Technology

In-App Events Hit the App Store, TikTok Tries Stories, Apple Reveals New Child Safety Plan

In-App Events Hit the App Store, TikTok Tries Stories, Apple Reveals New Child Safety Plan

Welcome to Apps this week, the weekly TechCrunch series that is the latest iteration of mobile OS news, mobile apps, and the overall app economy. The app industry continues to grow, with a record 218 billion downloads in 2020 and $143 billion spent by consumers worldwide.

And in the United States, app usage has increased before spending time watching live TV. Currently, the average American watches 3.7 hours of live TV per day, but now spends four hours per day on their mobile devices. Apps are not a way to pass the idle time – they are also a big business. In 2019, mobile-first companies were valued at $544 billion, 6.5x more than without mobile focus. In 2020, investors redistributed billions of dollars in companies to mobile company’s a 27% increase year on year.

Apps this week provides a way to stay in one place with this fast-paced industry, including the latest news from the apps world, updates, startup funding, mergers and acquisitions, and advice on new apps and games, too. Apple has announced a major initiative to scan devices for CSAM images. The company announced a new feature on Thursday, coming later this year that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement agencies.

In-App Events Hit the App Store, TikTok Tries Stories, Apple Reveals New Child Safety Plan

Companies like Dropbox, Google, and Microsoft have already scanned CSAM on their cloud services, but Apple allowed users to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices, allowing platforms to detect when users upload familiar CSAM images – without first decrypting the images.

Meanwhile, on the iPhone and iPad, the company will protect messaging app users that will filter images and alert children and parents if a child’s account or sex account is sent. Children will not be shown pictures but will see a gray one instead.

If they try to view the image through the link, they will be shown a barrier screen that explains why the material could be harmful and their parents have been warned. Some privacy advocates push back the idea of ​​such a system, believing it could spread to end-to-end encrypted photos, lead to false positives, or create a platform for future government surveillance of the device.