— — Apps

In-app events hit the App Store, TikTok tries Stories, Apple reveals new child safety plan – TechCrunch

Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest mobile OS news, mobile applications, and the overall app economy. The app industry continues to grow, with a record 218 billion downloads and $143 billion in global consumer spending in 2020. Consumers last year also spent 3.5 trillion minutes using apps on Android devices alone. And in the U.S., app usage surged ahead of the time spent watching live TV. The average American watches 3.7 hours of live TV daily but now spends four hours per day on their mobile devices.

Apps aren’t just a way to pass idle hours — they’re also a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into mobile companies — a figure up 27% year-over-year. This Week in Apps offers a way to keep up with this fast-moving industry in one place, with the latest from the world of apps, including news, updates, startup fundings, mergers and acquisitions, and suggestions about new apps and games to try, too.

App Store

Apple to scan for CSAM imagery.

Apple announced a significant initiative to scan devices for CSAM imagery. The company on Thursday announced a new set of features, arriving later this year, that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement. Companies like Dropbox, Google, and Microsoft already scan for CSAM in their cloud services, but Apple had allowed users to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices, and platforms detect when users upload known CSAM imagery — without having to crypt the images. It can even see the imagery if it’s been cropped or edited to avoid detection.

Meanwhile, on iPhone and iPad, the company will roll out protections to Messages app users to filter images and alert children and parents if sexually explicit photos are sent to or from a child’s account. Children will not be shown the photos but instead see a grayed-out image. If they try to view the image through the link, they’ll be shown interruptive screens explaining why the material may be harmful and warned that their parents will be notified.

Some privacy advocates pushed back at such a system, believing it could expand to end-to-end encrypted photos, lead to false positives, or set the stage for more on-device government surveillance. However, many cryptology experts believe the system Apple developed provides a good balance between privacy and utility and have endorsed the technology. In addition, Apple said reports are manually reviewed before being sent to the National Center for Missing and Exploited Children (NCMEC).

The changes may also benefit iOS developers who deal in user photos and uploads, as predators will no longer store CSAM imagery on iOS devices in the first place, given the new risk of detection. Though not publicly available to all users, those testing the new iOS 15 mobile operating system got their first glimpse of a new App Store discovery feature this week: “in-app events.” First announced at this year’s WWDC, the part will allow developers and Apple editors alike to showcase directly on the App Store upcoming events taking place inside apps. In-app events appear on the App Store.

Gemma Broadhurst
I am a writer by profession, and I love to write in my spare time. I am one of the most experienced writer for newspriest. I always make sure that whatever is written on my blog is 100% genuine and true. I am a University of Florida graduate pursuing a Master's degree.

Leave a Reply