The company plans to scan all photos uploaded to iCloud photos, looking for images of child abuse. Apple also plans to scan the iMessages of minors for a broader range of “sexually explicit” images.
Child abuse is a scourge, but it can be investigated and prosecuted without breaking encryption, or scanning our private personal photos. Crimes against children cannot be an excuse for Apple to install surveillance software that will scan millions of iPhones. This type of “client-side scanning” violates the promise of end-to-end encryption.
Apple is very likely to get pressure to expand the system to search for additional types of content. The system will endanger children, not protect them—especially LGBTQ kids and children in abusive homes. Countries around the world would love to scan for and report matches with their own database of censored material, which could lead to disastrous results, especially for regimes that already track activists and censor online content.