Chandgarh: Apple Inc has said that iPhone users’ entire photo libraries would be examined for known child abuse images if they are stored in the online iCloud service.
This has been disclosure in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users’ phones, tablets and computers for millions of illegal pictures.
Though, Google, Microsoft and other technology platforms check uploaded photos or emailed attachments against a database of identifiers provided by the National Center for Missing and Exploited Children and other clearing houses, security experts faulted Apple’s plan as more invasive.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple wrote.
Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple, company mentioned on its website.
Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos, company issued a statement.
Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics, the statement read adding that these features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.