![]() ![]() Updated 5:20pm ET, Wednesday, Decemto include commentary from RAINN.Final Verdict: What is the Best Duplicate Photo Finder & Cleaner for Mac in 2023? List of 14 Best Duplicate Photo Finders & Removers for Mac in 2023 (Free & Paid)įinding duplicate photos in your iCloud Photo Library is a more complicated task than finding a needle in a haystack. But tech giants are walking a fine line as they work to balance CSAM detection and user privacy. Scanning for CSAM before the material is sent by a child’s device is one of these such tools and can help limit the scope of the problem.”Ĭountering CSAM is a complicated and nuanced endeavor with extremely high stakes for kids around the world, and it’s still unknown how much traction Apple’s bet on proactive intervention will get. ![]() While the vast majority of online CSAM is created by someone in the victim’s circle of trust, which may not be captured by the type of scanning mentioned, combatting the online sexual abuse and exploitation of children requires technology companies to innovate and create new tools. “Additionally, because the minor is typically sending newly or recently created images, it is unlikely that such images would be detected by other technology, such as Photo DNA. "Technology that detects CSAM before it is sent from a child’s device can prevent that child from being a victim of sextortion or other sexual abuse, and can help identify children who are currently being exploited,” says Erin Earp, interim vice president of public policy at the anti-sexual violence organization RAINN. The feature is designed so Apple never gets access to the messages, the end-to-end encryption that Messages offers is never broken, and Apple doesn’t even learn that a device has detected nudity. Research has consistently shown, though, that end-to-end encryption is a vital safety tool for protecting human rights and that the downsides of its implementation do not outweigh the benefits.Ĭommunication Safety for Messages is opt-in and analyzes image attachments users send and receive on their devices to determine whether a photo contains nudity. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse in opposing the use and expansion of end-to-end encryption, though many of these agencies have historically been hostile toward end-to-end encryption in general because it can make some investigations more challenging. Child safety experts and technologists working to combat CSAM have often opposed broader deployment of end-to-end encryption because it renders user data inaccessible to tech companies, making it more difficult for them to scan and flag CSAM. The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM.Īpple’s CSAM update comes alongside its announcement today that the company is vastly expanding its end-to-end encryption offerings for iCloud, including adding the protection for backups and photos stored on the cloud service. Additionally, the core of the protection is Communication Safety for Messages, which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The features work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Parents and caregivers can opt into the protections through family iCloud accounts. Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Safety” features, which the company initially announced in August 2021 and launched last December. ![]() Now the company says that in response to the feedback and guidance it received, the CSAM-detection tool for iCloud photos is dead. At the beginning of September 2021, Apple said it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a launch was still coming. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |