- Thread starter
- Staff
- #1
Dadparvar
Staff member
- Nov 11, 2016
- 10,735
- 0
- 6
Apple announced Thursday a new policy for child-safety measures on its devices, leading several digital rights organisations such as the European Digital Rights network (EDRi) and Electronic Frontier Foundation to raise concerns over the privacy and security of the company’s large global customer base.
The new policy scans messages sent or received by a minor’s account to warn children and their parents when they are receiving or sending sexually explicit photos. When minors receive such photos, the photo will be blurred. The children will be assured that it is alright if they do not want to view such pictures and will be provided with helpful resources. When sending such photos, the children will be similarly warned. Parents can also be notified if their child decides to view or send sexually explicit photos.
Additionally, Apple will scan photos uploaded by users in iCloud Photos to identify Child Sexual Abuse Material (CSAM) and report these instances to the National Center for Missing and Exploited Children (NCMEC). Apple will match content against an unreadable database of known CSAM image hashes provided by child safety organizations that will be stored in the operating system of users’ devices.
Where a match is found, a cryptographic safety voucher encoding the match will be uploaded to iCloud Photos along with the image. Apple will not be able to interpret the contents of these safety vouchers unless a high threshold limit preset by the company is breached by a user, after which it will manually review each match, disable the users’ account and send a report to the NCMEC.
The EDRi and EFF recognise the serious problem posed by online child exploitation, but argue that the changes “build a backdoor into [Apple’s] data storage system and its messaging system” and it is “impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.” Rather, they say, Apple is compromising end-to-end encryption that protects citizens against state surveillance. Additional tweaking or expansion in machine learning can be used to scan all users’ devices and open the floodgates for misuse by authoritarian regimes.
They also highlighted that machine learning, when used without human oversight, habitually classifies content incorrectly, including sexually explicit content, and employing such tools to scan users’ iCloud Photos will have a “chilling effect.”
The new policy has also drawn criticism from experts such as Edward Snowden, Matthew Green, and Kendra Albert. WhatsApp CEO Will Cathcart has said his company will not implement this policy. A consortium of legal experts, cryptographers, researchers, professors, and Apple consumers have also written an open letter asking the company to halt the deployment of the new policy and reaffirm its commitment to end-to-end encryption and to user privacy.
The post Digital rights groups say Apple’s new child protection policy weakens user privacy appeared first on JURIST - News - Legal News & Commentary.
Continue reading...
Note: We don't have any responsibilities about this news. Its been posted here by Feed Reader and we had no controls and checking on it. And because News posted here will be deleted automatically after 21 days, threads are closed so that no one spend time to post and discuss here. You can always check the source and discuss in their site.
The new policy scans messages sent or received by a minor’s account to warn children and their parents when they are receiving or sending sexually explicit photos. When minors receive such photos, the photo will be blurred. The children will be assured that it is alright if they do not want to view such pictures and will be provided with helpful resources. When sending such photos, the children will be similarly warned. Parents can also be notified if their child decides to view or send sexually explicit photos.
Additionally, Apple will scan photos uploaded by users in iCloud Photos to identify Child Sexual Abuse Material (CSAM) and report these instances to the National Center for Missing and Exploited Children (NCMEC). Apple will match content against an unreadable database of known CSAM image hashes provided by child safety organizations that will be stored in the operating system of users’ devices.
Where a match is found, a cryptographic safety voucher encoding the match will be uploaded to iCloud Photos along with the image. Apple will not be able to interpret the contents of these safety vouchers unless a high threshold limit preset by the company is breached by a user, after which it will manually review each match, disable the users’ account and send a report to the NCMEC.
The EDRi and EFF recognise the serious problem posed by online child exploitation, but argue that the changes “build a backdoor into [Apple’s] data storage system and its messaging system” and it is “impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.” Rather, they say, Apple is compromising end-to-end encryption that protects citizens against state surveillance. Additional tweaking or expansion in machine learning can be used to scan all users’ devices and open the floodgates for misuse by authoritarian regimes.
They also highlighted that machine learning, when used without human oversight, habitually classifies content incorrectly, including sexually explicit content, and employing such tools to scan users’ iCloud Photos will have a “chilling effect.”
The new policy has also drawn criticism from experts such as Edward Snowden, Matthew Green, and Kendra Albert. WhatsApp CEO Will Cathcart has said his company will not implement this policy. A consortium of legal experts, cryptographers, researchers, professors, and Apple consumers have also written an open letter asking the company to halt the deployment of the new policy and reaffirm its commitment to end-to-end encryption and to user privacy.
The post Digital rights groups say Apple’s new child protection policy weakens user privacy appeared first on JURIST - News - Legal News & Commentary.
Continue reading...
Note: We don't have any responsibilities about this news. Its been posted here by Feed Reader and we had no controls and checking on it. And because News posted here will be deleted automatically after 21 days, threads are closed so that no one spend time to post and discuss here. You can always check the source and discuss in their site.