{"id":192200,"date":"2021-08-23T12:00:20","date_gmt":"2021-08-23T11:00:20","guid":{"rendered":"https:\/\/www.transcend.org\/tms\/?p=192200"},"modified":"2021-08-19T08:30:54","modified_gmt":"2021-08-19T07:30:54","slug":"apples-plan-to-think-different-about-encryption-opens-a-backdoor-to-your-private-life","status":"publish","type":"post","link":"https:\/\/www.transcend.org\/tms\/2021\/08\/apples-plan-to-think-different-about-encryption-opens-a-backdoor-to-your-private-life\/","title":{"rendered":"Apple&#8217;s Plan to &#8220;Think Different&#8221; about Encryption Opens a Backdoor to Your Private Life"},"content":{"rendered":"<p><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption.png\" ><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-192207\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption-1024x512.png\" alt=\"\" width=\"400\" height=\"200\" srcset=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption-1024x512.png 1024w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption-300x150.png 300w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption-768x384.png 768w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/AppleFBIKeys-encryption.png 1200w\" sizes=\"auto, (max-width: 400px) 100vw, 400px\" \/><\/a><\/p>\n<p><em>5 Aug 2021 &#8211; <\/em>Apple has announced impending changes to its operating systems that include new \u201cprotections for children\u201d features in iCloud and iMessage. If you\u2019ve spent any time <a target=\"_blank\" href=\"https:\/\/www.vice.com\/en\/article\/wjwpmn\/ahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\" >following the Crypto Wars<\/a>, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.<\/p>\n<p>Child exploitation is a serious problem, and Apple isn&#8217;t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can <a target=\"_blank\" href=\"https:\/\/www.apple.com\/child-safety\/\" >explain at length<\/a> how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.<\/p>\n<p>To say that we are disappointed by Apple\u2019s plans is an understatement. Apple has <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2016\/02\/bay-area-rallies-against-fbi-threats-privacy-and-security\" >historically been a champion<\/a> of end-to-end encryption, for all of the same reasons that EFF has articulated <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2019\/07\/dont-let-encrypted-messaging-become-hollow-promise\" >time<\/a> <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2020\/03\/earn-it-bill-governments-not-so-secret-plan-scan-every-message-online\" >and<\/a> <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/03\/fbi-should-stop-attacking-encryption-and-tell-congress-about-all-encrypted-phones\" >time<\/a> <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2016\/04\/eff-joins-32-civil-liberties-groups-and-companies-ask-obama-reject-anti-encryption\" >again<\/a>. Apple\u2019s compromise on end-to-end encryption may <a target=\"_blank\" href=\"https:\/\/www.reuters.com\/article\/us-apple-fbi-icloud-exclusive-idUSKBN1ZK1CT\" >appease government agencies<\/a> in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company\u2019s leadership in privacy and security.<\/p>\n<p>There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing &amp; Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts\u2014that is, accounts designated as owned by a minor\u2014for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.<\/p>\n<p>When Apple releases these <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2019\/11\/why-adding-client-side-scanning-breaks-end-end-encryption\" >\u201cclient-side scanning\u201d<\/a> functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their <a target=\"_blank\" href=\"https:\/\/ssd.eff.org\/en\/module\/your-security-plan\" >privacy and security priorities<\/a> in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.<\/p>\n<h3><b>Apple Is Opening the Door to Broader Abuses<\/b><\/h3>\n<p>We\u2019ve said it <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2019\/11\/why-adding-client-side-scanning-breaks-end-end-encryption\" >before<\/a>, and we\u2019ll say it again now: it\u2019s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger\u2019s encryption itself and open the door to broader abuses.<\/p>\n<blockquote>\n<p class=\"pull-quote\"><em><strong>That\u2019s not a slippery slope; that\u2019s a fully built system just waiting for external pressure to make the slightest change.<\/strong><\/em><\/p>\n<\/blockquote>\n<p>All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children\u2019s, but anyone\u2019s accounts. That\u2019s not a slippery slope; that\u2019s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/07\/indias-draconian-rules-internet-platforms-threaten-user-privacy-and-undermine\" >rules<\/a> include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of \u201cmisinformation\u201d in 24 hours <a target=\"_blank\" href=\"https:\/\/www.accessnow.org\/ethiopias-hate-speech-and-disinformation-law-the-pros-the-cons-and-a-mystery\/\" >may apply to messaging services.<\/a> And many other countries\u2014often those with authoritarian governments\u2014have <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/02\/indonesias-proposed-online-intermediary-regulation-may-be-most-repressive-yet\" >passed<\/a> <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2020\/11\/turkey-doubles-down-violations-digital-privacy-and-free-expression\" >similar<\/a> <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/04\/proposed-new-internet-law-mauritius-raises-serious-human-rights-concerns\" >laws<\/a>. Apple\u2019s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.<\/p>\n<p>We\u2019ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2020\/08\/one-database-rule-them-all-invisible-content-cartel-undermines-freedom-1\" >database of \u201cterrorist\u201d content<\/a> that companies can contribute to and access for the purpose of banning such content. The database, managed by the <a target=\"_blank\" href=\"https:\/\/gifct.org\/\" >Global Internet Forum to Counter Terrorism<\/a> (GIFCT), is troublingly without external oversight, despite <a target=\"_blank\" href=\"https:\/\/cdt.org\/insights\/human-rights-ngos-in-coalition-letter-to-gifct\/\" >calls from civil society<\/a>. While it\u2019s therefore impossible to know whether the database has overreached, we do know that platforms <a target=\"_blank\" href=\"https:\/\/www.eff.org\/wp\/caught-net-impact-extremist-speech-regulations-human-rights-content\" >regularly flag critical content<\/a> as \u201cterrorism,\u201d including documentation of violence and repression, counterspeech, art, and satire.<\/p>\n<h3><b>Image Scanning on iCloud Photos: A Decrease in Privacy<br \/>\n<\/b><\/h3>\n<p>Apple\u2019s plan for scanning photos that get uploaded into iCloud Photos is similar in some ways to <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2019\/07\/dont-let-encrypted-messaging-become-hollow-promise\" >Microsoft\u2019s PhotoDNA<\/a>. The main product difference is that Apple\u2019s scanning will happen on-device. The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. The result of the matching will be sent up to Apple, but Apple can only tell that matches were found once a sufficient number of photos have matched a preset threshold.<\/p>\n<p>Once a certain number of photos are detected, the photos in question will be sent to human reviewers within Apple, who determine that the photos are in fact part of the CSAM database. If confirmed by the human reviewer, those photos will be sent to NCMEC, and the user\u2019s account disabled. Again, the bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.<\/p>\n<p>Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement.<\/p>\n<p>Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images. Civil liberties organizations have <a target=\"_blank\" href=\"https:\/\/fixitalready.eff.org\/apple\" >asked the company<\/a> to remove its ability to do so. But Apple is choosing the opposite approach and giving itself <i>more<\/i> knowledge of users\u2019 content.<\/p>\n<h3><b>Machine Learning and Parental Notifications in iMessage: A Shift Away From Strong Encryption<br \/>\n<\/b><\/h3>\n<p>Apple\u2019s second main new feature is two kinds of notifications based on scanning photos sent or received by iMessage. To implement these notifications, Apple will be rolling out an on-device machine learning classifier designed to detect \u201csexually explicit images.\u201d According to Apple, these features will be limited (at launch) to U.S. users under 18 who have been enrolled in a <a target=\"_blank\" href=\"https:\/\/support.apple.com\/families-and-kids\" >Family Account<\/a>. In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the \u201cparent\u201d will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later. For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.<\/p>\n<p>Similarly, if the under-13 child <i>receives<\/i> an image that iMessage deems to be \u201csexually explicit\u201d, before being allowed to view the photo, a notification will pop up that tells the under-13 child that their parent will be notified that they are receiving a sexually explicit image. Again, if the under-13 user accepts the image, the parent is notified and the image is saved to the phone. Users between 13 and 17 years old will similarly receive a warning notification, but a notification about this action will not be sent to their parent\u2019s device.<\/p>\n<p>This means that if\u2014for instance\u2014a minor using an iPhone without these features turned on sends a photo to another minor who does have the features enabled, they do not receive a notification that iMessage considers their image to be \u201cexplicit\u201d or that the recipient\u2019s parent will be notified. The recipient\u2019s parents will be informed of the content without the sender consenting to their involvement. Additionally, once sent or received, the \u201csexually explicit image\u201d cannot be deleted from the under-13 user\u2019s device.<\/p>\n<p>Whether sending or receiving such content, the under-13 user has the option to decline without the parent being notified. Nevertheless, these notifications give the sense that Apple is watching over the user\u2019s shoulder\u2014and in the case of under-13s, that\u2019s essentially what Apple has given parents the ability to do.<\/p>\n<blockquote>\n<p class=\"pull-quote\"><em><strong>These notifications give the sense that Apple is watching over the user\u2019s shoulder\u2014and in the case of under-13s, that\u2019s essentially what Apple has given parents the ability to do.<\/strong><\/em><\/p>\n<\/blockquote>\n<p>It is also important to note that Apple has chosen to use the notoriously <a target=\"_blank\" href=\"https:\/\/facctconference.org\/2021\/acceptedpapers.html\" >difficult-to-audit<\/a> technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly \u201csexually explicit\u201d content. When blogging platform Tumblr <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2018\/12\/dear-tumblr-banning-adult-content-wont-make-your-site-better-it-will-harm-sex\" >instituted a filter for sexual content<\/a> in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook\u2019s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen\u2019s <a target=\"_blank\" href=\"https:\/\/www.telegraph.co.uk\/news\/worldnews\/europe\/denmark\/12081589\/Copenhagen-Little-Mermaid-statue-Facebook-accused-of-censoring-photo.html\" >Little Mermaid<\/a>. These filters have a history of chilling expression, and there\u2019s plenty of reason to believe that Apple\u2019s will do the same.<\/p>\n<p>Since the detection of a \u201csexually explicit image\u201d will be using on-device machine learning to scan the contents of messages, Apple will no longer be able to honestly call iMessage \u201cend-to-end encrypted.\u201d Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the \u201cend-to-end\u201d promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company\u2019s stance toward strong encryption.<\/p>\n<h3><b>Whatever Apple Calls It, It\u2019s No Longer Secure Messaging<\/b><\/h3>\n<p>As a reminder, a secure messaging system is a system where no one but the user and their intended recipients can read the messages or otherwise analyze their contents to infer what they are talking about. Despite messages passing through a server, an end-to-end encrypted message will not allow the server to know the contents of a message. When that same server has a channel for revealing information about the contents of a significant portion of messages, that\u2019s not end-to-end encryption. In this case, while Apple will never see the images sent or received by the user, it has still created the classifier that scans the images that would provide the notifications to the parent. Therefore, it would now be possible for Apple to add new training data to the classifier sent to users\u2019 devices or send notifications to a wider audience, easily censoring and chilling speech.<\/p>\n<p>But even without such expansions, this system will give parents who do not have the best interests of their children in mind one more way to monitor and control them, limiting the internet\u2019s potential for expanding the world of those whose lives would otherwise be restricted. And because family sharing plans may be organized by abusive partners, it&#8217;s not a stretch to imagine using this feature as <a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/05\/fighting-disciplinary-technologies\" >a form of stalkerware<\/a>.<\/p>\n<p>People have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users\u2019 devices.<\/p>\n<p>________________________________________________<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/www.eff.org\/es\/deeplinks\/2021\/08\/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life\" class=\"language-link\" xml:lang=\"es\"  hreflang=\"es\">Espa\u00f1ol<\/a><\/p>\n<p style=\"padding-left: 40px;\"><em><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/India-McKinney.png\" ><img loading=\"lazy\" decoding=\"async\" class=\"alignleft wp-image-192209 size-full\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/India-McKinney-e1629357926996.png\" alt=\"\" width=\"100\" height=\"66\" \/><\/a>India McKinney, Director of Federal Affairs. Prior to joining EFF, India spent over 10 years in Washington, DC as a legislative staffer to three members of Congress from California. India\u2019s passion has always been for good public policy, and she\u2019s excited to be using skills developed during legislative battles to fight for consumer privacy and for robust surveillance oversight. Email:\u00a0<a href=\"mailto:india@eff.org\">india@eff.org<\/a><\/em><\/p>\n<p style=\"padding-left: 40px;\"><em><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/Erica-Portnoy.jpg\" ><img loading=\"lazy\" decoding=\"async\" class=\"alignleft wp-image-192208 size-full\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2021\/08\/Erica-Portnoy-e1629357996994.jpg\" alt=\"\" width=\"100\" height=\"100\" \/><\/a>Erica Portnoy, Senior Staff Technologist. Erica develops the Let&#8217;s Encrypt client Certbot, which makes it easy for people who run websites to turn on https, keeping their users private and secure against network-based attackers. She writes and speaks about encryption in practice, including what people need from secure messaging providers and what the next generation of encryption in the cloud might look like. Erica\u00a0previously worked on EFF&#8217;s net neutrality project, writing technical filings and opinion pieces and organizing technologists from the networking industry to speak up for technical accuracy in policy decisions. Email:\u00a0<a href=\"mailto:erica@eff.org\">erica@eff.org<\/a><\/em><\/p>\n<p><a target=\"_blank\" href=\"https:\/\/www.eff.org\/deeplinks\/2021\/08\/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life\" >Go to Original &#8211; eff.org<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Apple is planning to build a backdoor into its data storage and messaging systems.  Apple\u2019s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company\u2019s leadership in privacy and security.<\/p>\n","protected":false},"author":4,"featured_media":192207,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[216],"tags":[2632,910,1009,1772,2383,1277,1109,911],"class_list":["post-192200","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-technology","tag-apple","tag-big-brother","tag-big-tech","tag-computer-science","tag-encryption","tag-privacy-rights","tag-spying","tag-surveillance"],"_links":{"self":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/192200","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/comments?post=192200"}],"version-history":[{"count":0,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/192200\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/media\/192207"}],"wp:attachment":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/media?parent=192200"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/categories?post=192200"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/tags?post=192200"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}