Privacy

Apple Delays Plan To Snoop on Users' Pictures

An encryption back door will lead to abusive authoritarian surveillance—even if you present it as a way to stop child porn.

|

A controversial plan by Apple to scan users' photos for the possible presence of child pornography has been shelved (for now) after tech privacy advocates warned of a future of oppressive government surveillance.

In early August, Apple announced it was rolling out a tool that would scan everybody's iPhone and iPad photos when they were uploaded into iCloud storage. Apple promised that no human would actually be looking at the photos unless a complex tech program found that they matched known pictures of child pornography. If the program found child porn, Apple would disable the account and send a report to the National Center for Missing and Exploited Children, a quasi-private nonprofit created and primarily funded by Congress.

The announcement was greeted with shock and dismay by tech privacy and security organizations like the Electronic Frontier Foundation (EFF). One massive problem is that there's no reason to believe that, once implemented, this system would remain confined to child pornography. A technology that could be programmed to detect images of child pornography could be programmed to detect other types of images if Apple is pressured or ordered to do so by a government. This technology could be used by authoritarian leaders to figure out who is transmitting images they don't approve of or those associated with protests and activism. Will Chinese authorities demand to know who has images of Winnie the Pooh on their phones, given President Xi Jinping's apparent sensitivity to the comparison?

Apple had initially planned to implement the scanning system with an August patch. But on Friday, in a message to The Verge, the company announced it was delaying the plan. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," a representative from Apple told The Verge.

Federal law already requires Apple to report Child Sexual Abuse Material (CSAM) to the authorities whenever the company finds any. But the code also clearly states that platforms and providers are not required to monitor users' content or "affirmatively search" for CSAM images, which is what Apple was proposing to do.

Apple's plan made a policy of violating users' privacy just to make sure they weren't breaking the law. If the U.S. government actually attempted to scan all our photos just to be certain we weren't storing illegal images of child pornography, we would see it as the massive Fourth Amendment violation that it is. Government is not supposed to be looking through our private files absent suspicion of criminal activity and a warrant.

But data held by third parties lack such strong constitutional protections, and so the government pressures companies like Apple to provide this information as a way of bypassing constitutional privacy protections. And in other countries, like China, where those protections are simply nonexistent, Apple has given governments even broader access to user information.

If Apple moves forward with this plan, the consequences could be huge and dangerous—a mechanism for worldwide mass surveillance. It's an official policy of an encryption back door designed to not look like an encryption back door. The problem with back doors bypassing users' technology privacy is that once they've been created, nobody can guarantee that only the "right" people with the "right" intentions will be able to access them and only under certain legal circumstances. Encryption back doors will ultimately lead to abuse, both by criminal elements and government officials (and sometimes those are the same people).

As such, EFF Executive Director Cindy Cohn responded to Apple's delay by saying the plan isn't good enough. Her organization wants Apple to drop the program entirely:

The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

NEXT: Do San Francisco's Restrictions on Chain Stores Violate the First Amendment?

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. A controversial plan by Apple to scan users’ photos for the possible presence of child pornography has been shelved (for now)

    There were concerns that Tony’s dick pics would crash the system.

    1. Too much magnification required?

      1. Find USA Online Jobs (800$-95000$ Weekly) safe and secure! Easy Access To Information. Simple in use. All the Answers. Multiple sources combined.JHg Fast and trusted. Discover us now! Easy & Fast, 99% Match……

        Start now……………. VISIT HERE

    2. Cameras really are good in those aren’t they?!?!? All those pixels to capture Tony’s micro-penis!!

      1. Start making money this time… Spend more time with your family & relatives by doing jobs that only require you to have a computer and an internet access and you can have that at your home.FGh Start bringing up to $65,000 to $70,000 a month. I’ve started this job and earn a handsome income and now I am exchanging it with you, so you can do it too.

        Here is I started.…………… VISIT HERE

    3. I am still hoping for the movie theatre chains (AMC, Cinemark, Regal) to reopen in time for Unhinged, HERE►…Click here.

  2. Apple Delays Plan To Snoop on Users’ Pictures

    “We took note of progressive tactics and will now wait until the news cycle moves on before implementing it anyway.”

    Hunter is going to need another place to store those pictures of his niece.

  3. “Will Chinese authorities demand…”

    Yes, yes, a thousand times yes, they are a communist dictatorship and they will demand anything and everything. As I have mentioned a million times trillion dollar multi national corporations don’t care who they deal with and so companies that build phones with slave wages (or actual slavery) will happily add backdoors for the police state. Just like they will ban negative articles about Biden prior to an election.

    1. My guess is it was written for the ChiComs and they are just advertising it as anti-child porn outside china.

      1. Maybe the “Advertising” was a warning.

        I mean, now we KNOW they can do it. Which means the encryption is breakable. If there’s a back door, China can exploit it. This was Apple’s warrant canary.

        1. It wasn’t apples warrant canary – you act like Apple gives two shits about anything but profit. If it was a warning, it was just to test the waters to see how much they’d lose for implementing it.

          The mere proposal and attempted implementation is a slap in the face to all Apple consumers. Apple is fb with a fake smile. Fuck em.

      2. My guess is it was written for the ChiComs and they are just advertising it as anti-child porn outside china.

        Bingo. Maybe someone should ask their board of directors about this. They could start with fucking Al Gore.

  4. Given up the plan, or given up admitting it?

  5. It wasn’t even their plan. It was a three-letter-agency behind the initiative, obviously.

    1. FYTW is four letters – – – –

  6. I don’t even know why they announced the plan in the first place. No one would have known their pix were being scanned in the cloud unless Apple mentioned it.

    So now, Apple just waits a few months and does it anyway.

    1. It seems odd to me. I see one of two possibilities
      1: They genuinely didn’t think of any negatives and thought they would get public praise for this program
      2: Someone on the inside wanted to kill the plan with public pressure because someone powerful in the company (or possibly the government) was pushing for it, and so they sent the press release, knowing the backlash would make upper management panic on it.

      There are possible variants, but I can’t think of any other answer than one of those two.

      1. The other possibility is they are incrementally deploying more and more authoritarian controls because they are in the governments pocket like all the other prog corporations.

        I mean, this is formulaic gov rights usurping – just say it’s for “the kids” and watch people hand their rights away.

    2. Everyone already knows their pictures are being scanned in the cloud. Apple wanted to scan your pictures ON YOUR DEVICE.

      1. And this changes my argument? If Apple had never said a word about it, no one would have known either way.

      2. My privates are not public but they are pubic.

      3. No, Apple’s original announcement was that the pictures would be scanned when they were saved to Apple’s cloud storage service.

      4. I consider scanning in the cloud and scanning in an individual apps sandbox to be about the same. They both suck, but I don’t put one before the other. If Apple were to instead scan the whole device I would have a major problem with this since I wouldn’t have a way to opt out.

  7. I’m a techy by profession. The moral here is that if you can at all avoid it, NEVER use cloud storage products, especially for personal stuff. There is no reason anyone NEEDS to move their person stuff to Apple’s iCloud.

    The problem here is that whatever the privacy agreement was when you signed up and/or started using the storage WILL change. Apple, Google, et. al. WILL change it the second it is convenient for them to do so. Their existing agreement with you means NOTHING to them. If your stuff is already on there, you are stuck figuring out how to close out your account.

    As this article points out, it will not stop at child porn. That’s a noble goal but this will quickly move on to lessor crimes, political crimes, and thought crimes. Secondly this creates an easy way to “SWAT” someone/get someone in trouble if you can send them an image of what could be perceived as child porn and before they notice, it ends up on their cloud account and their account is shutoff and the authorities called on them.

    Don’t use nor ever trust Apple iCloud storage.

    1. Personally if you know what you’re doing you’re fine. Criminals do. Whether you like or it not the cloud services are hosting your data and they will get to have some sayso on what is there and it’s in the user agreement. If you’re too stupid to read them then woe be to you. However this was crossing the line and Apple was becoming the new FBI extension and the tech crowd started yelling about how this hasn’t been done before in sure a public and broad manner. Apple is very sensitive to that bad publicity even if it was just a vocal minority. They do not like bad PR that threatens their “we’re way more private” than Android spiel

      1. But that’s how ALL cloud storage is going to be, if you want to take your chances …

        Next it will anything construed as COVID misinformation including things like political cartoons. Eventually doing something as egregious as saving off a recipe that calls for meat will be enough that the thought police need to deactivate you.

        1. Encryption is your friend in the cloud.

  8. We should be looking at what China is doing with surveillance and doing exactly the opposite not trying to copy them.

  9. “A controversial plan by Apple to scan users’ photos for the possible presence of child pornography has been shelved…”

    At least for now. I wonder if a picture of a young female, dressed in a “Catholic school girl,” while reclined, would be considered “child porn.” Incidentally, the “young female” in the picture was one of our many cats.

    And, no, I am not being entirely specious, because the same pic was taken by my wife and shared at her office. And several people complained that it was “improper.” I kid you not.

    So, it begs the question: “‘Do we want those “kitty-porn” folks deciding what is “proper?”‘

    1. What begs the question is “does your wife still work there?”

      1. “‘What begs the question is “does your wife still work there?”’

        LOL. Nope. We moved.
        Note: the vast majority of the folks there thought the pic was cute.

  10. C’mon Man . . . It’s For The Children!

  11. Shouldnt the current reason position be this is fine, because it’s a private company. It doesn’t matter if the Gov tells them to, they are still a private company

  12. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” a representative from Apple told The Verge.

    So it’s coming, but next Wednesday instead of this one.

  13. If Apple moves forward with this plan, the consequences could be huge and dangerous—a mechanism for worldwide mass surveillance.

    Private company something something do whatever they something.

    1. Private Company being pressured by Government = Fascism

      Government can’t ban speech, but they can pressure private companies to ban speech.

      I’m only enough (35) to remember lefties non-stop complaining about corporations. Everything was corporations this, corporations that.

      They they gained power over them because that’s what locusts do, and now they love corporations. Again with them, it’s all about power

  14. Just another reason I don’t use Apple products, and refuse to update or do anything with my Microsoft OneDrive. Can’t trust these people at all.

  15. See the authoritarians always pick on the worst of people (who doesn’t like those evil chomo’s?), and dumbasses will go with it not understanding where that leads.

    Then they will go, wait why is this being used against me? I’m innocent after all? I thought this was just supposed to stop illegal activity. Wait I did misgender someone in a private conversation, but that’s not the same thing! not the same thing!

    Just like the real communists in the gulags.

  16. Private company’s private pictures of privates.

  17. Apple hasn’t stated, but many think they took this approach to CSAM in order to end-to-end encrypt all photos. That means if the government presented a warrant to release photos Apple wouldn’t be able to comply. This move may have been to show the state they are giving something for taking something away. If this is true, it could come down to a possible pragmatic argument for increased privacy. If it is true, Apple totally screwed up the optics.

    At least if not expanded, it is about as borderline as it gets. It only finds specific photos in CSAM database without sending any photos off device and only within the iCloud Photos app which you can choose not to use. It doesn’t scan the photo roll, iCloud storage, or other apps.

    1. Have other “feature” announced at the same time is a bigger concern to me. They have the option to notify parents if kids are sent nudes. This might seem ok at first glance, but many kids live in abusive households that could be endangered by this.

    2. Not mentioned is Google scans photos, mail, and other content for CSAM. I guess we can still use Plex to store photos privately.

  18. In early August, Apple announced it was rolling out a tool that would scan everybody’s iPhone and iPad photos when they were uploaded into iCloud storage. Apple promised that no human would actually be looking at the photos unless a complex tech program found that they matched known pictures of child pornography. If the program found child porn, Apple would disable the account and send a report to the National Center for Missing and Exploited Children, a quasi-private nonprofit created and primarily funded by Congress.
    team of https://aostv.fun/
    thank you…!

  19. wow good to know more info on free online movies sites

  20. It wasn’t even their plan. It was a three-letter-agency behind the initiative, obviously. famous bio

Please to post comments