20 votes

An open letter against Apple's privacy-invasive content scanning technology

34 comments

  1. [30]
    stu2b50
    (edited )
    Link
    I feel like I'm going crazy but none of this seems to make sense, and a significant portion seems to be based on something that's not in reality. For instance What does that even mean? A backdoor...

    I feel like I'm going crazy but none of this seems to make sense, and a significant portion seems to be based on something that's not in reality. For instance

    Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.

    What does that even mean? A backdoor to what? Given the 2 new, completely discrete features, the two possibilities can only be

    1. iCloud

    and

    1. iMessage

    But 1) how can you have a backdoor to iCloud Photos when iCloud Photos has the front door open - it's not E2EE, can't be, and Apple explicitly states in their ToS that they can and will look at your images and take your images down. And 2) I fail to see how the client side detection of nude images that only applies to accounts of children under the age of 13 and just shows 2 popups is a backdoor to anything. The iMessage feature doesn't send anything to Apple or check against CSAM database or any of that.

    Apple's proposed technology works by continuously monitoring all photos stored or shared on a user's iPhone, iPad or Mac, and notifying the authorities if a certain number of objectionable photos is detected.

    That's literally false - the fact that it only applies to iCloud Photos is kind've important. In case you didn't know, anything you upload to OneDrive also gets scanned by Microsoft's PhotoDNA, which does essentially the same thing.

    "Images you upload to iCloud photos are fingerprinted and matched to the CSAM database" is so very different from "continuously monitoring all photos stored or shared on a user's iPhone, iPad, or Mac".

    If you didn't know, this is what you agree to whenever you signed up for iCloud Photos:

    You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law.

    Fair to be wary of uploading things to there, in that case, but that was always the case.


    Apple's proposed measures could turn every iPhone into a device that is continuously scanning all photos and messages that pass through it in order to report any objectionable content to law enforcement

    Also a falsehood. It conflates the two separate features. The iMessage feature doesn't send anything to law enforcement - it's a mild annoyance if it gets tripped, and it doesn't use the CSAM database - it's just an ML model Apple says can detect any kind of nudity. Will it actually work well? Who knows, but the cost for false positives is low in this case.

    It also, like, doesn't scan all photos.

    Is it just me or this is in extremely bad faith?

    21 votes
    1. [18]
      aditya
      Link Parent
      Yes, Apple is currently limiting scanning of photos to those being uploaded to iCloud, and yes, this can be disabled by opting out of backing it up to iCloud. However, they're doing this scanning...

      Yes, Apple is currently limiting scanning of photos to those being uploaded to iCloud, and yes, this can be disabled by opting out of backing it up to iCloud. However, they're doing this scanning on-device, rather than on their servers, so the "front door is open" argument, while valid, never comes into the picture. The problem with this entire thing for me is that this is a small step to go from "scan all photos on this device being uploaded to iCloud" to "scan all photos on this device". The problem isn't necessarily what photos get scanned, but rather that the mechanism is being built into the devices, and can likely be expanded quite trivially.

      10 votes
      1. [9]
        stu2b50
        Link Parent
        That's not the case from an implementation point of view. iCloud Photos is what does the fingerprinting and CSAM checking. As a userland app, iCloud Photos only has access to what you give it to...

        the problem with this entire thing for me is that this is a small step to go from "scan all photos on this device being uploaded to iCloud" to "scan all photos on this device".

        That's not the case from an implementation point of view. iCloud Photos is what does the fingerprinting and CSAM checking. As a userland app, iCloud Photos only has access to what you give it to be uploaded. If Apple did want to scan all photos, they could not continue with this route - it would have to be implemented deeper into the OS rather than in an app, into a process that has true access to the filesystem instead of the facade given to apps.

        In that respect, there is nothing in iCloud Photos, and how it does this scanning, that another app that was not made by Apple could not. It's all stuff apps get to do.

        5 votes
        1. [7]
          aditya
          (edited )
          Link Parent
          Apple is saying this is for iCloud photos yes, but they're also promising it's happening on device. We also don't really know if this is built into the Photos app for photos being uploaded to...

          Apple is saying this is for iCloud photos yes, but they're also promising it's happening on device. We also don't really know if this is built into the Photos app for photos being uploaded to iCloud or into the iCloud stuff, because we can't see the source, correct?

          Edit: This is the part that has my attention most.

          Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

          Edit again: I don't think anyone would be complaining if this was just happening on iCloud, rather than on-device. That extra hook, I think, is the problem. You've made the hole, how much bigger can you make it?

          7 votes
          1. [6]
            stu2b50
            Link Parent
            I don't find that part particularly concerning at all. It's just an implementation detail. Again, it would be very different if iOS itself were fingerprinting all of your photos and it was just...

            I don't find that part particularly concerning at all. It's just an implementation detail.

            Again, it would be very different if iOS itself were fingerprinting all of your photos and it was just that iCloud Photos checks it. That would be concerning.

            In this case, once you give your image to the iCloud Photos app to be uploaded - and by app I do mean the code on your iPhone, not the servers - it is fair game in my eyes. Yes, the iCloud Photos app, on your phone, does the fingerprinting. But since it's just the behavior of a userland sandboxed app, we know it cannot access anything you don't give it access to because apps don't get filesystem access at all on iOS.

            It's like the difference between the post office going into your home X-raying everything and giving you a corresponding bar code for each item in your house that you should use if you ever choose to mail any of it and the post office x-raying your package in front of your house as you give it to them.

            Once you're handing over a photo to a cloud app to be uploaded, I do not care if the app does the scanning or the server - you should assume that Apple can and has access to the data and can do whatever matching and filtering it wants to for whatever reason it wants and not use the service if that would put you in jeopardy in some fashion.

            2 votes
            1. [5]
              aditya
              Link Parent
              As someone who doesn't use iCloud storage, I've never interacted with / seen the iCloud photos app. Does it exist as a standalone app or is iCloud Sync built into the regular photos app? I see an...

              As someone who doesn't use iCloud storage, I've never interacted with / seen the iCloud photos app. Does it exist as a standalone app or is iCloud Sync built into the regular photos app? I see an option in settings -> icloud to enable photos, but I also don't have enough storage to experiment with this. If there is an app, maybe we can somewhat verify that this is happening there, and it can't access photos that aren't imported into it to upload, but again my understanding is it just usually syncs everything in your regular photos app. If the hook is in the regular photos app, it can theoretically see everything right now, or be modified trivially to see everything, regardless of whether it's being uploaded somewhere.

              Also, to be 100% certain that even if the app exists, that's where this scan is happening, iOS or the app would have to be open source?

              Edit: https://support.apple.com/en-us/HT204264 skimming throug this, I don't see it being a separate app.

              I also think that this discussion is focusing too much on what this feature means today, and less on the broader impact this feature can have in the hands of an unfriendly government. It's an easier step to go from this to scanning all photos for "objectionable" content than if this feature didn't exist.

              https://twitter.com/kaepora/status/1423388549529968645

              Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure.

              What happens when local regulation mandates that messages be scanned for homosexuality?

              4 votes
              1. [4]
                stu2b50
                Link Parent
                Photos is the iCloud photos app. Photos only has access to what you give it access to - naturally the camera roll, for instance, but it does not by default have access to the pictures in dropbox,...

                Photos is the iCloud photos app. Photos only has access to what you give it access to - naturally the camera roll, for instance, but it does not by default have access to the pictures in dropbox, for instance. Furthermore, you can disable iCloud syncing (or for many people have run out of space).

                Again, once iCloud syncs a file you should have no assumptions of privacy. Hence I see no more issue with it doing the CSAM checking as part of the sync. If you have iCloud syncing turned out, everything Photos touches you should assume you have no privacy on. If you do not having iCloud syncing turned on, you should assume Apple cannot touch any of your photos.

                I don't see any change to that threat model.

                Also, to be 100% certain that even if the app exists, that's where this scan is happening, iOS or the app would have to be open source?

                Of course, that's the whole premise. Apple told us something they're doing, that was the source. Apple can of course install all the backdoors they want into whatever they want.

                3 votes
                1. [3]
                  aditya
                  Link Parent
                  Then Photos can see all the images on the device to my understanding. I'm not thinking about other cloud services, but rather what I receive in say a Whatsapp group. That image goes into photos,...

                  Then Photos can see all the images on the device to my understanding. I'm not thinking about other cloud services, but rather what I receive in say a Whatsapp group. That image goes into photos, and the new scanning feature checks right before uploading to icloud. But again, it's a far more trivial check to change that from "on upload to icloud" to "just scan everything the photos app can see". If Saudi Arabia wants Apple to scan for images involving homosexuality, or China for photos of Tienanmen Square protests, it just got a lot closer, even if it's not there yet.

                  Of course, that's the whole premise. Apple told us something they're doing, that was the source. Apple can of course install all the backdoors they want into whatever they want.

                  That's fair, I'll concede that. However, the argument that iCloud photos is a userland app doesn't hold when it's the default home for photos on iOS, iCloud or otherwise. IMO Dropbox etc are the outliers here.

                  3 votes
                  1. [2]
                    stu2b50
                    Link Parent
                    I think this confusion is just over how apps interact with the filesystem on iOS. On a desktop OS, for instance, you have a filesystem. All apps have access to the filesystem (pending permissions...

                    I think this confusion is just over how apps interact with the filesystem on iOS.

                    On a desktop OS, for instance, you have a filesystem. All apps have access to the filesystem (pending permissions on different paths, of course). So a "gallery" app on a desktop OS would go through all the folders it has access to and index all the images so you can view it.

                    But that's not how Photos works on iOS. No app has access to the filesystem. As far as they're concerned, the files they OWN are the only files in existence. So a file on iOS cannot just float in existence - it must belong to some app's (or belong to the OS itself) virtual filesystem.

                    Photos has access to all the photos in its virtual filesystem. You cannot just "save" an image in safari, for instance, like you can on a desktop OS (or android). You can only send it to another app - photos, for instance.

                    Photos only has access to what photos has access to. It does not have access to all photos on your phone - it doesn't even know that anything other than the files it owns exist. Now if you want define "all photos = everything photos owns" then sure it's true by tautology but certainly the images owned by Dropbox, or a 3rd party photos app like Google photos sure does seem to count as a photo.

                    2 votes
                    1. aditya
                      Link Parent
                      The way I see it, while a lot of this is accurate, in practice it really doesn't matter. On iOS no one is storing anything on their filesystem directly. Photos go in the photos app. Photos go in...

                      The way I see it, while a lot of this is accurate, in practice it really doesn't matter. On iOS no one is storing anything on their filesystem directly. Photos go in the photos app.

                      1. Photos go in the photos app, unless they're explicitly stored on some other service.
                      2. The photos app is also the iCloud Photos client.
                      3. This app can now scan photos for CSAM, if they're flagged for iCloud upload.

                      That last step can trivially become "This app can now scan photos for 'objectionable' content", where there's no telling who decides what is objectionable. I am not concerned about third party services, but rather how the Photos app on iOS natively works.

                      Photos has access to all the photos in its virtual filesystem. You cannot just "save" an image in safari, for instance, like you can on a desktop OS (or android). You can only send it to another app - photos, for instance.

                      You can absolutely hard-click a photo on safari and it shows an option to "Add to photos". It is the default mechanism for photo management on iOS, and we really shouldn't even be arguing this. The file system stuff is true, but it's abstracted away to a degree that in a user's workflow it doesn't matter. My mom is going to right click on her desktop and store it to the file system, and on her phone hard click and save to photos. To her, there's no distinction. To the average user, there is no distinction. I've never indicated a concern that the photos app can see what's on my dropbox or google photos. I do understand that distinction, but I maintain that the photos app remains the default. If I have auto-download media setup on WhatsApp, where do my images go? Dropbox? Google Photos? No! It goes to the default photos app. Let's stop pretending that the default photos app is on par with other services here.

                      6 votes
        2. aldian
          Link Parent
          Apple's Photos app is not a user land app. Any software signed by Apple's certificates has full, unrestricted access to all APIs on the system, including 'Private' APIs that include the entire...

          As a userland app

          Apple's Photos app is not a user land app. Any software signed by Apple's certificates has full, unrestricted access to all APIs on the system, including 'Private' APIs that include the entire suite of Mach kernel functions. Also, if you go look at the Photos app in Settings, it doesn't feature the same set of Permissions that other photo managing apps have, and in fact, if you look at another Photo managing app, like Google Photos, Apple's Photos app is the permission those apps are granted to access the device's photos, because Apple's app is the gatekeeper for all of it.

          5 votes
      2. [8]
        NaraVara
        Link Parent
        The ability to check a hash against another hash was always built into these devices. It's literally how password authentication works. It's not that much less trivial to build that in from...

        The ability to check a hash against another hash was always built into these devices. It's literally how password authentication works. It's not that much less trivial to build that in from scratch if that's what you want to do than it is to build on top of this functionality.

        1 vote
        1. [5]
          aditya
          Link Parent
          Sure, but now they've gone and announced to every government out there that they have a channel specifically to do so for photos. I'm not sure your average politician knows how passwords are...

          Sure, but now they've gone and announced to every government out there that they have a channel specifically to do so for photos. I'm not sure your average politician knows how passwords are hashed / stored / compared or whatever. What they care about are high-level mechanisms. Even if the scanning functionality isn't for every image today, it's definitely one step closer. I asked a question elsewhere: will Apple say no to a major foreign government, say China, if it requires this functionality to be extended to all photos for Apple to continue selling in that market? Why even open themselves up to that?

          Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure.

          What happens when local regulation mandates that messages be scanned for homosexuality?

          https://twitter.com/kaepora/status/1423388549529968645

          7 votes
          1. [2]
            stu2b50
            Link Parent
            But they already did? Apple can do all the scanning they want on iCloud photos. They say as much on the tin. Why is this such an advertisement of Apple being able to comply to requests when they...

            But they already did? Apple can do all the scanning they want on iCloud photos. They say as much on the tin.

            You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law.

            Why is this such an advertisement of Apple being able to comply to requests when they literally say they can and will already.

            What happens when local regulation mandates that messages be scanned for homosexuality?

            Apple tells them to sod off because the iMessage feature is not at all the same as the CSAM one? On the other hand, if a country says "no homosexual images allowed on cloud services in this country", with or without the CSAM scanning I would assume Apple would comply.

            2 votes
            1. aditya
              Link Parent
              I have literally zero problem with Apple scanning photos on their servers. I do have a problem with them doing this scanning locally, on-device, because there is no implementation transparency to...

              I have literally zero problem with Apple scanning photos on their servers. I do have a problem with them doing this scanning locally, on-device, because there is no implementation transparency to what's being scanned. I also really really think it's easier to go from "we scan all photos tagged for iCloud upload" to "we scan all photos" on device, at the pressure of government. I have never said Apple should not scan the images on their server.

              6 votes
          2. [2]
            NaraVara
            Link Parent
            It’s really not though. It’s already close enough that Apple doing it isn’t making it any closer. If you want to live in a world where this stuff doesn’t happen you should be exerting democratic...

            Even if the scanning functionality isn't for every image today, it's definitely one step closer.

            It’s really not though. It’s already close enough that Apple doing it isn’t making it any closer. If you want to live in a world where this stuff doesn’t happen you should be exerting democratic pressure on the lawmakers you’re so afraid of not to do these things. Expecting private, profit motivated corporations to not more technology forward in ways that are net benefits to their customer base out if some vaguely articulated concern about slippery slope externalities isn’t going to do it.

            1 vote
            1. aditya
              Link Parent
              In technical terms, sure. But considering the stance Apple had taken, or projected as taking for some time now, this is a significant change, it's them "caving", or close enough, and I do believe...

              It’s already close enough that Apple doing it isn’t making it any closer.

              In technical terms, sure. But considering the stance Apple had taken, or projected as taking for some time now, this is a significant change, it's them "caving", or close enough, and I do believe this is meaningful both to consumers who buy their pro-privacy stance, as well as less technically adept lawmakers in terms of what they can ask Apple to do.

              you should be exerting democratic pressure on the lawmakers you’re so afraid of not to do these things

              I mean, in an ideal world, sure. But for one thing, I currently reside in the US where there has been bipartisan support to backdoor encrypted communication in the recent past. And that's not a problem that's going to be fixed any time soon, looking at the political clusterfuck here over the last few years. Also, it doesn't seem to be limited to just the US either, because I think the EU is considering something similar, but I'm not sure. Further, I'm not a citizen here, so I really don't have the ability to exert any democratic pressure.

              Expecting private, profit motivated corporations to not more technology forward in ways that are net benefits to their customer base out if some vaguely articulated concern about slippery slope externalities isn’t going to do it.

              I'm not, though? I like to think I'm sufficiently cynical about how these things work. What I am doing is engaging in discussions with people who don't even see these moves as a problem. You don't have the same argument as the others I'm talking to here. Heck, if this discussion means I can convince one other person that such changes are a problem, maybe they're in a better position to exert democratic pressure lol.

              2 votes
        2. [2]
          Wes
          Link Parent
          Comparing hashes isn't difficult. Building a mechanism to compare hashes from a server and flag them for review though - that's a little more meaningful.

          Comparing hashes isn't difficult. Building a mechanism to compare hashes from a server and flag them for review though - that's a little more meaningful.

          4 votes
          1. NaraVara
            Link Parent
            I guess I don’t see this as particularly difficult, since this is also something most large companies do with passwords. In fact, services like Exabeam even automatically review repeated failed...

            I guess I don’t see this as particularly difficult, since this is also something most large companies do with passwords. In fact, services like Exabeam even automatically review repeated failed checks of those hashes and tag them with different flavors of potentially anomalous behavior as an anti-insider threat measure.

    2. [8]
      joplin
      Link Parent
      Like they say, never attribute to malice that which could be explained by incompetence. This letter is signed by a bunch of individuals who have been duped by the media's outrage cycle. The media...

      Is it just me or this is in extremely bad faith?

      Like they say, never attribute to malice that which could be explained by incompetence. This letter is signed by a bunch of individuals who have been duped by the media's outrage cycle. The media are definitely working in bad faith, as they understand what source checking is, and they aren't doing it with the stories they're writing.

      8 votes
      1. aditya
        Link Parent
        I'm not so sure about the media nor all the signatories, but I know of some of the folks quoted by reputation / have come across their work / read their papers and they're legit. In particular,...

        I'm not so sure about the media nor all the signatories, but I know of some of the folks quoted by reputation / have come across their work / read their papers and they're legit. In particular, when Matthew Green, Carmela Troncoso etc speak up, I think it's definitely important to listen. I think it's especially silly to consider these folks incompetent. They've proven themselves.

        Edit: I must add that I'm a PhD student in security, and privacy isn't my field of study but one I keep an eye on / have read papers in because of personal interest.

        7 votes
      2. [6]
        aditya
        Link Parent
        I should also add that in particular Carmela Troncoso was part of the team that developed the anonymized covid contact tracing protocol, a version of which as used by Apple and Google (IIRC)....

        I should also add that in particular Carmela Troncoso was part of the team that developed the anonymized covid contact tracing protocol, a version of which as used by Apple and Google (IIRC). Literally, Apple listens to people like her on these matters. It's absolutely ridiculous for anyone to call established academics who have made these things their life's work "incompetent", even if it's just an implication. The media has a lot to answer for in a lot of cases, but this isn't the case here. I'd also urge you to consider what possible reason a wide range of academics could have in coming out against this feature. What's their interest, and how does that stack with the priorities of a trillion dollar company?

        5 votes
        1. [5]
          joplin
          Link Parent
          I don’t doubt those individuals’ credentials. However, the letter as written is false. They say: This is factually incorrect. It is odd to me that an expert in the field would sign a letter that’s...

          I don’t doubt those individuals’ credentials. However, the letter as written is false. They say:

          Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac.

          This is factually incorrect. It is odd to me that an expert in the field would sign a letter that’s complaining about something which they should clearly understand isn’t happening.

          What's their interest, and how does that stack with the priorities of a trillion dollar company?

          What were these same individuals talking about when Microsoft, another trillion dollar company, implemented PhotoDNA, which does essentially the same thing, in their products? Microsoft reaches many more people than Apple does. Where were they when companies like Adobe and Xerox were adding detection of the Eurion constellation to their products, literally checking every image sent through them? Did they speak out against those instances? Why this, and why now?

          Regardless my point was that most of the signatories (at least when I wrote the above comment) were individuals who were responding to incorrect information from bad faith media. If it doesn’t describe these particular researchers, so be it. In no way did I intend to besmirch their good names.

          3 votes
          1. Wes
            Link Parent
            It doesn't run on my computer. I have no problem with Microsoft scanning Bing or OneDrive because that is on their servers. I'd have no problem with Apple applying this to their iCloud storage...

            ... when Microsoft, another trillion dollar company, implemented PhotoDNA, which does essentially the same thing, in their products?

            It doesn't run on my computer. I have no problem with Microsoft scanning Bing or OneDrive because that is on their servers. I'd have no problem with Apple applying this to their iCloud storage either.

            But it's my device, and I should get to decide what runs on it.

            4 votes
          2. [3]
            aditya
            (edited )
            Link Parent
            I think this is a question of semantics. iCloud Photos is on by default AFAIK. I read recently that every time you plug it into a new device, it enables some iCloud features even if you'd...

            This is factually incorrect. It is odd to me that an expert in the field would sign a letter that’s complaining about something which they should clearly understand isn’t happening.

            I think this is a question of semantics. iCloud Photos is on by default AFAIK. I read recently that every time you plug it into a new device, it enables some iCloud features even if you'd previously disabled them. This is an anti-pattern IMO. Next, either sync is on or it's off AFAIK. I have too many photos and not enough iCloud storage to test this, but how do you select what gets synced? For the average user, who may or may not be tech savvy (heck, something tells me most of them are not on tildes.net), that statement essentially holds. Even in your interpretation, it's only missing one conditional: "Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac, flagged for upload to iCloud." Now having established that iCloud sync itself is not exactly an opt-in service, I really think the statement works.

            What were these same individuals talking about when Microsoft, another trillion dollar company, implemented PhotoDNA, which does essentially the same thing, in their products? Microsoft reaches many more people than Apple does. Where were they when companies like Adobe and Xerox were adding detection of the Eurion constellation to their products, literally checking every image sent through them? Did they speak out against those instances? Why this, and why now?

            Microsoft didn't do it on the users' machine, right? You seem to think that "Microsoft did it for photos on their cloud / as a service for anyone who wants to use it" is equivalent to "Apple is doing it for images on iCloud BUT ON USER DEVICES". This is absolutely a "foot in the door" to scanning on user devices, regardless of the iCloud flag being set to true or false, and that's what's got all the experts freaked out. No one seems to really have a problem if Apple scanned iCloud servers, heck they're entitled to. And people will get caught out by the iCloud usability thing I mentioned earlier. However, what Apple has done is show that this can happen on device, and doing it for photos not flagged for iCloud upload is a small step away. I'm not a software developer by day, but I do have CS degrees (Bachelor's and Master's). I have contributed code to open source projects. I'm a co-maintainer for one project, and I'm a Google Summer of Code mentor this year for that project. I really wouldn't be surprised if the change from scanning all photos for iCloud upload to scanning all photos is a relatively minor diff. Apple has opened the door to governments putting pressure on them to scan all photos on user devices.

            Why this, and why now?

            Because the scale is completely different? It's happening on-device, and hits a massive chunk of the world's smartphone users? "They're only speaking up now, so I'm suspicious of them" -- I don't think I really understand this argument. Or, it's ignoring the difference having this scanning happen on-device makes.

            Edit: I think I'm repeating myself a fair bit on this matter, on the fact that on-device is problematic not because of what it can do now, but what it can be somewhat easily turned to etc etc. I think I'm going to drop from this discussion...

            4 votes
            1. [2]
              joplin
              Link Parent
              This is not true. I have had iCloud photos off for years on several devices and it has never been turned back on. It’s certainly possible there’s a bug somewhere that could cause it to turn on...

              iCloud Photos is on by default AFAIK. I read recently that every time you plug it into a new device, it enables some iCloud features even if you'd previously disabled them. This is an anti-pattern IMO.

              This is not true. I have had iCloud photos off for years on several devices and it has never been turned back on. It’s certainly possible there’s a bug somewhere that could cause it to turn on unexpectedly in some odd edge case, as all software has weird cases like that, but it certainly isn’t happening to everyone or as a policy.

              Microsoft didn't do it on the users' machine, right? You seem to think that "Microsoft did it for photos on their cloud / as a service for anyone who wants to use it" is equivalent to "Apple is doing it for images on iCloud BUT ON USER DEVICES".

              Yeah, calculating a hash takes significantly less processing power than the computational photography that goes into every photo taken on the device. So no I don’t see it as any worse. In fact, because it’s on-device, you know it’s not being sent anywhere else and being seen by third parties you don’t know about. One of the things that Amazon, Apple, and Google all got in trouble for recently was that voice assistant recordings were being sent to 3rd parties for “quality purposes” without users’ knowledge. If it happens on my device, that problem goes away. It’s more secure that way. This entire argument purports to be about privacy and security, and Apple has gone to great lengths to balance those things by not sending your data when you don’t use the service and not sending the data to a third party when you do use the service.

              1 vote
              1. aditya
                Link Parent
                I’m glad to hear that. There isn’t a question of a third party here. If apple truly wanted to limit this to scanning iCloud photos, why isn’t it happening server side once they have the photo?...

                This is not true. I have had iCloud photos off for years on several devices and it has never been turned back on. It’s certainly possible there’s a bug somewhere that could cause it to turn on unexpectedly in some odd edge case, as all software has weird cases like that, but it certainly isn’t happening to everyone or as a policy.

                I’m glad to hear that.

                There isn’t a question of a third party here. If apple truly wanted to limit this to scanning iCloud photos, why isn’t it happening server side once they have the photo? Heck, they almost certainly have some scanning already.

                5 votes
    3. [2]
      jcdl
      Link Parent
      I agree with your take on message scanning. From Apple, clear as day: It really seems like this is only enabled on parent-controlled devices. I don't think you and me with full control of our...

      I agree with your take on message scanning. From Apple, clear as day:

      The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

      [...]

      Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.

      This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.*

      It really seems like this is only enabled on parent-controlled devices. I don't think you and me with full control of our devices would be opted into this.

      Now, obviously, this would never be effective because kids will just use literally any other app if they don't want their parents spying on their smut.

      As for the iCloud photo scanning, I'm still extremely wary. I read a comment on an early HN thread about how the CSAM database is broader in scope than just explicit media of children. Apparently whenever illegal content is seized, everything "adjacent" to it (scope unclear) is also added to the database and is subject to flagging. It seems easy for perfectly innocent stuff to get caught in the dragnet.

      5 votes
      1. joplin
        Link Parent
        That's a legitimate concern, but not something that Apple (or Microsoft, who are also known to be using the CSAM database) controls. There's definitely room for some sort of public auditing here...

        the CSAM database is broader in scope than just explicit media of children. Apparently whenever illegal content is seized, everything "adjacent" to it (scope unclear) is also added to the database and is subject to flagging. It seems easy for perfectly innocent stuff to get caught in the dragnet.

        That's a legitimate concern, but not something that Apple (or Microsoft, who are also known to be using the CSAM database) controls. There's definitely room for some sort of public auditing here (by which I mean government run, perhaps, or audited by a hired firm, not that all images would be available to the public). At some point we either have to check all the images ourselves (no thank you!) or trust someone to do it for us. I kind of wish Apple controlled it because right now I trust them more than any other company and more than many parts of our government (which is sad, but here we are).

        3 votes
    4. NaraVara
      Link Parent
      There also seems to be a fundamental misapprehension of what "scanning your photos" means. What it's doing is checking a hash of your photo against hashes of images in the CSAM database. So...

      That's literally false - the fact that it only applies to iCloud Photos is kind've important. In case you didn't know, anything you upload to OneDrive also gets scanned by Microsoft's PhotoDNA, which does essentially the same thing.

      There also seems to be a fundamental misapprehension of what "scanning your photos" means. What it's doing is checking a hash of your photo against hashes of images in the CSAM database. So practically speaking, nobody is actually peeping on your photos and fears that it will flag you as a pedo for sharing pics of your kids in the bath are unfounded. It's also difficult to imagine this being expanded to some of the dystopian use cases people are asserting, like checking facial recognition data against databases of dissidents. Again, it's checking hashes of pictures against each other so they basically have to be very similar pictures in order to match. It's probably not easily able to flag and identify individual components of the pictures, like peoples' faces.

      The database is a specific repository of images that are known to the FBI as being in active circulation among child pornographers. Unless the pictures you are sharing are in those collections of images (in which case you would probably want to know if it is wouldn't you?) it won't flag unless as a false positive. From what I understand, these people usually swap images as parts of large, bulk downloads on the dark web and law enforcement only manages to catch them when they slip up and combine their offending content with their normal content in a channel that scans for it.

      3 votes
  2. post_below
    Link
    Most of the comments here seem to fall into two camps... Camp one: Apple isn't doing anything here that carries a significant privacy concern, or indeed much more than they, and other companies,...

    Most of the comments here seem to fall into two camps...

    • Camp one: Apple isn't doing anything here that carries a significant privacy concern, or indeed much more than they, and other companies, were already doing. So what's the big deal all of a sudden? It seems alarmist.

    I think this is entirely correct. I too am frustrated when people get the tech wrong.

    • Camp two: Slippery slope.

    I think this is entirely correct too.

    Why the outrage now? Because it dropped into the media cycle at the right time.

    I don't think the fact that Apple isn't doing anything particularly egregious right now should end the conversation.

    Without conversation these slips become normalized, which makes the next slip down the slope easier.

    Where privacy is concerned, and where giant multinational companies are concerned, I think the slippery slope argument is often valid. We've seen the slide into moral grey area, and farther, happen over and over, with countless companies and governments. It's consistent enough to safely call it an inevitability.

    So questions like "what next step might this facilitate?" and "what would that potentialy allow people or governments acting in bad faith to do?" are important.

    In addition to privacy, there are a variety of other issues at play. For example, that other slope we've been sliding down for some time now... the one where the concept of truly owning a computing device (arguably the most important object we own in the modern world) is evaporating as we lose, by degrees, the ability to choose what our devices do and don't do.

    For the tech savvy, there are alternatives, but the convenience barriers make them effectively out of reach for the average user. They could theoretically get a Google or Apple or Microsoft free device, but it won't actually happen. That makes any major decision by OS makers important.

    I'm less concerned about the merits of calling out this particular move by Apple, and more happy to see the conversation happening.

    5 votes
  3. [2]
    skybrian
    Link
    One thing I'm wondering about is the distinction between the OS and apps. Apple talks about this as a feature of the OS, and that it's coming in future OS updates, but then talks about what the...

    One thing I'm wondering about is the distinction between the OS and apps. Apple talks about this as a feature of the OS, and that it's coming in future OS updates, but then talks about what the Messages app and Photos app will do.

    Reading between the lines, perhaps they have a private API that these two apps call into?

    Apple hasn't done themselves any favors by talking about this as part of the OS since it makes it sound like a pervasive, unavoidable feature, versus another service that apps can use.

    I'm wondering if any other apps might start using it.

    4 votes
    1. p4t44
      Link Parent
      System apps update with the OS on iOS, I have seen apple use similar wording before for other irrevalent updates to system apps.

      Apple talks about this as a feature of the OS, and that it's coming in future OS updates, but then talks about what the Messages app and Photos app will do.

      System apps update with the OS on iOS, I have seen apple use similar wording before for other irrevalent updates to system apps.

      4 votes
  4. MetArtScroll
    Link
    In the case of iCloud image scanning, I see absolutely no point in doing it on-device. Even the best iPhones have at most a few TB memory, and their computation power is still quite limited,...

    In the case of iCloud image scanning, I see absolutely no point in doing it on-device. Even the best iPhones have at most a few TB memory, and their computation power is still quite limited, whereas the iCloud servers definitely have access to much more efficient computation, and I expect the memory they have should be in the exabyte range.

    So what's the point of doing it on-device?

    3 votes