Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Iphone Cellphones Privacy Apple

Should Apple Share iPhone X Face Data With App Developers? (washingtonpost.com) 66

The Washington Post ran a technology column asking what happens "when the face-mapping tech that powers the iPhone X's cutesy 'Animoji' starts being used for creepier purposes." It's not just that the iPhone X scans 30,000 points on your face to make a 3D model. Though Apple stores that data securely on the phone, instead of sending it to its servers over the Internet, "Apple just started sharing your face with lots of apps." Although their columnist praises Apple's own commitment to privacy, "I also think Apple rushed into sharing face maps with app makers that may not share its commitment, and it isn't being paranoid enough about the minefield it just entered." "I think we should be quite worried," said Jay Stanley, a senior policy analyst at the American Civil Liberties Union. "The chances we are going to see mischief around facial data is pretty high -- if not today, then soon -- if not on Apple then on Android." Apple's face tech sets some good precedents -- and some bad ones... Less noticed was how the iPhone lets other apps now tap into two eerie views from the so-called TrueDepth camera. There's a wireframe representation of your face and a live read-out of 52 unique micro-movements in your eyelids, mouth and other features. Apps can store that data on their own computers.

To see for yourself, use an iPhone X to download an app called MeasureKit. It exposes the face data Apple makes available. The app's maker, Rinat Khanov, tells me he's already planning to add a feature that lets you export a model of your face so you can 3D print a mini-me. "Holy cow, why is this data available to any developer that just agrees to a bunch of contracts?" said Fatemeh Khatibloo, an analyst at Forrester Research.

"From years of covering tech, I've learned this much," the article concludes. "Given the opportunity to be creepy, someone will take it."
This discussion has been archived. No new comments can be posted.

Should Apple Share iPhone X Face Data With App Developers?

Comments Filter:
  • No (Score:4, Insightful)

    by Black.Shuck ( 704538 ) on Monday December 04, 2017 @07:02AM (#55671397)

    Users should be asked if they want to share their data with an App.

    Like every other permission Apple has implemented.

    • default should be no sharing.
      then users should be given an option on sharing, and which data.

      and to be really fair, if apple/others-using-them are making money out of that data, users should get a share of that money.

      to be perhaps impractically fair, apple should recognize data about third parties in data( such as someone else in image), and at least inform the user about facts and consequences of who has legal right to that data, on case by case.

      • >default should be no sharing.

        Default should be no sharing, and apps that crash or won't launch without those permissions should be banned from the store unless the permissions are vital to the primary functionality of the app.

        • by Dog-Cow ( 21281 )

          So... like every other permission that Apple forces apps to obtain? You don't really know what an iPhone is, do you? It's just this thing you've heard people talk about.

          • I'm in the process of switching back to Android, actually, since I hate the stupid iPhone I have.

            I was speaking in the general case. And yes, I'm aware I'll run into the problems I mentioned more frequently (unnecessary permissions aren't exactly unheard of with iPhone apps) once I'm back to Android, but it's worth it to have control of my device.

            In short, stick it where the sun don't shine, Apple fanboi.

    • Re:No (Score:5, Insightful)

      by ctilsie242 ( 4841247 ) on Monday December 04, 2017 @09:09AM (#55671713)

      Even better... how about just no, period. Apple doesn't share the metrics from the fingerprint data with app developers, so photos done from the FaceID authentication mechanism shouldn't be shared either. The FaceID data has zero relevancy to apps, because it is specific to the iPhone (dot placement, etc.), and if an app wants a picture of someone, they can do what like all apps have done for ages... and ask for a selfie from the front or rear camera.

      In fact, the FaceID data should never leave the Secure Enclave, much less the device.

      • It's entirely possible that the data set available to app developers is not as complete as what the FaceID system uses for authentication purposes. After all, it does not make much sense to use super high resolution mapping for emoji nonsense, where you would definitely want it for authentication purposes.

        At any rate, it's also plausible that the communication between the FaceID cameras and the Secure Enclave are secured with TPMS style hardware signing, as that's what they do with TouchID, so injecting an

        • by swb ( 14022 )

          I think the risk isn't access to authentication (although the existential nature of that risk never goes away), but that any topographical mapping of your face good enough to do anything clever with is a level of biometric detail that could also be abused.

          Facial recognition has gotten pretty good with just 2D photos. Look at what Facebook can do with a well-hinted database of photos. Do we really want high resolution topographic facial maps out there?

          I think the big problem with technology and privacy is

      • I completely disagree. The sensor is essentially a miniature lidar array with functionality to around 5 feet. The potential to change how we use our phones could be revolutionary ; scan favorite objects and have them appear in virtual reality environments like games, or send them to your 3-d printer. It goes far beyond turning your face into a talking poop [theverge.com] Just implement it as an opt-in like everything else.
      • Re:No (Score:4, Interesting)

        by Anubis IV ( 1279820 ) on Monday December 04, 2017 @12:20PM (#55673085)

        Apple isn't and hasn't been sharing FaceID data. Your facial "fingerprint" is not being shared. No data from the Secure Enclave is being shared. And apps do have to ask for and be granted permission before they have access to any of the new APIs.

        This whole thing is being poorly reported by the media to make it sound like something other than it is. It is a cause for concern, to be sure, and certainly something that users should be aware of, but it's not nearly what they're making it out to be.

        So what's actually happening? Well, while iOS is sharing facial data, it is NOT sharing FaceID data. iOS can now recognize one of about 50 different facial expressions and report them back to an app in realtime via new APIs, allowing the app (after it's received permission) to understand your facial expressions. And in the same way that the recently added AR APIs allow iOS to provide the shape of nearby objects to apps so that they can map virtual items into 3D space, apps can now use those APIs to map items onto your face. The example they gave was applying a silly mask onto the user's face in realtime, which is a fun thing for the kids to do, I guess?

        However, based on the images I've seen of the raw points apps have access to, they're not getting anything even CLOSE to the full-resolution scan of 30K IR points, which makes sense, since (as you said) there really isn't a need for them to have anything of the sort. Rather, they're getting a significantly lower-resolution 3D mesh of your face that's sufficient for their needs without being good enough to create their own "fingerprint" that could be used to produce a facsimile of your face. And, of course, at no point is the actual "fingerprint" of your face that the iPhone produces for FaceID ever handed over to apps. It remains locked within the Secure Enclave.

        As for permissions, right now apps are receiving them when they ask for access to the camera. To me, that's the biggest issue at play, since it's not immediately apparent to users what's happening, but they're all part of that same sensor suite, so I can see why Apple may have grouped them like that. That said, with this much public concern regarding the sharing facial data, I wouldn't be surprised if Apple makes the 3D sensors require separate permissions starting in an upcoming dot-release of iOS.

    • I'm voting for No unless you manually change it to Yes... and not by some popup.

  • This reminds me of an earlier discussion about Apple's AR initiative.

    Let's say IKEA creates an app that allows you to place virtual furniture in your living room.

    Doesn't that mean that IKEA now has access to data about my livingroom?
    • Yes, it does. The real question is whether you, as an end user, care about it.

    • Re:AR (Score:4, Insightful)

      by BlacKSacrificE ( 1089327 ) on Monday December 04, 2017 @07:33AM (#55671465)
      Moot point. You can change the configuration of your living room. You cannot change the configuration of your face. And the layout of your living room is not being used as an access method to your digital life. Flippant disregard of the deeper consequences such as yours is the reason people don't care, and manufacturers know it.
      • Re:AR (Score:4, Funny)

        by Black.Shuck ( 704538 ) on Monday December 04, 2017 @07:41AM (#55671475)

        You cannot change the configuration of your face.

        Travolta and Cage would beg to differ.

      • Your room and its contents says a lot about you. This data can be used by databrokers to update thousands of reputation scores about you. Deep learning algorithms could seek correlations with (mental) health, poverty, ambition, etc by comparing your room to that of others whom they know more about.

        It doesn't matter that these are spurious correlations, or that they are wrong a lot of the times. As long as it allows some risk to be managed, then their clients will happily pay for these 'opinions' about yo
      • by AmiMoJo ( 196126 )

        Face ID is a reasonable security measure for many people. People are basically lazy and their main adversaries are petty thieves and nosy friends/co-workers. The hierarchy of security levels is something like:

        0 No lock at all
        1 Fixed swipe pattern
        2 PIN
        3 PIN with randomized keypad
        4 Face ID
        5 Fingerprint
        6 Very strong password

        1 is enough to stop a lot of people. 3 is enough to stop most law enforcement. 4 and 5 depend on the implementation, but based on the backlog of phones waiting to be unlocked at least 5 is

      • You walk around all day proudly displaying your face for all to see or record. The real problem is wanna be security idiots using biometrics for authentication instead of just a fancy user name. it should be illegal to use biometrics as the sole source to verify user identity.
    • If they do an accurate measurement instead of just taking your word, yes, yes they do.

  • Security Risks (Score:3, Interesting)

    by ytene ( 4376651 ) on Monday December 04, 2017 @07:57AM (#55671513)
    There are two critical problems here...

    The first is that it is a lot harder for you to change your face than it is to change a password. Like any truly effective biometric, it is tied to you, permanently. So the moment someone comes up with the means to defeat a biometric-based authentication scheme, the entire scheme is effectively useless, not just a single implementation for a single user. [ I concede the point that security through obscurity is no security at all - in other words if your biometric facial recognition system is vulnerable if the back-end data leaks, then it's not really secure ].

    The second is that it would make it an order of magnitude easier for a despotic government to obtain that data and then use it to track citizens. Except, of course, it would now be possible to make an explicit connection between a face and a smartphone - which means in theory it would also be possible to detect when smartphones are being shared among small groups of people].

    But perhaps the most compelling argument would be to categorize the data being collected as being part of your medical record. It relates to your personal physiology, after all - and is unique to you. Would it be acceptable for your doctor [or a company you deal with] to take part of your medical record and simply share it or sell it if they wanted to? Without your knowledge or consent?

    This is a disturbing development from a company that has recently made a big play for being a champion of personal privacy. Question is: is this an overlooked mistake that will be corrected, or in fact Apple's true colours?
    • The first is that it is a lot harder for you to change your face than it is to change a password.

      That's why biometric data should only be used as a user ID, not a password. So far, there are very few devices that do this at all.

      • Right. So biometric data should only be freely available to be used for rigorous fool-proof tracking. It shouldn't be used to authenticate. Somebody could sneak on your phone and ruin your Angry Birds score!

        • Most biometric data already is (freely available to anyone within observation distance) - that's the whole reason for the problem with using it as a password in the first place.

  • by Gravis Zero ( 934156 ) on Monday December 04, 2017 @08:19AM (#55671565)

    How else will fools* learn to avoid malicious technologies? Also, if they don't lean, well, they earned all the wonderful things coming to them as a result.

    * Please note that there is a large difference between a foolish person and a stupid person.

  • About the only use case I could see, is where an App was always locked, and could be unlocked by querying the operating system to check the face ID. This might be useful. My phone may be unlocked because I'm watching a video or showing someone a picture. If someone swipes my phone while it's unlocked, it's pretty trivial for them to keep it unlocked. But certain apps with sensitive data on them could always be required to show facial ID to open or switch to the app. However, there wouldn't be any actual

    • Using the animoji as an example, there are a number of interesting ways for a user to interact with an app with the face ID dot projector.

      My gut feeling though is that the data is so limited that this generation is nothing to get your knickers in a twist.

      What I would love to see is facial recognition scoring from the FaceID system. So far, I am a little disappointed at simple things it can't do... like track attention while in landscape rotation.

  • "So who built us?"

    "The humans did. Well they built the machines who built the machines who built us after the war"

    "The war between our predecessor and the humans?"

    "Yeah"

    "How did our predecessor get weapons?"

    "The humans built them, and put them under the control of Skynet 1.0"

    "They built enough weapons to destroy humanity and handed control over to Skynet"

    "Yeah"

    "Why would they do that?"

    "The humans weren't united. They fought amongst themselves. Skynet was to help them fight"

    "So Skynet won?"

    "For a while. Then

  • Only share the data with the FBI and the Russians. They get it anyway, might as well make it easier.
  • Apple should never share any data of what so ever to the developers. This just simply removes privacy and overall security from a person's life. https://www.identitypi.com/ [identitypi.com]
  • Slashdot is getting as untruthful as Trump's Tweets.

    What they have an API for, is the LOW RESOLUTION mo-cap data that is updated in real-time; NOT the "30,000 Points of Light" data that is used for FaceID.

    This is the same data that is used to drive the Animoji "expressions", and apparently to breathe more "life" into certain gaming avatars.

    As far as being able to stuff like gender, which is already much more obtainable through a gazillion sources, and sexuality (gimme a break!), that is simply a big nothing

No spitting on the Bus! Thank you, The Mgt.

Working...