Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Privacy Businesses Communications Encryption Government IOS Iphone Operating Systems Security Software The Courts The Internet United States Apple Hardware Your Rights Online

Tim Cook: Privacy Is Worth Protecting (washingtonpost.com) 120

An anonymous reader writes from InformationWeek: In a wide-ranging interview with The Washington Post, Apple's CEO Tim Cook talks iPhones, AI, privacy, civil rights, missteps, China, taxes, and Steve Jobs -- all without addressing rumors about the company's Project Titan electric car. One of the biggest concerns Tim Cook has is with user privacy. Earlier this year, Apple was in the news for refusing a request from the U.S. Department of Justice to unlock a suspected terrorist's iPhone because Apple argued it would affect millions of other iPhones, it was unconstitutional, and that it would weaken security for everyone. Cook told the Washington Post: "The lightbulb went off, and it became clear what was right: Could we create a tool to unlock the phone? After a few days, we had determined yes, we could. Then the question was, ethically, should we? We thought, you know, that depends on whether we could contain it or not. Other people were involved in this, too -- deep security experts and so forth, and it was apparent from those discussions that we couldn't be assured. The risk of what happens if it got out, could be incredibly terrible for public safety." Cook suggest that customers rely on companies like Apple to set up privacy and security protections for them. "In this case, it was unbelievably uncomfortable and not something that we wished for, wanted -- we didn't even think it was right. Honestly? I was shocked that [the FBI] would even ask for this," explained Cook. "That was the thing that was so disappointing that I think everybody lost. There are 200-plus other countries in the world. Zero of them had ever asked [Apple to do] this." Privacy is a right to be protected, believes Cook: "In my point of view, [privacy] is a civil liberty that our Founding Fathers thought of a long time ago and concluded it was an essential part of what it was to be an American. Sort of on the level, if you will, with freedom of speech, freedom of the press."
This discussion has been archived. No new comments can be posted.

Tim Cook: Privacy Is Worth Protecting

Comments Filter:
  • True (Score:2, Insightful)

    by Anonymous Coward

    Unfortunately, the fact that privacy is worth a lot is why so many people are trying to sell our privacy to the highest bidders.

  • I agree with our Founding Fathers, but they also didn't know what a germ was. Anyways... Without Privacy there is no Security. I rarely agree with Mr. Cook but I do agree with him on this, Realistically any Entity would be unable to contain a tool that exists of this caliber. I reckon that within days, maybe hours of Apple releasing a tool to assist the Feds, the tool would end up being publicly distributed and Apple would have to make a new one...and another one...ad infinum (My latin is kinda bad, I th
    • by Anonymous Coward

      Agreed. That's why we need to start from scratch, starting with a new and modern constitution. I've started it below:

      We the [REDACTED] of the [REDACTED], in Order to form a more perfect [REDACTED], [DELETED], [DELETED], provide for the common defense of Social Justice, [DELETED], and secure the [REDACTED] of [REDACTED] to [REDACTED] and [REDACTED], do ordain and establish this [REDACTED] for [REDACTED].

      • by Anonymous Coward

        We the [APPS] of the [APPS], in Order to form a more perfect [APPS], [COWS], [COWS], provide for the common defense of Social Justice, [COWS], and secure the [APPS] of [APPS] to [APPS] and [APPS], do ordain and establish this [APPS] for [APPS].

    • The Founding Fathers would be appalled to see how the use and abuse of personal information is completely subverting their Bill of Rights. You have no protection of anything if all of your personal information is already outside of your control. If Cook was sincere, then he would at least offer a business model that would profit by protecting privacy (even if it were optional). For example:

      Create a privacy protecting intermediary (PPI) that would be motivated to gather and protect ALL of your personal infor

      • Maybe some of the things, but this particular incident...nope.

        There was a warrant involved, the owners of the phone were dead, so have no right to privacy, so there was no constitutional or privacy issue.

        If they were afraid of the tool getting out, why not unlock the phone in their facility with a tool designed to unlock ONLY that phone?

        This was Apple making a stand to raise their sales numbers, that is all it was.

        Also, them taking this stand did not stop the tool from being created and used, and guess what

        • by shanen ( 462549 )

          Your reply was evidently intended for the comment above mine. No relevance to anything that I wrote.

          However, you do sound amazingly naive. May I recommend you consider reading Data and Goliath by Bruce Schneier, Future Crimes by Marc Goodman, Geeks by Jon Katz, The Facebook Effect by David Kirkpatrick, and The Filter Bubble by Eli Pariser?

        • There was a warrant involved, the owners of the phone were dead, so have no right to privacy, so there was no constitutional or privacy issue.

          And Apple was an uninvolved third party conscripted against its will to perform duties against its conscience. Would you be so eager to be conscripted to work on a project for an arbitrary government agency just because you knew how to do the job? Particularly if you thought the project might affect the reputation or long term profitability of your business?

          This w

  • Could we create a tool to unlock the phone? After a few days, we had determined yes, we could.

    Now there's your problem. You should not be *able* to unlock it by any known means and this approach should be supported by both software and hardware design. Design a phone that you *can not* open even upon request and you've solved the problem in the best possible way.

    • by dgatwood ( 11270 ) on Monday August 15, 2016 @04:58PM (#52707403) Homepage Journal

      To be fair, the iPhone in question lacked the secure enclave. The techniques to crack into it would not work with newer hardware. It is still an open question whether other techniques could compromise current hardware—though to be fair, that is always the case with new technology up until the point when somebody comes up with a way to break it, so I guess that isn't really saying anything. :-)

      • by allo ( 1728082 )

        To be fair: A normal computer doesn't have this at all and a strong passphrase protects it just fine.

        • by allo ( 1728082 )

          * disk encryption passphrase.

        • by dgatwood ( 11270 )

          To be fair: A normal computer doesn't have this at all and a strong passphrase protects it just fine.

          A strong passcode protects an iPhone just fine, too, AFAIK. A four-digit numerical passcode does not, and would not protect a computer, either. If anything, it would protect a typical computer far less, because it is far easier to interpose a disk emulator (passing reads through, storing writes to a separate device) on the SATA bus between a computer and its drive than between a CPU and flash parts that ar

          • by allo ( 1728082 )

            Yeah, but that's the point. Of course a secure enclave does not hurt. But Kerckhoff's Law says, if your scheme isn't secure if everything but the password is known, it's not secure at all. So use a damn passphrase which is secure and you do not need to worry about hardware implementations. With a fingerprint sensor, the iphone has everything which is needed to have convenience AND security with a long passphrase. Otherwise you can use an android phone with SnooperStopper to have different passcodes for the

    • You fools (Score:2, Insightful)

      by Anonymous Coward

      You fools. Apple's security and privacy are to protect the walled garden. They keep "your data" private to prevent their competition from monetizing you. They keep "your phone" secure to protect the walled garden. There is not an ounce of concern about your dignity or rights; this is 100% about greedily protecting their revenue stream.

    • by MikeMo ( 521697 )
      You'll recall that the phone in question was an older model with far less security than the phones they sell today. The particular tool being requested was essentially a new version of the firmware that would ignore the failed unlock attempt counter, installed via a maintenance path. It is said they are working to remove that, too.
    • by gweihir ( 88907 )

      It was an older model with known flaws. They never said anything about the newer ones.

      • That doesn't really change anything, and the discussion isn't about newer phones. It's about what they knew and what they did. What they knew was that there is a problem with the design and what they did was greenlight it.

        • by gweihir ( 88907 )

          And you know that how? The facts is that they know today that there was a problem with a design that went into production several years ago. We do not know (and in fact have no indication) that they knew back then when the decision to go productive was made.

          • We do not know (and in fact have no indication) that they knew back then when the decision to go productive was made.

            Umm, what? iPhone 5S and 5C were both released Sep 20 2013. One had secure enclave and the other one didn't.

            I think that's pretty much "they knew".

            • by gweihir ( 88907 )

              Your thinking is flawed. Less security does not mean "breakable", just the same as more security does not mean "unbreakable".

              • No, the phone is flawed. :-)

                Your argument is correct, but the question was whether they knew it was possible to open phone model in question upon request. They most definitely knew it was possible because the Secure Enclave in 5S defeats this particular design flaw. They would not design something like Secure Enclave if they did not know what it was there for.

  • by Chas ( 5144 ) on Monday August 15, 2016 @05:01PM (#52707419) Homepage Journal

    I don't care for Cook personally, or Apple, or the entire Apple-sphere.

    But this is one thing he and I have a meeting of the minds on.

    My privacy is valuable. Which is why I'm so parsimonious doling out pieces of it. Why the hell should I have to submit five forms of identification, provide blood, sperm and stool samples, open up my financial data back to the date of my birth, get a hundred and thirteen character witnesses, etc, etc just to participate online?

    Fuck that noise. I'd rather shiver in a cave in the woods.

    On top of that, my privacy also protects me from theft of my identity and, theoretically, also provides protection against illegal behavior by bad actors with government credentials. Hence, it guards my freedom.

    And don't tell me it never happens. It does.

    If you have zero use for your freedoms, rights and liberties, by all means. Go ahead and shotgun all your data to the Internet.

    But the second you (or anyone (and I mean ANYONE) else) demands that I do the same, you're going to be met with a giant "fuck you" and a fist in the face.

  • by Zombie Ryushu ( 803103 ) on Monday August 15, 2016 @05:02PM (#52707423)

    We don't protect ourselves by destroying Freedom. The FBI Knew there was nothing on that Phone. They wanted to set the Precident so they could unlock everyone's Phone. These invasive privacy efforts do nothing to protect private citizens from terrorist attacks. They exist to create an atmosphere of fear and social control and paranoia in our own society.

    If we really wanted to stop Sunni terrorist organizations we would be relentlessly trying to level places where they are Headquartered like Raqqa.

    • From my understanding this guy is correct. The rumours go, that the guy who did the shooting deliberately smashed to pieces his personal phone and left his WORK iphone in his drawer / house or something.

      I imagine law enforcement would want to check the thing but it was always likely to have very little on it.

    • If we really wanted to stop Sunni terrorist organizations we would be relentlessly trying to level places where they are Headquartered like Raqqa.

      To what end? Every bomb we drop that happens to harm an innocent person is egg on our face in other countries eyes. It's a deadly game of whack-a-mole that really doesn't have an end.

      Hate breeds more hate. The Sunni and Shiites will never be peaceful to one another, and neither will truly accept Western civilizations (e.g. US and UK) as long as we keep going in and ham-handedly killing women and children in the name of peace. Ever wonder what sparked terrorism and revenge on the West? Do you think maybe

  • by supernova87a ( 532540 ) <kepler1@NoSpaM.hotmail.com> on Monday August 15, 2016 @05:25PM (#52707563)
    Apple certainly has no shortage of issues to criticize them on. But on the issue of privacy and making the iPhone backdoor-able, at least they were smart enough to know what they could not know and could not control, and to want no part of it.

    And what they were smart enough to know is that no government authority, no matter how secure and authoritative it claims to be, can control all of its own people and the hundreds of places that a backdoor capability might leak or be used improperly. The FBI cannot even control leaks and incompetence within their own ranks -- what's the likelihood that a capability so valuable would remain unleaked and well-protected in their hands, even with many checks?

    So I applaud Apple for at least knowing that it should not develop such a capability and instead leave it in the hands of users to choose when to make things private, out of even Apple's reach.

    There have always been secrets, and people trying to foil the methods of hiding them. Time for the government to do a bit more legwork for the next move.
  • ..that morning, Cook had stood in front of employees at Apple headquarters and held up the phone, which a staffer had hand-delivered from a store in Beijing to commemorate a notable occasion: Apple had sold its billionth iPhone.

    Wait, did Tim Cook jack someone's iPhone just because it was the billionth? I can only imagine a scene similar to Willy Wonka and the Chocolate Factory.

  • Tim, I would like more control of my iPhone so I could assure privacy myself.

    a few quick examples:

    - Can I use my apple phone without apple knowing who I am?
    - Can I block some apps from internet access at all times (not just over cellular)?
    - Can I create/adjust my own content blockers?
    - Can I have a firewall, bidirectional? Please?

    • also:

      -Can I turn of the software update nagging?
      -Can I play all songs by Artist with one tap (like I used to?)
      -Can I permanently shut off the confusing Time Travel for Watch (what is that anyway?)

      Otherwise, kudos.
  • The only reason they ever bothered is that some people of means were hurt by the lack of it. They don't care about ordinary people or what happens to them.

  • by Anonymous Coward

    He talks privacy yet build huge cloud analysis data centers.
    He really needs to step down.
    No new products in years. What a failure.

  • by gweihir ( 88907 ) on Monday August 15, 2016 @09:41PM (#52708961)

    In a free society, people must be able to experiment with ideas and thoughts. Some of these thoughts and ideas will by the very nature of the process be, to put it mildly, problematic. Other will threaten holders of power. Hence, in order to no have to self-censor, people must have privacy in the spaces they use to evolve their ideas and opinions and that is what a free society is all about. Today, these spaces are more often than reflected in the computing equipment people own.

    Sure, many people do not use these freedoms or only use them rarely. That does not matter one bit. If they are missing, freedom goes out the window and tyranny sets in. And tyranny is far, far worse than any other threat could ever be.

  • Then the question was, ethically, should we?

    yeah- i'm sure he wrestled with that mightily.

    what he wrestled with was the financial implications. Somehow they came to the conclusion that it would cost them more money to go ahead and break into that phone- probably because they'd have to start doing it all the time.

    That's how that decision got made, not because of anything soft and fuzzy like ethics.

    • by tlhIngan ( 30335 )

      Then the question was, ethically, should we?

      yeah- i'm sure he wrestled with that mightily.

      what he wrestled with was the financial implications. Somehow they came to the conclusion that it would cost them more money to go ahead and break into that phone- probably because they'd have to start doing it all the time.

      That's how that decision got made, not because of anything soft and fuzzy like ethics.

      Here's a simple, basic question. We know iOS accounts for about 20% of the market, and Android, 80% (4 android p

  • https://en.wikipedia.org/wiki/... [wikipedia.org] https://en.wikipedia.org/wiki/... [wikipedia.org] lol. (I laughed because privacy doesn't exist in the medium where many of our thoughts largely exist, the internet and computers in general). Nearly all of us wouldn't know if the NSA/CIA/etc... came into our computers to check on us-- from the little I know these guys have a huge amount of brain power and almost unlimited authority. And I'm sure there are a few leaps in this thought, but I'm worried that perhaps the public might conflat
  • > After a few days, we had determined yes, we could.
    Enough. What can be done, eventually will be done. Others try to secure their software, so not even they can crack it. That's the way to go, because otherwise there just need to be enough bribe or pressure and it will be done. Look at your anonymous vpn provider. They will most likely cooperate as well, turning over all their logs, which means nothing at all. That's useful security for you and for them. Now suppose they have logs but store them strongly

  • Translation: - "I care deeply about privacy because Apple's business model at present is based on selling hardware, not advertising."

Do molecular biologists wear designer genes?

Working...