OT: Apple and Privacy vs. National Security

Submitted by StephenRKass on

I haven't generated any posts lately, but there's a current hot topic I'm interested in. I'm curious for info on the privacy vs. national security questions raised in recent days, between the FBI and Apple. Here's the synopsis, if you've been living under a rock. Apparently, the San Bernardino terrorist's Apple i-phone wasn't destroyed, and the FBI wants Apple to help unlock the encryption so they can presumably see a record of calls and stored information, contacts, etc.. And (edit) Tim (not Robert) Cook of Apple is refusing, suggesting that to do this would be to create a "backdoor" giving the government access to every single i-phone out there, and all the content.

I've googled this topic, and read several articles on it, but still am unsure about what to think. Here's what I don't understand. Why can't Apple unlock the phone for the FBI and assist them in getting the data off of the phone? Can't they do this without giving the FBI software that would allow for the creation of a universal backdoor the FBI could use on everyone's phone? From what I've read, the encryption is so good that even Apple can't get in . . . it would have to write new software to be able to get in. And Cook doesn't even want that kind of software written, even if it is in-house at Apple. Is that correct?

My interest is really in what Apple can do to preserve privacy, and at the same time allow for the government to do everything it can towards national security. Is it possible, or do we really have to choose between either privacy, or national security concerns? I want to have my cake and eat it too!

UMProud

February 18th, 2016 at 5:37 PM ^

Unlock the damn phone these people were mass murderers and Apple has done it 70 times in the past. It would be the same as a locksmith refusing to drill open a safe. They could be preventing future murders. Zero problem with this.

Codeman

February 18th, 2016 at 6:38 PM ^

Almost all Android phones are based on the open-sourced Android project, but are modified by the individual manufacturer/carrier.  Just because there's no obvious backdoor in the open-sourced code doesn't mean the Android phones you can buy don't have them.  Food for thought.

VintageBlue

February 18th, 2016 at 5:38 PM ^

I see this all the time in my line of work where people immediately think of information differently solely because it's sitting on a piece of silicon instead of a piece of paper. If the terrorists had all their evil plans in a notebook somewhere and only Apple knew the location, wouldn't they be compelled to spill it under court order. Is there really that big of a difference?

carolina blue

February 18th, 2016 at 5:46 PM ^

You make an interesting, but in my opinion, incomplete point. This wouldn't be like just allowing the terrorist papers to be found. It would be like giving the government the keys to the room where everyone's papers are held. These papers are in one locker in that room but the key can open up all the lockers.
So it's not quite as simple as you put it.



Sent from MGoBlog HD for iPhone & iPad

VintageBlue

February 18th, 2016 at 7:01 PM ^

But swat kicking in your door executing a search warrant in the middle of the night is legal and certainly scary and obtrusive to neighbors but at the end of the day they're just after you, right? It's an interesting argument to be sure but sometimes I think it being technology-related complicates the question of whether the government is right to ask for it (this specific question).

Gucci Mane

February 18th, 2016 at 5:39 PM ^

Apple should not help open the phone in any way. If they do then every iPhone will be subject to being unlocked by the government. What good is "safety" if we lose our freedoms along the way.

UMich87

February 18th, 2016 at 5:40 PM ^

Did anyone buy the iPhone primarily because of its encryption capabilities? Or did they buy it for its cutting edge coolness factor? Or because everyone was buying it? I have a Samsung Galaxy (something something) and cannot tell you what kind of privacy protections it has other than I have to type in my password a hundred times a day because my employer requires me to. I am responsible for who I call and what pictures I take and the messages I send. I have never thought for a moment that Samsung owed me a duty to protect my privacy.

TruBluMich

February 18th, 2016 at 6:09 PM ^

Ok, you don't use your phone to do banking, pay bills or do any investing. I specifically choose an iPhone for the encryption and security. It always dangerous and you must be cautious regardless but if I only have to worry about me making a mistake that eliminates 50% of any risk I'm taking.

Jack Hammer

February 18th, 2016 at 5:46 PM ^

You don't need Apple to look at phone and text records.  The NSA can (legally) get that information from the wireless carrier.  It won't reveal all of the phone contacts, but it provides quite a bit of data they can use as leads.  Each wireless carrier has a department dedicated to assisting law enforcement with call/text records.

Kevin13

February 18th, 2016 at 5:44 PM ^

That with all the resources the FBI has, that they can't unlock an i-phone. I know it's encrypted and all, but my gawd.... your the FBI........

SalvatoreQuattro

February 18th, 2016 at 6:16 PM ^

And no, it's not disturbing at all because we know how the towers fell. People like you see ghosts where there are trees. It's a paranoia borne of ignorance of history,war, economics, and politics. 

Conspiracy theories flourish where weak and ignorant minds are dominant.

SalvatoreQuattro

February 18th, 2016 at 6:33 PM ^

during traumatic events and the spread of flammable materials can weaken steel beams enough to bring about a buildings collapse. 

It doesn't take a poorly thought out conspiracy narrative that oh-so-conveniently fits the worldview of it's author to understand what happened.

False flag operations are largely the fantasies of know nothings. The ones that actually did happen don't involve mass casualties and cause a crushing blow to the economy. No businessman, no matter how nefarious his designs, would sign on to an operation that would place his company in jeopardy of losing huge sums of money. 9-11 is not the type of operation such people would seek.

http://www.dailymail.co.uk/news/article-2056088/Footage-kills-conspiracy-theories-Rare-footage-shows-WTC-7-consumed-fire.html

BlueMan80

February 18th, 2016 at 5:47 PM ^

It was owned by his employer and the employer has no problem hacking it.  Employers can take their devices back and all the information on them - theirs and yours.  It's just that the phone can't be opened because of how Apple architected their security.  I can see Apple's concern and I certainly want my personal privacy protected, but in this case, I would hope they can find a middle ground.  Similar to how encryption keys can be held by a third party (such as for the cloud storage service I use), perhaps one can be found for the eventual solution.  Make the government file a petition to access as they should.  Search warrant approved, then third party applies the tool.

remdog

February 18th, 2016 at 5:48 PM ^

allowing the government to literally force Apple into creating new software in this case sets a dangerous precedent.  Historically, the biggest threat to humanity has been an overly powerful and/or irresponsible government.  By far.  The government should not be given the power to literally enslave a private company, innocent of any wrongdoing, at its whim.  And in this case, it would be forcing the company to destroy its security systems.  The "backdoor" once created would easily fall into the wrong hands, especially with the incompetent government involved.  Then, all of our sensitive data would be at risk of exploitation by anybody, including terrorists.

michgoblue

February 18th, 2016 at 5:50 PM ^

This is a really tough issue, and while this is likely to be unpopular, I think that I lean in favor of the government on this one (something that I rarely do).  Here is my reasoning:

As a starting premise, let's all agree that civil liberties and privacy are, in fact, important.  Regardless of whether the government is being run by republicans or democrats, there needs to be a limit into the level to which the government can meddle in the private affairs of its citizens, provided that those citizens are not breaking the law or harming others.  So, as a general matter, I think that most of us would agree that the government should not have the ability ot right to hack into our cell phones, computers, tablets, etc., or to monitor our day to day activities. 

However, there are exceptions.  The San Bernadino case involves what is now accepted to be a terrorist incident.  One or both of the killers was radicalized, and there is documented evidence of at least one of them visiting known terrorist webpages.  The fact that their home was littered with bomb-making materials, unused pipe bombs, etc., indicates the possibility that they - OR OTHERS - were planning further attacks. 

So, there is a real possibility that their iphones might contain other valuable information that would lead the FBI and Homeland Security to (1) identify other would be terrorists, or (2) thwart another potential terrorist attack.  Is the infringement upon privacy IN THIS CASE worth it, if it accomplishes either of these goals?  As someone who was in NY on 9/11, I would say, unquestionably, yes. 

While it is not a pretty reality, the reality in which we live is that there are misguided individuals and groups of individuals who seek to do harm to our country and our citizens.  While civil liberties and privacy MUST be safeguarded whenever possible, it is my belief that we also MUST protect the lives of our citizens from another terrorist attack. 

laus102

February 18th, 2016 at 6:00 PM ^

Can Apple get into this phone without damaging the data?  Of course.

Do they want to?  No.  For several reasons:

  • If they wrote this separate iOS (iPhone, iPad, iWatch, etc.. operating system) like the government wants them to, this would mean that there exists an iOS that allows anyone in possession of it to forcibly break into any iOS device so as long as they had said iOS and a solid knowledge of brute force algorithms.  The problem with this?  Software is transmitted incredibly easy.  Think about how long it takes you to download and install a new operating system for your laptop.  Depending on your machine's limitations, anywhere from 45 minutes to an hour, and remember that an iPhone is significantly less complex device than say even a Macbook Air.  Considering that a mobile operating system's file size is many degrees smaller than the average high quality movie file, it could be very easily transferred to anyone, at any time.  
  • You might say, "Well, just put it on a super-encrypted hard drive that only certain members of the government are ever able to see."  See, the problem with this is, information is power, and no system is without corruption.  Even if it were given to a select group of people (even 1 person), I have serious doubts that it would remain only within that individual's possession forever (forever is a long time).  
  • Additionally, all of the above assumes that granting the government access to this kind of software is even something that we as a society want.  I believe that this is not what our society wants or needs.  As Tim Cook said very shrewdly, "this would set a very dangerous precedent" for future government / technology company interactions.  The federal government (not to mention, state, and even local districts) would claim in the future that they "need" this technology to solve certain crimes that might not (and very likely would not) necessarily carry with them the same importance that the San Bernardino attacks did.  Remember, if you give a mouse a cookie... 
  • Furthermore, the Federal government does not realistically even need to break into Syd Farook's iPhone.  Why?  Well, all of the information that the Fed wants from the phone is available in other places.  Phones are only very small computers, and a computer can't establish correspondence with people (read: ISIS terrorist cells) without a bijectional transference of data with another computer, which happens in the form of either cellular data, internet connection, or satellite communication, I suppose.  In my opinion, it would be both easier and safer for the Fed to subpoena companies like Verizon, AT&T, or Vodafone for this information they want so much.  Actually, now that I think about it, they probably don't even need to subpoena any companies...In the AT&T building in L.A. (I think) the NSA splits (like, clones) the huge optical connection (probably 5 feet in diameter) that processes most of the information that flows in and out of the United States.  

Sopwith

February 18th, 2016 at 6:51 PM ^

1. Could the government, in theory, require lock or safe manufacturers to use weaker metal in their products if it became too difficult to break through them in cases where they wanted to do a search? Because they're basically asking Apple to weaken everyone's phone, not just one they're interested in.

2. Once Apple writes the new software that creates the backdoor, and 3 hours later the Russian and Chinese govts have the code (because obviously) and start imprisoning/murdering human rights and democracy activists after accessing their encrypted messages, will the FBI/DOJ feel any moral responsibility to those people? Should they?

3. If we all agreed repealing the 4th, 5th, and 6th amendments would objectively be likely to reduce crime and keep everyone safer from everyday crime as well as terrorism, shouldn't we consider that, too? Would make policing 100x easier.

I'm Batman

February 18th, 2016 at 6:05 PM ^

A conversation like this comes up, and someone says I don't have anything to hide. Why not just wipe your ass with the Constitution? Give me liberty, or give me death. This country was built on that philosophy.