18
   

Should the government be able to require a "backdoor" into our phones?

 
 
Robert Gentel
 
  1  
Reply Fri 19 Feb, 2016 03:14 pm
@Finn dAbuzz,
Pretty much. All encryption basically works because certain calculations are so computationally expensive that it would take too long for them to calculate reasonably.

Thing is, the pin used could likely be cracked in a reasonable amount of time if you allow for brute force methods (the software limits your incorrect guesses to defend against the brute force attack of guessing everything).

The FBI is asking Apple to write and sign software that removes those fundamental restrictions. If they get it they are definitely getting into the phone (probably won't find anything super useful, after all they already can get the content of the email and who he called etc from email and phone providers).

Getting Apple to produce a signed version of code that removes the restrictions on brute force hacking (and going so far as allowing the codes to be sent from another computer, instead of typed into the screen) is a significant request.

I think the government should be able to get at whatever it can, but essentially what is happening is that tech companies are building devices that only the user (not even themselves) can get into. This is great for all of our security but bad for the government so they are trying to force tech companies to refrain from making secure products and to cooperate with them in weakening critical parts of infrastructure that billions of people use every day.

And it is especially stupid because if they succeed they'll just shoot American business in the foot. If they get American run companies like Apple or Facebook to give them backdoors then people will just flock to tech products from countries that do not insist on undermining the technical security.

So even if they succeed they'll only succeed in harming America's tech industry, and this consumer data (and terrorist data) will just migrate to other countries harming the US tech sector.
Robert Gentel
 
  1  
Reply Fri 19 Feb, 2016 03:35 pm
@Finn dAbuzz,
Finn dAbuzz wrote:
This is a very tough question, and anyone who has argued that "slippery slope" propositions are logical fallacies should probably recuse themselves from this discussion, especially if they come down on the side of Apple.


It is a tough question but at its core is not a slippery slope argument. Apple's arguments about a legal precedent would be (if unsubstantiated) but that isn't the key point.

The key point is really simple (and complex at the same time) and it is that the way encryption works is based on some complex math and some simple assumptions. The simple assumptions are that you own what we can safely assume is a mathematical secret.

By giving any third party this secret this fundamental assumption does not work anymore.

This is a fundamentally changed situation, and no actual abuse needs to take place for technologists to point out that we all now have weaker security. It is objectively true.

Think of it like this. Right now your garage opener uses encrypted keys to give you a general assumption that you hold a secret that opens your door. Now it can be hacked etc, and is not perfect but you understand the reasonable level of security it provides.

Now let's say that a company starts making them so secure they can't be hacked. That's great for you, but bad for people who want in against your wishes. So the government goes to the company and demands a master key to all garages.

This is what's at stake. If this is granted, whether or not it is abused it has changed the situation and somebody has been granted the key to every house in the world. In security theory, this is objectively a significant step back in security.

It is a legal demand that users not be allowed to have secrets, and the user having the secret is the whole key to encryption.

It would be like a requirement that every user share their passwords with the government in case they want to get into a bad person's account, just because passwords have become so secure their previous methods no longer reliably work.

Quote:
This single instance can probably be worked out to give the FBI what it needs without compromising privacy, but it is valid to wonder what it might lead to. The Slippery Slope.


The thing about encryption is that it fundamentally cannot. Encryption is a complex bit of math around a simple set of assumptions and the kind of assumptions there are involve fundamental things like X is a secret, and Y can be trusted.

The whole concept is predicated on the notion that we can trust Apple not to do this one thing. The entire system revolves around the assumption that the key is my secret, and that I can trust Apple not to deliberately undermine the fundamental ways it is kept a secret or to build in a backdoor (i.e. break the arrangement). They built the box that only I can open and only they can be forced to unmake it that way, and make my box openable by third parties.

If you can't trust your software and hardware vendor (with Apple thankfully this means trusting fewer people) then the security stack collapses. Computer security requires several key assumptions to work, and if you deliberately break them to get into one computer you are still deliberately breaking the system for all people.

Quote:
I am someone who has an inherent distrust of people who hold and wield power and that is a fine definition of the Government. Anything that can be abused will be abused, and I always laugh when progressives declare their confidence in the integrity of government but are the absolute last ones to come to it's defense when abuses are claimed, and , in fact, generally believing the worst of it.


It's not even a matter of trusting the government to not abuse something. I don't personally fear the government at all in my encryption concerns. But if they have a master key to everyone's phones then I do have to trust their competence too, and their technical understanding.

We don't just have to trust that they will not abuse this key, we have to trust that they can actually keep the key a secret. And we have to trust that our fundamentally weakened locks are going to hold, even though one of the core assumptions they are built on is no longer true.

Quote:
I'm not happy declaring my support of Apple, but I must.


I hear ya. This is a legitimately tough debate. I personally prefer to have functional computer security but there are going to be legitimate cases where this is going to cost lives, I do not take it lightly at all and understand why the FBI et al seek the powers they do.
rosborne979
 
  1  
Reply Fri 19 Feb, 2016 05:52 pm
@Robert Gentel,
Robert Gentel wrote:
It is a legal demand that users not be allowed to have secrets, and the user having the secret is the whole key to encryption.

Yes, that is the core problem in a nutshell. Are people allowed, or will they be allowed to encrypt data such that only they can access it?

People already have the technical capability to encrypt data such that it cannot be decrypted, but the question is whether they will retain the legal right to do so.
0 Replies
 
ossobuco
 
  1  
Reply Fri 19 Feb, 2016 06:37 pm
@Setanta,
Thanks. I've been waffleing, and that helps clarify it for me, acknowledged internet tech dumpkopf.
0 Replies
 
BillRM
 
  -1  
Reply Fri 19 Feb, 2016 07:05 pm
@Setanta,
Look guys there is roughly a hundred software companies with end to end encrypt software that is not base in the US and do not need to answer to the US government or US Courts orders.

A large percent of it is even free and all can be downloaded over the net.

So other then harming US firms such as apple there is nothing that US government can do about end to end encrypting or encryption in general.

https://www.schneier.com/cryptography/archives/2016/02/a_worldwide_survey_o.html
BillRM
 
  0  
Reply Fri 19 Feb, 2016 10:34 pm
@BillRM,
Quote:


https://www.schneier.com/cryptography/paperfiles/worldwide-survey-of-encryption-products.pdf


Implications for US Policy
Currently in the US, UK, and other countries, there are policy discussions about mandatory backdoors in encryption products. Law
enforcement is the impetus behind these discussions; they claim that they are “going dark” and unable to decrypt either communications
or data in storage [Com14]. Security researchers have long argued that such backdoors are impossible to implement
securely, and will result in substandard security for everyone [AA+15]. Others argue that going dark is the wrong metaphor, and
that many avenues for surveillance remain [GG+16].
Our research points to a different argument. Proposed mandatory backdoors have always been about modifying the encryption
products used by everyone to eavesdrop on the few bad guys. That is, the FBI wants Apple—for example—to ensure that everyone’s
iPhone can be decrypted on demand so the FBI can decrypt the phones of the very few users under FBI investigation.
For this to be effective, those people using encryption to evade law enforcement must use Apple products. If they are able to use
alternative encryption products, especially products created and distributed in countries that are not subject to US law, they will
naturally switch to those products if Apple’s security weaknesses become known.
Our survey demonstrates that such switching is easy. Anyone who wants to evade an encryption backdoor in US or UK encryption
products has a wide variety of foreign products they can use instead: to encrypt their hard drives, voice conversations, chat
sessions, VPN links, and everything else. Any mandatory backdoor will be ineffective simply because the marketplace is so international.
Yes, it will catch criminals who are too stupid to realize that their security products have been backdoored or too lazy to
switch to an alternative, but those criminals are likely to make all sorts of other mistakes in their security and be catchable anyway.
The smart criminals that any mandatory backdoors are supposed to catch—terrorists, organized crime, and so on—will easily be
able to evade those backdoors. Even if a criminal has to use, for example, a US encryption product for communicating with the
world at large, it is easy for him to also use a non-US non-backdoored encryption product for communicating with his compatriots.
The US produces the most products that use encryption, and also the most widely used products. Any US law mandating backdoors
will primarily affect people who are unconcerned about government surveillance, or at least unconcerned enough to make
the switch. These people will be left vulnerable to abuse of those backdoors by cybercriminals and other governments.
Feb 2016, v 1.0 A Worldwide Survey of Encryption Products ••
•••••


•7
Conclusions
Laws regulating product features are national, and only affect people living in the countries in which they’re enacted. It is easy to
purchase products, especially software products, that are sold anywhere in the world from everywhere in the world. Encryption
products come from all over the world. Any national law mandating encryption backdoors will overwhelmingly affect the innocent
users of those products. Smart criminals and terrorists will easily be able to switch to more-secure alternatives
0 Replies
 
Olivier5
 
  3  
Reply Wed 24 Feb, 2016 03:48 am
Which government are we talking about here? The Chinese?
0 Replies
 
DrewDad
 
  2  
Reply Wed 24 Feb, 2016 08:00 am
@Robert Gentel,
Robert Gentel wrote:
The whole concept is predicated on the notion that we can trust Apple not to do this one thing. The entire system revolves around the assumption that the key is my secret, and that I can trust Apple not to deliberately undermine the fundamental ways it is kept a secret or to build in a backdoor (i.e. break the arrangement). They built the box that only I can open and only they can be forced to unmake it that way, and make my box openable by third parties.

Yup. And it's already been demonstrated that they can open the box, or at least weaken the lock. Even if Apple doesn't do it for this phone, someone, eventually, will write the code to do so.
BillRM
 
  0  
Reply Wed 24 Feb, 2016 03:14 pm
@DrewDad,
Sorry but the phone in question allow a must longer passcodes then a five or so digit number to the point that no one including apple could break it.

The hope and it is only a hope at this point is that the user went with the shorter default length passcode that can be broken once the limit of ten guesses had been removed by apple by brute force guessing.

But at this point as long as you used the maximum level of protection instead of going with the convenience defaults the technology will protected your secrets until hell freeze over.



BillRM
 
  0  
Reply Wed 24 Feb, 2016 03:55 pm
@BillRM,
LOL a comment that if a full length alph/number passcode had been used instead of the must shorter 5 digit code default on the phone all the people at apple and the federal government is not likely to be able to brute force it, got a vote down!!

Would anyone care to explain the vote down?
BillRM
 
  1  
Reply Wed 24 Feb, 2016 06:10 pm
@BillRM,
My another vote down on a post that deal only with a mathematics fact that a longer passcode that also included both numbers and letters is many many many times more difficult to brute force then a shorter numbers only passcode.

Lord I would love to know the type of person who would give such a post a vote down and why he or she does so.
McGentrix
 
  0  
Reply Wed 24 Feb, 2016 06:38 pm
@BillRM,
BillRM wrote:

My another vote down on a post that deal only with a mathematics fact that a longer passcode that also included both numbers and letters is many many many times more difficult to brute force then a shorter numbers only passcode.

Lord I would love to know the type of person who would give such a post a vote down and why he or she does so.


Don't worry about it. There are a few people who think it's a popularity contest like in high school. Their immaturity should have no effect on your posts and when you comment on it, it allows them to sit behind their monitors and snigger on about how funny they are that they gave you the thumbs down.

Think of it as an intelligence test. They have no intelligence, so all they can do is give you a thumbs down and snigger, alone in their lonely, pathetic lives.
cicerone imposter
 
  1  
Reply Wed 24 Feb, 2016 06:43 pm
@McGentrix,
HA HA HA HA.....intelligence test......HA HA HA.....
0 Replies
 
Fil Albuquerque
 
  1  
Reply Thu 25 Feb, 2016 10:21 am
Next thing we will have a mind reader installed in the back of our necks...
0 Replies
 
Robert Gentel
 
  2  
Reply Wed 2 Mar, 2016 01:59 pm
@DrewDad,
DrewDad wrote:
Yup. And it's already been demonstrated that they can open the box, or at least weaken the lock. Even if Apple doesn't do it for this phone, someone, eventually, will write the code to do so.


I think there is a key piece of misunderstanding here, the software must be signed by Apple's keys. If this were not the case then the FBI would have no legal basis for this demand and should be told to do it themselves.

The whole point is that nobody can do this, sure anyone can write the code but they cant' do anything to run it given that the phone is setup to not allow that unsigned code to replace the OS.
DrewDad
 
  1  
Reply Thu 3 Mar, 2016 09:42 am
@Robert Gentel,
You have more faith in the security of the keys than I do.

Given how many times digital certificates have been mishandled, I think it's entirely possible that someone could get a valid certificate to sign the code with.
InfraBlue
 
  1  
Reply Thu 3 Mar, 2016 11:09 am
@Robert Gentel,
Robert Gentel wrote:
The FBI is asking Apple to write and sign software that removes those fundamental restrictions.


The FBI is compelling, or trying to, Apple to do these things.

In addition to the Fifth Amendment Rights stance that Apple is arguing from, they are also taking a First Amendment Right of Free Speech stance that the government cannot commpel them to say--in code--what they don't want to say in regard to the code that the FBI is trying to force them to write.
cicerone imposter
 
  1  
Reply Thu 3 Mar, 2016 11:12 am
@InfraBlue,
We need to contact our congressmen, and tell them to support Apple.
0 Replies
 
Robert Gentel
 
  2  
Reply Thu 3 Mar, 2016 02:08 pm
@DrewDad,
Yes I definitely do, I would not use a computer if I believed what you appear to about encryption.
maxdancona
 
  2  
Reply Thu 3 Mar, 2016 07:34 pm
@Robert Gentel,
Yeah, I am with Robert on this.

If keys are managed correctly, encryption is quite strong. Technology companies know how to manage and protect encryption keys.
 

Related Topics

Philosophy of Technology? It's Significance & Nature - Question by JustifiedReflections
Is technology killing art? - Discussion by Cyracuz
Raspberry Pi - the $5 computer - Question by ehBeth
3D M16 - Discussion by gungasnake
This could be important - Discussion by mysteryman
Norway starts thorium reactor test - Discussion by gungasnake
 
Copyright © 2021 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 05/10/2021 at 08:29:26