|
Read the rules in the OP before posting, please.In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up! NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action. |
On February 19 2016 04:47 oneofthem wrote:Show nested quote +On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote: [quote] Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone.
Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones.
"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"
"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"
Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.
Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen.
You might want to look at the parallels from DRM on movies and games. They have all been cracked. Even the dedicated uncrackable chips in consoles.
|
On February 19 2016 04:51 oneofthem wrote:Show nested quote +On February 19 2016 04:48 WolfintheSheep wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote: [quote] there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. "device specific code that requires a secured algorithm to decrypt" That's called a password. not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
https://en.wikipedia.org/wiki/Spoofing_attack#GPS_Spoofing
|
Cayman Islands24199 Posts
On February 19 2016 04:53 Acrofales wrote:Show nested quote +On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote: [quote] the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution.
|
On February 19 2016 04:51 oneofthem wrote:Show nested quote +On February 19 2016 04:48 WolfintheSheep wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote: [quote] there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. "device specific code that requires a secured algorithm to decrypt" That's called a password. not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
What on earth makes satellites special snowflakes? You really have no idea...
If anything, satellitrs make things easier, because literally everybody can listen in on satellite signals. And given enough data EVERY encryption system in existence can be cracked.
|
Cayman Islands24199 Posts
On February 19 2016 04:53 Soap wrote:Show nested quote +On February 19 2016 04:51 oneofthem wrote:On February 19 2016 04:48 WolfintheSheep wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote: [quote]so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. "device specific code that requires a secured algorithm to decrypt" That's called a password. not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there. https://en.wikipedia.org/wiki/Spoofing_attack#GPS_Spoofing sure and you can spoof remote authentication servers but this is not really an insurmountable challenge when you want to limit the access to physical devices in possession of law enforcement. put it in the fbi office.
|
On February 19 2016 04:52 puerk wrote:Show nested quote +On February 19 2016 04:47 Plansix wrote:On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence. http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shootersOn a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them. So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order. Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography? Does american law have no form of: de.wikipedia.org? The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it. First of all, I am a paralegal. Second, don't be an asshole. That second one might be hard for you.
Its a piece of tech. Apple has created a system where it is marketing and selling communication devices that cannot be forced opened without destroying the evidence. They either comply or the FBI is going to push congress to make this form of encryption illegal without some way of opening it. They are not going to be able to sell smartphones that no one can open. Once that becomes common knowledge, Apple will be the number 1 phone among criminals.
|
oneofthem, may i ask you one question: have you ever in your life, thought to yourself "well, i do not know enough about this topic"?
|
On February 19 2016 04:58 oneofthem wrote:Show nested quote +On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote: [quote] there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. This is why the technologically inept should not comment on these things.
"Just properly secure it" is not a solution, it's a wish.
|
On February 19 2016 05:00 Plansix wrote:Show nested quote +On February 19 2016 04:52 puerk wrote:On February 19 2016 04:47 Plansix wrote:On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote: [quote]
That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.
True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence. http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shootersOn a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them. So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order. Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography? Does american law have no form of: de.wikipedia.org? The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it. First of all, I am a paralegal. Second, don't be an asshole. That second one might be hard for you. Its a piece of tech. Apple has created a system where it is marketing and selling communication devices that cannot be forced opened without destroying the evidence. They either comply or the FBI is going to push congress to make this form of encryption illegal without some way of opening it. Ok, sorry i will try.
Please look at that legal concept, i hope google autotranslate makes it kinda understandable. Approprietness of means of law enforcement is an issue that we can talk about in a politics thread, giving advice on how encryption should get circumvented by people not knowing how said encryption works however is pointless.
I agree with you that it might happen this way, and i am saying it is a bad outcome for society overall.
|
On February 19 2016 05:03 WolfintheSheep wrote:Show nested quote +On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote: [quote]so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. This is why the technologically inept should not comment on these things. "Just properly secure it" is not a solution, it's a wish. They should remove the auto delete feature, that is a little much. From reports, it takes a really long time to crack these phones by brute force, so that should be enough.
|
Cayman Islands24199 Posts
On February 19 2016 05:01 puerk wrote: oneofthem, may i ask you one question: have you ever in your life, thought to yourself "well, i do not know enough about this topic"?
i don't know the particular technical implementations but this is just based on computational mathematics results. there are some well known hard computational problems that enable extremely strong encryption in asymmetric key situation. and this enables the public door to be sufficiently secure.
|
On February 19 2016 04:58 oneofthem wrote:Show nested quote +On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote: [quote] there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution.
Nothing is impenetrable.
Security is about making things so hard to break into that it's prohibitively expensive to do so.
The problem you are creating, however, is that this golden widget will be almost indescribably valuable to a lot of ppl, ranging from organized crime to foreign agencies. It also cannot be changed sufficiently often to not give these organizations a lot of time and data to Crack the system.
Combine the two, and I give any such system between a few months and a year before it is blown wide open, and with it, all data on all smartphones... or at least all American ones. I give it about a week before Samsung and all Chinese brands drop Android and switch to Tizen or some other homebrew OS, and the rest of the world laughs at how stupid the US was to purposefully expose their entire population.
And you thought the Ashley Madison hack was bad...
|
On February 19 2016 04:58 oneofthem wrote:Show nested quote +On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote: [quote] there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution.
And who has that algorithm? You can't make an algorithm that only works when you put in a legitimate court order. Is it somewhere at the FBI? Because we know how well we can trust government agencies not to abuse that kind of access. Or is it somewhere at apple? How is it kept safe? You need incorruptable people as the only ones who are able to access that algorithm.
Lets say we give the algorithm to Bob, who stores it in his brain. Bob is the most honest person in the world, and would never do anything wrong or illegal. Now someone abducts bobs daughter and starts sending him fingers. How many will it take for Bob to give out the algorithm?
Or lets say you split the algorithm into a dozen parts, and Bob 1-12 each have one of them. Now they have to come together to unlock a phone, because noone has the complete algorithm. But that is really complicated and annoying, so Bob 1 just asks Bob 2 to send his part over because Bob 2s son has a football game that saturday And since Bob 2 knows that Bob 1 is a good guy, where is the harm in that, the remaining 10 parts are still at other people, so all ist still fine. And what if Bob 8 suddenly dies? Now the whole thing is unusable. So obviously you need Backup people with algorithm fragments. Or the FBI has three hundred i phones they need decrypted on one weekend, half in Alaska, half in Texas, all of which with very legitimate court orders and so on. Would be really impractical to fly all of the Bobs and Backupbobs three times across the country, there must be an easier solution to this, right? How about we just gave all the data to one of the Bobs, that would make things more practical.
You see how that central point of failure is a major security risk?
|
Cayman Islands24199 Posts
On February 19 2016 05:03 WolfintheSheep wrote:Show nested quote +On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote: [quote]so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. This is why the technologically inept should not comment on these things. "Just properly secure it" is not a solution, it's a wish. this is as technical as holding gold in a vault, since we are really talking about securing the method not a piece of passcode. learn to read.
|
United States43187 Posts
On February 19 2016 05:05 Plansix wrote:Show nested quote +On February 19 2016 05:03 WolfintheSheep wrote:On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote: [quote] The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.
You think it means "only the people we want".
The reality is "only people who have the key".
And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. This is why the technologically inept should not comment on these things. "Just properly secure it" is not a solution, it's a wish. They should remove the auto delete feature, that is a little much. From reports, it takes a really long time to crack these phones by brute force, so that should be enough. After all, computing power doesn't increase over time so there's no need to worry about it. Additionally they put tight limits on who has access to computing power, it's restricted only to humans. The auto delete is the defence against the brute force which is the point.
|
On February 19 2016 05:05 puerk wrote:Show nested quote +On February 19 2016 05:00 Plansix wrote:On February 19 2016 04:52 puerk wrote:On February 19 2016 04:47 Plansix wrote:On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote: [quote]
True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.
From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence. http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shootersOn a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them. So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order. Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography? Does american law have no form of: de.wikipedia.org? The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it. First of all, I am a paralegal. Second, don't be an asshole. That second one might be hard for you. Its a piece of tech. Apple has created a system where it is marketing and selling communication devices that cannot be forced opened without destroying the evidence. They either comply or the FBI is going to push congress to make this form of encryption illegal without some way of opening it. Ok, sorry i will try. Please look at that legal concept, i hope google autotranslate makes it kinda understandable. Approprietness of means of law enforcement is an issue that we can talk about in a politics thread, giving advice on how encryption should get circumvented by people not knowing how said encryption works however is pointless. I agree with you that it might happen this way, and i am saying it is a bad outcome for society overall. Let me put it to you this way, you understand encryption. I understand how hard it is to get evidence entered into a court. Even physical evidence is challenging at times. We have had documents that are worthless because we couldn't provide a witness to confirm they were authentic. Same with photos, because no one could testify when they we needed to prove they were taken in a specific time frame. Emails are a nightmare if one side won't admit they sent it. You need to call the system administrator and have them confirm where the email came from and then prove that the person in question had sole access to the machine. If you can't access the phone to provide proof to the jury that the email/text/tweet was sent from that phone, it will likely be impossible to prove who send it. That is just a reality of the legal process for almost all evidence.
|
Cayman Islands24199 Posts
On February 19 2016 05:08 Simberto wrote:Show nested quote +On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote: [quote]so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. And who has that algorithm? You can't make an algorithm that only works when you put in a legitimate court order. Is it somewhere at the FBI? Because we know how well we can trust government agencies not to abuse that kind of access. Or is it somewhere at apple? How is it kept safe? You need incorruptable people as the only ones who are able to access that algorithm. Lets say we give the algorithm to Bob, who stores it in his brain. Bob is the most honest person in the world, and would never do anything wrong or illegal. Now someone abducts bobs daughter and starts sending him fingers. How many will it take for Bob to give out the algorithm? Or lets say you split the algorithm into a dozen parts, and Bob 1-12 each have one of them. Now they have to come together to unlock a phone, because noone has the complete algorithm. But that is really complicated and annoying, so Bob 1 just asks Bob 2 to send his part over because Bob 2s son has a football game that saturday And since Bob 2 knows that Bob 1 is a good guy, where is the harm in that, the remaining 10 parts are still at other people, so all ist still fine. And what if Bob 8 suddenly dies? Now the whole thing is unusable. So obviously you need Backup people with algorithm fragments. Or the FBI has three hundred i phones they need decrypted on one weekend, half in Alaska, half in Texas, all of which with very legitimate court orders and so on. Would be really impractical to fly all of the Bobs and Backupbobs three times across the country, there must be an easier solution to this, right? How about we just gave all the data to one of the Bobs, that would make things more practical. You see how that central point of failure is a major security risk? now we are getting somewhere. it is reduced to the question of whether a politically acceptable level of risk exists to enable a front door for law enforcement.
your question about the FBI securing info is really a technical matter that should not define the risk or preclude the possibility. obviously they need to do a good job to secure this stuff. just like apple needs to secure its firmware and security design. as long as there is no lower plank in the barrel created it should be fine. it is not impossible for the FBI to be at least as secure as apple's own security.
|
On February 19 2016 05:09 oneofthem wrote:Show nested quote +On February 19 2016 05:03 WolfintheSheep wrote:On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote: [quote] The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.
You think it means "only the people we want".
The reality is "only people who have the key".
And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. This is why the technologically inept should not comment on these things. "Just properly secure it" is not a solution, it's a wish. this is as technical as holding gold in a vault, since we are really talking about securing the method not a piece of passcode. learn to read.
You fail to understand that there is no difference between the pass code and the method.
Whether you use a key or a sequence of genetically modified monkeys that have to dance in a specific order, it is only the complexity that increases, not the underlying principles. And give enough incentive, that complexity can be overcome (usually because some FBI shmoe leaves his laptop in his car)
|
On February 19 2016 05:12 Plansix wrote:Show nested quote +On February 19 2016 05:05 puerk wrote:On February 19 2016 05:00 Plansix wrote:On February 19 2016 04:52 puerk wrote:On February 19 2016 04:47 Plansix wrote:On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote: [quote] From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.
A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.
This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence. http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shootersOn a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them. So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order. Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography? Does american law have no form of: de.wikipedia.org? The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it. First of all, I am a paralegal. Second, don't be an asshole. That second one might be hard for you. Its a piece of tech. Apple has created a system where it is marketing and selling communication devices that cannot be forced opened without destroying the evidence. They either comply or the FBI is going to push congress to make this form of encryption illegal without some way of opening it. Ok, sorry i will try. Please look at that legal concept, i hope google autotranslate makes it kinda understandable. Approprietness of means of law enforcement is an issue that we can talk about in a politics thread, giving advice on how encryption should get circumvented by people not knowing how said encryption works however is pointless. I agree with you that it might happen this way, and i am saying it is a bad outcome for society overall. Let me put it to you this way, you understand encryption. I understand how hard it is to get evidence entered into a court. Even physical evidence is challenging at times. We have had documents that are worthless because we couldn't provide a witness to confirm they were authentic. Same with photos, because no one could testify when they we needed to prove they were taken in a specific time frame. Emails are a nightmare if one side won't admit they sent it. If you can't access the phone to provide proof to the jury that the email/text/tweet was sent from that phone, it will likely be impossible to prove who send it. That is just a reality of the legal process for almost all evidence. And you think the solution is to throw everything out on the street instead? (since it is almost inevitable that it will be broken once introduced) Sometimes the privacy of everyone is more important then your ability to convict one man.
|
On February 19 2016 05:13 oneofthem wrote:Show nested quote +On February 19 2016 05:08 Simberto wrote:On February 19 2016 04:58 oneofthem wrote:On February 19 2016 04:53 Acrofales wrote:On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote: [quote] The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.
You think it means "only the people we want".
The reality is "only people who have the key".
And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. Except that now your golden algorithm has taken the place of your golden key. I am literally repeating the wired article I posted not 1 page back. Please read a wiki page on information security or something... It is either device specific, or universally usable whenever the FBI wants to. In the latter, it is a single point of failure. And while at first that key/algorithm will be fairly safe with just the FBI having access, I give it a few months/year at most before it is reproduced, copied or stolen. you can't reverse engineer some encrypted locks to find the solver algorithm, so just properly secure the golden algorithm and we are okay. the overall landscape is that strength of encryption can get very high, so if your question is on the security of the government door then it should not be an unsolvable problem, except when there is so much dismissal for want of a solution. And who has that algorithm? You can't make an algorithm that only works when you put in a legitimate court order. Is it somewhere at the FBI? Because we know how well we can trust government agencies not to abuse that kind of access. Or is it somewhere at apple? How is it kept safe? You need incorruptable people as the only ones who are able to access that algorithm. Lets say we give the algorithm to Bob, who stores it in his brain. Bob is the most honest person in the world, and would never do anything wrong or illegal. Now someone abducts bobs daughter and starts sending him fingers. How many will it take for Bob to give out the algorithm? Or lets say you split the algorithm into a dozen parts, and Bob 1-12 each have one of them. Now they have to come together to unlock a phone, because noone has the complete algorithm. But that is really complicated and annoying, so Bob 1 just asks Bob 2 to send his part over because Bob 2s son has a football game that saturday And since Bob 2 knows that Bob 1 is a good guy, where is the harm in that, the remaining 10 parts are still at other people, so all ist still fine. And what if Bob 8 suddenly dies? Now the whole thing is unusable. So obviously you need Backup people with algorithm fragments. Or the FBI has three hundred i phones they need decrypted on one weekend, half in Alaska, half in Texas, all of which with very legitimate court orders and so on. Would be really impractical to fly all of the Bobs and Backupbobs three times across the country, there must be an easier solution to this, right? How about we just gave all the data to one of the Bobs, that would make things more practical. You see how that central point of failure is a major security risk? now we are getting somewhere. it is reduced to the question of whether a politically acceptable level of risk exists to enable a front door for law enforcement. your question about the FBI securing info is really a technical matter that should not define the risk or preclude the possibility. obviously they need to do a good job to secure this stuff. just like apple needs to secure its firmware and security design. as long as there is no lower plank in the barrel created it should be fine. it is not impossible for the FBI to be at least as secure as apple's own security.
Except that nobody is trying to break Apple's security, because they purposefully don't have a magic golden key to their cell phones.
Edit: oh, and Apple's security is not anything special. Their "secret" designs consistently leak before launch. As do Samsung's, and everyone else's.
|
|
|
|
|
|