|
Read the rules in the OP before posting, please.In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up! NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action. |
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.
I think that's a fairer analogy to the iphone issue.
|
On February 19 2016 04:20 Plansix wrote:Show nested quote +On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.
IP addresses. If the phone sends data through the internet, it has a very specific address.
And regarding the "front door for authorized personnel only". If that door exists, it has a key. That key is a code of some sorts. That means that if someone gets access to that code, they can all the phones. Suddenly, the question whether your data is safe becomes "Who can access that key". Someone could crack the code, through brute force or whatever different way they prefer. Someone could pay a dude at Apple to give them the code. Someone could Break into apple and steal the physical drive that the code is on. And as soon as the code is out, there is nothing you can do. It is out there, and bad people will be able to access all of the phones. The existence of this possibility makes the data of a lot of people a lot less secure. And a lot of people will want access to that key, because it is worth a lot. Most people are already very uncomfortable at what the NSA does with our data. Now imagine that the FSB or the chinese intelligence agency gets that code. Or the mob. A single breaking point for a large system is not a good thing to have if you want to keep that system secure. Demanding to put one in is demanding to make everyones data less secure just so the government can spy on you better.
(Of course that simplifies things a bit, it doesn't have to be one key for all phones, it could be something that generates a key based on an algorithm based on some hardware specifications or another different thing. Doesn't change anything though, now the algorithm is that central weak point that could break)
|
On February 19 2016 04:34 Simberto wrote:Show nested quote +On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
|
Cayman Islands24199 Posts
On February 19 2016 04:33 Acrofales wrote:Show nested quote +On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.
|
On February 19 2016 04:39 Gorsameth wrote:Show nested quote +On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
Well. Decrypting the devices memory and getting the record from there is not going to help anybody at all with that task.
|
Cayman Islands24199 Posts
On February 19 2016 04:34 JinDesu wrote: My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.
I think that's a fairer analogy to the iphone issue. problem is this 'they cannot make you give them the passcode' is wrong and also dynamic depending on technological environment.
if indestructible safes and houses do exist, then laws will be created around them to resolve the similar issues that will arise.
|
On February 19 2016 04:40 oneofthem wrote:Show nested quote +On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. This is a back door.
Stop mislabelling it.
|
United States42694 Posts
On February 19 2016 04:40 oneofthem wrote:Show nested quote +On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.
|
Cayman Islands24199 Posts
On February 19 2016 04:42 WolfintheSheep wrote:Show nested quote +On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. This is a back door. Stop mislabelling it. don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?
|
On February 19 2016 04:39 Gorsameth wrote:Show nested quote +On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
That's about anonymity, not encryption. A threat needs to be decrypted to work.
|
On February 19 2016 04:40 oneofthem wrote:Show nested quote +On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.
You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.
|
Cayman Islands24199 Posts
On February 19 2016 04:43 KwarK wrote:Show nested quote +On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support. i'm sure they can design these systems against the technical problems like fragility. within the space of practical options having a secure frontdoor for law enforcement is the best way to go.
|
On February 19 2016 04:44 oneofthem wrote:Show nested quote +On February 19 2016 04:42 WolfintheSheep wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote: [quote] Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone.
Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones.
"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"
"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"
Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.
Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. This is a back door. Stop mislabelling it. don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right? If I told you to make a gun that only good guys can use, you'd call me stupid.
When you say to make a security hole that only good guys can access, this is apparently a good idea.
|
On February 19 2016 04:44 oneofthem wrote:Show nested quote +On February 19 2016 04:42 WolfintheSheep wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote: [quote] Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone.
Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones.
"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"
"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"
Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.
Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. This is a back door. Stop mislabelling it. don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?
please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap
either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
|
On February 19 2016 04:39 Gorsameth wrote:Show nested quote +On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.
http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters
On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.
So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.
On February 19 2016 04:44 Soap wrote:Show nested quote +On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. That's about anonymity, not encryption. A threat needs to be decrypted to work. No, its about evidence. You need to prove that the person performed the criminal act, which normally means finding the evidence on their PC/phone.. You can't infer, that isn't good enough.
|
Cayman Islands24199 Posts
On February 19 2016 04:45 Acrofales wrote:Show nested quote +On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
|
On February 19 2016 04:47 oneofthem wrote:Show nested quote +On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote: [quote] Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone.
Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones.
"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"
"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"
Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.
Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
"device specific code that requires a secured algorithm to decrypt"
That's called a password.
|
Cayman Islands24199 Posts
On February 19 2016 04:46 puerk wrote:Show nested quote +On February 19 2016 04:44 oneofthem wrote:On February 19 2016 04:42 WolfintheSheep wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote: [quote] the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. This is a back door. Stop mislabelling it. don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right? please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing lol fuck off this post here is to resolve a linguistic confusion. and you dont know more about that than me
|
Cayman Islands24199 Posts
On February 19 2016 04:48 WolfintheSheep wrote:Show nested quote +On February 19 2016 04:47 oneofthem wrote:On February 19 2016 04:45 Acrofales wrote:On February 19 2016 04:40 oneofthem wrote:On February 19 2016 04:33 Acrofales wrote:On February 19 2016 04:25 oneofthem wrote:On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote: [quote] the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever. Now what you're proposing is that somehow Apple magically has this key too? How? EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door. the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on. You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context. uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question. "device specific code that requires a secured algorithm to decrypt" That's called a password. not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
|
On February 19 2016 04:47 Plansix wrote:Show nested quote +On February 19 2016 04:39 Gorsameth wrote:On February 19 2016 04:34 Simberto wrote:On February 19 2016 04:20 Plansix wrote:On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. IP addresses. If the phone sends data through the internet, it has a very specific address. Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time. And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence. http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shootersOn a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them. So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order. Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography? Does american law have no form of: de.wikipedia.org? The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it.
|
|
|
|