|
Read the rules in the OP before posting, please.In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up! NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action. |
Cayman Islands24199 Posts
On February 19 2016 02:35 Gorsameth wrote:Show nested quote +On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.
|
On February 19 2016 03:48 Acrofales wrote:Show nested quote +On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.
A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.
|
On February 19 2016 03:48 Acrofales wrote:Show nested quote +On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.
They can seek the manufacturer assistance, who replied they'd have to make a key to all of their safes. That can be construed as a violation of the Fourth Amendment requirement that searches must be specific.
|
On February 19 2016 03:56 oneofthem wrote:Show nested quote +On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door
|
On February 19 2016 03:56 Plansix wrote:Show nested quote +On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain.
Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.
Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.
|
Cayman Islands24199 Posts
On February 19 2016 04:08 puerk wrote:Show nested quote +On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open?
|
On February 19 2016 04:13 oneofthem wrote:Show nested quote +On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? creating it after the fact only works through a vulnerability in the system, so doing it shows the existence of said vulnerability: in this case there seems to be a way to force a locked iphone to update its firmware to a faulty one, that will give access
if this is possible is still unclear, but if apple does it, the systems flaw is exposed and others will copy the technique
|
On February 19 2016 04:13 oneofthem wrote:Show nested quote +On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? How often do we need to lay out the path this takes before you understand why it is a problem?
|
On February 19 2016 03:56 Plansix wrote:Show nested quote +On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.
You're once again completely missing the point.
The police can: 1. Listen to what you're saying, and writing, through a wiretap they got a court order for. If those messages are encrypted, then whoever is on the receiving end is either receiving nonsensical jibberish, or can provide you with the decryption key.
2. The police can take that phone and figure out any number of ways to work around the security, just as today they manage to get info out of suspects in completely legal interrogations.
Neither of these have anything to do with the case at hand.
And the ability to store data in a way that nobody except the owner can access already exists and is freely available with a very brief Google search. All Apple did is take this already existing technology and embed it in their phone.
In actual fact, it already existed since virtually forever: you could always take your sensitive documents and hide them somewhere.
Edit: or as someone just pointed out: by memorizing it.
|
On February 19 2016 04:11 WolfintheSheep wrote:Show nested quote +On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.
Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.
|
On February 19 2016 04:13 oneofthem wrote:Show nested quote +On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.
You think it means "only the people we want".
The reality is "only people who have the key".
And there is a gigantic difference.
|
I was under the impression that if the FBI tries to brute force the phone or break the encryption, it will delete itself and destroy the evidence. Is that not true?
|
On February 19 2016 04:22 Plansix wrote: I was under the impression that if the FBI tries to brute force the phone or break the encryption, it will delete itself and destroy the evidence. Is that not true? It might be, but that has a VERY simple workaround. The real problem is the time it will take to crack the pass code.
EDIT: the work around is obviously to bypass the OS entirely and read out the ROM directly. You can keep flashing it ad infinitum if it keeps getting destroyed. I don't really see any way for iOS to get around that. Even if there's two separate ROMS like in their newer Secure Enclave systems. Only way I can think of doing this irreversibly is by tying the code up in some hardware and actually destroying that hardware if the code is typed wrong too often. Fairly certain that isn't happening.
|
Cayman Islands24199 Posts
On February 19 2016 04:17 Gorsameth wrote:Show nested quote +On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? How often do we need to lay out the path this takes before you understand why it is a problem? it's not a problem with some relaxed assumptions. again a lot of intermediate and less scary optins that properly take into account compelling security/law enforcement interests
|
Cayman Islands24199 Posts
On February 19 2016 04:20 WolfintheSheep wrote:Show nested quote +On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.
|
On February 19 2016 04:20 Plansix wrote:Show nested quote +On February 19 2016 04:11 WolfintheSheep wrote:On February 19 2016 03:56 Plansix wrote:On February 19 2016 03:48 Acrofales wrote:On February 19 2016 03:43 Jayme wrote:On February 19 2016 03:35 Acrofales wrote:Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous. Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies. And no, a search warrant doesn't mean you have to open your safe. Just your front door.Edit: forgot the link. http://www.wired.com/2014/10/golden-key/ That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house. True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code. From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats. A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal. This exists already, and it's called the human brain. Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt. Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights. Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in. And if I wore gloves and sent these evil messages from newspaper cutouts like the good old days, you're saying I would never be convicted?
Sure, encryption is a road block for law enforcement, but again, the fact that the evidence even exists on a phone or device now instead of nowhere is a step up from even 20 years back, so let's not pretend that encryption is providing some magical hole that criminals can escape to.
|
Cayman Islands24199 Posts
On February 19 2016 04:15 puerk wrote:Show nested quote +On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? creating it after the fact only works through a vulnerability in the system, so doing it shows the existence of said vulnerability: in this case there seems to be a way to force a locked iphone to update its firmware to a faulty one, that will give access if this is possible is still unclear, but if apple does it, the systems flaw is exposed and others will copy the technique the discussion i am interested in having concerns the general design of secured devices. this particular phone can just be individually unlocked, then apple can issue an update that would introduce front door measure at a later date for future problems.
i see no necessitation of security flaw from this sort of a design.
|
On February 19 2016 04:25 oneofthem wrote:Show nested quote +On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw. A secondary access point does not become magically more secure just because you called it a front door.
The security flaw is that there is an access vector that is identical across all devices outside of the control of the owner of the secured information.
Not that it has the word "back" in it.
|
On February 19 2016 04:30 oneofthem wrote:Show nested quote +On February 19 2016 04:15 puerk wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? creating it after the fact only works through a vulnerability in the system, so doing it shows the existence of said vulnerability: in this case there seems to be a way to force a locked iphone to update its firmware to a faulty one, that will give access if this is possible is still unclear, but if apple does it, the systems flaw is exposed and others will copy the technique the discussion i am interested in having concerns the general design of secured devices. this particular phone can just be individually unlocked, then apple can issue an update that would introduce front door measure at a later date for future problems. i see no necessitation of security flaw from this sort of a design.
you expertly elaborate about issues you have not the first clue about. this might be the problem in this conversation: you have no fucking clue and act like you do and throw a tantrum when your dreamt up cluelessness solution doesnt work.
you behave like a child that talks about dry rain, and how parents are mean when they say that is not how rain works....
|
On February 19 2016 04:25 oneofthem wrote:Show nested quote +On February 19 2016 04:20 WolfintheSheep wrote:On February 19 2016 04:13 oneofthem wrote:On February 19 2016 04:08 puerk wrote:On February 19 2016 03:56 oneofthem wrote:On February 19 2016 02:35 Gorsameth wrote:On February 19 2016 02:27 oneofthem wrote: privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.
strawmanning open access when it need not go so far is just a waste of time Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be. Apple can do this with a 1 time program to only break this one phone. Then the FBI/NSA/CIA will have another phone, and another, and another and.... Before long Apple has several people doing nothing but writing 1 use programs to break phones. "Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe" "Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin" Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!. Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second. the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards. there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?) everything else is by definition of encryption systems a back door so what is the problem with creating another similarly secure front door mechanism that authorized access can open? The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality. You think it means "only the people we want". The reality is "only people who have the key". And there is a gigantic difference. that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.
OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN
He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.
Now what you're proposing is that somehow Apple magically has this key too? How?
EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
|
|
|
|