• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 17:56
CET 22:56
KST 06:56
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL20] Finals Preview: Arrival13TL.net Map Contest #21: Voting10[ASL20] Ro4 Preview: Descent11Team TLMC #5: Winners Announced!3[ASL20] Ro8 Preview Pt2: Holding On9
Community News
2025 RSL Offline Finals Dates + Ticket Sales!9BSL21 Open Qualifiers Week & CONFIRM PARTICIPATION1Crank Gathers Season 2: SC II Pro Teams9Merivale 8 Open - LAN - Stellar Fest3Chinese SC2 server to reopen; live all-star event in Hangzhou23
StarCraft 2
General
How DragonWebRec Saved My Crypto and My Sanity! 🌟 RotterdaM "Serral is the GOAT, and it's not close" The New Patch Killed Mech! Could we add "Avoid Matchup" Feature for rankgame Chinese SC2 server to reopen; live all-star event in Hangzhou
Tourneys
Crank Gathers Season 2: SC II Pro Teams 2025 RSL Offline Finals Dates + Ticket Sales! Merivale 8 Open - LAN - Stellar Fest $5,000+ WardiTV 2025 Championship $3,500 WardiTV Korean Royale S4
Strategy
Custom Maps
Map Editor closed ?
External Content
Mutation # 497 Battle Haredened Mutation # 496 Endless Infection Mutation # 495 Rest In Peace Mutation # 494 Unstable Environment
Brood War
General
[ASL20] Ask the mapmakers — Drop your questions BSL Team A vs Koreans - Sat-Sun 16:00 CET [ASL20] Finals Preview: Arrival BW General Discussion BSL Season 21
Tourneys
[ASL20] Grand Finals The Casual Games of the Week Thread BSL21 Open Qualifiers Week & CONFIRM PARTICIPATION ASL final tickets help
Strategy
How to stay on top of macro? PvZ map balance Soma's 9 hatch build from ASL Game 2 Current Meta
Other Games
General Games
Stormgate/Frost Giant Megathread General RTS Discussion Thread Path of Exile Nintendo Switch Thread Dawn of War IV
Dota 2
Official 'what is Dota anymore' discussion LiquidDota to reintegrate into TL.net
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread SPIRED by.ASL Mafia {211640}
Community
General
US Politics Mega-thread Things Aren’t Peaceful in Palestine Russo-Ukrainian War Thread YouTube Thread The Chess Thread
Fan Clubs
White-Ra Fan Club The herO Fan Club!
Media & Entertainment
Movie Discussion! Anime Discussion Thread [Manga] One Piece Korean Music Discussion Series you have seen recently...
Sports
2024 - 2026 Football Thread Formula 1 Discussion MLB/Baseball 2023 TeamLiquid Health and Fitness Initiative For 2023 NBA General Discussion
World Cup 2022
Tech Support
SC2 Client Relocalization [Change SC2 Language] Linksys AE2500 USB WIFI keeps disconnecting Computer Build, Upgrade & Buying Resource Thread
TL Community
The Automated Ban List Recent Gifted Posts
Blogs
LMAO (controversial!!)
Peanutsc
The Benefits Of Limited Comm…
TrAiDoS
Our Last Hope in th…
KrillinFromwales
Certified Crazy
Hildegard
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1428 users

US Politics Mega-thread - Page 2959

Forum Index > Closed
Post a Reply
Prev 1 2957 2958 2959 2960 2961 10093 Next
Read the rules in the OP before posting, please.

In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up!

NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious.
Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action.
JinDesu
Profile Blog Joined August 2010
United States3990 Posts
February 18 2016 19:34 GMT
#59161
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.
Yargh
Simberto
Profile Blog Joined July 2010
Germany11609 Posts
February 18 2016 19:34 GMT
#59162
On February 19 2016 04:20 Plansix wrote:
Show nested quote +
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

And regarding the "front door for authorized personnel only". If that door exists, it has a key. That key is a code of some sorts. That means that if someone gets access to that code, they can all the phones. Suddenly, the question whether your data is safe becomes "Who can access that key". Someone could crack the code, through brute force or whatever different way they prefer. Someone could pay a dude at Apple to give them the code. Someone could Break into apple and steal the physical drive that the code is on. And as soon as the code is out, there is nothing you can do. It is out there, and bad people will be able to access all of the phones. The existence of this possibility makes the data of a lot of people a lot less secure. And a lot of people will want access to that key, because it is worth a lot. Most people are already very uncomfortable at what the NSA does with our data. Now imagine that the FSB or the chinese intelligence agency gets that code. Or the mob. A single breaking point for a large system is not a good thing to have if you want to keep that system secure. Demanding to put one in is demanding to make everyones data less secure just so the government can spy on you better.

(Of course that simplifies things a bit, it doesn't have to be one key for all phones, it could be something that generates a key based on an algorithm based on some hardware specifications or another different thing. Doesn't change anything though, now the algorithm is that central weak point that could break)
Gorsameth
Profile Joined April 2010
Netherlands21937 Posts
February 18 2016 19:39 GMT
#59163
On February 19 2016 04:34 Simberto wrote:
Show nested quote +
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
It ignores such insignificant forces as time, entropy, and death
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:40 GMT
#59164
On February 19 2016 04:33 Acrofales wrote:
Show nested quote +
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Acrofales
Profile Joined August 2010
Spain18100 Posts
February 18 2016 19:41 GMT
#59165
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


Well. Decrypting the devices memory and getting the record from there is not going to help anybody at all with that task.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:42 GMT
#59166
On February 19 2016 04:34 JinDesu wrote:
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.

problem is this 'they cannot make you give them the passcode' is wrong and also dynamic depending on technological environment.

if indestructible safes and houses do exist, then laws will be created around them to resolve the similar issues that will arise.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:42 GMT
#59167
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
Average means I'm better than half of you.
KwarK
Profile Blog Joined July 2006
United States43188 Posts
February 18 2016 19:43 GMT
#59168
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.
ModeratorThe angels have the phone box
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:44 GMT
#59169
On February 19 2016 04:42 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Soap
Profile Blog Joined April 2010
Brazil1546 Posts
February 18 2016 19:44 GMT
#59170
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.
Acrofales
Profile Joined August 2010
Spain18100 Posts
February 18 2016 19:45 GMT
#59171
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.


oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:45 GMT
#59172
On February 19 2016 04:43 KwarK wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.

i'm sure they can design these systems against the technical problems like fragility. within the space of practical options having a secure frontdoor for law enforcement is the best way to go.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:46 GMT
#59173
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?

If I told you to make a gun that only good guys can use, you'd call me stupid.

When you say to make a security hole that only good guys can access, this is apparently a good idea.
Average means I'm better than half of you.
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:46 GMT
#59174
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
Plansix
Profile Blog Joined April 2011
United States60190 Posts
Last Edited: 2016-02-18 19:50:01
February 18 2016 19:47 GMT
#59175
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

On February 19 2016 04:44 Soap wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.

No, its about evidence. You need to prove that the person performed the criminal act, which normally means finding the evidence on their PC/phone.. You can't infer, that isn't good enough.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:47 GMT
#59176
On February 19 2016 04:45 Acrofales wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:48 GMT
#59177
On February 19 2016 04:47 oneofthem wrote:
Show nested quote +
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.
Average means I'm better than half of you.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:49 GMT
#59178
On February 19 2016 04:46 puerk wrote:
Show nested quote +
On February 19 2016 04:44 oneofthem wrote:
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
lol fuck off this post here is to resolve a linguistic confusion. and you dont know more about that than me
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:51 GMT
#59179
On February 19 2016 04:48 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:47 oneofthem wrote:
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.

not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:52 GMT
#59180
On February 19 2016 04:47 Plansix wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography?
Does american law have no form of: de.wikipedia.org?
The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it.
Prev 1 2957 2958 2959 2960 2961 10093 Next
Please log in or register to reply.
Live Events Refresh
Next event in 3h 4m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
SpeCial 212
Railgan 67
Livibee 35
PiGStarcraft5
StarCraft: Brood War
Britney 13063
Bonyth 119
NaDa 10
Dota 2
monkeys_forever417
capcasts155
Counter-Strike
Foxcn159
kRYSTAL_19
Super Smash Bros
C9.Mang0529
PPMD31
Heroes of the Storm
Liquid`Hasu472
Other Games
Grubby3014
fl0m1047
FrodaN979
shahzam547
Skadoodle180
ViBE103
ArmadaUGS87
Maynarde56
JuggernautJason8
Models5
Moletrap4
Organizations
Other Games
BasetradeTV29
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 16 non-featured ]
StarCraft 2
• Hupsaiya 59
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
Dota 2
• masondota21483
League of Legends
• Doublelift2795
Other Games
• imaqtpie1280
• Scarra642
• Shiphtur177
Upcoming Events
BSL 21
3h 4m
Replay Cast
12h 4m
BASILISK vs Shopify Rebellion
Team Liquid vs Team Falcon
OSC
14h 4m
CrankTV Team League
15h 4m
Shopify Rebellion vs Team Liquid
BASILISK vs Team Falcon
Replay Cast
1d 1h
The PondCast
1d 11h
CrankTV Team League
1d 15h
Replay Cast
2 days
WardiTV Invitational
2 days
MaNa vs Gerald
Rogue vs GuMiho
ByuN vs Spirit
herO vs Solar
CrankTV Team League
2 days
[ Show More ]
Replay Cast
3 days
BSL Team A[vengers]
3 days
Dewalt vs Shine
UltrA vs ZeLoT
BSL 21
3 days
Sparkling Tuna Cup
4 days
BSL Team A[vengers]
4 days
Cross vs Motive
Sziky vs HiyA
BSL 21
4 days
Wardi Open
5 days
Monday Night Weeklies
5 days
Liquipedia Results

Completed

ASL Season 20
WardiTV TLMC #15
Eternal Conflict S1

Ongoing

BSL 21 Points
BSL 21 Team A
C-Race Season 1
IPSL Winter 2025-26
KCM Race Survival 2025 Season 4
SOOP Univ League 2025
CranK Gathers Season 2: SC II Pro Teams
PGL Masters Bucharest 2025
Thunderpick World Champ.
CS Asia Championships 2025
ESL Pro League S22
StarSeries Fall 2025
FISSURE Playground #2
BLAST Open Fall 2025
BLAST Open Fall Qual
Esports World Cup 2025
BLAST Bounty Fall 2025

Upcoming

SC4ALL: Brood War
YSL S2
BSL Season 21
SLON Tour Season 2
BSL 21 Non-Korean Championship
RSL Offline Finals
WardiTV 2025
RSL Revival: Season 3
Stellar Fest
SC4ALL: StarCraft II
META Madness #9
eXTREMESLAND 2025
ESL Impact League Season 8
SL Budapest Major 2025
BLAST Rivals Fall 2025
IEM Chengdu 2025
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2025 TLnet. All Rights Reserved.