• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 23:09
CEST 05:09
KST 12:09
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL21] Ro24 Preview Pt1: New Chaos0Team Liquid Map Contest #22 - Presented by Monster Energy8ByuL: The Forgotten Master of ZvT30Behind the Blue - Team Liquid History Book20Clem wins HomeStory Cup 289
Community News
Weekly Cups (March 16-22): herO doubles, Cure surprises3Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool48Weekly Cups (March 9-15): herO, Clem, ByuN win42026 KungFu Cup Announcement6BGE Stara Zagora 2026 cancelled12
StarCraft 2
General
Team Liquid Map Contest #22 - Presented by Monster Energy What mix of new & old maps do you want in the next ladder pool? (SC2) Potential Updates Coming to the SC2 CN Server Behind the Blue - Team Liquid History Book herO wins SC2 All-Star Invitational
Tourneys
RSL Season 4 announced for March-April Sparkling Tuna Cup - Weekly Open Tournament StarCraft Evolution League (SC Evo Biweekly) WardiTV Mondays World University TeamLeague (500$+) | Signups Open
Strategy
Custom Maps
[M] (2) Frigid Storage Publishing has been re-enabled! [Feb 24th 2026]
External Content
The PondCast: SC2 News & Results Mutation # 518 Radiation Zone Mutation # 517 Distant Threat Mutation # 516 Specter of Death
Brood War
General
Pros React To: SoulKey vs Ample ASL21 General Discussion RepMastered™: replay sharing and analyzer site KK Platform will provide 1 million CNY Recent recommended BW games
Tourneys
[Megathread] Daily Proleagues [ASL21] Ro24 Group C [ASL21] Ro24 Group B [ASL21] Ro24 Group A
Strategy
What's the deal with APM & what's its true value Fighting Spirit mining rates Simple Questions, Simple Answers
Other Games
General Games
General RTS Discussion Thread Nintendo Switch Thread Stormgate/Frost Giant Megathread Darkest Dungeon Path of Exile
Dota 2
The Story of Wings Gaming Official 'what is Dota anymore' discussion
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Five o'clock TL Mafia Mafia Game Mode Feedback/Ideas Vanilla Mini Mafia
Community
General
US Politics Mega-thread The Games Industry And ATVI European Politico-economics QA Mega-thread Canadian Politics Mega-thread Russo-Ukrainian War Thread
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Manga] One Piece [Req][Books] Good Fantasy/SciFi books Movie Discussion!
Sports
Formula 1 Discussion 2024 - 2026 Football Thread Cricket [SPORT] Tokyo Olympics 2021 Thread General nutrition recommendations
World Cup 2022
Tech Support
[G] How to Block Livestream Ads
TL Community
The Automated Ban List
Blogs
Funny Nicknames
LUCKY_NOOB
Money Laundering In Video Ga…
TrAiDoS
Iranian anarchists: organize…
XenOsky
FS++
Kraekkling
Shocked by a laser…
Spydermine0240
ASL S21 English Commentary…
namkraft
Customize Sidebar...

Website Feedback

Closed Threads



Active: 8534 users

US Politics Mega-thread - Page 2959

Forum Index > Closed
Post a Reply
Prev 1 2957 2958 2959 2960 2961 10093 Next
Read the rules in the OP before posting, please.

In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up!

NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious.
Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action.
JinDesu
Profile Blog Joined August 2010
United States3990 Posts
February 18 2016 19:34 GMT
#59161
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.
Yargh
Simberto
Profile Blog Joined July 2010
Germany11787 Posts
February 18 2016 19:34 GMT
#59162
On February 19 2016 04:20 Plansix wrote:
Show nested quote +
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

And regarding the "front door for authorized personnel only". If that door exists, it has a key. That key is a code of some sorts. That means that if someone gets access to that code, they can all the phones. Suddenly, the question whether your data is safe becomes "Who can access that key". Someone could crack the code, through brute force or whatever different way they prefer. Someone could pay a dude at Apple to give them the code. Someone could Break into apple and steal the physical drive that the code is on. And as soon as the code is out, there is nothing you can do. It is out there, and bad people will be able to access all of the phones. The existence of this possibility makes the data of a lot of people a lot less secure. And a lot of people will want access to that key, because it is worth a lot. Most people are already very uncomfortable at what the NSA does with our data. Now imagine that the FSB or the chinese intelligence agency gets that code. Or the mob. A single breaking point for a large system is not a good thing to have if you want to keep that system secure. Demanding to put one in is demanding to make everyones data less secure just so the government can spy on you better.

(Of course that simplifies things a bit, it doesn't have to be one key for all phones, it could be something that generates a key based on an algorithm based on some hardware specifications or another different thing. Doesn't change anything though, now the algorithm is that central weak point that could break)
Gorsameth
Profile Joined April 2010
Netherlands22164 Posts
February 18 2016 19:39 GMT
#59163
On February 19 2016 04:34 Simberto wrote:
Show nested quote +
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
It ignores such insignificant forces as time, entropy, and death
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:40 GMT
#59164
On February 19 2016 04:33 Acrofales wrote:
Show nested quote +
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Acrofales
Profile Joined August 2010
Spain18246 Posts
February 18 2016 19:41 GMT
#59165
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


Well. Decrypting the devices memory and getting the record from there is not going to help anybody at all with that task.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:42 GMT
#59166
On February 19 2016 04:34 JinDesu wrote:
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.

problem is this 'they cannot make you give them the passcode' is wrong and also dynamic depending on technological environment.

if indestructible safes and houses do exist, then laws will be created around them to resolve the similar issues that will arise.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:42 GMT
#59167
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
Average means I'm better than half of you.
KwarK
Profile Blog Joined July 2006
United States43755 Posts
February 18 2016 19:43 GMT
#59168
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.
ModeratorThe angels have the phone box
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:44 GMT
#59169
On February 19 2016 04:42 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Soap
Profile Blog Joined April 2010
Brazil1546 Posts
February 18 2016 19:44 GMT
#59170
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.
Acrofales
Profile Joined August 2010
Spain18246 Posts
February 18 2016 19:45 GMT
#59171
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.


oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:45 GMT
#59172
On February 19 2016 04:43 KwarK wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.

i'm sure they can design these systems against the technical problems like fragility. within the space of practical options having a secure frontdoor for law enforcement is the best way to go.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:46 GMT
#59173
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?

If I told you to make a gun that only good guys can use, you'd call me stupid.

When you say to make a security hole that only good guys can access, this is apparently a good idea.
Average means I'm better than half of you.
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:46 GMT
#59174
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
Plansix
Profile Blog Joined April 2011
United States60190 Posts
Last Edited: 2016-02-18 19:50:01
February 18 2016 19:47 GMT
#59175
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

On February 19 2016 04:44 Soap wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.

No, its about evidence. You need to prove that the person performed the criminal act, which normally means finding the evidence on their PC/phone.. You can't infer, that isn't good enough.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:47 GMT
#59176
On February 19 2016 04:45 Acrofales wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:48 GMT
#59177
On February 19 2016 04:47 oneofthem wrote:
Show nested quote +
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.
Average means I'm better than half of you.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:49 GMT
#59178
On February 19 2016 04:46 puerk wrote:
Show nested quote +
On February 19 2016 04:44 oneofthem wrote:
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
lol fuck off this post here is to resolve a linguistic confusion. and you dont know more about that than me
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:51 GMT
#59179
On February 19 2016 04:48 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:47 oneofthem wrote:
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.

not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:52 GMT
#59180
On February 19 2016 04:47 Plansix wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography?
Does american law have no form of: de.wikipedia.org?
The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it.
Prev 1 2957 2958 2959 2960 2961 10093 Next
Please log in or register to reply.
Live Events Refresh
Next event in 6h 51m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
NeuroSwarm 330
RuFF_SC2 249
ProTech127
StarCraft: Brood War
BeSt 969
Shuttle 769
scan(afreeca) 171
ggaemo 164
NaDa 26
Bale 25
Icarus 6
Dota 2
monkeys_forever567
LuMiX1
League of Legends
JimRising 495
Super Smash Bros
AZ_Axe294
amsayoshi92
Mew2King37
Other Games
summit1g11551
WinterStarcraft447
C9.Mang0369
crisheroes244
ViBE139
ArmadaUGS137
Moletrap11
Organizations
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 13 non-featured ]
StarCraft 2
• Hupsaiya 88
• Airneanach23
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• Response 4
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
Upcoming Events
RSL Revival
6h 51m
ByuN vs Maru
MaxPax vs TriGGeR
WardiTV Team League
8h 51m
BSL
15h 51m
Replay Cast
20h 51m
Replay Cast
1d 5h
Afreeca Starleague
1d 6h
Light vs Calm
Royal vs Mind
Wardi Open
1d 7h
Monday Night Weeklies
1d 12h
OSC
1d 20h
Sparkling Tuna Cup
2 days
[ Show More ]
Afreeca Starleague
2 days
Rush vs PianO
Flash vs Speed
Replay Cast
3 days
Afreeca Starleague
3 days
BeSt vs Leta
Queen vs Jaedong
Replay Cast
3 days
The PondCast
4 days
Replay Cast
4 days
RSL Revival
5 days
Replay Cast
5 days
RSL Revival
6 days
BSL
6 days
Liquipedia Results

Completed

Proleague 2026-03-27
WardiTV Winter 2026
Underdog Cup #3

Ongoing

BSL Season 22
CSL Elite League 2026
CSL Season 20: Qualifier 1
ASL Season 21
Acropolis #4 - TS6
2026 Changsha Offline CUP
StarCraft2 Community Team League 2026 Spring
RSL Revival: Season 4
Nations Cup 2026
NationLESS Cup
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual

Upcoming

CSL Season 20: Qualifier 2
Escore Tournament S2: W1
CSL 2026 SPRING (S20)
Acropolis #4
IPSL Spring 2026
BSL 22 Non-Korean Championship
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
IEM Atlanta 2026
Asian Champions League 2026
PGL Astana 2026
BLAST Rivals Spring 2026
CCT Season 3 Global Finals
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.