• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 16:57
CEST 22:57
KST 05:57
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
Serral wins EWC 202543Tournament Spotlight: FEL Cracow 202510Power Rank - Esports World Cup 202580RSL Season 1 - Final Week9[ASL19] Finals Recap: Standing Tall15
Community News
Weekly Cups (Jul 28-Aug 3): herO doubles up6LiuLi Cup - August 2025 Tournaments3[BSL 2025] H2 - Team Wars, Weeklies & SB Ladder10EWC 2025 - Replay Pack4Google Play ASL (Season 20) Announced62
StarCraft 2
General
The GOAT ranking of GOAT rankings Weekly Cups (Jul 28-Aug 3): herO doubles up Clem Interview: "PvT is a bit insane right now" Serral wins EWC 2025 TL Team Map Contest #5: Presented by Monster Energy
Tourneys
Global Tourney for College Students in September Sparkling Tuna Cup - Weekly Open Tournament WardiTV Mondays $5,000 WardiTV Summer Championship 2025 LiuLi Cup - August 2025 Tournaments
Strategy
Custom Maps
External Content
Mutation # 485 Death from Below Mutation # 484 Magnetic Pull Mutation #239 Bad Weather Mutation # 483 Kill Bot Wars
Brood War
General
Google Play ASL (Season 20) Announced Simple editing of Brood War save files? (.mlx) StarCraft & BroodWar Campaign Speedrun Quest BW General Discussion Help, I can't log into staredit.net
Tourneys
[CSLPRO] It's CSLAN Season! - Last Chance [Megathread] Daily Proleagues [ASL20] Online Qualifiers Day 2 Cosmonarchy Pro Showmatches
Strategy
[G] Mineral Boosting Simple Questions, Simple Answers Muta micro map competition Does 1 second matter in StarCraft?
Other Games
General Games
Stormgate/Frost Giant Megathread Nintendo Switch Thread Total Annihilation Server - TAForever Beyond All Reason [MMORPG] Tree of Savior (Successor of Ragnarok)
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Vanilla Mini Mafia
Community
General
Things Aren’t Peaceful in Palestine US Politics Mega-thread Russo-Ukrainian War Thread European Politico-economics QA Mega-thread Bitcoin discussion thread
Fan Clubs
INnoVation Fan Club SKT1 Classic Fan Club!
Media & Entertainment
Movie Discussion! [Manga] One Piece Anime Discussion Thread [\m/] Heavy Metal Thread Korean Music Discussion
Sports
2024 - 2025 Football Thread Formula 1 Discussion TeamLiquid Health and Fitness Initiative For 2023
World Cup 2022
Tech Support
Gtx660 graphics card replacement Installation of Windows 10 suck at "just a moment" Computer Build, Upgrade & Buying Resource Thread
TL Community
TeamLiquid Team Shirt On Sale The Automated Ban List
Blogs
[Girl blog} My fema…
artosisisthebest
Sharpening the Filtration…
frozenclaw
ASL S20 English Commentary…
namkraft
The Link Between Fitness and…
TrAiDoS
momentary artworks from des…
tankgirl
from making sc maps to makin…
Husyelt
Customize Sidebar...

Website Feedback

Closed Threads



Active: 749 users

US Politics Mega-thread - Page 2959

Forum Index > Closed
Post a Reply
Prev 1 2957 2958 2959 2960 2961 10093 Next
Read the rules in the OP before posting, please.

In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up!

NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious.
Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action.
JinDesu
Profile Blog Joined August 2010
United States3990 Posts
February 18 2016 19:34 GMT
#59161
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.
Yargh
Simberto
Profile Blog Joined July 2010
Germany11512 Posts
February 18 2016 19:34 GMT
#59162
On February 19 2016 04:20 Plansix wrote:
Show nested quote +
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

And regarding the "front door for authorized personnel only". If that door exists, it has a key. That key is a code of some sorts. That means that if someone gets access to that code, they can all the phones. Suddenly, the question whether your data is safe becomes "Who can access that key". Someone could crack the code, through brute force or whatever different way they prefer. Someone could pay a dude at Apple to give them the code. Someone could Break into apple and steal the physical drive that the code is on. And as soon as the code is out, there is nothing you can do. It is out there, and bad people will be able to access all of the phones. The existence of this possibility makes the data of a lot of people a lot less secure. And a lot of people will want access to that key, because it is worth a lot. Most people are already very uncomfortable at what the NSA does with our data. Now imagine that the FSB or the chinese intelligence agency gets that code. Or the mob. A single breaking point for a large system is not a good thing to have if you want to keep that system secure. Demanding to put one in is demanding to make everyones data less secure just so the government can spy on you better.

(Of course that simplifies things a bit, it doesn't have to be one key for all phones, it could be something that generates a key based on an algorithm based on some hardware specifications or another different thing. Doesn't change anything though, now the algorithm is that central weak point that could break)
Gorsameth
Profile Joined April 2010
Netherlands21687 Posts
February 18 2016 19:39 GMT
#59163
On February 19 2016 04:34 Simberto wrote:
Show nested quote +
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.
It ignores such insignificant forces as time, entropy, and death
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:40 GMT
#59164
On February 19 2016 04:33 Acrofales wrote:
Show nested quote +
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Acrofales
Profile Joined August 2010
Spain17992 Posts
February 18 2016 19:41 GMT
#59165
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


Well. Decrypting the devices memory and getting the record from there is not going to help anybody at all with that task.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:42 GMT
#59166
On February 19 2016 04:34 JinDesu wrote:
My understanding is that if your house is built of adamantium and you destroy the key to your house and it's unpickable and they can only get in via your passcode, they cannot make you give them the passcode.

I think that's a fairer analogy to the iphone issue.

problem is this 'they cannot make you give them the passcode' is wrong and also dynamic depending on technological environment.

if indestructible safes and houses do exist, then laws will be created around them to resolve the similar issues that will arise.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:42 GMT
#59167
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
Average means I'm better than half of you.
KwarK
Profile Blog Joined July 2006
United States42694 Posts
February 18 2016 19:43 GMT
#59168
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.
ModeratorThe angels have the phone box
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:44 GMT
#59169
On February 19 2016 04:42 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
Soap
Profile Blog Joined April 2010
Brazil1546 Posts
February 18 2016 19:44 GMT
#59170
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.
Acrofales
Profile Joined August 2010
Spain17992 Posts
February 18 2016 19:45 GMT
#59171
On February 19 2016 04:40 oneofthem wrote:
Show nested quote +
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.


oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:45 GMT
#59172
On February 19 2016 04:43 KwarK wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

You understand that humans are the weak point in security, right? That if a FBI guy has it then there's a fair chance he'll give it out over the phone to someone who says he's calling from Apple tech support.

i'm sure they can design these systems against the technical problems like fragility. within the space of practical options having a secure frontdoor for law enforcement is the best way to go.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:46 GMT
#59173
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?

If I told you to make a gun that only good guys can use, you'd call me stupid.

When you say to make a security hole that only good guys can access, this is apparently a good idea.
Average means I'm better than half of you.
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:46 GMT
#59174
On February 19 2016 04:44 oneofthem wrote:
Show nested quote +
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
Plansix
Profile Blog Joined April 2011
United States60190 Posts
Last Edited: 2016-02-18 19:50:01
February 18 2016 19:47 GMT
#59175
On February 19 2016 04:39 Gorsameth wrote:
Show nested quote +
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

On February 19 2016 04:44 Soap wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.


That's about anonymity, not encryption. A threat needs to be decrypted to work.

No, its about evidence. You need to prove that the person performed the criminal act, which normally means finding the evidence on their PC/phone.. You can't infer, that isn't good enough.
I have the Honor to be your Obedient Servant, P.6
TL+ Member
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:47 GMT
#59176
On February 19 2016 04:45 Acrofales wrote:
Show nested quote +
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
On February 19 2016 02:27 oneofthem wrote:
privacy hawks are making this harder than necessary. there are ways to make iphones unlock without thereby propagating said tools or methods. the means could itself be protected by encryption in some secure location.

strawmanning open access when it need not go so far is just a waste of time

Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
WolfintheSheep
Profile Joined June 2011
Canada14127 Posts
February 18 2016 19:48 GMT
#59177
On February 19 2016 04:47 oneofthem wrote:
Show nested quote +
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
On February 19 2016 02:35 Gorsameth wrote:
[quote]
Any backdoor in the OS itself is exploitable, its pretty impossible for it not to be.
Apple can do this with a 1 time program to only break this one phone.

Then the FBI/NSA/CIA will have another phone, and another, and another and....
Before long Apple has several people doing nothing but writing 1 use programs to break phones.

"Hey wouldn't it be easy if there was just a general backdoor we could use? We would need the physical phone so its all safe"

"Hey yeah that backdoor? Its kinda bad if we need the phone itself you know, we want to catch people before they commit crimes so we need to be able to remote breakin"

Oh hey look someone broke the Iphone backdoor and now everyone's data is on the streets, evil Apple let this happen!.

Intelligence organizations have a long history of breaking laws and gathering anything and everything they can get their hands on regardless of use. No I don't trust them to act responsibly with this technology for even a second.

the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.
Average means I'm better than half of you.
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:49 GMT
#59178
On February 19 2016 04:46 puerk wrote:
Show nested quote +
On February 19 2016 04:44 oneofthem wrote:
On February 19 2016 04:42 WolfintheSheep wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.

This is a back door.

Stop mislabelling it.
don't really give a shit what you call it as long as it is secure enough. you do realize the functional significance of the 'backdoor' is its lack of security right?


please stop giving a shit about this thread if you can not be asked to inform yourself and constantly spew totally missinformed bullcrap

either you know how it works, or you ask, but don't lecture everyone on stuff you have no clue about... seriously it is getting embarassing
lol fuck off this post here is to resolve a linguistic confusion. and you dont know more about that than me
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
oneofthem
Profile Blog Joined November 2005
Cayman Islands24199 Posts
February 18 2016 19:51 GMT
#59179
On February 19 2016 04:48 WolfintheSheep wrote:
Show nested quote +
On February 19 2016 04:47 oneofthem wrote:
On February 19 2016 04:45 Acrofales wrote:
On February 19 2016 04:40 oneofthem wrote:
On February 19 2016 04:33 Acrofales wrote:
On February 19 2016 04:25 oneofthem wrote:
On February 19 2016 04:20 WolfintheSheep wrote:
On February 19 2016 04:13 oneofthem wrote:
On February 19 2016 04:08 puerk wrote:
On February 19 2016 03:56 oneofthem wrote:
[quote]
the whole point of bringing it up in a court of law is to have a front door rather than a backdoor, with the proper precautions and safeguards.

there is already a front door, the passkey the owner of the phone used. (have they even tried swiping his finger on the sensor, if it is biometrically locked?)
everything else is by definition of encryption systems a back door
so what is the problem with creating another similarly secure front door mechanism that authorized access can open?

The problem, as stated a hundred times already, is that your use of the word "authorized" is not in line with reality.

You think it means "only the people we want".

The reality is "only people who have the key".

And there is a gigantic difference.

that is not what i think at all, try again. a front door works the same way but with proper safeguards that a 'back door' lacks. you might as well say the fact the owner can use the phone is a security flaw.


OK. So the front door requires a special key that the user can configure. He sets his code to fjord$^_fhrid4568nrtbr÷==AT&TIEN

He memorized this key and doesn't tell anybody. He subsequently dies, leaving the front door locked forever.

Now what you're proposing is that somehow Apple magically has this key too? How?

EDIT: and just to be clear this, and only this is the front door. Amy other access mechanism is, by its very definition, a back door.
the front door would be a secured access option with the fbi or whoever holding the key. i am talking about designing the security process so as to avoid both unsecure backdoors and inability to access device in the process of investigation and so on.


You know that the analogy is a house with two doors. One with your key and one with the FBI's key. Except that the FBI's key also opens everybody else's door in the entire world. It's a single point of failure as about 100 ppl here have pointed out, and by its very definition insecure. You seem to be throwing words around without knowing their technical definition in this context.



uh it does not have to be a passcode key obviously. there can be device specific code that requires a secured algorithm to decrypt. it would not be a universal one point of failure, and this is just a technical challenge that should not define the tradeoff in question.


"device specific code that requires a secured algorithm to decrypt"

That's called a password.

not really, they can secure the method of applying the password. it doesn't have to be a simple passcode. maybe a piece of hardware that is authenticated via special satellite that only the u.s. govt can send up there.
We have fed the heart on fantasies, the heart's grown brutal from the fare, more substance in our enmities than in our love
puerk
Profile Joined February 2015
Germany855 Posts
February 18 2016 19:52 GMT
#59180
On February 19 2016 04:47 Plansix wrote:
Show nested quote +
On February 19 2016 04:39 Gorsameth wrote:
On February 19 2016 04:34 Simberto wrote:
On February 19 2016 04:20 Plansix wrote:
On February 19 2016 04:11 WolfintheSheep wrote:
On February 19 2016 03:56 Plansix wrote:
On February 19 2016 03:48 Acrofales wrote:
On February 19 2016 03:43 Jayme wrote:
On February 19 2016 03:35 Acrofales wrote:
Extremely relevant article from wired in 2014 on this issue. I agree completely. The sense of entitlement of law officials to be able to ruffle through your stuff is quite ludicrous.

Now I sympathize with the FBI in this case and wish them all the luck in the world cracking that phone. But just as we don't forbid ppl from using safes, or locking their front door, should we try to block the computational equivalent of these technologies.

And no, a search warrant doesn't mean you have to open your safe. Just your front door.


Edit: forgot the link.
http://www.wired.com/2014/10/golden-key/


That actually depends on what the Search Warrant says. If it specifically indicates a safe they know you own you're actually compelled to open it. Normally they just take the entire thing and open it themselves. Careful with statements like that. Search Warrants aren't limited to your house.



True, but they can't prove you didn't forget the code. And in this case the owner is dead, so can't be compelled to giving them the code.

From a legal standpoint, that makes it very hard to prove a lot of things. If they can never open an Iphone for review, you could harass someone from it forever, sending death threats and other terrible thing and never be charged. As long as you did it all on the phone, including the creation of the address, it could be impossible to prove you sent the threats.

A tiny computer that can never be opened by anyone but the owner is the dream of almost every slightly smart criminal.

This exists already, and it's called the human brain.

Realistically, everything you send out on a phone still exists on the receiver's phone, and the records of some message being sent to that exact destination which align perfectly with the time stamps still exist. You don't need to break open a criminal's phone to prove that they were transmitting the incriminating messages beyond a reasonable doubt.

Honestly, people pretend that secrets didn't exist before computers, and that the situation is somehow worse. In reality it's far, far better for law enforcement, even with encryption, because you can't brute force or hack a person (legally) who is keeping secrets in their head. but a phone has no rights.


Courts require all evidence be authenticated by a witness, confirming that the evidence is what the attorney claims it is. So in the case of an email, they would need a witness that could testify that the email was sent from that specific phone while the witness is under oath. And that witness is open to cross examination, where the other side can as if they are 100% sure. If they can never open the phone, I question how they would ever prove that it came from that specific phone and not someplace else. The same goes for twitter and all other forms of social media that allow for anonymous log in.


IP addresses. If the phone sends data through the internet, it has a very specific address.

Just as an aside, the legal problem is not proving that device X sent the data. Its to prove the suspect was the one using device X at the time.

And you prove that by saying they have sole access to the device or had the device at the time the criminal activity took place. But to do that, you need to prove that the device sent the data and was the one used. Its a hurdle in all internet based crime and digital evidence.

http://www.theverge.com/2016/2/16/11027732/apple-iphone-encryption-fbi-san-bernardino-shooters

On a side, note, the Judge specifically ordered that Apple assist with bypassing the auto delete feature, which appears to be no small task. Also removing the limit on the number of password attempts. If they cannot currently do that, they need to write software that will allow them.

So the order isn't for them to create a back door, but remove the defenses they put in place so the FBI can force the phone open. With those in place, every iphone is completely immune to being forced open. I don't see anything unreasonable about that order.

Sorry but have you ever felt that maybe lawschool did not make you an expert in cryptography?
Does american law have no form of: de.wikipedia.org?
The gain of hacking that phone is insignificantly low, but it's detriments are unpredictably large and severe. There is no compelling argument to do it. And "fuck i have no clue how stuff works, i will force the apple guys to do my bidding" just doesn't cut it.
Prev 1 2957 2958 2959 2960 2961 10093 Next
Please log in or register to reply.
Live Events Refresh
Next event in 3h 3m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
ForJumy 172
SteadfastSC 131
StarCraft: Brood War
ggaemo 175
NaDa 40
Aegong 29
Stormgate
ZombieGrub309
UpATreeSC189
Nathanias168
JuggernautJason46
NightEnD9
Dota 2
syndereN443
LuMiX1
League of Legends
Reynor126
Counter-Strike
Stewie2K59
Super Smash Bros
Mew2King23
Heroes of the Storm
Liquid`Hasu606
Other Games
tarik_tv4298
Grubby2426
summit1g2416
mouzStarbuck257
Hui .153
Sick20
Organizations
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 23 non-featured ]
StarCraft 2
• Berry_CruncH139
• StrangeGG 63
• davetesta42
• LUISG 38
• sooper7s
• AfreecaTV YouTube
• intothetv
• Migwel
• Kozan
• IndyKCrew
• LaughNgamezSOOP
StarCraft: Brood War
• HerbMon 55
• 80smullet 24
• Eskiya23 11
• Pr0nogo 1
• STPLYoutube
• ZZZeroYoutube
• BSLYoutube
Dota 2
• masondota22479
League of Legends
• Doublelift4475
• TFBlade928
Counter-Strike
• Shiphtur401
Other Games
• imaqtpie1781
Upcoming Events
DaveTesta Events
3h 3m
The PondCast
13h 3m
WardiTV Summer Champion…
14h 3m
Replay Cast
1d 3h
LiuLi Cup
1d 14h
uThermal 2v2 Circuit
1d 18h
RSL Revival
2 days
RSL Revival
2 days
uThermal 2v2 Circuit
2 days
CSO Cup
2 days
[ Show More ]
Sparkling Tuna Cup
3 days
uThermal 2v2 Circuit
3 days
Wardi Open
4 days
RotterdaM Event
4 days
RSL Revival
5 days
Liquipedia Results

Completed

ASL Season 20: Qualifier #2
FEL Cracow 2025
CC Div. A S7

Ongoing

Copa Latinoamericana 4
Jiahua Invitational
BSL 20 Team Wars
KCM Race Survival 2025 Season 3
BSL 21 Qualifiers
HCC Europe
BLAST Bounty Fall Qual
IEM Cologne 2025
FISSURE Playground #1
BLAST.tv Austin Major 2025
ESL Impact League Season 7
IEM Dallas 2025

Upcoming

ASL Season 20
CSLPRO Chat StarLAN 3
BSL Season 21
BSL 21 Team A
RSL Revival: Season 2
Maestros of the Game
SEL Season 2 Championship
WardiTV Summer 2025
uThermal 2v2 Main Event
Thunderpick World Champ.
MESA Nomadic Masters Fall
CS Asia Championships 2025
Roobet Cup 2025
ESL Pro League S22
StarSeries Fall 2025
FISSURE Playground #2
BLAST Open Fall 2025
BLAST Open Fall Qual
Esports World Cup 2025
BLAST Bounty Fall 2025
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2025 TLnet. All Rights Reserved.