UK Politics Mega-thread - Page 307
Forum Index > General Forum |
In order to ensure that this thread meets TL standards and follows the proper guidelines, we ask that everyone please adhere to this mod note. Posts containing only Tweets or articles adds nothing to the discussions. Therefore, when providing a source, explain why you feel it is relevant and what purpose it adds to the discussion. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments will be actioned upon. All in all, please continue to enjoy posting in TL General and partake in discussions as much as you want! But please be respectful when posting or replying to someone. There is a clear difference between constructive criticism/discussion and just plain being rude and insulting. https://www.registertovote.service.gov.uk | ||
LegalLord
United Kingdom13774 Posts
| ||
LightSpectra
United States1128 Posts
On March 28 2017 02:26 Plansix wrote: The justice system functions because there is a reasonable expectation that any order to release information will be complied with. If bank records are needed to prove the state's case or mount a defense against it, the bank will provide them. The same with phone records or any other form of documentation. They may redact information, but they will still provide what they can and justify redacting the information they did. Companies like Facebook, Google, WhatsApp and others subvert this exception. They offer the ability to communicate through their services, but wash themselves of responsibility to provide the goverment with access. They build locks and keys to assure they don't have access, even to the data on their own servers. And then they make the argument that providing access to that data on the servers they control would be to risky for all their customers. The argument the goverment and others have been making is if that is sustainable. If they can co-exist with these companies that have active created systems to make it impossible for them to comply with court orders. And tied those systems together to make the argument that they are protecting their clients privacy. If the WhatsApp is any different than a phone company, which is required to keep records and provide them to the court if ordered to do so. And all of these systems and services were designed to make a profit and place a little burden on the company as possible. To protect them from liability. The companies can argue that it is for the good for everyone, but that is also marketing. Our services are so private even the government can’t get in. So, again, the choice remains: either everybody is compromised by default, but good AND bad people can become secure with a bit of effort; or everybody is secure by default. Those are the options. Nothing you say is unreasonable, but none of that changes the fact that there are only two options on the table. | ||
bardtown
England2313 Posts
| ||
LightSpectra
United States1128 Posts
On March 28 2017 02:31 LegalLord wrote: The continuation of this "no one understands encryption if they disagree with me" game is worth a laugh, for sure. Even if I do agree that backdoors are idiocy this insistence on "I and I alone know how things work" is quite tiresome. I'm sorry, who is making the argument that "no one understands encryption if they disagree with me"? | ||
Plansix
United States60190 Posts
On March 28 2017 02:31 LegalLord wrote: The continuation of this "no one understands encryption if they disagree with me" game is worth a laugh, for sure. Even if I do agree that backdoors are idiocy this insistence on "I and I alone know how things work" is quite tiresome. It is pretty standard for the tech industry sadly. In the recent lawsuit against Oculus for breaking NDA, John Carmac testified that not only did he break NDA, but freely admitted that he took software he developed for Zenimax. But that was only so he could make his real defense, that stealing the software didn’t matter. That he was such a wizard at code that the software wasn’t even relevant to the final product. So it wasn’t really stealing and only slightly wrong. But not really wrong since no one was really harmed. He was really grumpy when the jury didn’t find that amazing argument compelling. | ||
LegalLord
United Kingdom13774 Posts
On March 28 2017 02:38 Plansix wrote: It is pretty standard for the tech industry sadly. In the recent lawsuit against Oculus for breaking NDA, John Carmac testified that not only did he break NDA, but freely admitted that he took software he developed for Zenimax. But that was only so he could make his real defense, that stealing the software didn’t matter. That he was such a wizard at code that the software wasn’t even relevant to the final product. So it wasn’t really stealing and only slightly wrong. But not really wrong since no one was really harmed. He was really grumpy when the jury didn’t find that amazing argument compelling. I do have to say that the software folks in general seem to be quite bad by the standards of technical experts at convincing people of the validity of their technical concerns in a legal setting. Engineers (not software "engineers" who build websites) tend to be much better at arguing for why science is on their side than programmers. This reminds me of the FBI "we want Apple to crack open this iPhone for us" case and how stupid some of the software folk looked legally - even though from a technical perspective they were right. I remember you made a point of how the courts would see that that seemed quite effective at describing the problem. | ||
LightSpectra
United States1128 Posts
On March 28 2017 02:43 LegalLord wrote: This reminds me of the FBI "we want Apple to crack open this iPhone for us" case and how stupid some of the software folk looked legally - even though from a technical perspective they were right. In what way do you think they looked stupid? Because they refused to crack the phone of a terrorist at the expense of all of their other customers? Yeah, unfortunately everybody looks bad when they defend human rights contra the universal argument of "but think of the children!!" | ||
Plansix
United States60190 Posts
On March 28 2017 02:45 LightSpectra wrote: In what way do you think they looked stupid? Because they refused to crack the phone of a terrorist at the expense of all of their other customers? Yeah, unfortunately everybody looks bad when they defend human rights contra the universal argument of "but think of the children!!" Because they were ordered by a court to do it and they were protecting dead terrorist information on a phone. Of course the argument was that they were protecting everyone’s privacy. But the other argument was that they are protecting their profit margins and lowing their exposure to liability. Apple loves that people feel safe punching their credit card number into their phone. | ||
Acrofales
Spain17263 Posts
On March 28 2017 02:30 LightSpectra wrote: Great, then since you know so much about such an obvious solution, go make a couple million dollars as a government consultant to politicians and stop wasting my time. Pretty sure I'm not forcing you to respond. Every terrorist attack that has succeeded in the past 15 years in the first-world is proof that a couple average guys with a negligible amount of resources can overcome an intercontinental multi-trillion dollar surveillance apparatus. What about... oh wait. That's right. This has nothing to do with the topic at hand. The problem isn't that terrorists are geniuses, the problem is that encryption actually exists in nature and no force in the world can stop bad people from learning about it. Nowhere is anybody claiming otherwise. My focus is not on WhatsApp per se but on really any encrypted communications app in widespread use (Signal, iMessage, et al.). But I'll play along. Any system where you send information through channels you don't control is compromised. The moment your message leaves your system, you should assume someone is trying to read it. If you didn't encrypt it yourself, someone *may* be able to read it. In the case of WhatsApp the only guarantee you have that Facebook isn't reading your conversation is because Facebook tells you so. If fully secure communication is your aim, you should absolutely not be using WhatsApp, Telegram or whatever other commercial app is available. In my opinion, that is far more tinfoil-hat-worthy than it's worth, though. WhatsApp is hella convenient and I don't care too much if Facebook is listening in to most of my conversations, and know how to switch when I do worry. Moreover, I trust both Facebook and the government when they tell me that they are not reading my conversations. Am I saying Facebook has already added the wiretap protocol to the WhatsApp software? Of course not. And yes, they'd have to push an update. Of course, they could easily make it mandatory by simply making whatsapp not work anymore for anybody who hasn't updated. This would be very bad for business, but the government could pass legislation that requires messaging services to have this type of protocol. Moreover, something being open source doesn't make it secure. Unless you also compiled it yourself, there is absolutely no guarantee that whatever you downloaded in the app store (or wherever) is the same thing that you looked at the source code of. If you are indeed downloading source code, compiling it, and installing it on your phone on your own, then kudos. You are very secure. Of course, you're still only as secure as the people you are communicating with... So it's a backdoor, just one that takes a little bit more concentrated effort to overcome. Yes, some malicious guy at Facebook with full access to the WhatsApp system could use such a protocol to spy on his girlfriend (or random people). Just as someone at BT/Vodafone/whatever might be listening in to any of your phone conversations. Generally speaking, we (1) don't care and (2) trust them not to care. If you want fully secure conversations, definitely don't use the telephone... But more broadly, there tend to be protocols in place to stop random employees at Vodafone from listening in on conversations. And those same protocols could be used for WhatsApp. This part of the issue is not a new problem. In fact, modern encryption technology allows a more secure solution through secret sharing. | ||
LightSpectra
United States1128 Posts
On March 28 2017 02:50 Plansix wrote: Because they were ordered by a court to do it and they were protecting dead terrorist information on a phone. Of course the argument was that they were protecting everyone’s privacy. But the other argument was that they are protecting their profit margins and lowing their exposure to liability. Apple loves that people feel safe punching their credit card number into their phone. That's because there's a distinction there that a lot of people missed. They weren't refusing to crack the shooter's iPhone. They could not do that. It is not within their means. What Apple was doing was refusing to use their private key to sign some malware that would eliminate security features on the phone, which would have allowed EVERY iPhone to be cracked without a warrant. Are safe companies stupid if they don't make a custom skeleton key for the FBI to crack every safe they've ever made? No. Nobody would buy safes from them ever again if their security was so compromised. The same is true of Apple. They could either submit to becoming a jewelry company that no serious person would ever use for real work, or they could stand up and say "We're not compromising our millions of customers for the government, because even if we do, those bad people will just stop using iPhones and move on to something else that's otherwise impenetrable." | ||
LightSpectra
United States1128 Posts
Any system where you send information through channels you don't control is compromised. The moment your message leaves your system, you should assume someone is trying to read it. If you didn't encrypt it yourself, someone *may* be able to read it. That's true to an extent, sure. Of course, there are open source security solutions that demonstrably guarantee that your private key is not in danger and thus nobody will ever be able to read the messages encrypted under your public key, such as Signal and GPG. WhatsApp is a harder case to judge considering they're not open source, but they've undergone a security review that theoretically is the equivalent. I'm not vouching for the auditors who did the review, maybe they fucked up. I'm just responding to the notion that Facebook could 'easily' backdoor it. Am I saying Facebook has already added the wiretap protocol to the WhatsApp software? Of course not. And yes, they'd have to push an update. Of course, they could easily make it mandatory by simply making whatsapp not work anymore for anybody who hasn't updated. This would be very bad for business, but the government could pass legislation that requires messaging services to have this type of protocol. Yeah, and then everybody who's interested in security would switch to extranational solutions that aren't bound by the same legislation. Until we get to a world government, there's no stopping that. Trump of course wants this, he wants to backdoor every American communications program in order to catch terrorists. His mistake, and the mistake of everybody who agrees with him, is that it takes mere minutes to switch to something not American. All of our companies thus lose all of their business and the bad people are no worse off. Moreover, something being open source doesn't make it secure. Unless you also compiled it yourself, there is absolutely no guarantee that whatever you downloaded in the app store (or wherever) is the same thing that you looked at the source code of. If you are indeed downloading source code, compiling it, and installing it on your phone on your own, then kudos. You are very secure. Of course, you're still only as secure as the people you are communicating with. Open source isn't a guarantee of security, no. But it's a guarantee of no backdoors so long as you review the code for backdoors (doesn't take terribly long and even non-programmers can do so) and compile it yourself, which is quite easy to do. Yes, some malicious guy at Facebook with full access to the WhatsApp system could use such a protocol to spy on his girlfriend (or random people). Just as someone at BT/Vodafone/whatever might be listening in to any of your phone conversations. Generally speaking, we (1) don't care and (2) trust them not to care. If you want fully secure conversations, definitely don't use the telephone... That's the precise point I'm making here. Ordinary folk need a secure-by-default solution, otherwise it's too much work for them and they'll just settle for being compromised. But that doesn't stop bad people from putting in minutes of effort to use something secure. Do we want to live in a world where innocent people are prone to mass surveillance but bad people are just as secure as they are now? | ||
Acrofales
Spain17263 Posts
On March 28 2017 02:50 Plansix wrote: Because they were ordered by a court to do it and they were protecting dead terrorist information on a phone. Of course the argument was that they were protecting everyone’s privacy. But the other argument was that they are protecting their profit margins and lowing their exposure to liability. Apple loves that people feel safe punching their credit card number into their phone. Completely different cases. But yes, it would not be a good thing if losing your phone meant you have to cancel your credit card. And I cannot think of a technical way for Apple to ensure they could get into a phone that is locked without compromising the locking mechanism in such a way that any sophisticated hacker could get into any phone that is locked. Because the same asymmetry of control does not apply as in what I described about communication channels. But if you don't like the pragmatic argument, here's a philosophical one: smartphones should be considered an extension of the body, not of the home. Because for most intents and purposes that's how we treat them. They serve us as an extension of our memory, and as a communication channel, among other things. We store things in our smartphone that we would otherwise memorize. Smartphones have effectively turned us into cyborgs. And a whole new set of issues arises with that. Simply treating a smartphone as a safe that you should be able to open if you have physical access to it is overlooking this fact, and maybe we should treat it more as having access to someone's body, and thus 5th amendment rights apply to their smartphones... or maybe we should treat it as a mix. But technology is opening new ethical conundrums and treating smartphones as a different iteration of the beaten track is an error of category. | ||
Plansix
United States60190 Posts
If Facebook and Apple want a law that makes the act of breaking into their client’s data using internal backdoors illegal, they can ask for it. If they want some protection from liability for having to create it, they can work with congress. If they have special requirements that the warrant to backdoor given software be sealed when issued, or be non specific, they can work with congress to create that system. There are countless ways to create a system that can be responsibility secure, but not pefect. Perfection is impossible when humans are involved. By Apple and other’s plan to remove all humans and responsibility for their products isn’t going to be sustainable. At some point they become like Uber, yelling that they shouldn’t have to play by the rules everyone else has to. Edit: Acrofales - I agree that smart phones are a special case and should have different requirements. I would have no problem with them being more secure or having other safeguards built it. I’m not for unlimited access or backdoors on every product. Just an end to the all or nothing argument that the tech industry is making. Because that argument ends with the government saying “Nothing sounds fine.” | ||
Jockmcplop
United Kingdom8769 Posts
Fair enough, but its not going to stop the next guy driving his car into a bunch of people and stabbing a cop is it? | ||
LightSpectra
United States1128 Posts
On March 28 2017 03:07 Plansix wrote: If Facebook and Apple want a law that makes the act of breaking into their client’s data using internal backdoors illegal, they can ask for it. Makes perfect sense. Criminals of course always obey the law, so they'll be totally bound to obey this one as well. On March 28 2017 03:07 Plansix wrote: There are countless ways to create a system that can be responsibility secure, but not pefect. Then go make one and become an overnight billionaire for having solved a problem that is widely considered to be unsolvable. On March 28 2017 03:07 Plansix wrote: Just an end to the all or nothing argument that the tech industry is making. Because that argument ends with the government saying “Nothing sounds fine.” Reality is all-or-nothing, and no law is going to change that. They can pass a law to try, certainly. And that will result in American tech companies losing billions so that European companies who aren't bound by the same laws can pick up where they left off. | ||
Plansix
United States60190 Posts
On March 28 2017 03:10 LightSpectra wrote: Makes perfect sense. Criminals of course always obey the law, so they'll be totally bound to obey this one as well. Please refrain from making these garbage arguments. | ||
LightSpectra
United States1128 Posts
On March 28 2017 03:13 Plansix wrote: Please refrain from making these garbage arguments. What's garbage about it? Russian/Chinese/North Korean hackers are already breaking the law, why would a law that says "please don't use the backdoor" be any different? | ||
Acrofales
Spain17263 Posts
On March 28 2017 03:02 LightSpectra wrote: That's true to an extent, sure. Of course, there are open source security solutions that demonstrably guarantee that your private key is not in danger and thus nobody will ever be able to read the messages encrypted under your public key, such as Signal and GPG. WhatsApp is a harder case to judge considering they're not open source, but they've undergone a security review that theoretically is the equivalent. I'm not vouching for the auditors who did the review, maybe they fucked up. I'm just responding to the notion that Facebook could 'easily' backdoor it. Yeah, and then everybody who's interested in security would switch to extranational solutions that aren't bound by the same legislation. Until we get to a world government, there's no stopping that. Well, you could of course make it illegal to use phones of that type. It wouldn't stop criminals from using them, but it would give law enforcement another tool: they could arrest them on the use of contraband technology. Trump of course wants this, he wants to backdoor every American communications program in order to catch terrorists. His mistake, and the mistake of everybody who agrees with him, is that it takes mere minutes to switch to something not American. All of our companies thus lose all of their business and the bad people are no worse off. I don't think I'll address this strawman. Open source isn't a guarantee of security, no. But it's a guarantee of no backdoors so long as you review the code for backdoors (doesn't take terribly long and even non-programmers can do so) and compile it yourself, which is quite easy to do. Absolutely. Do you, though? Or do you just download the app from Google Play? Because I guarantee you that virtually nobody installs their homebrew apps. In fact, most people don't even know where the button to install from "untrusted sources" (irony noted) is. That's the precise point I'm making here. Ordinary folk need a secure-by-default solution, otherwise it's too much work for them and they'll just settle for being compromised. But that doesn't stop bad people from putting in minutes of effort to use something secure. Do we want to live in a world where innocent people are prone to mass surveillance but bad people are just as secure as they are now? Where you say mass-surveillance I say court-ordered surveillance. Remember that the default mode is still encryption on. Facebook/Google/Telegram/my homebrew messaging app have absolutely no incentive to switch that off. So unless a court orders a wiretap on you, your WhatsApp messages are encrypted. And if you're paranoid you are free to add extra layers of encryption in there. It also won't catch smart criminals, who will of course be paranoid and add those extra layers of encryption. But nobody is claiming catching smart criminals will ever be easy. Just that barring law enforcement the possibility to listen in to conversations by very simple design makes catching dumb criminals needlessly hard. | ||
Plansix
United States60190 Posts
On March 28 2017 03:16 LightSpectra wrote: What's garbage about it? Russian/Chinese/North Korean hackers are already breaking the law, why would a law that says "please don't use the backdoor" be any different? That is a risk with phone calls, mail, checks, faxes(maybe not faxes), wiring funds and all other forms of communication. Why is digital communication different? Why should it be any different than those systems? Why does it gain special protection from court orders that those companies do not? What if the phone companies created a special version of phone line that couldn’t be wire tapped, ever? | ||
LightSpectra
United States1128 Posts
Well, you could of course make it illegal to use phones of that type. It wouldn't stop criminals from using them, but it would give law enforcement another tool: they could arrest them on the use of contraband technology. Another charge to add to the "blew themselves up in public" charge, I suppose. But would that work against scammers and smugglers? Sure, until they decide to just revert back to word-salad as was done in the days of olde. Absolutely. Do you, though? Or do you just download the app from Google Play? Because I guarantee you that virtually nobody installs their homebrew apps. In fact, most people don't even know where the button to install from "untrusted sources" (irony noted) is. I do download it from Google Play, but I do a cryptographic hash verification. Granted most people don't do that, but I'm sure they will the moment the very first bait-and-switch exploit is demonstrated to have been used in the iOS App Store or Google Play. If I was a Bad Person, I would compile from source. It's not hard. Non-programmers can do it, and in fact many of them probably do. Where you say mass-surveillance I say court-ordered surveillance. Remember that the default mode is still encryption on. Facebook/Google/Telegram/my homebrew messaging app have absolutely no incentive to switch that off. So unless a court orders a wiretap on you, your WhatsApp messages are encrypted. And if you're paranoid you are free to add extra layers of encryption in there. I don't know how many times I have to repeat this. If Facebook/Google/Telegram can do it arbitrarily for a court order, they can do it arbitrarily for anyone and any reason. It also won't catch smart criminals, who will of course be paranoid and add those extra layers of encryption. But nobody is claiming catching smart criminals will ever be easy. Just that barring law enforcement the possibility to listen in to conversations by very simple design makes catching dumb criminals needlessly hard. I don't agree with the premise that we should compromise billions of people's security for a chance to catch a few dumb criminals. | ||
| ||