In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up!
NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action.
On December 17 2016 02:21 LegalLord wrote: The problem with combating "bullshit" is that sometimes it's hard to differentiate from straight-up censorship of opinions not popular with the mediaverse. In that sense xDaunt's concerns are perfectly valid.
It would not have been difficult for anyone with half a brain to separate the stuff about disrupting rallies and real news. Same with the pizza place shit. Its not like this filtering would need to be extremely aggressive to still chop off a lot of bullshit. Even a delicate touch would have prevented that guy from showing up to the pizza place.
I either do that (whoever has bla@bla.com is receiving a shitton of spam), or if an actual email is required, www.10minutemail.com is awesome. My yahoo mail is my last ditch choice if they insist on something that's persistent and I'll have to check occasionally.
the yahoo leaks are gonna be a bunch of wapo "SPECIAL SUBSCRIPTION OFFER TO LOYAL READER" emails.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Even when science ends up being wrong, it is always the best shot we had at being correct. There is nothing that we ever have more reason to believe than the scientific method. The method is designed around having as much confidence as you can and for people to have the best shot as proving it wrong.
That's the main problem with climate change denial. When we have 98% consensus, we're in a good spot.
If you needed surgery and you had a fleet of 100 surgeons advising you on what should be done, and 98 of them all agreed, I refuse to believe that anyone skeptical of climate change would be skeptical of their consensus. You would all do what those 98 agreed on, even though 2 of them weren't on board.
Can someone explain to me why this video isn't legitimate.
There are a couple things to be skeptical about off the top of my head: -The first five matching points could be the same for all birth certificates in that range due to having been mass printed -How many certificates were sampled to find one with date stamps at a similar angle and what's the actual probability of that under these conditions -Is Johanna Ah'Nee's birth certificate verified to be real and as presented, and hasn't itself been reverse-engineered to look like the template for a forgery
I think that's 7 out of the 9 points? That's where I stopped when I watched before.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Remember how many people here posted stuff about the Clinton campaign hiring people to disrupt Trump rallies? This stuff is legitimately bad. It feels like you're ignoring the downside to straight up bullshit being propagated and believed by swaths of people.
Lots of people believe the shit being propagated about Syria too and refuse to recognize the civilians in eastern Aleppo are being liberated by the Syrian Army and the Russians. Where are these filters for the crap that's being shown on TV where they are taking videos from clearly biased sources such as rebels posting videos on YouTube and reporting on the allegations in those videos in a sympathetic tone that suggests they are facts?
Anyway, besides that, I'm worried that these mechanism will be used for purposes other than just filtering out fake news. As xDaunt indicated, there are some signs already that there is a partisan air around this type filtering. I am not a fan of Cenk Uygur, but I found this a rather disturbing report, where he says politicians are basically constantly interfering with what should or should not be said on TV and the media companies listen for various monetary reasons such as having certain politicians on their show and potential ad money. I fear the mechanisms that are being implemented will be subject to a similar sort of interference in the long run (especially as internet/social media will become the dominating form of communication).
Quite frankly, whenever things like this are implemented, I am worried they will be used for things other than their original intent. This is especially true for censorship, although I don't have an example ready for something like that occurring in recent western history with specific regards to censorship. However, to give an example of using laws beyond their original intent: I believe Trump has suggested utilizing the Patriot Act to prevent the banks from allowing Mexicans to send back money to their family in Mexico in order to blackmail Mexico to pay for the Great Wall. That doesn't seem like it was the original purpose of the Patriot Act. It certainly wasn't an argument presented by proponents of the law at the time of its implementation.
Another example that's actually occurred: in the Netherlands we had a law that made it so you have to have a government-issued ID when you are out on the street. Proponents said it was so people couldn't escape fines and it's been suggested to be a measure against terrorism. A couple of weeks ago, IIRC, an 80 year old man was cuffed for cycling on the wrong side of the street and he didn't have his ID with him. I'll add that I often cycle on the wrong side of the street on short stretches because otherwise I'd have to cross the street twice, and there's corners that make it hard to see if its safe to cross. It's also a little disturbing that he was on his way to the mosque, especially with the far-right anti-muslim movement here in the Netherlands and I doubt he would have been handled quite as roughly (and cuffed to boot) if that law hadn't been there.
That said, I am not going to disagree that genuinely fake news such as the PizzaGate thing is bad. But I do believe that these two criticisms ("partisan filtering" [whether it comes from civilians or politicians/government] and "beyond original intent") have some validity to them.
travis -> I'm not up to a full debunking; why not just google it though? it seems likely someone has already put up a full debunking of it somewhere. in general, there's probably a bunch of fallacies in the logic or claims in there. and the guy's history means less reason to give him benefit of the doubt on anything.
if other people you know are buying it; I wonder what circles you live in. very different from the ones I live in. wherein everyone dismisses the video as trash.
if other people you know are buying it; I wonder what circles you live in. very different from the ones I live in. wherein everyone dismisses the video as trash.
That's why I was asking where he finds this stuff. He clearly hangs around some extremely questionable echo chamber online communities that just instantly eat up this shit.
On December 17 2016 02:51 a_flayer wrote: That said, I am not going to disagree that genuinely fake news such as the PizzaGate thing is bad. But I do believe that these two criticisms have some validity to them.
Being able to think of a theoretical scenario where this could be a bad thing is meaningless, though.
Let's say we brought up the question of a country having a military. Couldn't that military be used to violently suppress dissent for the ruling party? And yet, Canada kinda has a military and that doesn't happen. The existence of a theoretical scenario where something goes bad does not mean the idea is itself flawed. As I said, there are also a variety of scenarios where this could be effective and helpful without being a tool for partisan censorship.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers?
I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers?
I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
In short, there's two major problems with regards to science in general: shitty media reporting (take the most extreme conclusion of a journal and use it as the narrative in a news report) and a lack of funding/willingness for peer review. Beyond that, there's going to be human errors such as what you said, but that's really a minor issue compared to the other two and one that has no real resolution because humans will always make mistakes.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers?
I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue.
In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending.
Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community.
Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?"
Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right."
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers?
I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
In short, there's two major problems with regards to science in general: shitty media reporting (take the most extreme conclusion of a journal and use it as the narrative in a news report) and a lack of funding/willingness for peer review. Beyond that, there's going to be human errors such as what you said, but that's really a minor issue compared to the other two and one that has no real resolution because humans will always make mistakes.
there's a known 3rd medium-scale problem: the bias toward interesting study results being reported (this has several facets)
if other people you know are buying it; I wonder what circles you live in. very different from the ones I live in. wherein everyone dismisses the video as trash.
That's why I was asking where he finds this stuff. He clearly hangs around some extremely questionable echo chamber online communities that just instantly eat up this shit.
On December 17 2016 02:51 a_flayer wrote: That said, I am not going to disagree that genuinely fake news such as the PizzaGate thing is bad. But I do believe that these two criticisms have some validity to them.
Being able to think of a theoretical scenario where this could be a bad thing is meaningless, though.
Let's say we brought up the question of a country having a military. Couldn't that military be used to violently suppress dissent for the ruling party? And yet, Canada kinda has a military and that doesn't happen. The existence of a theoretical scenario where something goes bad does not mean the idea is itself flawed. As I said, there are also a variety of scenarios where this could be effective and helpful without being a tool for partisan censorship.
Just as you can always come up with a theoretical scenario of abuse, you can always use the above argument to dismiss concerns regarding flaws in the idea being presented. Same goes for any potential implementation: you can cite concerns about potential flaws, and then argue against it with the same thing you said.
if other people you know are buying it; I wonder what circles you live in. very different from the ones I live in. wherein everyone dismisses the video as trash.
That's why I was asking where he finds this stuff. He clearly hangs around some extremely questionable echo chamber online communities that just instantly eat up this shit.
On December 17 2016 02:51 a_flayer wrote: That said, I am not going to disagree that genuinely fake news such as the PizzaGate thing is bad. But I do believe that these two criticisms have some validity to them.
Being able to think of a theoretical scenario where this could be a bad thing is meaningless, though.
Let's say we brought up the question of a country having a military. Couldn't that military be used to violently suppress dissent for the ruling party? And yet, Canada kinda has a military and that doesn't happen. The existence of a theoretical scenario where something goes bad does not mean the idea is itself flawed. As I said, there are also a variety of scenarios where this could be effective and helpful without being a tool for partisan censorship.
Just as you can always come up with a theoretical scenario of abuse, you can always use the above argument to dismiss concerns regarding flaws in the idea being presented. Same goes for any potential implementation: you can cite concerns about potential flaws, and then argue against it with the same thing you said.
We might as well say nothing at all.
It all comes down to whether your conclusion is "this box shouldn't even be opened" or "let's cautiously open this box". I agree with caution, but the idea of just not even giving it a shot because something could go wrong is what I am arguing against. I'm not saying crank that shit up and go nuts. I'm saying the idea itself is not dangerous enough to warrant being completely shut down.
My argument is: I think this could be done in such a variety of ways that are not disastrous that it is at least worth trying and not being too scared of. Cautiously do it, but do it.
Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.”
We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term.
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility.
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem.
I don't know if claiming news media that denies science and scientists can be considered real news.
Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues?
The skepticism is usually wrong but absolutely warranted.
Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base.
If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted.
Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers?
I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue.
In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending.
Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community.
Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?"
Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right."
Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong?
Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions?
christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
On December 17 2016 03:17 travis wrote: christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
seriously what the fuck is wrong with some of you
what's wrong wtih you? you're being needlessly aggro over it. a 90% like ratio doesn't mean much in terms of quality. I'm sure we could find tons of utter drek that gets 90% like ratio.
oddly, I look for debunking and finds tons of stuff on it with google; though maybe not one for this specific video; not really worth going through all the links to check. pesky google giving different results for different people.
also, as long as you're being aggro, i'm gonna aggro back a little; you shouldn't really need a counter to know that any actual issues with Obama's birth would've been settled long ago; and that finding the specific counters to something that's obviously trash is unnecessary.