|
Read the rules in the OP before posting, please.In order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a re-read to refresh your memory! The vast majority of you are contributing in a healthy way, keep it up! NOTE: When providing a source, explain why you feel it is relevant and what purpose it adds to the discussion if it's not obvious. Also take note that unsubstantiated tweets/posts meant only to rekindle old arguments can result in a mod action. |
On December 17 2016 03:15 LegalLord wrote:Show nested quote +On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions?
well first of all theres a difference between hard and soft sciences. There is a real percieved problem in terms of funding and raising questions about how they may shape the data. I think most scientists wouldn't willingly post fake data just because it makes there backer look bad but it may affect the study in other ways. There's a reason scientists are always trying to replicate experiments and verify claims and for the most part science is eventually self correcting if a mistake happens.
regarding Google that would be way more social science and therefore impossible to prove. If there was something wrong with their math I'm sure Google labs would love to hear it and fix it. if someone really thought it was unsustainable Google would probably find out why they think that and independently come to their own conclusion about it. now the possibility is of course their that they would just dismiss it but if they did they'd probably have a reason. if it was a good or bad reason is up to debate but for them it would be a good reason and its ultimately their company
|
On December 17 2016 03:21 zlefin wrote:Show nested quote +On December 17 2016 03:17 travis wrote: christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
seriously what the fuck is wrong with some of you what's wrong wtih you? you're being needlessly aggro over it. a 90% like ratio doesn't mean much in terms of quality. I'm sure we could find tons of utter drek that gets 90% like ratio. oddly, I look for debunking and finds tons of stuff on it with google; though maybe not one for this specific video; not really worth going through all the links to check. pesky google giving different results for different people.
I sympathize with travis. I didn't follow or care about that particular chain of reactions in this thread (because I thought the video was absolute trash and not worth discussing), but I saw people bitching about him living in an echo chamber, etc. I can see how he can feel annoyed by that. It would feel like a personal attack on me if I had to deal with responses such as that. Now you're saying "what is wrong with you" and continuing the line of personal attacks.
|
On December 17 2016 02:57 LegalLord wrote:Show nested quote +On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong.
When a science journal is published it literally walks you through how they got to that conclusion. The reason it is posted that way is so that people can either replicate it or find a mistake in it. Anyone can do it. That means an american liberal can be tested by a russian conservative and if the russian finds mistakes, he can publish and get mucho rewards for moving the field forward. And if a former warlord in africa also finds mistakes in the conservative russian he can publish that as well; gaining similar praise among his peers.
Praise from peers is one of the biggest ways scientists get funding ie because they are famous.
Scientists are literally financially incentivized to prove each other wrong.
|
On December 17 2016 03:05 zlefin wrote:Show nested quote +On December 17 2016 03:02 a_flayer wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. In short, there's two major problems with regards to science in general: shitty media reporting (take the most extreme conclusion of a journal and use it as the narrative in a news report) and a lack of funding/willingness for peer review. Beyond that, there's going to be human errors such as what you said, but that's really a minor issue compared to the other two and one that has no real resolution because humans will always make mistakes. there's a known 3rd medium-scale problem: the bias toward interesting study results being reported (this has several facets) this link seems to have some decent info on that, though I haven't fully read it to verify that: http://www.editage.com/insights/publication-and-reporting-biases-and-how-they-impact-publication-of-research
Not just that, but the bias towards novel research is huge. Although researchers are encouraged to test their hypotheses thoroughly, it is generally assumed that the underlying data is correct and representative. And there is very very little incentive to redo someone else's experiment to gather your own data, because most of the time, you'll just confirm the other peoples' work and that won't get published anywhere (and thus is considered a waste of time). People only redo experiments when they have some other reason to believe that the conclusion is wrong. E.g. you have a hypothesis and have done experiments and it seems to work, but someone published a paper with experiments whose results would undermine your own theory. Now you have to either discard your own hypothesis, or you have to figure out why there are conflicting results.
Most research goes unchecked for basic validity. Soundness is generally pretty thoroughly checked in peer reviews, but validity is hard and often expensive to check, so we assume that a single researcher/group is honest and checks that himself when publishing. And this can lead to huge scandals in research communities when it turns out someone was actually falsifying data.
Luckily those cases are very rare (if they weren't we'd assume the scarse checks for validity that do occur would result in scandals more often). Even so, it is not easy to prove data invalid, and it can cause real issues (think Wakefield's autism bullshit for an infamous example).
|
On December 17 2016 03:24 a_flayer wrote:Show nested quote +On December 17 2016 03:21 zlefin wrote:On December 17 2016 03:17 travis wrote: christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
seriously what the fuck is wrong with some of you what's wrong wtih you? you're being needlessly aggro over it. a 90% like ratio doesn't mean much in terms of quality. I'm sure we could find tons of utter drek that gets 90% like ratio. oddly, I look for debunking and finds tons of stuff on it with google; though maybe not one for this specific video; not really worth going through all the links to check. pesky google giving different results for different people. I sympathize with travis. I didn't follow or care about that particular chain of reactions in this thread (because I thought the video was absolute trash and not worth discussing), but I saw people bitching about him living in an echo chamber, etc. I can see how he can feel annoyed by that. It would feel like a personal attack on me if I had to deal with responses such as that. Now you're saying "what is wrong with you" and continuing the line of personal attacks. if he has a problem, he should report offenders, take it to website feedback, or more civilly object. swearing at people, and beign as aggro as he was, is not justified anger, and is not acceptable imho. what I said does not qualify as continuing the line of personal attacks.
|
United Kingdom13775 Posts
On December 17 2016 03:23 Karis Vas Ryaar wrote:Show nested quote +On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:[quote] Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? well first of all theres a difference between hard and soft sciences. There is a real percieved problem in terms of funding and raising questions about how they may shape the data. I think most scientists wouldn't willingly post fake data just because it makes there backer look bad but it may affect the study in other ways. There's a reason scientists are always trying to replicate experiments and verify claims and for the most part science is eventually self correcting if a mistake happens. Similarly, I am more skeptical of economics, history, and the like, softer science, than I am of mathematics, physics, engineering, chemistry, and biology, harder science. Few people would contest that. Similarly, the former has less recourse in the way of "replicating experiments" since they mostly are forced to rely on historical data and micro-examples. Even the latter can be politicized though. Say that a certain physics lab is funded by the contribution of Lockheed Martin doing R&D work on the stealth aspect of their F-35 jet. Do you think said physicist might at least subconsciously be a little biased towards giving results that give LM the perception they want of the feasibility of the stealth functionality. And it doesn't even have to be wrong - it could just be one of those frequent cases where not all the conditions apply as necessary to make the experiments work as needed.
|
On December 17 2016 03:29 Acrofales wrote:Show nested quote +On December 17 2016 03:05 zlefin wrote:On December 17 2016 03:02 a_flayer wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:[quote] Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. In short, there's two major problems with regards to science in general: shitty media reporting (take the most extreme conclusion of a journal and use it as the narrative in a news report) and a lack of funding/willingness for peer review. Beyond that, there's going to be human errors such as what you said, but that's really a minor issue compared to the other two and one that has no real resolution because humans will always make mistakes. there's a known 3rd medium-scale problem: the bias toward interesting study results being reported (this has several facets) this link seems to have some decent info on that, though I haven't fully read it to verify that: http://www.editage.com/insights/publication-and-reporting-biases-and-how-they-impact-publication-of-research Not just that, but the bias towards novel research is huge. Although researchers are encouraged to test their hypotheses thoroughly, it is generally assumed that the underlying data is correct and representative. And there is very very little incentive to redo someone else's experiment to gather your own data, because most of the time, you'll just confirm the other peoples' work and that won't get published anywhere (and thus is considered a waste of time). People only redo experiments when they have some other reason to believe that the conclusion is wrong. E.g. you have a hypothesis and have done experiments and it seems to work, but someone published a paper with experiments whose results would undermine your own theory. Now you have to either discard your own hypothesis, or you have to figure out why there are conflicting results. Most research goes unchecked for basic validity. Soundness is generally pretty thoroughly checked in peer reviews, but validity is hard and often expensive to check, so we assume that a single researcher/group is honest and checks that himself when publishing. And this can lead to huge scandals in research communities when it turns out someone was actually falsifying data. Luckily those cases are very rare (if they weren't we'd assume the scarse checks for validity that do occur would result in scandals more often). Even so, it is not easy to prove data invalid, and it can cause real issues (think Wakefield's autism bullshit for an infamous example).
What you are describing is peer review, is it not? That was one of two things I listed as a major problem in the post that was quoted by the one you responded to. Good elaboration though.
|
On December 17 2016 03:30 zlefin wrote:Show nested quote +On December 17 2016 03:24 a_flayer wrote:On December 17 2016 03:21 zlefin wrote:On December 17 2016 03:17 travis wrote: christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
seriously what the fuck is wrong with some of you what's wrong wtih you? you're being needlessly aggro over it. a 90% like ratio doesn't mean much in terms of quality. I'm sure we could find tons of utter drek that gets 90% like ratio. oddly, I look for debunking and finds tons of stuff on it with google; though maybe not one for this specific video; not really worth going through all the links to check. pesky google giving different results for different people. I sympathize with travis. I didn't follow or care about that particular chain of reactions in this thread (because I thought the video was absolute trash and not worth discussing), but I saw people bitching about him living in an echo chamber, etc. I can see how he can feel annoyed by that. It would feel like a personal attack on me if I had to deal with responses such as that. Now you're saying "what is wrong with you" and continuing the line of personal attacks. if he has a problem, he should report offenders, take it to website feedback, or more civilly object. swearing at people, and beign as aggro as he was, is not justified anger, and is not acceptable imho. what I said does not qualify as continuing the line of personal attacks.
You are being kind of a dick
EDIT: I have fat thumbs and posted before I intended (from phone). Point was: while I agree with you that he has other venues to bring up such complaints, perhaps it would be appropriate to reflect a bit upon why he acted like he did. Supposedly we are all here our spare time and when someone brings a topic for discussion it is because they would like an actual discussion. If all you are going to do is ridicule people for whatever topic why not ignore the topic instead add you are clearly not interested in an honest discussion.
|
On December 17 2016 03:15 LegalLord wrote:Show nested quote +On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions?
They have no incentive to publish it. They do have incentive to change their practice and in 10 years the one that works will replace the current google practice. Also, every single non-google search engine is actively trying to prove google wrong as we speak--unless you think Microsoft with its Bing search engine is just a shitty start up with no future.
|
United Kingdom13775 Posts
On December 17 2016 03:27 Thieving Magpie wrote:Show nested quote +On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. When a science journal is published it literally walks you through how they got to that conclusion. The reason it is posted that way is so that people can either replicate it or find a mistake in it. Anyone can do it. That means an american liberal can be tested by a russian conservative and if the russian finds mistakes, he can publish and get mucho rewards for moving the field forward. And if a former warlord in africa also finds mistakes in the conservative russian he can publish that as well; gaining similar praise among his peers. Praise from peers is one of the biggest ways scientists get funding ie because they are famous. Scientists are literally financially incentivized to prove each other wrong. Again, this is idolizing an imperfect "peer review" process that doesn't actually get rid of being wrong.
If I have, say, a large particle collider I will make sure to double-check CERN and the LHC results for accuracy. I'll also make sure that if some study from the ISS comes in and says one thing or other, that I will have my own independent space station on which I can verify the accuracy of their work. I'll make sure to do that.
|
On December 17 2016 03:30 LegalLord wrote:Show nested quote +On December 17 2016 03:23 Karis Vas Ryaar wrote:On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote: [quote]
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? well first of all theres a difference between hard and soft sciences. There is a real percieved problem in terms of funding and raising questions about how they may shape the data. I think most scientists wouldn't willingly post fake data just because it makes there backer look bad but it may affect the study in other ways. There's a reason scientists are always trying to replicate experiments and verify claims and for the most part science is eventually self correcting if a mistake happens. Similarly, I am more skeptical of economics, history, and the like, softer science, than I am of mathematics, physics, engineering, chemistry, and biology, harder science. Few people would contest that. Similarly, the former has less recourse in the way of "replicating experiments" since they mostly are forced to rely on historical data and micro-examples. Even the latter can be politicized though. Say that a certain physics lab is funded by the contribution of Lockheed Martin doing R&D work on the stealth aspect of their F-35 jet. Do you think said physicist might at least subconsciously be a little biased towards giving results that give LM the perception they want of the feasibility of the stealth functionality. And it doesn't even have to be wrong - it could just be one of those frequent cases where not all the conditions apply as necessary to make the experiments work as needed.
while yeah but that's where other scientists come in. Obviously people have subconscious beliefs that can interfere. that situations weird because it would obviously be classified and not available for general people to review.
at the end of the day though you still want accurate science because if you sell a plane that doesn't work it's going to negatively affect your reputation. your companies reputation, and the bottom line. But yeah those problems always slip through, see Samsung's exploding phone.
the media also might make it seem worse than it is because they tend to make a big deal about revolutionary discoveries and then not make a big deal about when it's corrected or scientists saying wait til we can verify this. see the FTL particles a couple years a go which later was found to just be a bad sensor.
|
On December 17 2016 03:15 LegalLord wrote:Show nested quote +On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions?
If I am a professor and I get a grant to research something from the NSF, and my findings end up being shit, and my paper gets withdrawn, the NSF will not fund me next time I ask. And when I ask the DOE, they will be skeptical as well, given my history.
Additionally, it is easier to prove someone wrong than it is to prove you yourself are right. If I tell Nature about how an article they published can actually be shown to be false by work I have done, their eyes are going to go wide because they need their impact factor to remain high and their reliability to remain intact. They will be very quick to write someone off as soon as they are shown to have been wrong or presented their data in a way that was negligent.
I don't know anything about Google labs because they largely don't work in my industry. My point only relates to academic research done by grant funding from various public funding sources, like the NSF and DOE and DOD etc. My point is that we can trust atmospheric scientists because they have a lot to lose by being wrong. People would GLADLY prove them wrong if they had the chance because proving someone else's work wrong is a great way to climb the ladder.
On December 17 2016 03:17 travis wrote: christ im not gonna come back here. sorry I don't read through every single page. And I am so so sorry that I get 2nd opinions on things and remain skeptical of things that I am not an expert on. I am so sorry I recognize I am not all knowing. If only I could be like so many of the geniuses in this thread that know if something is real or not at a glance and are just so fucking smart that they already have such a solid grasp on what is possible or not. As for doing a google search, it's literally the first thing I did. If a "debunking" happened it't not in the first 2 pages for the last 24 hours.
as for "people buying it" gosh I don't know where I get that idea maybe the 90% like ratio on the video?
seriously what the fuck is wrong with some of you
You've also posted things like this in the past, oddly enough. You take some totally bogus video (did you seriously just use youtube's vote ratio as evidence?) and ask people to disprove it, we all respond and you get pissy.
|
United Kingdom13775 Posts
On December 17 2016 03:32 Thieving Magpie wrote:Show nested quote +On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:[quote] Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? They have no incentive to publish it. They do have incentive to change their practice and in 10 years the one that works will replace the current google practice. Also, every single non-google search engine is actively trying to prove google wrong as we speak--unless you think Microsoft with its Bing search engine is just a shitty start up with no future. And now we will have an internal political game that starts to be well removed from science. Their reason for not publishing is political, pure and simple, despite being something that would be rooted in heavily hard-science fields.
|
Off-topic: The term "aggro" sounds douchey as fuck. Just say aggressive.
|
United Kingdom13775 Posts
On December 17 2016 03:35 Mohdoo wrote:Show nested quote +On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:[quote] Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? If I am a professor and I get a grant to research something from the NSF, and my findings end up being shit, and my paper gets withdrawn, the NSF will not fund me next time I ask. And when I ask the DOE, they will be skeptical as well, given my history. Additionally, it is easier to prove someone wrong than it is to prove you yourself are right. If I tell Nature about how an article they published can actually be shown to be false by work I have done, their eyes are going to go wide because they need their impact factor to remain high and their reliability to remain intact. They will be very quick to write someone off as soon as they are shown to have been wrong or presented their data in a way that was negligent. I don't know anything about Google labs because they largely don't work in my industry. My point only relates to academic research done by grant funding from various public funding sources, like the NSF and DOE and DOD etc. My point is that we can trust atmospheric scientists because they have a lot to lose by being wrong. People would GLADLY prove them wrong if they had the chance because proving someone else's work wrong is a great way to climb the ladder. People sometimes get public funding to research climate change for explicitly political reasons. Word on the grapevine is that climate skeptics are big funders of climate research - to avoid making decisions right away on the matter. And would those scientists refuse more money to say that climate change is very real, but in the future?
What if your paper isn't really wrong, it just doesn't tell the full story? That is far more common than simple fabrication. It looks like minor errors but it certainly might give a wrong direction, and it doesn't mean retractions and public shaming.
And here is an even more insidious problem: would a scientist perhaps say "support the EU because of their principled stance on the environment" when what they are really after is the continuation of a large, yet highly polarizing in some circle, entity that gives them stable and consistent funding for their own work?
|
On December 17 2016 03:31 a_flayer wrote:Show nested quote +On December 17 2016 03:29 Acrofales wrote:On December 17 2016 03:05 zlefin wrote:On December 17 2016 03:02 a_flayer wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote: [quote]
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. In short, there's two major problems with regards to science in general: shitty media reporting (take the most extreme conclusion of a journal and use it as the narrative in a news report) and a lack of funding/willingness for peer review. Beyond that, there's going to be human errors such as what you said, but that's really a minor issue compared to the other two and one that has no real resolution because humans will always make mistakes. there's a known 3rd medium-scale problem: the bias toward interesting study results being reported (this has several facets) this link seems to have some decent info on that, though I haven't fully read it to verify that: http://www.editage.com/insights/publication-and-reporting-biases-and-how-they-impact-publication-of-research Not just that, but the bias towards novel research is huge. Although researchers are encouraged to test their hypotheses thoroughly, it is generally assumed that the underlying data is correct and representative. And there is very very little incentive to redo someone else's experiment to gather your own data, because most of the time, you'll just confirm the other peoples' work and that won't get published anywhere (and thus is considered a waste of time). People only redo experiments when they have some other reason to believe that the conclusion is wrong. E.g. you have a hypothesis and have done experiments and it seems to work, but someone published a paper with experiments whose results would undermine your own theory. Now you have to either discard your own hypothesis, or you have to figure out why there are conflicting results. Most research goes unchecked for basic validity. Soundness is generally pretty thoroughly checked in peer reviews, but validity is hard and often expensive to check, so we assume that a single researcher/group is honest and checks that himself when publishing. And this can lead to huge scandals in research communities when it turns out someone was actually falsifying data. Luckily those cases are very rare (if they weren't we'd assume the scarse checks for validity that do occur would result in scandals more often). Even so, it is not easy to prove data invalid, and it can cause real issues (think Wakefield's autism bullshit for an infamous example). What you are describing is peer review, is it not? That was one of two things I listed as a major problem in the post that was quoted by the one you responded to. Good elaboration though.
Peer review in a far broader sense than it is currently used, yes.
|
United Kingdom13775 Posts
On December 17 2016 03:34 Karis Vas Ryaar wrote:Show nested quote +On December 17 2016 03:30 LegalLord wrote:On December 17 2016 03:23 Karis Vas Ryaar wrote:On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote: [quote]
I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship.
We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? well first of all theres a difference between hard and soft sciences. There is a real percieved problem in terms of funding and raising questions about how they may shape the data. I think most scientists wouldn't willingly post fake data just because it makes there backer look bad but it may affect the study in other ways. There's a reason scientists are always trying to replicate experiments and verify claims and for the most part science is eventually self correcting if a mistake happens. Similarly, I am more skeptical of economics, history, and the like, softer science, than I am of mathematics, physics, engineering, chemistry, and biology, harder science. Few people would contest that. Similarly, the former has less recourse in the way of "replicating experiments" since they mostly are forced to rely on historical data and micro-examples. Even the latter can be politicized though. Say that a certain physics lab is funded by the contribution of Lockheed Martin doing R&D work on the stealth aspect of their F-35 jet. Do you think said physicist might at least subconsciously be a little biased towards giving results that give LM the perception they want of the feasibility of the stealth functionality. And it doesn't even have to be wrong - it could just be one of those frequent cases where not all the conditions apply as necessary to make the experiments work as needed. while yeah but that's where other scientists come in. Obviously people have subconscious beliefs that can interfere. that situations weird because it would obviously be classified and not available for general people to review. at the end of the day though you still want accurate science because if you sell a plane that doesn't work it's going to negatively affect your reputation. your companies reputation, and the bottom line. But yeah those problems always slip through, see Samsung's exploding phone. the media also might make it seem worse than it is because they tend to make a big deal about revolutionary discoveries and then not make a big deal about when it's corrected or scientists saying wait til we can verify this. see the FTL particles a couple years a go which later was found to just be a bad sensor. What if the other scientists are also bought off?
Also, a physicist might conclude something like "optical properties of material X suggest that they may be undetectable by Y means" which may or may not be completely accurate, but the assertion would be favorable for LM. And that's related, but it's hard to pin that as fabrication when the real question of interest is not Material X and Tracker Y but more so "will the Russian S-400 system be able to track and target F-35 jets" which is what people actually care about and which people might use the physicist's word to answer, even though it simply doesn't work that well. And how long will it take before someone can actually test that in real combat scenarios?
|
On December 17 2016 03:33 LegalLord wrote:Show nested quote +On December 17 2016 03:27 Thieving Magpie wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote:On December 17 2016 01:29 xDaunt wrote:Facebook detailed a new plan Thursday to target the rapid spread of fake news across its site, a phenomenon that received renewed attention in the weeks following the 2016 election, with accusations that it may have influenced the behavior of voters.
The problem reached a breaking point two weeks ago when a gunman entered a pizza restaurant in Washington, D.C., to investigate an internet-based conspiracy theory about a child-sex ring that does not exist.
Now the move from the internet’s largest social-media platform has some intentional fake-news writers, who created their websites to “satirize” right-wing conspiracies or exploit Facebook’s algorithm, believing they’ll soon be out of business.
But the new program also has conspiracy theorists, ones who believe Hillary Clinton’s fictitious ties to the occult are the “real news,” excitedly drawing battle lines over the future of the news on social media.
Should Facebook’s fact-check initiative take off and result in censorship of propagandist sites, editors at websites like Infowars and alt-right leaders insist it will only reinforce the belief that certain ideas are being suppressed in favor of facts from mainstream outlets. One editor told The Daily Beast the Facebook plan proves that now the “‘Infowar’ isn’t a cliché, it’s perfectly apt.”
If Facebook’s experiment is applied correctly, authors of intentionally fake news face a potential hurdle for generating advertising revenue for their sites, if not the banning of their stories from the social network outright.
Marco Chacon, the creator of the intentional fake news website RealTrueNews.org, says Facebook is finally taking a positive step toward making sure websites like his no longer go viral on the social network. In an article for The Daily Beast in November, Chacon wrote that he created his site to make those who share fake right-wing news on Facebook more aware that they’re “susceptible to stories written ‘in [their] language’ that are complete, obvious, utter fabrications.” Chacon’s larger aim, he wrote, was to force Facebook to work out a solution for a fake-news epidemic he believed was “deeply entrenched” and easily monetized.
“This is the right approach,” said Chacon of Facebook’s new plan Thursday. “The people who fear censors fear a whitelist of ‘approved news sites.’ This sounds like a more intelligent heuristic that is exactly the kind of thing a company like Facebook should employ.”
Chacon, who said he was preparing for NBC News to interview him about his antics in his home later in the day, added that the new safeguards “will give people some greater responsibility in what they spread.” Source. We'll have to wait and see what Facebook actually does, but it is becoming pretty clear that these major tech platforms (particularly Twitter) are aligning themselves with the Left. I tend to think that this is going to be a mistake long term. Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)? But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. When a science journal is published it literally walks you through how they got to that conclusion. The reason it is posted that way is so that people can either replicate it or find a mistake in it. Anyone can do it. That means an american liberal can be tested by a russian conservative and if the russian finds mistakes, he can publish and get mucho rewards for moving the field forward. And if a former warlord in africa also finds mistakes in the conservative russian he can publish that as well; gaining similar praise among his peers. Praise from peers is one of the biggest ways scientists get funding ie because they are famous. Scientists are literally financially incentivized to prove each other wrong. Again, this is idolizing an imperfect "peer review" process that doesn't actually get rid of being wrong. If I have, say, a large particle collider I will make sure to double-check CERN and the LHC results for accuracy. I'll also make sure that if some study from the ISS comes in and says one thing or other, that I will have my own independent space station on which I can verify the accuracy of their work. I'll make sure to do that. CERN is actually a really bad example for this. It is set up with so many competing teams that if one group were to publish something using bad data, it would actually be caught out really quickly.
|
On December 17 2016 03:42 LegalLord wrote: People sometimes get public funding to research climate change for explicitly political reasons. Word on the grapevine is that climate skeptics are big funders of climate research - to avoid making decisions right away on the matter. And would those scientists refuse more money to say that climate change is very real, but in the future?
This will always be the case. There were papers showing cigs are *not* carcinogenic. But extreme consensus still showed cigs were carcinogenic. 2% of climate scientists don't believe in climate change just like the same percent of bio whatever people didn't believe cigs were carcinogenic.
There are also physicists who don't buy relativity. It's not a bad thing for that to exist, its just that you should probably just go with what the 98% are saying.
On December 17 2016 03:42 LegalLord wrote:
What if your paper isn't really wrong, it just doesn't tell the full story? That is far more common than simple fabrication. It looks like minor errors but it certainly might give a wrong direction, and it doesn't mean retractions and public shaming.
And here is an even more insidious problem: would a scientist perhaps say "support the EU because of their principled stance on the environment" when what they are really after is the continuation of a large, yet highly polarizing in some circle, entity that gives them stable and consistent funding for their own work?
You are creating hypotheticals that don't relate at all to what I'm talking about.
My point is: When 98% agree, it is for a good reason and it can be believed.
|
On December 17 2016 03:35 LegalLord wrote:Show nested quote +On December 17 2016 03:32 Thieving Magpie wrote:On December 17 2016 03:15 LegalLord wrote:On December 17 2016 03:03 Mohdoo wrote:On December 17 2016 02:57 LegalLord wrote:On December 17 2016 02:38 Thieving Magpie wrote:On December 17 2016 02:27 LegalLord wrote:On December 17 2016 02:25 Thieving Magpie wrote:On December 17 2016 01:57 xDaunt wrote:On December 17 2016 01:45 Acrofales wrote: [quote]
Pretty sure anti-fake-news is not left or right, unless you are willing to concede that reality has a liberal bias (mainly because conservatives have taken pants-on-heads anti-science stances on a number of issues ranging from climate change to trickle-down economics)?
But platforms that people consider important sources of information (although why people think Facebook is a good source of news is beyond me) should take care of the information they are providing. They either shut down their "sponsored links", or they curate it, because they are the ones who are responsible for what shows up on their site. And if that is stuff like "Pope Francis says Hillary is devilspawn" (or whatever the fake headline along those lines was), then that is (partially) their responsibility. I don't think that anyone is going to disagree with the proposition that the truly fake news (ie outright making shit up) is a problem. However, there are two problems with the Left's current attack on fake news. The first is that the breadth of the attack encompasses not just true fake news sites but also conservative media as well. The second problem is that left wing sites aren't receiving the same scrutiny and attention as the right wing analogs. For these reasons, the war on fake news looks very much like an excuse to wage an information war in the name of partisanship. We have to wait and see what Facebook actually does, but if it goes down the same path that Twitter has, it will be a real problem. I don't know if claiming news media that denies science and scientists can be considered real news. Do scientists ever bullshit? Is scientific consensus ever bullshit, especially on political issues? The skepticism is usually wrong but absolutely warranted. Scientific journals are actually heavily tested and their work reviewed and re-reviewed constantly. Disagreeing with the current consensus without running your own test on the disagreed upon journal is nothing but intentional misleading of your reader base. If you think a journal is wrong--prove it. Show tests, show the math, show the proof that the test was wrong. Then have your test get published in a scientific journal to also be equally vetted. Do scientists ever have their judgment clouded by their own bias in favor of funding for their own work? Do they ever make hasty conclusions that are hard to refute because the few people who are capable of doing so have the same conflicts of interest as the first? Do they ever make imprecise and unknowable predictions about the future for which it's really hard to know if an "educated guess" is based on good science or politicized? Do they ever just have such a poor understanding of the political aspects of their proposals that they just miss the mark entirely? Do they ever do experiments that are hard to replicate because of money and expertise barriers? I'm generally not a "science skeptic" but damn, don't give them more credit than they deserve. They are far from unbiased arbiters of truth, as is the implication here with the emphasis on idolizing "peer-review" as if it gets rid of being wrong. All of the things you listed are extremely risky things to do because of how much street cred people get for proving other people's shit wrong. Once you publish, you are insanely vulnerable. I have been a part of many different investigations that never got published because it was still technically possible we were too vulnerable to a certain avenue. In reality, a lot of stuff *could* be likely published but is not because the consequences of being wrong are too great. It is really, really bad to have to withdraw a paper. It is borderline career ending. Note: This only applies to western research. Chinese research littered with trash and none of this applies to the Chinese scientific community. Edit: Except for when you said "Do they ever do experiments that are hard to replicate because of money and expertise barriers?" Being the only person to make a certain type of spectrometer, and then publishing working that uses it, is not necessarily a bad thing. Papers where someone is the first to do a type of detection always comes with lots of caution and the reason for believing the new spectroscopy method are all clearly laid out. People are like "Yo, this shit is new and wild, but here's why I'm pretty sure it is right." Who is vulnerable, and how? The biggest existential crisis for most scientists is that they won't get more money to do their work. If they're being funded to consistently lie and/or mislead with partial truths that simply cannot be painted as fabrications (no one gets punished for what can be a "misdirection" done in good faith), are they more vulnerable to pissing off their benefactors or to some guys proving that they are kinda-sorta-maybe wrong? Would Google Labs publish something that, for example, proves that Google's approach to search is fundamentally flawed and will not scale to the web in 10 years, with little in the way of recourse? If they would not, do you suspect political, rather than academic, motives for doing so? And do you consider Google Labs researchers to be real scientists, given both their political motives and the fact that they genuinely have important scientific contributions? They have no incentive to publish it. They do have incentive to change their practice and in 10 years the one that works will replace the current google practice. Also, every single non-google search engine is actively trying to prove google wrong as we speak--unless you think Microsoft with its Bing search engine is just a shitty start up with no future. And now we will have an internal political game that starts to be well removed from science. Their reason for not publishing is political, pure and simple, despite being something that would be rooted in heavily hard-science fields.
Which is why news media should not really talk about unpublished work?
Google finds out its methods could be wrong, starts working on new way to do their work. Because they did not publish it, the research never gets reviewed. Since it never gets reviewed, they won't know for certain if their work is accurate once tested outside of their team. It is up to them if they want accuracy vs control.
If they publish it, the media can then point to it and talk about it. As that publication gets tested and changes the industry, the media can talk about it more.
If they don't publish it, the media can't really talk about a paper that does not technically exist.
This is fairly simple and does not need you to complicate it.
|
|
|
|