Detecting fake news at its source

Windparadox

Gold Member
May 3, 2017
4,567
903
275
Northern WI.
`
"Machine learning system aims to determine if an information outlet is accurate or biased. (Massachusetts Institute of Technology) - Lately the fact-checking world has been in a bit of a crisis. Sites like Politifact and Snopes have traditionally focused on specific claims, which is admirable but tedious; by the time they’ve gotten through verifying or debunking a fact, there’s a good chance it’s already traveled across the globe and back again.

Social media companies have also had mixed results limiting the spread of propaganda and misinformation. Facebook plans to have 20,000 human moderators by the end of the year, and is putting significant resources into developing its own fake-news-detecting algorithms.

Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and the Qatar Computing Research Institute (QCRI) believe that the best approach is to focus not only on individual claims, but on the news sources themselves. Using this tack, they’ve demonstrated a new system that uses machine learning to determine if a source is accurate or politically biased.

“If a website has published fake news before, there’s a good chance they’ll do it again,” says postdoc Ramy Baly, the lead author on a new paper about the system. “By automatically scraping data about these sites, the hope is that our system can help figure out which ones are likely to do it in the first place.”

Baly says the system needs only about 150 articles to reliably detect if a news source can be trusted — meaning that an approach like theirs could be used to help stamp out new fake-news outlets before the stories spread too widely
. - Source
`
`

Fake-news-machine-learning-_Info_Wars-page-annotated-edited-_MIT-00.jpg
`
`

News and politics forums (as well as social forums) are notorious for false and misleading news. While I always like to at least double check sources, there are a few news sites that I hold with high credibility and others, little or none. Checking sources is time consuming, as the article states. While I will never fully trust a computer and it's algorithms, this is a step in the right direction.
`
 
`
"Machine learning system aims to determine if an information outlet is accurate or biased. (Massachusetts Institute of Technology) - Lately the fact-checking world has been in a bit of a crisis. Sites like Politifact and Snopes have traditionally focused on specific claims, which is admirable but tedious; by the time they’ve gotten through verifying or debunking a fact, there’s a good chance it’s already traveled across the globe and back again.

Social media companies have also had mixed results limiting the spread of propaganda and misinformation. Facebook plans to have 20,000 human moderators by the end of the year, and is putting significant resources into developing its own fake-news-detecting algorithms.

Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and the Qatar Computing Research Institute (QCRI) believe that the best approach is to focus not only on individual claims, but on the news sources themselves. Using this tack, they’ve demonstrated a new system that uses machine learning to determine if a source is accurate or politically biased.

“If a website has published fake news before, there’s a good chance they’ll do it again,” says postdoc Ramy Baly, the lead author on a new paper about the system. “By automatically scraping data about these sites, the hope is that our system can help figure out which ones are likely to do it in the first place.”

Baly says the system needs only about 150 articles to reliably detect if a news source can be trusted — meaning that an approach like theirs could be used to help stamp out new fake-news outlets before the stories spread too widely
. - Source
`
`

Fake-news-machine-learning-_Info_Wars-page-annotated-edited-_MIT-00.jpg
`
`

News and politics forums (as well as social forums) are notorious for false and misleading news. While I always like to at least double check sources, there are a few news sites that I hold with high credibility and others, little or none. Checking sources is time consuming, as the article states. While I will never fully trust a computer and it's algorithms, this is a step in the right direction.
`

The existence and practice of “fake news” is far more complicated than most people realize. There are countless ways to “fake” the news without outright lying and it’s these instances of deceit that are most effective bc they’re not noticed as often, if at all.

What sucks is fake news needs to be talked about but rarely can people discuss it without pointing fingers at specific outlets, when really the vast majority should be addressed no matter the sentiment they reflect.
 
Wikipedia is fake news.
`
I was wondering why that was there.
I'm still wary of Alex Jones and InfoWars, though Paul Joseph Watson is their best correspondent....There're rumblings about them being controlled opposition, and there's a fair amount of evidence to support the claim.

But outfits like Wiki, PolitFact, Snopes, et.al. calling InfoWars fake news totally pegs the irony-o-meter.
 
`
"Machine learning system aims to determine if an information outlet is accurate or biased. (Massachusetts Institute of Technology) - Lately the fact-checking world has been in a bit of a crisis. Sites like Politifact and Snopes have traditionally focused on specific claims, which is admirable but tedious; by the time they’ve gotten through verifying or debunking a fact, there’s a good chance it’s already traveled across the globe and back again.

Social media companies have also had mixed results limiting the spread of propaganda and misinformation. Facebook plans to have 20,000 human moderators by the end of the year, and is putting significant resources into developing its own fake-news-detecting algorithms.

Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and the Qatar Computing Research Institute (QCRI) believe that the best approach is to focus not only on individual claims, but on the news sources themselves. Using this tack, they’ve demonstrated a new system that uses machine learning to determine if a source is accurate or politically biased.

“If a website has published fake news before, there’s a good chance they’ll do it again,” says postdoc Ramy Baly, the lead author on a new paper about the system. “By automatically scraping data about these sites, the hope is that our system can help figure out which ones are likely to do it in the first place.”

Baly says the system needs only about 150 articles to reliably detect if a news source can be trusted — meaning that an approach like theirs could be used to help stamp out new fake-news outlets before the stories spread too widely
. - Source
`
`

Fake-news-machine-learning-_Info_Wars-page-annotated-edited-_MIT-00.jpg
`
`

News and politics forums (as well as social forums) are notorious for false and misleading news. While I always like to at least double check sources, there are a few news sites that I hold with high credibility and others, little or none. Checking sources is time consuming, as the article states. While I will never fully trust a computer and it's algorithms, this is a step in the right direction.
`
Conservatives oppose this, of course, because they contrive and propagate most of the fake news.
 
`
"Machine learning system aims to determine if an information outlet is accurate or biased. (Massachusetts Institute of Technology) - Lately the fact-checking world has been in a bit of a crisis. Sites like Politifact and Snopes have traditionally focused on specific claims, which is admirable but tedious; by the time they’ve gotten through verifying or debunking a fact, there’s a good chance it’s already traveled across the globe and back again.

Social media companies have also had mixed results limiting the spread of propaganda and misinformation. Facebook plans to have 20,000 human moderators by the end of the year, and is putting significant resources into developing its own fake-news-detecting algorithms.

Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and the Qatar Computing Research Institute (QCRI) believe that the best approach is to focus not only on individual claims, but on the news sources themselves. Using this tack, they’ve demonstrated a new system that uses machine learning to determine if a source is accurate or politically biased.

“If a website has published fake news before, there’s a good chance they’ll do it again,” says postdoc Ramy Baly, the lead author on a new paper about the system. “By automatically scraping data about these sites, the hope is that our system can help figure out which ones are likely to do it in the first place.”

Baly says the system needs only about 150 articles to reliably detect if a news source can be trusted — meaning that an approach like theirs could be used to help stamp out new fake-news outlets before the stories spread too widely
. - Source
`
`

Fake-news-machine-learning-_Info_Wars-page-annotated-edited-_MIT-00.jpg
`
`

News and politics forums (as well as social forums) are notorious for false and misleading news. While I always like to at least double check sources, there are a few news sites that I hold with high credibility and others, little or none. Checking sources is time consuming, as the article states. While I will never fully trust a computer and it's algorithms, this is a step in the right direction.
`
Conservatives oppose this, of course, because they contrive and propagate most of the fake news.
Look, it's the kink of fake! Goebbels reincarnated!
 
I'm still wary of Alex Jones and InfoWars, though Paul Joseph Watson is their best correspondent....There're rumblings about them being controlled opposition, and there's a fair amount of evidence to support the claim.But outfits like Wiki, PolitFact, Snopes, et.al. calling InfoWars fake news totally pegs the irony-o-meter.

Wikipedia, itself, is just an online encyclopedia, not a news source per se. For all the people I've seen complain about its accuracy, I've never seen anyone prove it; not to say they are not. They also rent out their software which allows users to make their own specialized Wiki's....like the Lord of The Rings Wiki.

I'll always consider infowars to be 100% trash, but obviously, a bunch of people like it, so that is that.
`
 
Liberals are 100 percent trash, by their own doing.
 
Wikipedia, itself, is just an online encyclopedia, not a news source per se. For all the people I've seen complain about its accuracy, I've never seen anyone prove it; not to say they are not. They also rent out their software which allows users to make their own specialized Wiki's....like the Lord of The Rings Wiki.

I'll always consider infowars to be 100% trash, but obviously, a bunch of people like it, so that is that.
`
Wiki is subject to editing by its subscribers, and "verified" by its staff, which has a good solid left lean to it.

PJW's content keeps InfoWars at about 75% trash.....His stuff is digital gold coin.
 

Forum List

Back
Top