What's new

Did Google Influence our Elections?

Denechek

Senior Member
Joined
Dec 8, 2016
Messages
197
Reaction score
49
Points
48
Location
Trump Tower


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,
  • Fake based on truth
  • Truth clouded by fake
  • Fake creating more fake
  • Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search
 

MindWars

Diamond Member
Joined
Oct 14, 2016
Messages
40,829
Reaction score
9,526
Points
2,040


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,
  • Fake based on truth
  • Truth clouded by fake
  • Fake creating more fake
  • Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search
it's probably a little to complicated for the average fake liberal to understand googles connection/control. They are under the impression their thoughts are their own .
 
OP
Denechek

Denechek

Senior Member
Joined
Dec 8, 2016
Messages
197
Reaction score
49
Points
48
Location
Trump Tower


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,​
  • Fake based on truth​
  • Truth clouded by fake​
  • Fake creating more fake​
  • Truth hard to ascertain​
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search

it's probably a little to complicated for the average fake liberal to understand googles connection/control. They are under the impression their thoughts are their own .​

 
OP
Denechek

Denechek

Senior Member
Joined
Dec 8, 2016
Messages
197
Reaction score
49
Points
48
Location
Trump Tower


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,
  • Fake based on truth
  • Truth clouded by fake
  • Fake creating more fake
  • Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search
it's probably a little to complicated for the average fake liberal to understand googles connection/control. They are under the impression their thoughts are their own .
The Electoral College is casting votes right now.
This is the last chance for the Heillery crowd to "Save the Republic" It's time to put up or shut up, Libtards!
 

Brynmr

Gold Member
Joined
Jun 12, 2016
Messages
5,491
Reaction score
872
Points
290
The entire MSM media tried to influence the election in favor of Clinton, so the fascist alt-Left can stick it up their asses.
 

Baron

Platinum Member
Joined
Sep 19, 2008
Messages
6,237
Reaction score
2,581
Points
370
Location
Brooklyn, NYC
Sure:

Put Trump, get only shi. back, put Hillary - receive only nice things and pictures.

Killary according to Google:



And here is most common pic of Trump.



Due to the fake news king Google alone Trump lost at least 3m votes.
 

Indeependent

Platinum Member
Joined
Nov 19, 2013
Messages
39,023
Reaction score
3,477
Points
1,115
Google, Facebook and the CNN App convinced lots of people that their information outlet was run by idiots.
 

Bush92

GHBush1992
Joined
May 23, 2014
Messages
29,567
Reaction score
4,452
Points
280


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,
  • Fake based on truth
  • Truth clouded by fake
  • Fake creating more fake
  • Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search
I think it was Taco Bell. Whole "make a run for the border" thing was a Trump subliminal message promoting his immigration policies. Lol.
 

aris2chat

Gold Member
Joined
Feb 17, 2012
Messages
18,678
Reaction score
4,677
Points
280


“Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”- Google, in response to SEME research.

SEME, in this context, is for those who Googled and found links to Japanese culture:

“The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers.”- Wiki
According to some, “Such manipulations…could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.”

Who knows? These days one has to be very careful sharing any information as the proliferation of fake news is so widespread and so deep.
  • Net fake/fake,
  • Fake based on truth
  • Truth clouded by fake
  • Fake creating more fake
  • Truth hard to ascertain
I am content to merely comment on Google’s statement, as it strikes a chord of dissonance. After all, the whole monetization structure of the enterprise is based on its ability to affect key and hyper targeted audiences. Yet seemingly that only holds true for the purple whale pants with the pink embroidery and not for anything else…hmm.

Frankly, as in the past, my issue is less with fake news—let’s be clear, I find the fabled New York Times guilty in that arena as well—than it is with the way algorithms are developing. To suggest that the Google algorithm is open, honest and neutral is disingenuous at best and, more likely, misleading.

“The amoral status of an algorithm does not negate its effects on society,” wrote Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute, authors of a 2015 Google advertising study.

Damien Tambini, an associate professor at the London School of Economics, who focuses on media regulation, was quoted by The Guardian as saying:

There’s an editorial function to Google and Facebook but it’s being done by sophisticated algorithms. They say it’s machines not editors. But that’s simply a mechanised editorial function.
It’s a function I’ve written about, with respect and admiration, as the head of the Creative Data Jury at the Cannes Lions International Festival of Creativity in France.

I refer you to a recent article in The Guardian, “Google, democracy and the truth about internet search,” written by Carole Cadwalladr, published Sunday, December 4, 2016.

It’s a piece most worth reading, no matter your view, as it sets up the arguments in a coherent and rational way. The author documents her journey of typing “Jews/Muslims/Women (etc.) are” into the Google search bar and watching, with horror, the auto-fill that followed.

To be fair, and clear, Google has cleaned up this mess since I first read Cadwalladr’s article last week—try it and you’ll see for yourself. Nevertheless, what was coming up just a short while ago were hateful terms and hate-filled links.

“Jews are evil.” “Muslims need to be eradicated.” “Women all have a little prostitute in them.” Cadwalladr writes about typing in the questions “Was Hitler bad?”As I’ve just confirmed, here’s Google’s top result: “10 Reasons Why Hitler Was One of the Good Guys.” Among other things, the article states, “He implemented social and cultural reform.” Eight out of the other 10 search results agree: Hitler really wasn’t that bad.

Cadwalladr continues by quoting Danny Sullivan, founding editor of SearchEngineLand.com:

He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
You seek answers thinking you’ll find the truth and the trusted librarian hands you books on hate…

Yet here’s the thing.

While sources from the left and right are attacking Google and others for bias, mainstream media has jumped in as well. It’s not a new issue; in fact, it’s a problem identified as early as 1999 in the United States and now recognized globally.

In the words of acclaimed media critic Robert McChesney from his paper “Shaping the Web: Why the Politics of Search Engines Matters (2000):

The American media system is spinning out of control in a hyper-commercialized frenzy. Fewer than ten transnational media conglomerates dominate much of our media; fewer than two dozen account for the overwhelming majority of our newspapers, magazines, films, television, radio, and books. With every aspect of our media culture now fair game for commercial exploitation, we can look forward to the full-scale commercialization of sports, arts, and education, the disappearance of notions of public service from public discourse, and the degeneration of journalism, political coverage, and children’s programming under commercial pressure.
Many, myself included, have written about the money at stake here. See “Google’s Dance”:

…from Google’s perspective – and I don’t mean Google’s PR department, I mean Google’s management – Google is an advertisingcompany. Ninety-seven percent of Google’s revenues, after all, come from advertising.
And we have all followed the fake news farms, whose farmers made a killing during the recent US election. Why look to Russian hackers (if it was them)—sadly they were sharing the truth—it was the novelists who were doing way worse damage.

I have shared this thought from The New York Times before, but it’s critical enough to repeat:

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
And according to The Wall Street Journal,

The legacy media companies addressed this issue by trying, admittedly with varying degrees of success, to establish walls between the departments responsible for editorials, news reporting and advertising. This will be far more difficult in an era where algorithms—not editors—often control the content and ads a person consumes.
To be fair, there are many who would argue, rightly so, that the varying degrees of success referenced above have diminished over time and because of—you guessed it—monetization issues, is diminishing ever more quickly.

Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology in California, worries this is where it all leads:

Google has become the main gateway to virtually all knowledge, mainly because the search engine is so good at giving us exactly the information we are looking for, almost instantly and almost always in the first position of the list it shows us after we launch our search – the list of ‘search results’.That ordered list is so good, in fact, that about 50 per cent of our clicks go to the top two items, and more than 90 per cent of our clicks go to the 10 items listed on the first page of results; few people look at other results pages, even though they often number in the thousands, which means they probably contain lots of good information. Google decides which of the billions of web pages it is going to include in our search results, and it also decides how to rank them. How it decides these things is a deep, dark secret – one of the best-kept secrets in the world, like the formula for Coca-Cola.
And in my first and only appearance before a congressional hearing, in a scene out of a movie, I, sitting at the green covered desk in front of a room full of people, warned of the dangers of an algorithm in the hands of a monopoly…any monopoly.

So Google and the rest have to come to grips with who they are…Either they can influence or they can’t. Either they can cause perception shifts or they can’t.

Google, democracy and the truth about internet search

People got to do their own research and talk to others without the main media brainwashing
 
Joined
Sep 15, 2016
Messages
7,605
Reaction score
464
Points
155
Location
All in your mind
Only if you expected Google to do your critical reasoning for you.
The Nose-Pin Zone

Give an example of critical thinking. From what I can see, it's just a buzzword for being able to follow the twisted diagnosis of a spin doctor. Whether from the Left or the Right, they both spin like a top; but one side calls it Dialectic.
 

Active Topics

Most reactions - Past 7 days

Top