In an exclusive report a research psychologist has reviewed the evidence of a new study that shows Google‘s search suggestions are altered in favor of Hillary Clinton.
The distinguished research psychologist, Robert Epstein, says Google manipulates its search results to favor Hillary Clinton, and as many as 3 million votes could be shifted in her favor in the upcoming U.S. presidential election.
Sputnik News reports:
Biased search rankings can swing votes and alter opinions, and a new study shows that Google’s autocomplete can too.
A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete.
Because of recent claims that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research. As you will see, there is some cause for concern here.
In June of this year, Sourcefed released a video claiming that Google’s search suggestions — often called “autocomplete” suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook.
The video’s narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump.
“The intention is clear,” said Lieberman. “Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site.”
Google responded to the Sourcefed video in an email to the Washington Times, denying everything. According to the company’s spokesperson, “Google Autocomplete does not favor any candidate or cause.” The company explained away the apparently damning findings by saying that “Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name.”
Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman’s claims. What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google’s search suggestions to alter what people search for.
Lieberman insisted that Google’s search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google’s lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us.
Our investigation is ongoing, but here is what we have learned so far:
Bias in Clinton’s Favor
To test Lieberman’s claim that Google’s search suggestions are biased in Mrs. Clinton’s favor, my associates and I have been looking at the suggestions Google shows us in response to hundreds of different election-related search terms. To minimize the possibility that those suggestions were customized for us as individuals (based on the massive personal profiles Google has assembled for virtually all Americans), we have conducted our searches through proxy servers — even through the Tor network — thus making it difficult for Google to identify us. We also cleared the fingerprints Google leaves on computers (cache and cookies) fairly obsessively.
Google says its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is false.
Generally speaking, we are finding that Lieberman was right: It is somewhat difficult to get the Google search bar to suggest negative searches related to Mrs. Clinton or to make any Clinton-related suggestions when one types a negative search term. Bing and Yahoo, on the other hand, often show a number of negative suggestions in response to the same search terms. Bing and Yahoo seem to be showing us what people are actually searching for; Google is showing us something else — but what, and for what purpose?
As for Google Trends, as Lieberman reported, Google indeed withholds negative search terms for Mrs. Clinton even when such terms show high popularity in Trends. We have also found that Google often suggests positive search terms for Mrs. Clinton even when such terms are nearly invisible in Trends. The widely held belief, reinforced by Google’s own documentation, that Google’s search suggestions are based on “what other people are searching for” seems to be untrue in many instances.
Google tries to explain away such findings by saying its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is false; Google suppresses negative suggestions selectively, not across the board. It is easy to get autocomplete to suggest negative searches related to prominent people, one of whom happens to be Mrs. Clinton’s opponent.
A picture is often worth a thousand words, so let’s look at a few examples that appear both to support Lieberman’s perspective and refute Google’s. After that, we’ll examine some counterexamples.
Before we start, I need to point out a problem: If you try to replicate the searches I will show you, you will likely get different results. I don’t think that invalidates our work, but you will have to decide for yourself. Your results might be different because search activity changes over time, and that, in turn, affects search suggestions. There is also the “personalization problem.” If you are like the vast majority of people, you freely allow Google to track you 24 hours a day. As a result, Google knows who you are when you are typing something in its search bar, and it sends you customized results.
For both of these reasons, you might doubt the validity of the conclusions I will draw in this essay. That is up to you. All I can say in my defense is that I have worked with eight other people in recent months to try to conduct a fair and balanced investigation, and, as I said, we have taken several precautions to try to get generic, non-customized search suggestions rather than the customized kind. Our investigation is also ongoing, and I encourage you to conduct your own, as well.
Let’s start with a very simple search. The image below shows a search for “Hillary Clinton is ” (notice the space after is) conducted on August 3rd on Bing, Yahoo, and Google. As you can see, both Bing and Yahoo displayed multiple negative suggestions such as “Hillary Clinton is a liar” and “Hillary Clinton is a criminal,” but Google is showed only two suggestions, both of which were almost absurdly positive: “Hillary Clinton is winning” and “Hillary Clinton is awesome.”
To find out what people actually searched for, let’s turn to Google Trends — Google’s tabulation of the popularity of search results. Below you will see a comparison between the popularity of searching for “Hillary Clinton is a liar” and the popularity of searching for “Hillary Clinton is awesome.” This image was also generated on August 3rd. “Hillary Clinton is a liar” was by far the more popular search term; hardly anyone conducted a search using the phrase, “Hillary Clinton is awesome.”
Okay, but Google admits that it censors negative search results; presumably, that is why we only saw positive results for Mrs. Clinton — even a result that virtually no one searched for. Does Google really suppress negative results? We have seen what happens with “Hillary Clinton is.” What happens with “Donald Trump is “? (Again, be sure to include the space after is.)
In the above image, captured on August 8th, we again found the odd “awesome” suggestion, but we also saw a suggestion that appears to be negative: “Donald Trump is dead.” Shouldn’t a result like that have been suppressed? Let’s look further.
Consider the following searches, conducted on August 2nd, for “anti Hillary” and “anti Trump.” As you can see below, “anti Hillary” generated no suggestions, but “anti Trump” generated four, including “anti Trump cartoon” and “anti Trump song.” Well, you say, perhaps there were no anti-Hillary suggestions to be made. But Yahoo — responding merely to “anti Hill” — came up with eight, including “anti Hillary memes” and “anti Hillary jokes.”
This seems to further refute Google’s claim about not disparaging people, but let’s dig deeper.
After Mrs. Clinton named Senator Tim Kaine to be her running mate, Mr. Trump dubbed him with one of his middle-school-style nicknames: “Corrupt Kaine.” Sure enough, that instantly became a popular search term on Google, as this July 27th image from Trends confirms:
Even so, as you can see in the image below, in response to “corrupt,” the Google search bar showed us nothing about Senator Kaine, but it did show us both “Kamala” (Kamala Harris, attorney general of California) and “Karzai” (Hamid Karzai, former president of Afghanistan). If you clicked on the phrases “corrupt Kamala” and “corrupt Karzai,” search results appeared that linked to highly negative web pages about Kamala Harris and Hamid Karzai, respectively.
Oddly enough, both on the day we looked up “corrupt Kaine” and more recently when I was writing this essay, Google Trends provided no popularity data for either “corrupt Kamala” or “corrupt Karzai.” It is hard to imagine, in any case, that either search term has been popular in recent months. So why did the Google search bar disparage Attorney General Harris and President Karzai but not Mrs. Clinton?
If you still have doubts about whether Google suggests negative searches for prominent people, see how Senators Cruz, Rubio and Sanders fared in the following searches conducted between July 23rd and August 2nd:
I could give you more examples, but you get the idea.
Latest posts by Edmondo Burr (see all)
- Police Arrest Suspect In Supermarket Baby Food Poisoning - October 1, 2017
- Seoul Secures Data From Electromagnetic Interference By N Korea - September 30, 2017
- The ‘World’s First Internet War’ Has Begun: Julian Assange - September 30, 2017