If you start typing Alex Jones’ name into Google search, it suggests:
Alex Jones, radio host and gives you a mini-picture so you can see if this is the person you’re looking for.
The radio host qualifier for the American with far-right ideas, founder of the website Infowarswho became famous for his conspiracy theories about the Sandy Hook shootings and the 9/11 attacks might surprise you.
Do the same exercise with Gavin McInnes, founder of the far-right group proud boysand he is introduced to you as a writer. For Jake Angeli, says QAnon Shamanone of the participants in the 2021 US Capitol assault, the search engine uses the qualifier activist.
Discussions about platforms that help spread misinformation online regularly target Facebook and Twitter, while Google’s search engine is too often overlooked, argues Ahmed Al-Rawi, a professor at the University’s School of Communications. Simon Fraser (SFU), British Columbia.
However, these descriptions given by the automatic suggestion tool of the Google search engine omit part of the truth and embellish the vision of these individuals with the general public, as explained by Ahmed Al-Rawi, who is also director of the laboratory. The Disinformation Project of the SFU.
” Calling someone an activist, when that person has spread hatred and even called for genocide, is not normal. »
Activists, journalists, and many more
As part of this study, Ahmed Al-Rawi searched the titles suggested by Google’s search engine to qualify 37 people, considered to be conspiracy theorists or to have supported conspiracy theories, along with other researchers from the Disinformation Lab. .
The results of the study, published in the M/C Logshow that, among the 30 people who had a subtitle, none of these qualifiers reflected the vision that the public has of them, according to Ahmed Al-Rawi: 16 were represented for their contribution in the artistic field, 4 were described as activists, 7 associates, with their original jobs, 2, related to the journalistic field, one, to his sports career, and the last, identified as doing research.
Knowing that Google’s search engine suggestions can yield different results depending on the geographic search area, researchers were surprised to find that the same results were obtained in Canada, the United States and the Netherlands. . They came to this conclusion using a Virtual Private Network (VPN) that originated in these three countries.
A bias impossible to define without a known algorithm
An algorithm uses a series of rules and a defined database to arrive at a result. If its initial database is subjective, these inequities will also be conveyed to the given result.
In this case, are the qualifiers assigned to conspiracy theorists due to an error in Google’s database or a conscious choice? Difficult to say, given the little information available on the functioning of its algorithm, argues Stéphane Couture, professor of communication at the University of Montreal, co-responsible for the Laboratory on online rights and alternative technologies.
However, it is clear that the choice of these titles was not made within the framework of an editorial policy.
Google doesn’t have an editor who decided to put that radio host subtitle to Alex Jonesexplains Stéphane Couture.
Ahmed Al-Rawi, for his part, fears that conspiratorial groups will not be able to take advantage of this system.
The laboratory director The Disinformation Project argues that, if Google’s database relies on information available on the Internet, as the platform stipulates, these conspiracy theorists could influence how they are presented by the search tool, thanks to the way they define themselves on the Internet.
” The system is manipulated, it’s like a loophole that lets conspiracy theorists promote themselves with the help of Google. »
A limited vision
Stéphane Couture explains that Google relies on the neutrality of its research process to ensure the support of as many Internet users as possible.
Although this neutrality is only claimed and these subtitles give a limited vision of these individuals, the fact remains that they are not false, supports the professor in communication. Alex Jones is indeed a radio host, he recalls.
Of course, if the platform presents Alex Jones as a conspiracy the people who are in the camp of Alex Jones will be angry with Google.
” This information seems biased to us, but it is not for Alex Jones. »
According to Stéphane Couture, it is rather in the choice of giving or not a subtitle to these individuals that Google takes a position. Other controversial figures, such as Osama Bin Laden, do not have it, for example.
Control and transparency required
The two researchers point out that this is not the first time that Google’s search engine algorithms have been discussed.
Ahmed Al-Rawi believes that, if there was enough pressure from the international community, it could prompt the digital giant to step in to fix the situation.
Google has already made changes to its search algorithm, after an outcry over terms with a pejorative tendency to qualify women or racialized groups on the search tool, recalls Ahmed Al-Rawi.
In 2020, Google also decided to remove gendered labels, such as
womenof its algorithm to comply with its ethical rules on artificial intelligence.
Stéphane Couture says Google should be more transparent about how its algorithms work. He suggests that the platform withdraw these titles from its suggestions and adopt a
chief editor who could be held responsible when questions of algorithmic bias arise.
Their impact is very real, according to him, given the use that the general public makes of the platform.
” It’s like saying, “Osama Bin Laden, he was a former citizen of Saudi Arabia.” It completely erases its history and, in a strange way, the political dimension behind it. »
The greats of the Internet like Google justify these prejudices of their algorithms by the fact that they are the
mirror of societybut according to more and more researchers and politicians, these platforms have
an editorial role to play, says Stéphane Couture.
With information from Nantou Soumahoro