Because of this you will hate google now

Last weekend, in the hours after a deadly Texas church shooting, Google search promotedfalse reports about the suspect, suggesting that he was a radical communist affiliated with the antifa movement. The claims popped up in Google’s “Popular on Twitter” module, which made them prominently visible — although not the top results — in a search for the alleged killer’s name. Of course, the was just the latest instance of a long-standing problem: it was the latest of multiple similar missteps. As usual, Google promised to improve its search results, while the offending tweets disappeared. But telling Google to retrain its algorithms, as appropriate as that demand is, doesn’t solve the bigger issue: the search engine’s monopoly on truth.

Surveys suggest that, at least in theory, very few people unconditionally believe news from social media. But faith in search engines — a field long dominated by Google — appears consistently high. A 2017 Edelman survey found that 64 percent of respondents trusted search engines for news and information, a slight increase from the 61 percent who did in 2012, and notably more than the 57 percent who trusted traditional media. (Another 2012 survey, from Pew Research Center, found that 66 percent of people believed search engines were “fair and unbiased,” almost the same proportion that did in 2005.) Researcher danah boyd has suggested that media literacy training conflated doing independent research with using search engines. Instead of learning to evaluate sources, “[students] heard that Google was trustworthy and Wikipedia was not.”

Google encourages this perception, as do competitors like Amazon and Apple — especially as their products depend more and more on virtual assistants. Though Google’s text-based search page is clearly a flawed system, at least it makes it clear that Google search functions as a directory for the larger internet — and at a more basic level, a useful tool for humans to master.

Google Assistant turns search into a trusted companion dispensing expert advice. The service has emphasized the idea that people shouldn’t have to learn special commands to “talk” to a computer, and demos of products like Google Home show off Assistant’s prowess at analyzing the context of simple spoken questions, then guessing exactly what users want. When bad information inevitably slips through, hearing it authoritatively spoken aloud is even more jarring than seeing it on a page.

Even if search is overwhelmingly accurate, highlighting just a few bad results around topics like mass shootings is a major problem — especially if people are primed to believe that anything Google says is true. And for every advance Google makes to improve its results, there’s a host of people waiting to game the new system, forcing it to adapt again.

Simply shaming Google over bad search results might actually play into its mythos, even if the goal is to hold the company accountable. It reinforces a framing where Google search’s ideal final state is a godlike, omniscient benefactor, not just a well-designed product. Yes, Google search should get better at avoiding obvious fakery, or creating a faux-neutral system that presents conspiracy theories next to hard reporting. But we should be wary of overemphasizing its ability, or that of any other technological system, to act as an arbiter of what’s real.

Alongside pushing Google to stop “fake news,” we should be looking for ways to limit trust in, and reliance on, search algorithms themselves. That might mean seeking handpicked video playlists instead of searching YouTube Kids, which recently drew criticism for surfacing inappropriate videos. It could mean focusing on reestablishing trust in human-led news curation, which has produced its own share of dangerous misinformation. It could mean pushing Google to kill, not improve, features that fail in predictable and damaging ways. At the very least, I’ve proposed that Google rename or abolish the Top Stories carousel, which offers legitimacy to certain pages without vetting their accuracy. Reducing the prominence of “Popular on Twitter” might make sense, too, unless Google clearly commits to strong human-led quality control.

The past year has made web platforms’ tremendous influence clearer than ever. Congress recently grilled Google, Facebook, and other tech companies over their role in spreading Russian propaganda during the presidential election. A report from The Verge revealed that unscrupulous rehab centers used Google to target people seeking addiction treatment. Simple design decisions can strip out the warning signs of a spammy news source. We have to hold these systems to a high standard. But when something like search screws up, we can’t just tell Google to offer the right answers. We have to operate on the assumption that it won’t ever have them.

Leave a Reply

Your email address will not be published. Required fields are marked *