Ever wonder why people’s perception of the incidence of crime, terrorism, kidnapping and other violent acts is often much higher than the reality? Why the U.S. is becoming a low-trust society? Why Americans are collectively in a funk?
A big part of the answer, according to experts in social science, psychology and computer science, is that the biases that were once useful to our primitive forebears have become — like the craving for sweet foods — detriments in our modern world. Instincts that may once have saved us from real dangers have now, thanks to global instantaneous communication, turned us all into Chicken Littles.
Our best hope for breaking their spell may lie in understanding the workings of our cognitive and social biases — and the algorithms of online social networks that reinforce them.
First described in 1973 by psychologists Amos Tversky and Daniel Kahneman, author of the book “Thinking, Fast and Slow,” the availability bias refers to our tendency to think that whatever we heard about most recently is more common than it actually is. This might have been useful when we had to make life choices based on a trickle of information, but now that we have a fire hose of it, we can’t seem to be rational about the likelihood of bad things happening.
The availability bias helps explain why people are afraid of shark attacks, even though they’re more likely to drown at the beach. People fear terrorism, even though the odds they will die in a plane crash are far higher — and the odds that they’ll be killed walking down the street are many times higher still.
Sometimes known as the availability heuristic, this bias is one reason parents are afraid to let children play unsupervised, though it’s never been safer to be a child in America.
Mass media has leveraged this bias since at least the birth of so-called yellow, or sensationalist, journalism in the late 1800s, but the internet makes every child abduction, shark bite and terrorist attack seem like it’s happening in our backyards, says Lenore Skenazy, president of Let Grow, a nonprofit that advocates for childhood independence.
We also have social biases that come out when we’re in crowds, says Jonah Berger, a professor at Wharton who studies how ideas spread. The extremity bias is our tendency to share the most extreme version of any story, to keep our listeners rapt. A positive story becomes absolutely glowing, a negative one turns horrific, like the tall tales of ancient oral tradition.
Online, this tendency goes into overdrive. “Our audiences are getting larger and larger, so our bias is to make things more and more extreme to engage those audiences,” says Prof. Berger. Note the rise of hyperbolic phrases — things aren’t merely “exciting,” they’re “extremely exciting.”
Content that evokes both positive and negative response at the same time is even more viral. For example, sharing content about children being abducted from their parents by strangers — an exceedingly rare phenomenon — simultaneously arouses feelings of anger and feelings of self-righteousness, says Ms. Skenazy. Even as we’re incensed, we feel we are helping to protect children by sounding the alarm. “It is this double whammy of outrage and virtue.”
We have a natural tendency to seek information that confirms our pre-existing views and discount information that doesn’t. That’s confirmation bias, and ironically, it may have evolved as a way to keep us from succumbing to manipulation by others.
Confirmation bias has come to the fore this week as President Trump has seized on a perception in conservative circles that Google elevates critical news articles about his presidency to threaten action against the search giant. Google says its search results aren’t politically biased.
Social media’s algorithms tend to lump us into buckets and feed us information that more or less conforms to what we’ve previously showed an interest in. Doing this across millions of people has meant dividing and polarizing populations into nonoverlapping views of reality.
As a result, when inaccurate information infects one of these echo chambers — for example, that kidnapping is on the rise or that vaccines cause autism — there are few checks on its spread.
Algorithms that maximize engagement play off our biases, or unwittingly fuel them. Either way, this leads to a litany of well-documented ills, from mental-health issues to ever-deeper political polarization.
The end result is systems that — whatever their makers’ intent — are highly optimized to make us believe things that aren’t true. Facebook Inc., Alphabet Inc. (parent of Google and its YouTube division) along with a few other tech companies, have built history’s biggest, farthest-reaching and most profitable delusion machine.
Chief Executive Mark Zuckerberg promised to spend 2018 fixing Facebook’s assorted issues, and pledged to help ensure that users’ time on its services is “time well spent.” Facebook also says it’s actively working to make its platform less susceptible to manipulation of the sort that occurred when Russia used Facebook to attempt to disrupt the 2016 U.S. elections. Whether or not these measures have had any effect, people are spending less time on Facebook.
YouTube previously said it was beefing up content moderation and surfacing more authoritative news sources to people searching breaking-news topics. It has also recently terminated accounts found to be pushing misinformation. It’s not clear what impact that has had on its user experience.
Skeptics might argue that this column is itself a product of our cognitive biases.
“I’m always skeptical of now-more-than-ever observations that are not backed up by time-series data, since they themselves can be products of the availability heuristic and may be inaccurate,” says Harvard University psychology professor Steven Pinker.
The good news, says Peter Reiner, a neuroethicist at the University of British Columbia, is that educating ourselves about these cognitive biases could help. “The best thing you can do to inoculate yourself is to know that they exist,” he adds.
That’s why it’s imperative that you don’t share this column on social media, where it will just become part of one bias-reinforcing echo chamber or another. Instead, talk about it with friends or family members. Or better yet, total strangers. After all, the odds of being killed by one are astronomically remote.
Credit: By Christopher Mims
(c) 2018 Dow Jones & Company, Inc. Reproduced with permission of copyright owner. Further reproduction or distribution is prohibited without permission.