If collective decision-making is to succeed, it is vital that all citizens are informed so that everyone can critique — from their own perspective — claims that land in their digital vicinity.

Share story

THE Industrial Revolution changed the world. The steam engine and machine tools increased labor efficiency a hundredfold. Advances in iron production increased energy efficiency and decreased costs, expanding the distances one could travel and decreasing the time it took to do so. But these technological developments carried serious side effects. The new coal-powered factory and large-scale chemical production blackened the sky and darkened the waterways. It was not an abstract concept; people only had to look up to the sky.

The Digital Revolution has followed a similar path. Leaps in computer technology have again streamlined production — not of physical goods, but information goods. Inexpensive and nearly infinite storage, gigahertz microprocessors, light-speed communication, and the rise of sociotechnical platforms all have made the production and transfer of information fast and cheap. Everyone has become a publisher. The gatekeeper models of news dissemination have lost their dominance. Nearly two-thirds of Americans get some news from social media. This decentralization of news filtering is not all bad. Sure, the proliferation of fake news is a byproduct, but this era also provides access to diversity of thought and interests that the Walter Cronkite period could have never afforded.

Fact vs. Fake: Fighting back against fake news

Jevin West is one of four panelists participating in The Seattle Times LiveWire event Sept. 13 at the University of Washington’s Kane Hall. It is sold out, but you can join on Facebook at facebook.com/STLiveWireEvent.

Just like in the early 1800s, the costs are high. Our digital environments have become gold-rush boom towns of trolls, bots and clickbait prospectors. Fake news outpaces fact-based news. And even the purity of science has been pierced by the scourge of predatory publishers. “Fake-science” journals are attracting tens of millions of dollars a year, and, worse than the wasted money, potentially misleading the millions of people who view them.

So, what to do?

Technologists propose new algorithms and better platforms. Google has removed more than 200 publishers producing fake news from its ad network. I recently attended a meeting at Arizona State University, sponsored by Facebook. The all-too-easy, thumbs-up, one-click sharing design was a common discussion point. Personally, I would love to see a button that simply asked the user: “Did you read the article?” If not, that is OK. You can still share, but your Facebook friends should at least know whether you read the article or not.

Although commendable, these technology adjustments by themselves will not solve the problem. Others have proposed laws against fake news. Early this year, California legislators proposed a law prohibiting fake news. The bill was eventually pulled, and for good reason. First, how does one define “fake news” and how would this be enforced? Second, this kind of law is a slippery slope to First Amendment degradation.

Research is another avenue. I codirect the DataLab in the Information School at University of Washington. We study misinformation, data curation, information visualization and data analytics. Research can help us better understand the dynamics of fake news and inform solutions to the problem, but cleaning up the digital pollution will require more than research papers.

Ultimately, we must arm the information consumer with the skills necessary to combat the onslaught of misinformation. Over the last couple years, my colleague, Carl Bergstrom, and I have been sharing and discussing examples of bullshit (few other words in the English language better capture the sense of frustration that comes with this type of misinformation) in our professional world of science and in our personal information environments. Bullshit comes in all shapes and sizes, and across the political spectrum; no human organization is immune from it. Earlier this year, we released a course curriculum aimed at combating digital pollution. The course title — Calling Bullshit: Data Reasoning for the Digital Age — pays special attention to BS clothed in the authority of data, figures, statistics and algorithms. We call this “new-school bullshit.” Our philosophy is that you don’t need a Ph.D. in statistics or computer science to call BS on the vast majority of data bullshit. If you think clearly about what might be wrong with the data someone is using and what might be wrong about the interpretation of their results, you’ll catch a huge fraction of the bullshit without ever going into the mathematical details.

We have developed case studies and small video vignettes, freely available to the public. We will continue to add content and will make announcements on twitter (@callin_bull) or Facebook (callinBS). If collective decision-making is to succeed, it is important that all our citizens, young and old, blue and red, are informed so that everyone can critique — from his or her own perspective — a claim that lands in their digital vicinity.

This misinformation age worries me. As a parent and a concerned citizen, I genuinely lose sleep over it. Sword-rattling tweets, based on fake news, are not hypothetical. With all this post-truth doomsday talk, there is a silver lining and reason for optimism. Within hours of releasing our course, tens of thousands of visitors from all over the world visited the website. And over the last several months, the resonant energy has not abated. We are in active conversations with teachers, business leaders, librarians, journalists and concerned citizens from around the world and from all sides of the political spectrum. Faculty from more than 50 universities around the world have asked to use our course. High school and middle school teachers are building on our efforts as well. Just as the Industrial Revolution led to the first environmental laws, I am confident (with limits) that society will find workable solutions.

The workable solutions will hopefully come sooner than later. The most important principle is that bullshit is easy to create but difficult to clean up. Jonathan Swift said it best in 1710: “Falsehood flies and the truth comes limping after it.” Truth needs help. I encourage everyone to dig to the source of at least one article a week. Figure out if the claims are too good to be true; check whether the headline matches the article content and whether the conclusions follows from the data; and ask whether the person making the claim is an expert and what they have to gain from the claim. None of us has the time to do this for everything we read, but let’s at least begin the collective cleaning process. The health of democracy, if not the preservation of the world, depend on it.