A global effort to reinforce good reporting
The basic pledge of journalism is this: Serve society with a truthful, intelligent, and comprehensive account of ideas and events, and function as “the immune system of democracy.” How’s it going? Not well—as far as health and reputation are concerned. So in 2014, SCU’s Markkula Center for Applied Ethics launched The Trust Project. An international cooperative effort, the project aspires to harness Silicon Valley tech and imagination to bake the evidence of trustworthy reporting—accuracy, transparency, and inclusion—plainly into news practices, tools, and platforms. “Trust in journalism has been declining for several decades,” says veteran journalist Sally Lehrman, who founded the project. They started with some basic questions: “How can we think about ways to make technology a support for quality journalism instead of it being seen as a barrier? How can we flip this picture?”
Funding from philanthropist Craig Newmark got the project off the ground. Nearly 70 media organizations are involved. Google was there from the beginning. That was critical, Lehrman says, because now when people are looking for news, it’s often divorced from the brand. “You don’t necessarily know where this piece of information is coming from—whether it’s a piece of news from a trusted news brand or it’s actually advertising or propaganda, or just poorly reported news.”
The Trust Project started by consulting with the public in one-on-one interviews. Through workshops to apply the resulting insights, news executives developed 38 “indicators of trust.” These show practices behind the story: the organization’s ethics and corrections policies, its commitment to diversity, and sources of funding, plus author information, where a story’s facts came from, and where it was originally reported, among others. Late last year, international teams of news organizations participated in a hackathon in London—the “Trust Project Development Challenge”—to produce open-source technology that will display these to readers and create signals to help news platforms like Google and Facebook give factual, ethical news priority. A few results:
- The Economist: a validator to automate indicator tags and to display a score.
- Washington Post/BuzzFeed: a tool that scans news stories for author information, sources, and links, then makes these visible to readers and platforms.
- Ex-BBC News Labs: checks the similarity of articles to see whether they are recycled “churnalism” or original reporting.
- The Guardian: a tool to pop filter bubbles—allows users to suggest articles with an opposing view from the one they originally chose.
- La Stampa: an author database and tool that displays expertise based on previous coverage of a topic.
- Mirror Group: a tool that displays an author profile and warns users when an organization is not following Trust Project guidelines, also suggesting stories with alternate viewpoints.
“Now we’ve got both Facebook and Twitter engaged,” Lehrman says. “They can help us think about how a trust system would work within their particular environment. If you think about the platforms, they really are creating information environments.”
There’s a design element to this effort: creating a set of icons that news organizations use to vouch for the practices behind individual stories. And there’s an interesting dynamic at work in this phase of the project, using lessons from people who aggressively seek news and information from a broad array of sources. The project aims to use this knowledge to reach a middle ground—those people who are honestly looking for quality news but aren’t necessarily going to put a lot of effort into it.
Steven Boyd Saum is editor of Santa Clara Magazine.