On December 7, Google published a post in its company news section about the future of news, which proposed that “Artificial Intelligence [AI] has the potential to enhance storytelling and newsroom processes.” It is a timely piece. The media has been the topic of extensive news coverage in the past two years. Russian agents attempted to conduct information warfare against the United States, the President of the United States frequently casts aspersions on whatever news organization whose coverage he finds unfavorable, and algorithms encourage echo chambers. Not to mention confirmation bias often directs internet and media usage. But can emerging forms of machine intelligence save us from ourselves?
The Google News Initiative (GNI), announced as a mechanism to restore order to the chaos, will be partnering with Polis, a think tank in London to create a program called Journalism AI. They expect to release a global survey next year, which will reveal the ways that the news media is currently using AI. This research will be followed up with a handbook of best practices.
AI has already enabled many businesses to streamline their operations and convert data into actionable insights. However, the digitization and technological transformation of journalism has a checkered past. In 2011, the Columbia School of Journalism released a 143-page paper breaking down the business of digital journalism, titled “The Story So Far.” The authors examined the many ways that the digitization of the news industry has fundamentally changed the gathering, dissemination, and comprehension of information. At the time, the CSJ authors noted that top performing outlets had dedicated more resources to audience valuation than to content creation. These publications were analyzing traffic and usage data, responding quickly, and rivalling venerated newspapers in terms of audience numbers. It changed the profession.
“Indeed, data analysis has moved from being a required skill in media companies’ finance departments to being an essential part of the résumé for editors, writers and designers,” they wrote. The authors noticed that journalists were being encouraged to look at real-time data and see which stories were tracking. This might not be particularly surprising, as it is a direct reflection of the underlying business model — advertising revenue is often tied to views.
Editorial and advertising efforts were clearly blurring together. Media experts noted that these digital developments were leading to breaches of the Chinese wall that separates objective journalism from advertiser influence. Although this observation was often framed as a warning, some observers suggested that this ethical barrier should be unapologetically bulldozed. Ultimately, quality journalism is dependent on a viable business model, or a reliable, nonintrusive source of funding. Some suggested that a refusal to adapt to new business realities was self-defeating, not virtuous.
The digitization of the news media also allowed true expertise to shine. Lewis DVorkin of Forbes wrote, “By taking to the Web, audience members with deep topic-specific expertise successfully took on quite a few professional journalists with far less knowledge. Marketers, experts in their own right, also became respected content providers in an increasingly information-obsessed society.”
At the time of the CSJ report, loyal readers accounted for 56 percent of a news site’s page views, although in recent years, people have challenged the notion that a “flyby reader” has less value. News outlets wanted to capture the sustained attention of these readers. However, this presented ethical conundrums. Looking at the incentives, a publication has a vested interest in appealing to the biases of its known readership, as this leads to loyalty, engagement, and page views. The publication also has an interest in curating a readership that is valuable to advertisers and in producing content that complements ad campaigns.
Sometimes data interferes, and at other times, it illuminates; but ultimately, data is only valuable within a human context. Most humans have moral compasses. This is one of the reasons why it’s important to keep people in the equation, even as AI tools start to automate aspects of work.
But the rise of AI could present another problem. Tech companies and moguls are increasing their profits through effective automation, which may cause them to look for other acquisitions. The concentration of media ownership could affect the quality and accuracy of journalism. Multinational corporations and billionaires often have diverse financial interests, which could become entangled in the outcomes of coverage. It might not even be obvious when coverage is swayed one way or the other. Ideological and partisan biases are more readily detectable than complex conflicts of interest. And the influence of audience analytics and advertisers is at least directly related to the newsroom.
As news organizations try to expose deceptions and revisit their internal ethics and practices, they may find that the most relevant questions transcend technology. In a recent Nieman Lab piece, Robin Kwong wrote, “As even the biggest tech companies are starting to discover, there are larger issues than just how to reach, monetize, and retain users. How do we convene civic spaces, or ones that further human connection? What motivates learning and curiosity, to ward against disinformation? How do we study and change our own organizational cultures?”
When faced with the broadness of these sociological challenges, AI may seem like a distraction. It is important to remember that AI is a means, not an end. Human judgment is increasingly valuable as news companies become acquired by billionaires and algorithms pluck content for us to consume.