How will the Pandora’s AI Box impact the news industry?

date
Apr 30, 2024
slug
2024-how-will-the-pandoras-ai-box-impact-the-news-industry
status
Published
tags
AI
Artificial intelligence
journalism
regulation
news
type
Post
summary
Artificial intelligence is set to create a significant impact on the news environment. Has the media industry learned from previous major changes to handle it effectively?
notion image
Just as it happened with the advent of digitisation, search and social media, journalism and information media will experience a significant impact from the consolidation of artificial intelligence. Every aspect of the media business can - and will - have its processes adapted to work with this new paradigm to varying degrees. But to what extent? Will it replace the current system or become a tool? We should not attempt to predict the future, but identify the aspects that society can leverage to make this new technology work for it - and not against it.
AI arrives in the news industry at a time when it is riddled with systemic problems. Disinformation occupies a large part of information systems, bringing no benefit to the sector and leading the audience to a level of polarisation not seen for many decades. Layoffs have been the norm in a sector that, today, has the same number of jobs as the late 90s (or even fewer, in some countries). Trust in media is at a historical low, and the industry lacks a business model that ensures sustainability in its battle for survival, let alone to maintain its independence.
Technological shifts have a history of impacting productivity, generating new revenue, and providing flexibility for early adopters (for instance, consider Amazon, Google, and Facebook). However, the gains these pioneers achieve are unevenly distributed among customers, employees, and society at large. While such industries create new jobs in the short term, they often resort to layoffs to trim costs to please shareholders. This pattern seems likely to continue with the advent of AI, but with additional concerning changes: the speed and scope of these changes, and the fact that the uneven distribution of wealth does not benefit the media ecosystem at all, but remains in the pockets of technology corporations, exacerbating income inequality.
Imagine hard, but harder
The news system will also become more complex due to the addition of a new layer. LLM providers will form a very exclusive group. Their lack of interest in interoperability, aimed at shielding their users from potential competitors, introduces another level of walled gardens. Struggling to visualize it? Think about how Google or Meta impacted the news environment and the following consequences. Now, imagine that on top of the existing infrastructure, search, and social media layers, AI becomes another monopolistic entity striving for ubiquity.
Like any technology, AI can be a tool for positive change, but its potential developments can also be negative. In the newsroom, AI has the potential to significantly enhance the production flow. Automating manual tasks such as proofreading, creating articles about statistics and official announcements, and building context could free up journalists to focus on more complex content, such as investigative journalism. The process of tailoring content for different platforms could become easier. Adaptation to human behaviour could be quicker, and data-driven pivots could be made more frequently, leading to new or improved products.
Solutions for chronic, longstanding problems may also lie in the reshuffled news environment. New projects can thrive in news deserts—regions that have lost their news outlets over the years. With a leaner team, local news production can benefit from AI resources that were once impractical. Even if foundational problems, like the digital ad business model, are yet to be tamed, news entrepreneurs will have a chance to try their luck again, hopefully with some help from policymakers.
Why we have to regulate and what can it prevent?
The news environment exemplifies market failure, having been undermined by previous technological shifts. Governments' inability to effectively regulate search and social media has led to the Orwellian reality we currently face. These unresolved issues must be addressed now. As American jurist Oliver Wendell Holmes Jr. stated, laws should be made considering the bad man in power, not the good. These companies have consistently failed to safeguard societal interests. Expecting self-regulation after a technological shift in an industry that only trillion-dollar companies can enter is akin to waiting for a miracle.
Even without any futuristic exercises, it's possible to imagine how things could go very wrong. Scaling technologies tend to centralize markets into a few companies, and concentrate revenues in the hands of one or two of them. The speed of developments could go into overdrive, leaving no time for oversight or checks from external organisations. Privacy is already exploited today, but the capabilities of AI could exacerbate the problem (consider facial recognition, deep-fakes, and identity takeover, just to mention a few). Plus, disinformation could be boosted to unseen levels.
Reality has always been a shared experience until the combination of digital technology and algorithms led us into the post-truth realm, where anything can be disputed. Humans have always had a hard time processing reality because we don't have the ability to absorb everything in the world (what Hegel called "totality"). Digital technology has exacerbated this issue as we face multiple sources of information, and disinformation actors like Donald Trump took advantage by simply denying the undeniable. Suddenly, in practical terms, what we've seen in the last decade is the creation of different realities. It doesn't matter if it's real or not: what matters is the perception of a part of society that wants it to be real.
Fasten your seat belt, Dorothy…
Artificial intelligence can power factories to manufacture realities with such a level of detail that it can become impossible to prove a given customised reality is not true. "Facts" as we once knew them - undisputed pieces of information like official statistics and scientific output - may cease to exist. This could lead to community fragmentation and isolation in silos that refute information unfriendly to their beliefs, making the concept of a singular truth impossible. A deregulated news and information AI-driven market would be controlled by a few platforms (and serve bad actors with the right budget) that would optimize the system for higher profits only, leading to dystopian scenarios, like war, social unrest, and further dissolution of the already torn fabric of society.
The ubiquity of the subject in today's media, our pop culture, and the distorted versions of reality presented in sci-fi classics like The Matrix, Terminator, and Neuromancer certainly lead us to a kind of primal fear. Some may argue that everything the so-called doom-bringers bring to the table is a hysterical fear of progress. There is some truth in this. We live in a world at the end of a historical cycle, plagued by many existential threats (climate change, disinformation, wars, etc.), making it difficult to stay completely detached from the discontent afflicting civilisation now. Plus, there are several events, technologies and industries that will be born out of AI output that we can’t imagine yet. The level of unpredictability will only rise, and reacting fast will be paramount.
The relativisation of the subject, however, is not a license to dismiss the severity of the consequences if we get it wrong. Advances driven by artificial intelligence will be welcome. We have an obligation to address the large set of problems that already exist and the inevitable developments that will occur. The news industry, having suffered severe setbacks recently, should have learned lessons to prevent new cycles of decay from affecting its efficiency in society once again. Society and lawmakers, not elusive boards with individual goals, should make the decisions. This should already be happening. We are already late, and that's unfortunate.
[This is a series of posts processing the insights achieved by the Open Society AI in Journalism Futures 2024 workshop. The opinions are mine, but the subjects were collectively discussed]

© Cassiano Gobbet 2023 - 2024