Can co-ops protect press freedom in the age of AI?

As we enter uncharted waters when it comes to news information, we hear from the Independent Media Association and UK regulator Impress

Journalism has come under huge transformational pressure since the millennium, with the online revolution shattering business models, depressing wages and ushering in social media echo chambers. And tech giants are taking the place of the old media barons – and proving no better when it comes to impartiality or journalistic ethics. Meanwhile, authoritarian politics are on the rise around the world; arguably, in part, a product of the new media landscape, they in turn threaten press freedom.

And now AI enters the frame, bringing threats of fake news, autogenerated copy and increasing control by biased algorithms over what news the public gets to see. The world faces an information ecosystem where people struggle to tell the real from the fake, harming our ability to make crucial decisions about our health, our finances, and our politics. Is mutualising big tech, and the media, the answer?

Among the numerous warnings over AI is a report compiled in 2020 for the Canadian government by Julia Haas from the Organization for Security and Co-operation in Europe.

“AI can be used as a tool to censor the media and unlawfully surveil citizens and independent journalists,” she wrote. “Moreover, in today’s online environment, a few dominant internet intermediaries act as gatekeepers in the curation, distribution and monetisation of information, including news content. These intermediaries increasingly deploy AI to govern private speech and public discourse.”

She adds: “The use of AI to distribute content based on the predicted preferences of individuals is based on extensive data-driven profiling. To maximise revenue, intermediaries may prioritise content that increases user engagement over providing access to diverse information of public interest or to independent quality journalism. This may undermine users’ ability to access pluralistic information and bias their thoughts and beliefs.

“The power and influence of a few intermediaries, as well as the fact that most AI tools operate opaquely with little regulation or oversight, exacerbates this threat.”

For Haas, the answers are government regulation to ensure transparency and good practice on the part of the technology’s owners. Crucial to this is that “users should have a choice and control over collection, monitoring and analysis of their data.”

Related: Co-ops join MPs’ calls for more government support for local journalism

Control is also important for journalists. “Generative AI poses the biggest threat to press freedom in decades, and journalists should act quickly to organise themselves and radically reshape its power to produce news,” wrote researchers Mike Ananny and Jake Karr in a 2023 study for NiemanLab. “A truly free press controls its language from start to finish. It knows where its words come from, how to choose and defend them, and the power that comes from using them on the public’s behalf.”

With generative AI – a system with no “commitment to the truth, eloquence, or the public interest” – increasingly being used to produce copy, journalistic integrity crumbles, they warn.

But how do we give users that control? The co-op or mutual model is an obvious solution. But this is a big ask, as shown by the failure of the campaign in 2017 to mutualise Twitter.

For journalists in the newsroom, there have always issues around control. These demanded collective action, including the formation of media co-ops – and AI needs the same response, say Annany and Karr.

“Taking a page from the Writers and Screen Actors Guilds, and aligning with some newsroom unions, journalists could find their collective voice on GenAI,” they argue, noting “some halting but hopeful efforts at collective action. Some publishers are attempting to form a coalition to demand fair compensation from GenAI companies that use news copy to train their models. And newsroom unions are pushing for greater worker protection.”

Journalists should also ask if “GenAI’s synthetic, statistical, and proprietary nature – its language comes from systems controlled by a few powerful people – is even compatible with a free press that commands its own words.”

Fed by the underlying prejudices of those creating the dataset, Gen AI is ”anything but neutral or objective”, they warn. “How much power do journalists have to refuse some or all of a GenAI infrastructure?”

One organisation tackling these issues is the Independent Media Association (IMA), which was born from the Media Fund, created in 2015 to raise money for new public interest media in the UK.

Thomas Barlow

Co-founder Thomas Barlow said: “We were interested in organisations that were financially independent. They weren’t dependent on either a state or a multinational corporation financially … and we wanted to make it democratic organisations. So we made it co-operative.”

Members of the fund included media co-op New Internationalist, alongside the likes of Byline Times and Red Pepper, and it aimed to build a “plural information ecosystem, which is the bland, technocratic term for organisations that tell good stories that are important and are different.”

In 2019 the team decided they wanted to do more than raise funds: they wanted to negotiate with big tech, lobby government, and fight for their members. That, says Barlow, means having “everyone able to collaborate together, because a lot of the organisations actually have overlapping interests, and we’re going to learn from each other and share skills.”

This is now a case of convincing the tech giants of the monetary value of their content – and getting them to pay up. “We’ve got this massive negotiation that we hope to be undertaking with Google to get significant amounts of income into the UK media, particularly independent sources.”

The IMA is looking to the UK Digital Markets Consumers and Competitions Act for support. If the law designates Google as having strategic market value, the tech giant will be liable to government intervention.

Related: Co-ops and the commons – imagining a ‘People’s Media’

“Google’s so big that if you try to fight it, you wouldn’t win,” says Barlow. With 90% of people accessing the internet through Google, the tech giant “basically has a monopoly. As media organisations, our access to our audiences is very much controlled by Google.”

So the IMA wants to negotiate a deal. “They need us because we provide huge amounts of value for their AI, because we produce original content. We’re not just repackaging Press Association work. We provide regulated, high quality information, produced by humans, right? This isn’t junk stuff. So that’s great for their AI, for their search results.”

Barlow believes the IMA holds valuable cards. “People want to find news, and now Google put that at the top of their search results, as AI results. Well, they need us to do the work. We’ve got to make sure there’s something mutually beneficial.

“We don’t know whether Google will be designated as having strategic market status. If it is, and if there are negotiations, we are absolutely determined to make sure Google feel like that there’s a massive win for them and that there’s a massive win for us.

“For Google, they can make money from that, and for us, we can maintain our independence, and, at the same time, make a living. And that’s all that quality journalists are asking to do … The poverty that journalists live in is outrageous.”

With algorithms controlling the flow of information, press freedom is at stake, warns Barlow. “There’s two companies in the world, Meta and Google, that basically decide what almost everyone on the planet sees. Press cannot have freedom in that world. We work for them.”

Which returns us to the idea of mutualisation. “If these companies were co-operatives,” says Barlow, “and the public controlled these algorithms, they could be used for the public good … Algorithms themselves don’t have to squash press freedom. They don’t, but they could prioritise regulated media information that’s useful, stories that bring coherence and cohesiveness to society.”

Related: The role of user-led and member-owned media

Sadly, he says, the current reality is “addictive, extremely negative machines that constantly put us in a state of fear, anger, aggression, and very highly rewards disinformation and misinformation, because good quality information doesn’t elicit that response.”

Instead, says Barlow, it could prioritise “regulated media, independent sources, sources that are financially independent of nation states and of multinational corporations.”

The stakes are high, he warns. “We’re two or three, four years away from destroying all human news media in the UK. We’ve created something that we are now actively destroying or allowing to be destroyed by American global corporations.” 

As Julia Haas noted, the new tech-dominated media landscape poses a challenge for regulators  – and currently often escapes their control.

In the UK, Ofcom regulates TV and radio, with most UK publications regulated by the Independent Press Standards Organisation (IPSO). Following the 2012 Leveson Inquiry into the culture, practices and ethics of the British press, the Press Recognition Panel was set up. 

In 2016, the Independent Monitor for the Press (Impress) became the UK’s first recognised press regulator, fully compliant with Leveson. Impress regulates over 200 titles, consisting of a variety of independent local, investigative and special interest publications. No national newspaper signed up to the new regulator; most continue to be members of the unrecognised IPSO.

Lexie Kirkconnell-Kawana

Impress CEO Lexie Kirkconnell-Kawana shares concerns over the digital revolution. “I could see this crazy shift happening, this digital divide emerging, and people coming online and accessing different types of media in a very converged way,” she says. “And then these big pillars of companies rose.”

The new era brought new dangers. “I worked a lot in child protection and could see how in these very unregulated spaces children suddenly had access to everything all at once, and there was no ability to monitor that, or create interventions.

“I was interested in looking at ways to govern this. What should be the role of the state? Of civil society? Of the industries producing this stuff and profiting off of it?”

Six years ago, Kirkconnell-Kawana joined Impress to develop a framework and set of standards that could regulate digital news. 

“We’ve been really successful as an entity, bringing lots of news publishers into the fold that want to subscribe to that framework and make sure they’re legally and ethically compliant and doing right by their readers.”

Impress is organised as a community interest company, with articles of association that require it to maintain a board independently appointed by a satellite panel, and consult the public annually on its rules, fees and code. It has a number of co-ops in membership, including Co-op News, Bristol Cable, New Internationalist, the Meteor and the Ferret.

Its aim is to “build a new ecosystem grounded in accuracy, transparency and accountability” and, through this, mitigate harm. 

“But we’re not here to promote a certain type of journalism,” Kirkconnell-Kawana says. “We are impartial in the sense that we allow all kinds of partisanship within the space. So whatever worldview you want to bring to your journalism, we think is acceptable.”

Social media has made things harder, she says. “We want to make sure that where someone experiences harm and wants to engage with their news provider, they’ve got a body they can go to, to reinforce that. So we’re not just looking at news organisations. We’re also looking at the influencer creator space, and at providing redress for harm on tech platforms. We’re also thinking about where these new frontiers will be in the future as well.”

Among the new breed of creators, she warns, “there isn’t a lot of education or literacy around the rights and responsibilities when engaging with media. There is still a tendency to sit in this space of free speech absolutism that says, ‘I can say what I want because it won’t harm anyone’. And it’s about shifting that back into more of a social responsibility mindset that acknowledges all media can cause harm in some way – and the degree to which you can mitigate those harms in advance or then redress it when it goes wrong, isn’t about curtailing freedom. It’s about ensuring that communities stay in a good relationship.

“We’re not in the business of censorship. But it’s saying, ‘be aware that this will have an effect … and that if that effect causes harm, you have a responsibility to put that right’.”

The idea of media freedom “isn’t that complicated”, she adds. “A lot of people get really sensitive about this issue because I don’t think they have spent enough time in rights discourse to know that rights are balanced. One person’s ‘right’ can butt up against another, and then that friction needs to be addressed. When people aren’t prepared to do that, that’s where you’ve lost the conversation.”

Another challenge with the transition to digital is that print media organisations traditionally had a legal team, sub editors and other checks and balances to prevent the publication of harmful material. But accelerating news and publication cycles, alongside resourcing issues, mean a lot of this has been set aside. This includes training so journalists know how to be responsible in their news gathering. 

Kirkconnell-Kawana sees a particular gap here when it comes to influencers, who “are very good at building communities and affinity with their audiences; but that doesn’t correspond to any kind of fiduciary relationship, that responsibility for holding millions of followers in the round.”

This brings huge risks, as influencers “have probably never had any journalistic training or been involved in an organisation that was communicating or publishing and needed to think about the risks involved”.

In this context, Impress plans to launch an accreditation this autumn, for journalists, freelancers, influencers and social media hobbyists who want a rights-rich education. 

Generative AI further muddies the waters, says Kirkconnell-Kawana – and puts an onus on journalists to ask questions. “If you are going to be making AI images, it’s about thinking through, what’s the likely effect of this going to be on my audience, how are they going to engage with this content? How am I ensuring that there’s a level of oversight that if something goes wrong, we can redress that?”

Oversight is crucial where “there are whole websites, whole new accounts, whole new personalities being completely autonomously generated, where there’s no supervision or interface at all with the human agent. If I can’t hold to account a computer, then how are we supposed to repair harm?”

A community response – which the co-op model could provide – is needed, she thinks. “There is something about news which should be sense making, which should be about a common set of facts and experiences.” But hyper-personalised digital content is leading to “polarisation and a lack of cohesion”.

The co-operative model, she says, “is a really interesting way of thinking about frameworks. Obviously co-ops don’t have to just be based in a locality. They can be based in interest, or they can be based in other types of principles and values. 

“But it is a really interesting way of thinking about participation, particularly from a redress perspective.

“And I see co-ops having a much more prominent role over the next 10 to 20 years, as we move out of status paradigms for how we solve some of the biggest challenges that we’ve got coming down our way.”

Meanwhile, the risks of new tech become more worrying given the rise of authoritarian governments around the world – meaning there are risks involved in encouraging state regulation. “I think we have to ask questions if a country like Hungary uses the power of the Digital Services Act to start interpreting the national laws on, for example, discriminatory content,” warns Kirkconnell-Kawana. 

On the other hand, she notes, in the US “you have that libertarian model, and we’re seeing how that’s being weaponised by a particular set of oligarchic characters to their own ends. 

“For me, the state still has a significant role to play out over the next 10 or 20 years, but so long as we can build genuine democratic participation in these alternative governance structures, ultimately the state does fall away.”

Mutualism is an obvious contender for building this democratic system. Kirkconnell-Kawana says moves to mutualise the BBC, for instance, make for a “really interesting project” although she remains “agnostic” about the idea.

“As an organisation, I think it does suffer a lot from the politicisation of its independence and its governance because of its relationship, again, to the state. And we’ve really seen in the last couple of years how that’s played out in terms of reporting on Gaza, in terms of reporting on government, and political parties.”

Related: Petition for BBC Trust doubles signatures amid licence fee row

At IMA, Barlow is also interested in a mutual BBC. “I would say, you start gently … start electing the board and the chairs”.

This could remove some of the corporate figures and political appointees from the upper echelons of the BBC, whose current “cultural production is inherently political,” he argues.

Citizens assemblies could also feed into programming, he adds. “Let’s be real. The BBC is in trouble unless there’s a radical change. People are not paying the licence fee.

“The only way I think you can sell a tax is say, ‘we’re…You’re…’. You’re paying for something that is genuinely yours. People don’t realise how much the BBC turns up in their everyday life. It’s radio, it’s the news site, it’s weather reports – and again, AI and Google are scraping all the stuff the BBC does and presenting it as its own. The BBC is a public resource.”

OSZAR »