top of page
Search

'Get around the table' is my call




The AI Creative Sector Conundrum: A Government unsure

 

The conflict between AI companies and the creative sector is intensifying, with new headlines emerging daily in the media. AI developers push for broader access to creative works for training purposes, while artists and content creators demand fair compensation and protection of their intellectual property. This tug of war is being played out in the media on a daily basis.

A Guardian article recently discussed issues around ethical and responsible AI.  It discussed the call by artists to protect their work which is used by AI developers without oversight or compensation or transparency.

The issue for me is the obvious bias in the article which I think immediately limits the conversation on AI as it comes from a place of trepidation and suspicion by creatives.

The article discusses the opportunities around AI and follows that quite rapidly with the risks which is valid but the focus on risks is overwhelming and louder than most conversations around AI.

Also, it discusses landscape dominated by bland aesthetic output, but this is a limited view of AI in creativity as research by Alexanda Manu for example, actually indicates AI pushes human creativity and innovation opening up new ways of seeing which bring satisfaction to viewers.

The article is discussing the fact that over 400 creatives have written to an office in the White House to urge them to protect creative intellectual property.

The letter presented to the white House use the words ‘"America’s global AI leadership must not come at the expense of our essential creative industries."

On the surface it looks fine, (a truism which is not necessarily truth) but it immediately draws my attention to the book written by Hye-Kyung Lee who describes the use of the term ‘creative industries’ as dehumanising for creatives. I like this definition he provides which I think provides us with a less emotional, subjective view of creativity: ‘Since Tony Blair’s government in the UK introduced the idea of ‘creative industries’ (those industries which have their origin in individual creativity, skill and talent and which have a potential for wealth and job creation through the generation and exploitation of intellectual property), it quickly has become an orthodox template for cultural policy globally’. He says the Creative Industries is a term which was coined in regard to the financial exploitation of creative and intellectual property which can sometimes make the cry for creative ownership hollow.

The irony is he insists AI is humanising creativity by forcing a conversation which looks at human creativity and its ownership.

The dichotomy of the impact of AI is brought to light and indeed does force the conversation regarding creativity, ownership and AI.

Conversations which I don’t think are clarified in the public domain, even for those writing these letters in protest.

Google, on the other hand, insist, they understand copyright law including ‘Fair Use’ but they would like exemptions otherwise the prolonged negotiation with creatives copyright holders will delay AI innovation. 

Fair use permits a party to use a copyrighted work without the copyright owner’s permission for purposes such as criticism, comment, news reporting, teaching, scholarship, or research.

Yet 2 days following the protest letter to the White House, Lionsgate signs a deal with Runway to train their systems on their film content. Harper Collins have also signed a deal with an AI developer for its catalogue of books. This makes sense because the Creative Industries is about exploiting IP for money and surely this collaboration is one of the ways forward.  

Another article in the Guardian presents how 35 people from the world of theatre, music and drama have written to the Government protesting that the government have allowed AI companies to train their models using their copyrighted content citing the freelance nature of the sector which leaves creators unprotected.

The Government spokesman however said the current adversarial relationship between developers and creatives is slowing down the pace of AI innovation and that the government have proffered solutions which are advantageous to both sides.

Recently, the CEO of Open AI, Sam Altman, said they had created a new writing tool which is as good as other creative writers. Given the ongoing battles with the creative sector, it is understandable they want to create something which gives them independence in creating but it is also acknowledged by Altman that without the works of creatives including copyrighted work they would be unable to train these models.

In the meantime, Google and OpenAI lead the charge to have access to data including copyrighted data to train their models. They have written to the US government calling for policy changes which favour their use without permission of content owners and avoiding what they describe as “unpredictable, imbalanced, and lengthy negotiations."

Google and AI developers want the creation of national policies that allow them to exploit the capabilities and drive AI innovation, but they do not want to be held responsible for how people use the models given that output is solely dependent on the user.

The EU AI laws are calling for more transparency, but Google and the other AI developers feel this would release their secrets to competitors and of course that idea plays into the sensibilities of the American view of world dominance.

OpenAI insists on the need of ‘freedom to learn’ otherwise countries like China will have this same access and thus leave American innovation lagging.  They are hoping Trump will side with them to agree that learning from all publicly available content is ‘Fair Use’.

It must of course be noted that Trumps is quite cosied up to AI developers including receiving a £1m donation to his campaign by OpenAI’s Sam Altman.

Meanwhile Kate Moss is calling AI developers thieves for wanting the content without having to pay for it and the fashion industry in the UK have also called for the UK government to protect their industry worth over £81 billion by ensuring protection of creative IP from developers.

Copyright laws in the UK have worked really well for many years and developers should adhere to them like everybody else.  But of course, AI developers are using the very concept of Fair Use in copyright law to fight for the right to use content to train their models without compensation.

Baroness Kidron, a creative herself in the House of Lords is urging the government to protect the IP of creatives and not create laws which favour just the AI developers.

So, the conundrum remains as to how much freedom and leeway AI developers have at the expense of everyone else or can a win for all compromise be found.

The UK Action plan is calling for innovation whilst still protecting creatives and suggests the EU AI Act might be a possible template. The EU AI Act has however, been accused of falling short and for not dealing with copyright infringement and also GenAI has undergone several iterations since it was written.

The Act puts the responsibility on the artist to opt out of having their content used. The question would be then do artists just present a blank ‘do not use by AI developers’ which is stored or written into law like patents.  How do artists know their content will be used otherwise?  Should AI companies then not simply compensate the owners of data which they use.

This entire saga highlights several things for me: 

1. Instead of all stakeholders sitting together to negotiate a long-term solution which would be acceptable to all they are writing to the government individually. This creates a lack of movement and hampers the innovation beneficial to all parties.  Stakeholders around AI must sit around the table, the EU has called for it so has the UK government. Google and open AI look like bullies when they need to be around the table and Governments must ensure they do instead of trying to please them and then turning around to also try to placate the creatives. Google in writing to the government decries the focus on risk which has influenced and accompanied policy making. I agree with this to a large extent. If anything, it further confirms the need for a conversation.

 

2. A compensation scheme must be worked out for the long-term. This is the creative industries-the industry which has commercialised creativity and that it seems, is the model moving forward, and with AI in the mix should not change that. However, there should be fairness in these decisions given that most creatives do not make that much money on their current copyright creation anyway.

 

3. Lionsgate and Runway making agreements is an indication that agreements can be met by organisations willing to be fair and Google and Open AI as the biggest developers must be open to these negotiations

 

4. The deal between Runway and Lionsgate did not require the approval of actors, and writers and directors etc because the Studios and publishers own the commercial IP not the creatives who have been dehumanised for the industry as Hye-Kyung Lee states.  Thus, creatives really cannot protest the use of the IP once it has been sold. The new owners of the IP can choose to use it as they wish and ultimately this is a commercial sector, and money will always lead the conversation.

In conclusion, these debates and arguments should not be played out in the public domain, in the newspapers, in sensationalised headlines or on talk shows. They should be dealt with around the table by all key stake holders—government, developers, creatives and maybe even some members of the public.  This public bickering leaves the GenAI landscape, unprotected, dishevelled, unprotected open to misinterpretation and rudderless.

‘Get around the table’ is my call.

 

Reference List

Ashley Belanger ,13 March 2025, OpenAI declares AI race “over” if training on copyrighted works isn’t fair use, https://arstechnica.com/tech-policy/2025/03/openai-urges-trump-either-settle-ai-copyright-debate-or-lose-ai-race-to-china/

Dan Milmo Global technology editor, Performing arts leaders issue copyright warning over UK government’s AI plans https://www.theguardian.com/culture/2025/mar/18/performing-arts-leaders-issue-copyright-warning-over-uk-governments-ai-plans

Dan Milmo Global technology editor, ChatGPT firm reveals AI model that is ‘good at creative writing’, https://www.theguardian.com/technology/2025/mar/12/chatgpt-firm-reveals-ai-model-that-is-good-at-creative-writing-sam-altman

Hye-Kyung Lee, 2022, Rethinking creativity: creative industries, AI and everyday creativity, Sage Journals, https://journals.sagepub.com/doi/epub/10.1177/01634437221077009

 

Katie Kilkenny, March 18, 2025, 400 Hollywood Creatives Push Back on OpenAI and Google’s Calls to Train AI on Copyrighted Material, https://www.hollywoodreporter.com/business/business-news/hollywood-pushes-back-openai-google-argument-copyright-1236166626/

Paula Gortázar, Senior Lecturer, School of Arts, University of Westminster, Protecting artists’ rights: what responsible AI means for the creative industries, Published: March 11, 2025, https://theconversation.com/protecting-artists-rights-what-responsible-ai-means-for-the-creative-industries-250842

Ryan Whitwam – 14 Mar 2025, Google joins OpenAI in pushing feds to codify AI training as fair use, Google says it just wants "balanced" copyright rules, https://arstechnica.com/google/2025/03/google-agrees-with-openai-that-copyright-has-no-place-in-ai-development/

Mark Sellman, Technology Correspondent, Thursday March 20 2025, 9.50pm, The Times, AI copyright shake-up is ‘wrong approach’, say luxury brands, https://www.thetimes.com/uk/technology-uk/article/ai-copyright-shake-up-is-wrong-approach-say-luxury-brands-qsqhsj6fm

 

 

 
 
 

Comments


© 2020 Nana Ofori-Atta Oguntola

bottom of page