After more than 35 years of operation, TBI is closing its doors and our website will no longer be updated daily. Thank you for all of your support.
TBI Weekly: Why we need to talk about AI
The current US writers’ strike is as much about artificial intelligence (AI) as it is about pay and other conditions. The Writers’ Guild of America (WGA) is concerned about AI being used to write scripts for film and TV, especially taking into account the high output required by streaming services.
Earlier this year, the WGA indicated that it may not object to some use of AI if it was no more than a support device for writers, the AI received no credit, and writers’ income was unaffected.
If a script is based primarily on a novel, screenplay or newspaper article then the written work is derived from source material. Whereas, if it is based upon a writer’s original idea, then the story or screenplay is regarded as literary material. Basically, the writer gets a lower fee, usually 25% less, if it is not based upon their own idea.
However, if an AI system writes the script using data sourced from the internet, can that be classed as an original literary work? Also, how would you assign credits? AI uses data and its strength is identifying the data and piecing it together. AI is not a creator in the sense of a writer or artist. AI generated material is not eligible for copyright protection (certainly in the US), nor can the AI provide a certificate of authorship.
The WGA wants to protect writers against having to rewrite or adapt material that has been generated by AI and carries no copyright. Also, to prevent film and TV companies from asserting that the AI is responsible for the literary material.
Another issue concerns a writer using an AI tool but retaining substantial human involvement in the creative process – that is likely to be protectable by copyright. It is not clear where the line of protectability lies between AI generated and significant human involvement.
The fear for writers is that AI could replace them to create scripts, especially for those productions where there is high output and relatively low budgets. Seasoned well known writers may be ‘safe’ and be commissioned for more premium expensive productions. However, less experienced or established writers seem vulnerable.
The question is can writers and AI co-exist? AI cannot replicate the sensibilities, nuances and emotions of a human. The WGA dispute revolves around how to ensure writing remains a human centred activity but exploiting AI where appropriate; for instance having the equivalent of virtual assistants as used in gaming to help players by providing information or guidance during gameplay. These assistants use natural language processing to understand and respond to player requests.
Also, will the issues arising from the WGA strike cause any modification to copyright laws in different jurisdictions like the US and UK? Earlier this year, the UK Law Commission (UKLC) completed its consultation about possible reform to copyright law in the context of AI, and hopefully their recommendations will be published later this year.
The issues the UKLC is considering include whether computer generated works without a human author should be copyright protected at all. Currently, such works protected in the UK have 50 years safeguarding. How should laws for licensing or exceptions to copyright be modified for text and data mining, which is often significant in AI use and development. Getty Images, for instance, is suing for alleged breach of copyright by an AI company for using its photographic images to ‘teach’ AI systems to create photographs.
The discussion about AI and its regulation has only started. Be prepared for a long running soap-opera rather than a one-off commission as the story unfolds.
Julian Wilkins is a consultant solicitor and notary public with Eldwick Law, and a founding member of mediation and arbitration practice Q Chambers.