AI developers say it’s not their fault that their machine learning programs produce copyrighted material, even though they were the ones who trained their systems on copyrighted material of author. Instead, they want users to take legal responsibility for the material generated by their systems.
The US Copyright Office is considering new regulations regarding generative AI and in August issued a request for comment on artificial intelligence and copyright. Responses to the request are public and can be found here.
Among the responses, companies including Google, Dall-E developer OpenAI and Microsoft wrote, arguing that only unlicensed production of copyrighted materials violates existing protections. They say AI software is like audio or video recording devices, photocopiers or cameras, all of which can be used to infringe copyrights. The manufacturers of these products are not held responsible when this happens, so why should AI companies be held responsible, or so the thinking goes.
Microsoft, which has a multi-billion dollar partnership with OpenAI, wrote:
(U)ers must take responsibility for using the tools responsibly and as designed. …To address the concerns of rights holders, AI developers have taken steps to mitigate the risk of misuse of AI tools for copyright infringement. Microsoft builds many such measures and safeguards to mitigate potentially harmful uses in our AI tools. These measures include meta-prompts and classifiers, controls that add additional instructions to a user prompt to limit harmful or infringing output.
It’s worth noting that the safeguards Microsoft supposedly has in place have done little to prevent massive trademark and copyright violations. In fact, The Walt Disney Company recently asked the tech giant to prevent users from infringing on its trademarks.
Google, for its part, argued:
The possibility that a generative AI system could, through “rapid engineering,” be made to replicate the contents of its training data raises questions about the appropriate boundary between direct and secondary infringement. When a user induces an AI system to produce an infringing result, any resulting liability should fall on the user as the party whose willful behavior directly caused the infringement. …A rule that held AI developers directly (and strictly) liable for any infringing results created by users would impose overwhelming liability on AI developers, even if they have taken reasonable steps to prevent infringing activities of the users. If this standard had applied in the past, we would not have legal access to photocopiers, personal audio and video recording devices, or personal computers, all of which are susceptible to counterfeiting as well as ‘for substantial beneficial purposes.
And OpenAI wrote:
When evaluating results-related infringement claims, the analysis begins with the user. After all, there is no result without a user prompt, and the nature of the result is directly influenced by what was requested.
It is worth pointing out that all of the above companies have used copyrighted and trademarked material without permission to train their software, and OpenAI is currently being sued by more than a dozen major authors who accuse company to violate its copyright.
And to further muddy the waters, although these companies have told the US government that users should be responsible for the results of their systems, many of them, including Google, OpenAI, Microsoft and Amazon, are offering to cover legal costs their clients sued for copyright infringement.
But, ultimately, the companies argue that current copyright law is on their side and there is no need for the copyright office to change that, at least not yet . They say if the bureau cracks down on developers and changes copyright law, it could cripple the nascent technology. In its letter, OpenAI said it “urges the Copyright Office to proceed with caution in calling for new legislative solutions that may prove, in hindsight, premature or misguided as technology evolves quickly “.
It’s perhaps surprising that the big movie studios are on the side of big tech here, even if they approach the issue from a different angle. In its submission to the Copyright Office, the Motion Picture Association (MPA) distinguished between generative AI and the use of artificial intelligence in the film industry, in which “AI is a tool that supports, but does not replace the human.” creation of members’ works. The MPA also opposed updating the current legislation:
MPA members have a uniquely balanced perspective regarding the interaction between AI and copyright. Members’ copyrighted content is extremely popular and valuable. Strong copyright protection forms the backbone of their industry. At the same time, MPA members are strongly interested in developing creator-focused tools, including AI technologies, to support the creation of world-class content. AI, like other tools, supports and enhances creativity and attracts audiences to the stories and experiences that characterize the entertainment industry. The MPA’s overall view, based on the current state, is that while AI technologies raise a host of new questions, these questions involve well-established copyright doctrines and principles. At this time, there is no reason to conclude that these existing doctrines and principles will be inadequate to provide courts and the Copyright Office with the tools they need to address AI-related questions as they arise.
Although the MPA believes that existing copyright laws are sufficient, it strongly opposes the idea that AI companies should be able to freely train their systems on their hardware. In its letter, the MPA writes:
The MPA currently believes that existing copyright law should be up to the task of answering these questions. A copyright owner who establishes infringement should be able to avail itself of the existing remedies available in §§ 502 through 505, including monetary damages and injunctive relief. …At present, there is no reason to believe that copyright owners and companies engaged in training generative AI models and systems cannot enter into voluntary licensing agreements , so government intervention may be necessary.
Gn En enter