Generative AI and Copyright The Foundation for American Innovation
The ways in which existing laws are interpreted or reformed – and whether generative AI is appropriately treated as the tool it is – will have real consequences for the future of creative expression. To mitigate this concern, some scholars propose new regulations to protect and compensate artists whose work is used for training. These proposals include a right for artists to opt out of their data’s being used for generative AI or a way to automatically compensate artists when their work is used to train an AI. Unlike inanimate cameras, AI possesses capabilities – like the ability to convert basic instructions into impressive artistic works – that make it prone to anthropomorphization. Even the term “artificial intelligence” encourages people to think that these systems have humanlike intent or even self-awareness. Beyond the complications to determine what copyrighted work went into generating a specific piece of AI content, another struggle would be determining how much human involvement exactly is enough for the work to be owned by them.
For instance, the recent Screen Actors Guild – American Federation of Television and Radio Artists union strike highlights the concern many have about the potential of AI to write screenplays and create characters without any human involvement resulting in significant job displacement. Their concern extends to copyright protection, as ChatGPT and Midjourney scripts could incorporate elements from existing filmmakers’ works, leading to copyright issues. Addressing the various policy challenges with copyright policy and law is crucial to encourage innovation and foster creativity in the United States. Another issue is that, in jurisdictions such as the US, output generated solely by a machine is ineligible for copyright protection, as most jurisdictions protect only “original” works having a human author. However, some have argued that the operator of an AI may qualify for copyright if they exercise sufficient originality in their use of an AI model. I will not discuss here the important question of whether it makes sense to have special rules for copyrighted material as compared to other materials in the training data, although this is certainly a discussion that must be had.
‘Copying into Copyright Law’: Ireland’s minimalist transposition of Directive 2019/790
Baio has dubbed this practice “AI data laundering.” He notes that this method has been used before with the creation of facial recognition AI software, and points to the case of MegaFace, a dataset compiled by researchers from the University of Washington by scraping photos from Flickr. “The academic researchers took the data, laundered it, and it was used by commercial companies,” says Baio. Now, he says, this data — including millions of personal pictures — is in the hands of “[facial recognition firm] Clearview AI and law enforcement and the Chinese government.” Such a tried-and-tested laundering process will likely help shield the creators of generative AI models from liability as well. In the U.S., much of this preservation will be incumbent on the courts, where several creators and companies are duking it out right now.
- They would also undermine competition in the AI marketplace, by imposing significant financial and logistical burdens that new entrants may not be able to bear.
- According to a report by the Hollywood Reporter, Judge Beryl Howell handed down the decision in the case of Stephen Thaler, the CEO of neural network firm Imagination Engines, who attempted to have art created by an AI copyrighted under the federal agency in 2018.
- The answer is not clear at this point whether a generative AI system may use input data that is protected by copyright law.
- Because creative labor markets are already heavily concentrated and dominant companies have significant bargaining power, they will be able to impose contractual terms on artists that require them to sign away their “training rights” for reduced compensation.
- And in enterprise applications that leverage premium licensed content – many market research and competitive intelligence systems, for example — copyright issues are not relevant because the content license controls what is and is not permitted and the content license would supersede any general question of copyright compliance.
Generative AI tools are trained on collections of material gathered from many places. Some AI image and text generation tools have been trained on material scraped from web pages without the consent or knowledge of the web page owners. The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. At its extreme, the fair use minimalist view considers all Output Works to be unoriginal and therefore derivative of Input Works, even if, because of the intervention of GAI, no Output Work is an exact copy of any single Input Work. The question of whether or not a machine can legally do the same thing, Leong says, is one that the courts will now have to grapple with.
©2023 Foundation for American Innovation. All rights reserved.
In this case, whoever prompted the AI to produce the output appears to be the default owner. Frequently, artists who use generative AI tools go through many rounds of revision to refine their prompts, which suggests a degree of originality. This is not the first time legal systems are faced with questions on how to define ownership of a specific creation where a new tool serves as a medium in its making. In a recent paper published in Science magazine, researchers discussed copyright legal challenges when it comes to AI-generated work.
The Israeli Ministry of Justice issued an opinion that its fair use provision, modeled on the U.S. fair use doctrine, permits the training of AI systems without compensation. The EU recently adopted a directive that established two exceptions for text and data mining (TDM). TDM for all other uses is permitted subject to an express opt-out by the copyright owner. In other words, unless a copyright owner expressly prohibits the ingestion of her works, the AI system may ingest it. The opt-out must occur in an appropriate manner, such as machine-readable means in the case of material that is publicly made available online. In short, in no jurisdiction are artists compensated for ingestion unless they exercise affirmative means of preventing the ingestion.
Some creators and creator communities in these areas have made calls for “consent, credit, and compensation” when their works are included in training data. The obstacle to that point of view is, if the use of training data is a fair use, none of this is required, at least not by copyright. Fair use maximalists contend that each Output Work should be considered entirely distinct from any Input Work and relies very little on any single Input Work. Instead, fair use maximalists view AI models as tools, like a pencil or computer software, to create works.
Yakov Livshits
Founder of the DevEducation project
A prolific businessman and investor, and the founder of several large companies in Israel, the USA and the UAE, Yakov’s corporation comprises over 2,000 employees all over the world. He graduated from the University of Oxford in the UK and Technion in Israel, before moving on to study complex systems science at NECSI in the USA. Yakov has a Masters in Software Development.
US Court Rejects Copyright Application for “Creativity Machine” AI – Lexology
US Court Rejects Copyright Application for “Creativity Machine” AI.
Posted: Fri, 15 Sep 2023 08:57:17 GMT [source]
Indeed, training datasets for generative AI are so vast that there’s a good chance you’re already in one (there’s even a website where you can check by uploading a picture or searching some text). Giorgio Franceschelli, a computer scientist who’s written on the problems surrounding AI copyright, says measuring human input will be “especially true” Yakov Livshits for deciding cases in the EU. And in the UK — the other major jurisdiction of concern for Western AI startups — the law is different yet again. Since then, the office has released a more sweeping policy change to address all AI-human creative collaborations moving forward — a response to what it sees as new trends in registration activity.
Turning the filter on is a relatively straightforward process that takes just a few steps — visit GitHub’s docs for step-by-step instructions. Downing adds that to the extent your company gets an IP indemnity from GitHub, GitHub will only honor it if you have all the filters enabled. For one, generative AI tools like GitHub Copilot and ChatGPT have hit the mainstream and are now regularly used by developers. Copyright Yakov Livshits Office (USCO) decision provided some clarity on whether generative AI output is actually copyrightable — while at the same time creating practical challenges for software developers. Adobe Stock noted they weren’t able to weigh in on the ownership rights of AI-generated content. Going forward, photographers may need to include clauses in their contracts to cover the license to use photos to train generative AI.
Artificial Intelligence and Copyright — AI: The Washington Report – Lexology
Artificial Intelligence and Copyright — AI: The Washington Report.
Posted: Thu, 17 Aug 2023 07:00:00 GMT [source]
Getty Images has placed an all-out ban on artificially generated content, citing their potential legal risk. Shutterstock, another stock imagery site that was “critical” to the training of OpenAI’s DALL-E, according to CEO Sam Altman, has gone so far as to pay content creators if their work is used in the development of generative AI models. The court is now trying to determine whether Warhol’s work was transformative enough to be considered a new piece of art — separate from the original Prince portrait, and not a direct competition.
Generative AI Has an Intellectual Property Problem
Copyright Office, the agency attempts to clarify its stance on AI-generated works and their eligibility for copyright protection. However, several experts have pointed to previous fair use cases to justify a fair use argument for the use of various training data for AI image generation tools. Jones Day publications should not be construed as legal advice on any specific facts or circumstances. The contents are intended for general information purposes only and may not be quoted or referred to in any other publication or proceeding without the prior written consent of the Firm, to be given or withheld at our discretion.
If the court finds that the Warhol piece is not a fair use, it could mean trouble for AI-generated works. In each of these cases, the legal system is being asked to clarify the bounds of what is a “derivative work” under intellectual property laws — and depending upon the jurisdiction, different federal circuit courts may respond with different interpretations. If a court finds that the AI’s works are unauthorized and derivative, substantial infringement penalties can apply. Lommel acknowledges that “theoretically, the output is a derivative work.” But, he says, “it is derivative of many, many works, all of which contribute in an infinitesimal degree to the output. This is not to say that a clever legal strategy might not succeed in finding some infringing use, but I believe the risk to be quite low.” For written content creators, of course, there are tools such as Grammarly’s Plagiarism Checker or Turnitin (used by teachers) that can be utilized to identify plagiarism. There are also tools such as OpenAI’s AI Text Classifier, which allows users to cut and paste copy to analyze the likelihood that it was created by a human or by AI.
Unsurprisingly, this is the approach pushed by GAI providers, including OpenAI in a comment to the USPTO. Thaler’s case is one of the first in a coming wave of litigation surrounding artificial intelligence and the work it’s used to create. Copyright Office launched an AI initiative to examine laws and policies regarding copyright and artificial intelligence. Artists and writers have spoken out against the technology, saying its use amounts to plagiarism and copyright infringement, but they have faced an uphill battle in proving that point in court. According to a report by the Hollywood Reporter, Judge Beryl Howell handed down the decision in the case of Stephen Thaler, the CEO of neural network firm Imagination Engines, who attempted to have art created by an AI copyrighted under the federal agency in 2018.
These tools include plagiarism detection software, optical character and speech recognition, and search engines for websites and books. Most copyright experts believe that the fair use analysis for generative AI is the same as it is for these other AI tools. Art created by artificial intelligence does not get copyright protection, a federal judge upheld a decision late last week. We owe these value-driving innovations to the broad and flexible framework that Congress wisely created when fashioning our copyright regime. In essence, that framework ensures a fair reward to creators while also enabling technological innovation and follow-on creativity. That dynamic structure is the reason that the United States is not only the most successful creative economy in history, but is also the primary source of the technological innovation that has driven the global economy for over half a century.