About Boardroom

Boardroom is a sports, media and entertainment brand co-founded by Kevin Durant and Rich Kleiman and focused on the intersection of sports and entertainment. Boardroom’s flagship media arm features premium video/audio, editorial, daily and weekly newsletters, showcasing how athletes, executives, musicians and creators are moving the business world forward. Boardroom’s ecosystem encompasses B2B events and experiences (such as its renowned NBA and WNBA All-Star events) as well as ticketed conferences such as Game Plan in partnership with CNBC. Our advisory arm serves to consult and connect athletes, brands and executives with our broader network and initiatives.

Recent film and TV projects also under the Boardroom umbrella include the Academy Award-winning Two Distant Strangers (Netflix), the critically acclaimed scripted series SWAGGER (Apple TV+) and Emmy-nominated documentary NYC Point Gods (Showtime).

Boardroom’s sister company, Boardroom Sports Holdings, features investments in emerging sports teams and leagues, including the Major League Pickleball team, the Brooklyn Aces, NWSL champions Gotham FC, and MLS’ Philadelphia Union.

All Rights Reserved. 2022.

Who Owns the Work Created by Artificial Intelligence?

Last Updated: January 11, 2024

The disruptive nature of technology is as cliched a saying as “move fast and break things.” But there is a kernel of truth in every platitude and it is undeniably true that technology breaks things. More accurately, it breaks our way of thinking about the world as we know it because it creates new questions and new issues that none of us had ever anticipated.

One such question that many are asking themselves about artificial intelligence or “AI” — in addition to: “How can they do that?” — is, “Are they allowed to do that?”

The latter question has largely been asked about generative AI models and applications that can take a prompt and create “original” poems, essays, books, images, and even songs. Over the past several months, the internet has lost its mind and gotten duped by AI-generated pictures depicting Donald Trump’s arrest, songs that sound like they were written and recorded by Drake, and just this week an AI-generated image depicting an explosion at the Pentagon prompted wild swings in the financial markets.

In a relatively short period of time, the world has borne witness to the power and perils of this new technology.

Stay Ahead of the Game, Get Our Newsletters

Subscribe for the biggest stories in the business of sports and entertainment, daily.

One issue that has yet to be resolved is who actually owns the outputs created by an AI tool like OpenAI’s ChatGPT. From a legal standpoint, the majority of the outputs most of us are interested in or concerned with are governed by copyright, which is the body of law that protects original works of authorship or expression like songs, movies, books, photographs, etc. A copyright is a very literal name because, historically, the way that most people or artists made money off of their copyrighted works was by creating and selling numerous copies of the original.

Think back to the days when people actually bought records or DVDs; copyright gives the creator or the author the exclusive right to reproduce (copy) and commercialize their work(s). A creator or an artist automatically receives copyright protection over their work the minute they create it and record it, whether that’s actually writing something down, saving it to a computer, or recording it via a tape recorder.

In terms of monetization, copyrighted works are usually monetized through licensing by giving a limited right to a third party to make use of the copyrighted work in exchange for a sum of money or a share of future revenue streams or royalties.

Take Taylor Swift, for example. When she puts out her new album on Spotify, every time someone streams one of her songs, it generates licensing revenue that Spotify has to pay back to her in exchange for the right to carry her music on Spotify’s platform. The copyrighted work has value that is monetized by working out deals with third parties to allow them to reap some of that value for a fee.

But what happens when a machine created by one set of humans takes an input created by an unaffiliated human and creates a wholly original and potentially valuable output?

As is the answer to most legal questions: It depends.

Legislatures, regulatory bodies, judges, artists, and businesses are all grappling with this issue at the same time. The US Copyright Office, which is responsible for overseeing registrations of copyrighted works at the federal level, issued guidance in March where it suggested that if a machine like a generative AI tool produces an output where all of the creative elements typically associated with a copyrighted work are created by the machine, it is not subject to copyright protection.

Essentially, no one can claim ownership of it.

Jakub Porzycki/NurPhoto via Getty Images

The example it provides is if someone prompts a text-generating tool like ChatGPT to write a poem like William Shakespeare and the machine comes up with the words, the structure, and the rhyming scheme (what the Copyright Office describes as the expressive elements), it is not protected by copyright because copyright protection is only afforded to works of human authorship or expression.

On the other hand, the Copyright Office did acknowledge that there are situations and works that involve generative AI that may be protected by copyright and owned by the creator of the work. Say, for example, a digital artist uses an AI tool to produce an image of a pug and then the artist modifies that image using their Photoshop skills to make the pug look like a gangster with the words “Pug Life” emblazoned across the image, that is something that would be protected by copyright and capable of being owned by the artist.

The rule of thumb, as it stands right now, is the more expressive components of the work that are created by humans, the more likely it is that someone can own that work. In turn, the more expressive components that are created entirely by a machine, the less likely it is someone can assert ownership of the work. However, as technology evolves, the line between where the machine is expressing itself and the human is expressing their self is likely to become even more blurred.

Even if an individual could claim ownership over something created in part by generative AI, it doesn’t mean that they can monetize it without repercussions.

Let’s circle back to the AI-generated Drake song that went viral over the past month. The vocalist on the song sounds exactly like Drake and the beat is reminiscent of a Drake track, but just because a machine can reproduce someone’s voice does not mean it can profit from it. In most states, individuals have a right known as the “right of publicity” that precludes other people from monetizing the name, image, and likeness of another person without their consent or obtaining a license.

“Sound-a-likes” are not a new phenomenon, as people have been imitating celebrities for decades in order to profit off of their physical or auditory resemblance to their unwitting doppelgangers. In the 1980s, Ford attempted to use a Bette Midler sound-alike in a commercial. Midler then sued Ford and the ad agency, ultimately scoring a pivotal victory for the right of publicity law in this country, as the court effectively ruled that individuals have an exclusive right to profit off of their name, image, and likeness that extends to the sound of their voice.

While an individual may be able to own the lyrics created by the “Fake Drake” machine, if they tried to monetize the actual sound recording, they would be sued so fast their heads would spin. The same is true of AI-generated images or videos — you may be able to own the rights to the image, but if you try to monetize it, you better have a good lawyer.

It is almost always the case that technology outpaces the law and it will be years before the law will be able to get a handle on the issues presented by generative AI.

In the meantime, it’s probably best to tread lightly.

More AI Stories:

Daniel Marcus

Daniel Marcus is a Columnist for Boardroom. When he's not entertaining the masses with his literary stylings, he is a lawyer who runs his own practice where he represents prominent clients in sports, tech, entertainment, and crypto. Daniel is also a well-traveled entrepreneur who has a started a number of companies in sports including a ticketing company as well as a production company called Relentless - (he is the one to credit or to blame for developing and selling Pete Rose's gambling podcast). In another life, Daniel teaches a number of classes including Sports Law and the Business of Esports in his alma program at New York University. He is a beleaguered Jets fan who hopes to (once again) see a home playoff game in his lifetime.

About The Author
Daniel Marcus
Daniel Marcus
Daniel Marcus is a Columnist for Boardroom. When he's not entertaining the masses with his literary stylings, he is a lawyer who runs his own practice where he represents prominent clients in sports, tech, entertainment, and crypto. Daniel is also a well-traveled entrepreneur who has a started a number of companies in sports including a ticketing company as well as a production company called Relentless - (he is the one to credit or to blame for developing and selling Pete Rose's gambling podcast). In another life, Daniel teaches a number of classes including Sports Law and the Business of Esports in his alma program at New York University. He is a beleaguered Jets fan who hopes to (once again) see a home playoff game in his lifetime.