Tech

Hollywood agency CAA aims to help stars manage their own likenesses with AI

Creative Artists Agency (CAA), a leading entertainment and sports agency, hopes to be at the forefront of AI protection services for Hollywood celebrities.

With many stars having their digital image used without permission, CAA has created a virtual media storage system for A-list talent (actors, athletes, comedians, directors, musicians, etc.) to store their digital assets, such as their names, images. , digital scans, voice recordings, etc. The new development is part of “theCAAvault”, the company’s studio where actors record their bodies, faces, movements and voices using scanning technology to create AI clones.

CAA has partnered with AI technology company Veritone to provide its digital asset management solution, the company announced earlier this week.

The announcement comes amid a wave of AI deepfakes of celebrities, which are often created without their consent. Tom Hanks, a famous actor and CAA client, fell victim to an AI scam seven months ago. He claimed a company used an AI-generated video of him to promote a dental plan without authorization.

“Over the past several years, there has been misuse of our clients’ names, images, likenesses and voices without consent, without credit, without proper compensation. It’s very clear that the law is not currently designed to be able to protect them, and so we see a lot of lawsuits going on right now,” Shannon said.

A significant amount of personal data is required to create digital clones, which raises many privacy concerns due to the risk of compromise or misuse of sensitive information. CAA customers can now store their AI digital duplicates and other assets in a secure personal hub in CAAvault, accessible only to authorized users, allowing them to share and monetize their content as they see fit.

“This provides an opportunity to start setting precedents around the use of consent-based AI,” Alexandra Shannon, CAA’s head of strategic development, told TechCrunch. “Frankly, our view is that the law is going to take time to catch up, and so, thanks to the talent that creates and owns their digital likeness with (theCAAvault)… there is now a legitimate way for companies to work with one of our clients. . If a third party chooses not to work with them in the right way, it makes it much easier for lawsuits to demonstrate that there was a violation of their rights and help protect customers over time.

Notably, the vault also ensures that actors and other talent are legitimately compensated when companies use their digital likenesses.

“All of these assets are owned by the individual customer, so it’s largely up to them to decide whether they want to grant access to anyone else… It’s also up to the talent to decide on the appropriate business model for the opportunities. It’s a new space, and it’s still forming. We believe the value and opportunities of these assets will increase over time. It shouldn’t be a cheaper way to work with someone…We view (AI clones) as an improvement rather than a cost saving,” Shannon added.

CAA also represents Ariana Grande, Beyoncé, Reese Witherspoon, Steven Spielberg and Zendaya, among others.

The use of AI cloning has sparked much debate in Hollywood, with some saying it could lead to fewer job opportunities, as studios may prefer digital clones to real actors. This was a major point of contention during the 2023 SAG-AFTRA strikes, which ended in November after members approved a new agreement with AMPTP (Alliance of Motion Picture and Television Producers) that recognized the importance of human performers and included guidelines on how “digital” replicas should be used.

There are also concerns about the unauthorized use of AI clones of deceased celebrities, which can upset family members. For example, Robin Williams’ daughter expressed disdain for an AI-generated voice recording of the star. However, some argue that when done ethically, it can be a sentimental way to preserve an iconic actor and recreate his performances in future projects for all generations to enjoy.

“AI clones are an effective tool that allows legacy to continue into future generations. The CAA takes a consent and authorization-based approach to all AI applications and will only work with domains that own and have permissions for the use of these lookalike assets. It is up to the artists to decide to whom they wish to grant ownership and permission for use after their death,” Shannon noted.

Shannon declined to say which of CAA’s clients are currently storing their AI clones in the vault, but she said it’s just a few at the moment. CAA also charges customers a fee to participate in the vault, but has not said exactly how much it costs.

“The ultimate goal will be to make this accessible to all of our customers and everyone in the industry. It’s not cheap, but over time the costs will continue to come down,” she added.

techcrunch

Back to top button