Tech

ChatGPT’s use of a soundalike Scarlett Johansson reflects a troubling history of gender-stereotyping in technology

Credit: CC0 Public Area

Actress Scarlett Johansson released a statement this week expressing anger and concern that OpenAI used a voice “eerily similar” to her personal as a default voice for ChatGPT.

The voice in query, referred to as Sky, has been accessible to customers since September 2023, however the resemblance to Johansson’s voice grew to become clearer final week when OpenAI demoed an updated model called GPT-4o. Johansson claims that OpenAI’s CEO Sam Altman beforehand requested her if she would offer her voice for ChatGPT, and he or she had declined the invitation.

The nice and cozy and playful tone of Sky’s voice bears a putting resemblance to the digital companion referred to as Samantha within the movie Her (2013), voiced by Johansson.






CNN experiences on the Scarlett Johansson’s response to the similarity between her voice and the one OpenAI used for its newest model of ChatGPT.

Though Altman has since claimed that Sky’s voice was never meant to resemble Johansson’s, he appeared to allude to this connection by merely tweeting the word “her” on May 13, 2024—the day that GPT-4o launched.

OpenAI has since defined their course of for creating Sky’s voice in a blog post, stating that the voice was supplied by “a different professional actress using her own natural speaking voice.” Nonetheless, as increasingly smaller audio samples can be used to generate synthetic voices, cloning an individual’s voice with out their consent is simpler than ever.

As a sound research scholar, I am within the ways in which AI know-how is introducing new questions and considerations about voice and id. My analysis situates current developments, anxieties and aspirations about AI inside longer histories of voice and technology.

Stolen voices

This isn’t the primary time a performer has objected to an unlicensed simulation of their voice.

In 1988, Bette Midler pursued legal action in opposition to Ford Motor Firm for utilizing a voice resembling hers in a sequence of advertisements. The U.S. Courtroom of Appeals for the Ninth Circuit in the end dominated in her favor, with Circuit Choose John T. Noonan writing in his decision that “to impersonate her voice is to pirate her identity.”

Tom Waits launched an identical and profitable lawsuit in opposition to Frito-Lay after listening to what appeared like his personal gravelly voice in a radio industrial for Doritos. As musicologist Mark C. Samples describes, this case “elevat[ed] a person’s vocal timbre to the level of his or her visual representation” within the eyes of the legislation.

Legislators have solely simply begun to deal with the challenges and dangers that accompany the elevated adoption of AI.

For instance, a recent ruling by the Federal Communications Commission banned robocalls that use AI-generated voices. Within the absence of extra particular coverage and authorized frameworks, these examples of voice mimicry proceed to behave as essential precedents.

Chatbots and gender

OpenAI’s obvious reference to the film Her within the design of Sky’s voice additionally situates ChatGPT inside a long-standing tradition of assigning female voices and personas to computers.

The primary chatbot was inbuilt 1966 by MIT professor Joseph Weizenbaum. Known as ELIZA, Weizenbaum designed it to communicate with its users in the same manner as a psychotherapist. ELIZA was an affect and reference for at the moment’s digital assistants, which regularly have feminized voices as their default setting. When first launched in 2011, Siri told stories about ELIZA as if it had been a good friend.

Many technoscience students, together with Thao Phan and Heather Woods, have criticized the way in which tech corporations enchantment to gender stereotypes within the design of voice assistants.

Communication students Jessa Lingel and Kate Crawford recommend that voice assistants invoke the traditionally feminized function of the secretary, as they undertake each administrative and emotional labor. In referencing this submissive trope, they argue that tech companies search to distract customers from the surveillance and information extraction that voice assistants perform.

OpenAI says that when casting for ChatGPT’s voices, they sought out “an approachable voice that inspires trust.” It’s telling that the voice the corporate selected to make customers really feel comfortable with fast advances in AI know-how appears like a lady. Even because the conversational skills of voice assistants develop into way more superior, Sky’s voice demonstrates that the tech business has but to maneuver on from these regressive tropes.

Defending our voices

Johansson’s assertion ends with a name for “transparency and the passage of appropriate legislation” to guard vocal likeness and id. Certainly, will probably be attention-grabbing to see what authorized and coverage ramifications may observe from this high-profile case of unauthorized voice simulation.

Nonetheless, celebrities should not the one ones who ought to be involved about how their voices are being utilized by AI programs. Our voices are already being recorded and used to train AI by platforms like Zoom and Otter.ai and employed within the coaching of virtual assistants like Alexa.

The illicit AI impersonation of Johansson’s voice may seem to be a narrative from a dystopian future, however it’s best understood within the context of ongoing debates about voice, gender and privateness. It is a signal not of what is to come back, however of what already exists.

Offered by
The Conversation


This text is republished from The Conversation underneath a Inventive Commons license. Learn the original article.The Conversation

Quotation:
ChatGPT’s use of a soundalike Scarlett Johansson displays a troubling historical past of gender-stereotyping in know-how (2024, May 26)
retrieved 26 May 2024
from https://techxplore.com/information/2024-05-chatgpt-soundalike-scarlett-johansson-history.html

This doc is topic to copyright. Aside from any truthful dealing for the aim of personal examine or analysis, no
half could also be reproduced with out the written permission. The content material is supplied for data functions solely.



Click Here To Join Our Telegram Channel


Source link

When you’ve got any considerations or complaints concerning this text, please tell us and the article can be eliminated quickly. 

Raise A Concern

Show More

Related Articles

Back to top button