Tech

Deepfake videos prompt false memories of films in half of participants

The authors counsel we do not want technical advances to distort reminiscence, we will do it very simply and successfully utilizing non-technical means. Credit: Anemone123, Pixabay, CC0 (creativecommons.org/publicdomain/zero/1.0/)

In a brand new examine, deepfake video clips of film remakes that do not truly exist prompted members to falsely keep in mind the movies—however easy textual content descriptions of the faux films prompted comparable false reminiscence charges. Gillian Murphy of University School Cork, Eire, and Lero, the Science Basis Eire Research Centre for Software program, and colleagues offered these findings within the open-access journal PLOS ONE.

Deepfake movies are clips which were altered utilizing artificial intelligence (AI) know-how to change out one individual’s voice or face with that of a special individual. Instruments for making deepfakes have just lately develop into less expensive and extra accessible, amplifying conversations about potential inventive purposes in addition to potential dangers—akin to spreading misinformation and manipulating viewers’ recollections.

To discover potential dangers and advantages, Murphy and colleagues invited 436 folks to finish an online survey that included watching deepfake movies of faux film remakes starring totally different actors, akin to Will Smith because the character Neo—initially performed by Keanu Reeves—in The Matrix and Brad Pitt and Angelina Jolie in The Shining. Different faux remakes within the examine have been Indiana Jones and Captain Marvel.

For comparability, members additionally noticed clips of actual remakes, together with Charlie & The Chocolate Manufacturing facility, Whole Recall, Carrie, and Tomb Raider. As effectively, in some circumstances, members learn textual content descriptions of the remakes as a substitute of watching the deepfakes. Individuals weren’t instructed that the deepfakes have been false till later within the survey.

According to prior research, the deepfake movies and descriptions prompted false recollections of the faux remakes, with a mean of 49 % of members believing every faux remake was actual. Many members reported they remembered the faux remakes being higher than the originals. Nevertheless, false reminiscence charges from the textual content descriptions have been equally excessive, suggesting that deepfake know-how will not be extra highly effective than different instruments at distorting reminiscence.

Most members reported being uncomfortable with deepfake know-how getting used to recast movies, citing considerations together with disrespecting inventive integrity and disrupting shared social experiences of flicks.

These findings may assist inform future design and regulation of deepfake know-how in movies.

The authors add, “While deepfakes are of great concern for many reasons, such as non-consensual pornography and bullying, the current study suggests they are not uniquely powerful at distorting our memories of the past. Though deepfakes caused people to form false memories at quite high rates in this study, we achieved the same effects using simple text. In essence, this study shows we don’t need technical advances to distort memory, we can do it very easily and effectively using non-technical means.”

Extra info:
Gillian Murphy et al, Face/Off: Altering the face of flicks with deepfakes, PLOS ONE (2023). DOI: 10.1371/journal.pone.0287503

Quotation:
Deepfake movies immediate false recollections of movies in half of members (2023, July 13)
retrieved 22 November 2023
from https://techxplore.com/information/2023-07-deepfake-videos-prompt-false-memories.html

This doc is topic to copyright. Other than any honest dealing for the aim of personal examine or analysis, no
half could also be reproduced with out the written permission. The content material is supplied for info functions solely.



Click Here To Join Our Telegram Channel


Source link

When you have any considerations or complaints relating to this text, please tell us and the article will likely be eliminated quickly. 

Raise A Concern

Show More

Related Articles

Back to top button