Using deepfakes of students to create “personalized” training videos leads to quicker learning, scientists say.
Although deepfake technology has been widely criticized for fooling people, scientists at Bath University set out to show that it can have its positive uses.
They concluded that watching a training video starring yourself as a deepfake, as opposed to a clip featuring somebody else, makes learning faster, easier and more fun.
The researchers first looked at applying the technique to personal fitness videos and then to public speaking.
The team, led by the REVEAL research center at the University of Bath was published in the journal CHI ’23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems.
Deepfake technology has not had a good press. It has been used to spread political misinformation and to maliciously develop porn by superimposing the face of one person on the body of another.
It also poses a threat to professionals in the creative industries.
Dr. Christopher Clarke from the Department of Computer Science at Bath said: “Deepfakes are used a lot for nefarious purposes, but our research suggests that FakeForward, the name used by the researchers to describe the use of deepfake to teach a new skill, is an unexplored way of applying the technology so it adds value to people’s lives by supporting and improving their performances.”
Study co-author Dr. Christof Lutteroth added: “From this study, it’s clear that deepfake has the potential to be really exciting for people.
“By following a tutorial where they act as their own tutor, people can immediately get better at a task – it’s like magic.”
For the fitness experiment, study participants were asked to watch a training video featuring a deepfake of their own face pasted over the body of a more advanced exerciser.
The researchers chose six exercises, planks, squats, wall squats, sit-ups, squat jumps and press-ups, each targeting a different muscle group and requiring different types of movement.
For each exercise, study participants first watched a video tutorial where a model demonstrated the exercise, and then had a go at repeating the exercise themselves.
The model was chosen both to resemble the participants and to outperform them, though their skill level was attainable to the test subject.
The process of watching the video and mimicking the exercise was also performed using a deepfake instructor, where the participant’s own face was superimposed on a model’s body.
For both conditions, the researchers measured the number of repetitions or the time participants were able to hold an exercise.
For all exercises, regardless of the order in which the videos were watched, participants performed better after watching the video of “themselves,” compared to watching a video showing someone else.
Dr. Lutteroth said: “Deepfake was a really powerful tool.
“Immediately people could do more press-ups or whatever it was they were being asked to do.
“Most also marked themselves as doing the exercise better than they did with the non-deepfake tutorial and enjoyed it more.”
The other FakeForward study by the same team found that deepfake can significantly boost a person’s skills as a public speaker.
When the face of a proficient public speaker was replaced with a user’s, learning was significantly amplified, with both confidence and perceived competence in public speaking growing after watching the FakeForward video.
Participants felt inspired seeing ‘themselves’ deliver speeches, saying things such as, “it gives me a lot of strength,” “the deepfake video makes me feel that speaking is actually not that scary” and “when I saw myself standing there and speaking, I kind of felt proud of myself.”
To guard against potential misuse, the FakeForward research team has developed an ethical protocol to guide the development of ‘selfie’ deepfake videos.
Dr. Clarke said: “For this technology to be applied ethically, people should only create self-models of themselves, because the underpinning concept is that these are useful for private consumption.”
Dr. Lutteroth added: “Just as deepfake can be used to improve ‘good’ activities, it can also be misused to amplify ‘bad’ activities – for instance, it can teach people to be more racist, more sexist and ruder.
“For example, watching a video of what appears to be you saying terrible things can influence you more than watching someone else saying these things.”
“Clearly, we need to ensure users don’t learn negative or harmful behaviors from FakeForward. The obstacles are considerable but not insurmountable.”
Produced in association with SWNS Talker
“What’s the latest with Florida Man?”
Get news, handpicked just for you, in your box.