Bourdain film illustrates ethical issues with voice cloning, media manipulation
Anthony Bourdain speaks on stage during the DC Central Kitchen’s Capital Food Fight event at Ronald Reagan Building on Nov. 11, 2014 in Washington, DC. (Photo by Larry French/Getty Images for DC Central Kitchen)
In a new documentary about the late celebrity chef Anthony Bourdain, he is heard discussing his life shortly before committing suicide. “You are successful, and I am successful, and I’m wondering: Are you happy?”
Questions arose about Bourdain’s voice. He wrote those words in an email, and people were wondering how the filmmaker purloined an audio clip.
As it happened, director Morgan Neville used “deep fake” technology in his film, “Roadrunner.” He resorted to voice cloning because he could not find adequate and authentic audio for the story he wanted to tell.
In journalism, we used to call this lazy.
When movie critics learned about this, they panned Neville’s use of deep fakes, currently being deployed to deceive viewers on social and multimedia, especially in political ads.
Neville claimed he was not manipulating the audience. “We can have a documentary ethics panel about it later,” he quipped.
We can do so now.
The term “documentary ethics” is oxymoronic. Neville believes deep-fake technology is a storytelling tool. It’s not. It’s manipulative.
I have no doubt the technique in due time it will become acceptable. We’re living in a post-truth age.
In one generation, media ethics went from differentiating between interviewing a source in person or by telephone to via email, tweet or text message, and now to deep fakes.
Neville argues that he wasn’t putting words into Bourdain’s mouth; he was just inserting audio.
That is true, in a sense; but it ignores the importance of tone. You can ask, “Are you happy?” in an introspective or infuriated voice.
Voice cloning not only is being used by filmmakers, but cybercriminals, too.
In one case, a chief executive officer’s voice was cloned to trick him into transferring $243,000 into the criminal’s account.
Cybercrimes are classic examples of manipulation, defined as attacking a person’s mental and emotional states, thereby creating an imbalance of power to gain control, benefits or privileges at the expense of the victim.
An encyclopedia entry, “The Ethics of Manipulation,” identifies three types:
- Manipulation that bypasses reason, such as subliminal advertising or hypnosis.
- Manipulation as trickery, such as advertising that promotes false claims or induces false beliefs.
- Manipulation as pressure, such as scam phone calls warning about costs if demands are not heeded.
Watch out for deep fakes in political ads
Voice cloning has the potential to combine all manipulative types. You can anticipate its wide use in mid-term election political advertisements.
Edward Bernays, the so-called father of public relations, wrote this ominous quote in 1928:
“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds are molded, our tastes formed, and our ideas suggested, largely by men we have never heard of. … It is they who pull the wires that control the public mind.”
Little did Bernays realize his methods would be used years later by dictators.
In “The Manipulation of the American Mind,” Richard Gunderman, Chancellor’s professor of medicine at Indiana University–Purdue University Indianapolis, writes that Bernays used fear to sell products. “For Dixie cups, Bernays launched a campaign to scare people into thinking that only disposable cups were sanitary.” Bernays even founded the Committee for the Study and Promotion of the Sanitary Dispensing of Food and Drink.
“Bernays sought to turn citizens and neighbors into consumers who use their purchasing power to propel themselves down the road to happiness,” Gunderman writes.
If we change the words “purchasing power” to “votes,” we can see how manipulation plays a role in political advertising.
The news magazine, The Week, compiled some of the most manipulative political ads of the 2020 election.
One of the worst was titled “Meet Joe Biden’s Supporters,” showing riots and mayhem associated with Black Lives Matter and culminating with hellish music and a maniacal laugh.
Biden hit back, turning the tables on former President Trump in an advertisement titled, “You’ll Never See Me Again.”
The ad is only 10 seconds and shows Trump speaking at one of his rallies, stating, “If I lose to him, I don’t know what I am going to do. I will never speak to you again. You’ll never see me again.” Then we hear: “I’m Joe Biden, and I approved this message.”
To understand manipulation on a personal level, take an inventory of your deepest desires, convictions, fears, values and beliefs. Manipulators target them in a strategy to make you do something you ordinarily would not do.
When you respond emotionally to a political ad, positively or negatively, remember you are the target voter. You can still hold your political beliefs while acknowledging that the video, audio and voice is manipulating you.
This is true even if that voice is one you recognize and admire— Biden, Nancy Pelosi, Kamala Harris, Trump, Mike Pence, Mitch McConnell — because you no longer can trust what you see or hear.
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.