When Roadrunner, A documentary about the late TV chef and traveler Anthony Burden was released in theaters last month, and its director Morgan Neville added interest to the publicity interview through unconventional disclosure of a documentary director. The audience heard in the movie that some of the words Bourdain said were forged by artificial intelligence software used to imitate the voices of celebrities.
The accusations of Borden fans of Neville’s unethical behavior soon dominated the film’s coverage. Despite this attention, it is still unclear how many false Bourdain voices and what it says in this two-hour movie until now.
In an interview that made his movie notorious, Neville Tell New Yorker He produced three fake Bourdain clips under the permission of his estate, all from text written or spoken by the chef, but not available as audio. He only revealed one, and Bourdon “read” an email in the email. Movie trailer, But boasting that the other two clips will not be detected. “If you watch a movie,” New Yorker To quote Oscar winner Neville, “You may not know what other lines the artificial intelligence is saying, and you won’t know.”
Pindrop is a startup that helps banks and other companies fight phone fraud, and its audio experts believe they do know it. If the company’s analysis is correct, the root of the deepfake Bourdain controversy lies in the less than 50 seconds of audio in this 118-minute movie.
Pindrop’s analysis tagged Neville’s disclosure of e-mail quotations, as well as an early clip of the film, apparently taken from an article on Vietnam written by Bourdain entitled “Hungry Americans”, which was collected in his 2008 In the book, Annoying thingsIt also emphasizes the audio halfway through the film, in which the chef observes that many chefs and writers have “relentless instincts to mess up a good thing.”The same sentence appears in Interview with Bourdain In 2016, on the occasion of his 60th birthday, he collaborated with food website First We Feast, two years before he committed suicide.
All three clips sound like Bourdain. However, when listening carefully, they seem to have the characteristics of synthesized speech, such as strange prosody and fricatives, such as “s” and “f” sounds.A Reddit user Independent mark The three clips that are the same as Pindrop, write that they are easy to hear when watching the movie for the second time. Focus Features, the film’s distributor, did not respond to a request for comment. Neville’s production company declined to comment.
When Neville predicts that he uses artificial intelligence-generated media, it is sometimes referred to as Further study, Will not be able to detect, he may have overestimated the complexity of his forgery. He may not have expected that his use of this technology will cause controversy or concern among fans and audio experts. When the anger reached the ears of Pindrop researchers, they saw a perfect test case for the software used to detect audio depth forgery; when the movie debuted on the streaming service earlier this month, they set it up For work. “We are always looking for ways to test our systems, especially under real conditions-this is a new way to validate our technology,” said Collin Davis, Pindrop’s chief technology officer.
Pindrop’s research results may have solved the mystery of Neville’s missing Deepfakes, but this incident heralds future controversy, as Deepfakes becomes more complex and accessible to both creative and malicious projects.