Home CANADA Deepfakes of Canadian politicians…

Deepfakes of Canadian politicians…

0

Face of Nation : Deepfakes — the startlingly convincing videos that use artificial intelligence to manipulate clips of well-known people— have arrived in Canada.

Over the last few weeks, one YouTube channel has posted multiple deepfakes of Conservative Leader Andrew Scheer and Ontario Premier Doug Ford.

These particular examples were clearly meant to be humorous rather than deceptive — for example, one video pasted Andrew Scheer’s face over an old public service announcement from Pee-wee Herman.

But these clips demonstrate that the technology is easy to use — and there is more than enough footage of Canadian politicians to make it possible to fake their likeness and mislead voters ahead of the October election. 

“I have no background in video or production,” said the creator of the Scheer and Ford videos, who asked to be identified only by his username, FancyScientician. “I was initially, and remain, very intrigued by the power of deep learning and would say my main motivation is experimentation for the purposes of learning and laughter.”

The creator told CBC News that he used a free, open-source tool to make his videos, and has been learning as he goes. FancyScientician — in reality a 33-year-old from Richmond Hill, Ont., who works in golf course turf maintenance — said he was curious about the technology, but doesn’t have any intention of making misleading videos.

“I personally have no intention of using deepfakes for the purpose of misinformation, but it is entirely possible [to do so].

“Deepfakes broke into the mainstream in 2017 after Motherboard, VICE’s science and technology site, reported on a Reddit user named deepfakes who had begun to share easy-to-use technology that allowed average users to create realistic-looking face-swapped videos. Originally, it was used almost exclusively to generate porn videos with celebrity faces superimposed onto the bodies of adult film stars.

The technology uses artificial intelligence to learn the facial details of the input source, such as photos or footage of actors, and map them onto the output source, in this case a porn video.

Since then, deepfake software has advanced significantly, and researchers say in just a few months, we could see videos where the manipulation is imperceptible to the naked eye.

Many observers have raised alarm bells about the potential for deepfakes being used in a political context, as demonstrated by a video from Buzzfeed last year. In it, comedian and director Jordan Peele used his spot-on impression of former President Barack Obama to create a convincing fake video of Obama saying things like “President Trump is a total and complete dipshit.”

Hany Farid, a professor and image forensics expert at Dartmouth College, said Canada is no exception to this kind of manipulation.

“The computer software to create these fakes are freely available online, which means that an increasing number of people have access to this technology,” Farid said in an email. “And these same people have access to social media and can therefore distribute fake content to millions of people around the world. It is access to this sophisticated technology and the ability to distribute widely that is the new threat.”

In preparation for the 2020 U.S. election, Farid and his colleagues have created a tool that uses hours of footage of politicians in order to learn their distinct facial movements and gestures. This allows them to distinguish a real video from a fake. The tool will be available to news organizations to verify any suspicious videos that may emerge.

The only Canadian politician included in the tool is Prime Minister Justin Trudeau, because footage of him was “more readily available,” according to Farid.

But as the FancyScientician videos show, Trudeau is not the only subject of deepfakes, and someone with nefarious intentions wouldn’t be limited to gags comparing Doug Ford to U.S. President Donald Trump.

“I think it is reasonable to assume deepfakes could be used to fool people with the right amount of effort and resources,” FancyScientician said.