Doubts over Trump video show how hard deepfakes are to spot

A recent video posted to Twitter by President Donald Trump leaves some users with suspicion that this is a deepfake, highlighting the difficulty in detecting what is true and what is false in social media videos and highlighting the erosion of public trust in government and the media.

In the nearly three-minute video, posted Jan. 7, Trump acknowledges his loss to Democrat Joe Biden in the presidential election and thanks Republicans for their “loyalty.”

The video came as lawmakers weighed the president’s impeachment after helping incite a violent crowd of supporters to storm the Capitol Building in Washington, DC, Jan.6.

Trump stands behind a podium in the video, speaking directly to the camera. For some Twitter users, however, something is wrong.

“An impressive deepfake video! A Twitter user posted. “No way that’s the real DJT.”

“This video is a deepfake,” posted another user.

Possible explanations

Forrester analyst Brandon Purcell said Trump and his team may have used a virtual background, similar to the mock backgrounds available on Zoom and Microsoft Teams, in the video, which makes it look a bit fake.

Some believe that a recent Donald Trump video is a deepfake, but there is no proof.

“For people who are already suspicious of the government, many of whom did not want to hear a concession, that was probably enough to spark suspicion of deepfake,” Purcell said.

Deepfake refers to images, videos, or sound that have been manipulated using sophisticated machine learning and AI tools. The technology for creating deepfakes has become increasingly powerful and easy to use in recent years, leading to a proliferation of deepfake images and videos online.

“It’s easier than ever for anyone to make a deepfake video – whereas a year ago it was harder and less convincing. So in a sense, the ability to make deceptive videos has been trivialized. “said Alan Pelz-Sharpe, senior analyst and founder of Deep Analysis.

In particular, professionally made videos are easier to handle, as they are made with high-quality lighting, camera and sound, he continued.

“It’s great for faking and tampering,” Pelz-Sharpe said.

Erosion of trust

While deepfakes are usually created for entertainment purposes, foreign and domestic political agents and others also create deepfakes or other images and videos manipulated to influence elections and public opinion.

For people already suspicious of the government, many of whom were unwilling to hear a concession, this was probably enough to arouse suspicion of a deepfake.

Brandon purcellAnalyst, Forrester

On December 25, 2020, UK television channel Channel 4 released a deepfake video from an animation and visual effects studio featuring a fake dance by Queen Elizabeth. The video followed the Queen’s annual Christmas speech and, according to channel 4, was intended to warn viewers that not everything they see and hear is real.

After years of eroding public trust, politicians and governments are likely having an uphill battle convincing the public of what is and is not real with deepfakes.

Social media platforms, including Fakebook and Twitter, are developing technology to detect deepfake content on their platforms, Purcell noted. But, he said, as detection technology advances, so does deepfake technology, likely leaving detection efforts behind.

In the absence of sensing technology, the best defense against deepfakes is a reliable and trustworthy press, despite the decline in public confidence in the media, Purcell said.

Even so, he noted, there are still a few ways to attempt to detect deepfakes.

For example, the subject’s eyes in a deepfake may be asymmetrical, or the lines between the subject and the background may appear blurry in the video. Additionally, the voice may not match the actual subject’s voice, as deepfake audio delays video maturity, Purcell noted.

Yet, said Purcell, these methods are far from foolproof, and people should always consider the source of the content and try to corroborate it with other reputable sources.


Source link

Comments are closed.