How to Identify Deepfake Videos Like a Fact-Checker | By The Digital Insider

Deepfakes are synthetic media where an individual replaces a person’s likeness with someone else’s. They’re becoming more common online, often spreading misinformation around the world. While some may seem harmless, others can have malicious intent, making it important for individuals to discern the truth from digitally crafted false content.

Unfortunately, not everyone can access state-of-the-art software to identify deepfake videos. Heres a look at how fact-checkers examine a video to determine its legitimacy and how you can use their strategies for yourself.

1. Examine the Context

Scrutinizing the context in which the video is presented is vital. This means looking at the background story, the setting and whether the video's events align with what you know to be true. Deepfakes often slip here, presenting content that doesn’t hold up against real-world facts or timelines upon closer inspection.

One example involves a deepfake of Ukrainian President Volodymyr Zelensky. In March 2022, a deepfake video surfaced on social media where Zelensky appeared to be urging Ukrainian troops to lay down their arms and surrender to Russian forces. 

Upon closer examination, several contextual clues highlighted the video’s inauthenticity. The Ukrainian government's official channels and Zelensky himself didn't share this message. Also, the timing and circumstances didn’t align with known facts about Ukraine’s stance and military strategy. The video’s creation aimed to demoralize Ukrainian resistance and spread confusion among the international community supporting Ukraine.

2. Check the Source

When you come across a video online, check for its source. Understanding where a video comes from is crucial because hackers could use it against you to deploy a cyberattack. Recently, 75% of cybersecurity professionals reported a spike in cyberattacks, with 85% noting the use of generative AI by malicious individuals. 

This ties back to the rise of deepfake videos, and professionals are increasingly dealing with security incidents that AI-generated content is fueling. Verify the source by looking for where the video originated. A video originating from a dubious source could be part of a larger cyberattack strategy. 

Trusted sources are less likely to spread deepfake videos, making them a safer bet for reliable information. Always cross-check videos with reputable news outlets or official websites to ensure what you’re viewing is genuine.

3. Look for Inconsistencies in Facial Expressions

One of the telltale signs of a deepfake is the presence of inconsistencies in facial expressions. While deepfake technology has advanced, it often struggles with accurately mimicking the subtle and complex movements that occur naturally when a person talks or expresses emotions. You can spot these by looking out for the following inconsistencies:

  • Unnatural blinking: Humans blink in a regular, natural pattern. However, deepfakes may either under-represent blinking or overdo it. For instance, a deepfake could show a person talking for an extended period without blinking or blinking too rapidly.
  • Lip sync errors: When someone speaks in a video, their lip movement may be off. Watch closely to see if the lips match the audio. In some deepfakes, the mismatch is subtle but detectable when looking closely.
  • Facial expressions and emotions: Genuine human emotions are complex and reflected through facial movements. Deepfakes often fail to capture this, leading to stiff, exaggerated or not fully aligned expressions. For example, a deepfake video might show a person smiling or frowning with less nuance, or the emotional reaction may not match the context of the conversation.

4. Analyze the Audio

Audio can also give you clues into whether a video is real or fake. Deepfake technology attempts to mimic voices, but discrepancies often give them away. For instance, pay attention to the voice’s quality and characteristics. Deepfakes can sound robotic or flat in their speech, or they may lack the emotional inflections an actual human would exhibit naturally.

Background noise and sound quality can also provide clues. A sudden change could suggest that parts of the audio were altered or spliced together. Authentic videos typically remain consistent throughout the entirety.

5. Investigate Lighting and Shadows

Lighting and shadows play a large part in revealing a video’s authenticity. Deepfake technology often struggles with accurately replicating how light interacts with real-world objects, including people. Paying close attention to lighting and shadows can help you spot various items that indicate whether it’s a deepfake.

In authentic videos, the subject's lighting and surroundings should be consistent. Deepfake videos may display irregularities, such as the face being lit differently from the background. If the video's direction or source of light doesn’t make sense, it could be a sign of manipulation.

Secondly, shadows should behave according to the light sources in the scene. In deepfakes, shadows can appear at wrong angles or fail to correspond with other objects. Anomalies in shadow size, direction, and the presence or absence of expected shadows give you an overall idea.

6. Check for Emotional Manipulation

Deepfakes do more than create convincing falsehoods — people often design them to manipulate emotions and provoke reactions. A key aspect of identifying such content is to assess whether it aims to trigger an emotional response that could cloud rational judgment.

For instance, consider the incident where an AI-generated image of a bomb at the Pentagon circulated on Twitter X. Despite being completely fabricated, the image’s alarming nature caused it to go viral and trigger widespread panic. As a result, a $500 billion loss in the stock market occurred.

Deepfake videos can stir the same amount of panic, especially when AI is involved. While evaluating these videos, ask yourself:

  • Is the content trying to evoke a strong emotional response, such as fear, anger or shock? Authentic news sources aim to inform, not incite.
  • Does the content align with current events or known facts? Emotional manipulation often relies on disconnecting the audience from rational analysis.
  • Are reputable sources reporting the same story? The absence of corroboration from trusted news outlets can indicate the fabrication of emotionally charged content. 

7. Leverage Deepfake Detection Tools

As deepfakes become more sophisticated, relying solely on human observation to identify them can be challenging. Fortunately, deepfake detection tools that use advanced technology to distinguish between real and fake are available. 

These tools can analyze videos for inconsistencies and anomalies that may not be visible to the naked eye. They leverage AI and machine learning by utilizing speech watermarking as one method. These technologies are trained to recognize the watermark’s placement to determine if the audio was tampered with.

Microsoft developed a tool called Video Authenticator, which provides a confidence score indicating the likelihood of a deepfake. Similarly, startups and academic institutions continually develop and refine technologies to keep pace with evolving deepfakes.

Detecting Deepfakes Successfully

Technology has a light and dark side and is constantly evolving, so it’s important to be skeptical of what you see online. When you encounter a suspected deepfake, use your senses and the tools available. Additionally, always verify where it originated. As long as you stay on top of the latest deepfake news, your diligence will be key in preserving the truth in the age of fake media.


#Is, #2022, #Ai, #AiGeneratedContent, #Analysis, #Art, #ArtificialIntelligence, #Attention, #Audio, #Background, #Billion, #Capture, #Change, #Cloud, #Community, #Content, #Cyberattack, #Cyberattacks, #Cybersecurity, #Dark, #Deepfake, #DeepfakeDetection, #DeepfakeTechnology, #Deepfakes, #Design, #Detection, #Direction, #Display, #Emotions, #Events, #Eye, #Fabrication, #Facts, #Fear, #Generative, #GenerativeAi, #Government, #Hackers, #How, #HowTo, #Human, #Humans, #Incident, #Inspection, #It, #Learning, #LESS, #Light, #Lighting, #LipSync, #MachineLearning, #Manipulation, #Media, #Message, #Method, #Microsoft, #Military, #Misinformation, #Movement, #Natural, #Nature, #News, #Noise, #Objects, #One, #Other, #Panic, #Pentagon, #Placement, #Play, #President, #Robotic, #Security, #Senses, #Shadow, #Social, #SocialMedia, #Software, #Sound, #Startups, #StockMarket, #Strategy, #Sync, #Technology, #Tool, #Tools, #Twitter, #Ukraine, #Video, #Videos, #Voice, #Websites, #X
Published on The Digital Insider at https://is.gd/i8Ujvd.

Comments