2026-01-27

SECURITY HINTS & TIPS:

Deepfakes & Misinformation

What Are Deepfakes?
Deepfakes are fake videos, images, or audio recordings that look and sound real. They’re created using artificial intelligence to make it appear that someone said or did something they never actually did.
Think of it like this: It is like Adobe Photoshop on overdrive, but for video and audio.

🚨 Why Should You Care?

Deepfakes and misinformation are being used to:

  • Trick people into sending money by impersonating authority, family members, or celebrities
  • Steal login credentials through fake customer service videos or messages
  • Damage reputations by making it look like someone said something offensive
  • Spread false information, such as during elections, emergencies, or major events
  • Commit fraud by faking identity verification for financial transactions

Be a MGC Cyber Hero

MGC CYBER
HEROES

Thanks to our amazing users, who play a crucial role in keeping our organization safe every day by spotting and reporting phishing attempts through the Phish Alert Button (PAB). Recently, Lucas Hihn from Manitoulin Group Forwarding Team reported an email indicating an impersonation of an executive with the aim of achieving Business Email Compromise(BEC) to redirect finances.

Thanks to Lucas’s swift action, our Cybersecurity team was able to isolate the threat and enrich security indicators of compromised intelligence, which, in turn, prevents any harm to the company at large. His vigilance and dedication as a human firewall were key to stopping this attack.

A big thank you to Lucas and all our Cyber Heroes foryour ongoing commitment to protecting our organization! πŸš€πŸ”’

βœ… How to Protect Yourself

1. Slow Down and Verify

If something seems urgent or emotional, that’s a red flag. Scammers create panic to make you act without thinking.

What to do: Contact the person by another method (call them directly, text them, or visit their official website).

2. Look for Warning Signs

Watch for: Unnatural facial movements or blinking, weird lip-syncing issues, strange lighting or shadows on the face, audio that doesn’t quite match the video and blurry areas around the face or hairline.

3. Check Multiple Sources

Before believing or sharing something shocking, check if legitimate news outlets are reporting it.

4. Set Up Authentication Codes

Create a family password or code word that only your close contacts know. Use it to verify identity during unusual requests.

Example: If someone claiming to be your family member asks for money, ask them for the code word first.

5. Enable Multi-Factor Authentication (MFA)

Add an extra security step to your accounts. Even if scammers get your password, they can’t get in without the second verification.

6. Be Skeptical of Requests for Money or Information

No legitimate organization will pressure you to:

  • Send money immediately
  • Pay with gift cards or cryptocurrency
  • Share passwords or login codes
  • Click suspicious links “urgently.”

βœ… Real-World Examples

  • Italian Defence Ministry Voice Clone – €1 Million Stolen (2025)What Happened: A voice clone of the Italian Defence Minister was used to extract nearly €1 million. Voice cloning now requires only three to five seconds of sample audio. Recent studies have highlighted that human ability to detect high-quality deepfake videos is exceptionally poor, with some research indicating an accuracy rate of only 24.5%.
  • Arup Engineering Firm – $25 Million Deepfake Heist (Feb 2024)
    What Happened: A finance employee at global engineering firm Arup joined a video conference call with what appeared to be the company’s CFO and other senior executives. All participants were AI-generated deepfakes. The employee authorized 15 transactions totalling $25 million to Hong Kong bank accounts.

Manitoulin Group of Companies Security Team
Cybersecurity@manitoulingroup.com

Stop, Look, and Think. Don’t be fooled.