Scammers Are Using AI 'Voice Cloning' To Trick Victims

We've all heard about the so-called 'family emergency' scam - you get a call from someone claiming to be a niece or nephew that needs you send money quickly. But now this classic scam has a new wrinkle - AI-powered 'voice cloning' lets scammers impersonate the actual family member's voice. The FTC says scammers can make a convincing-sounding call from just a short sound clip of someone's voice, found online through social media or YouTube. They recommend taking extra steps to verify the story - check the phone number, call the person directly, or confirm with other family members. Known scams can be reported at ReportFraud.ftc.gov.

Listen to the Brother Wease Show on the iHeartradio app: Radio951.com\listen


Sponsored Content

Sponsored Content