Voice cloning scams driven by artificial intelligence (AI) are on the rise, making it difficult to distinguish between friends and foes over the phone. These scams can range from claiming to be friends, family members, or co-workers to more elaborate schemes such as election misinformation or kidnapping for ransom. The advancement of AI technology allows scammers to create realistic voice clones with just three seconds of a person’s voice, adding accents, age ranges, and background sounds to make the scam more convincing.

While the government has made it illegal for agencies to use AI to contact individuals, cyberthieves are constantly on the lookout for personal information. To protect themselves, individuals should hang up if they suspect a scam and contact the supposed caller to verify the situation. It is important to be vigilant on all phone calls, even if the voice sounds familiar, and to listen for any odd statements, questions, or requests, especially related to money or personal information. Avoiding personalized voicemail messages can also help prevent bad actors from accessing your voice.

If individuals believe they have fallen victim to a scam, they should report it to their local police and the Federal Trade Commission (FTC). The FTC shares its reports with over 2,800 law enforcement agencies to help track and combat these AI-driven scams. It is important to remain cautious and proactive in identifying and reporting any suspicious phone calls to protect oneself and others from falling victim to these increasingly sophisticated scams.

Share.
Exit mobile version