Scammers have started using artificial intelligence (AI) in various ways to carry out more sophisticated and convincing scams.
A new AI scam going around is targeting the elderly. The scammer will call and purport to be their grandson, who then goes on to explain that they are in jail, with no wallet or cell phone, and need cash fast.
In 2022, impostor scams were the second most popular racket in America, with over 36,000 people falling victim to calls impersonating their friends and family. Of those scams, 5,100 of them happened over the phone, robbing over $11 million from people, according to FTC officials.
As AI gets better and scammers find better ways to harvest data, these numbers will surely be WAY UP in 2023.
Our best form of security, at this point, is to be aware of these scams so that they don’t happen to you.
These are the top AI-related scams of 2023:
Voice Phishing (Vishing):
-
- AI-generated voice synthesis: Scammers can use AI to mimic the voices of trusted individuals or organizations in phone calls, making it difficult for victims to detect the scam.
Phishing and Social Engineering:
-
- AI-generated messages: Scammers use AI to create convincing phishing emails, messages, and even voice calls. They can mimic the writing style of a trusted entity, such as a bank or a colleague, making it more challenging to identify a scam.
- Personalized content: AI can analyze social media profiles and other online data to craft personalized and believable messages that trick individuals into revealing sensitive information.
Deepfake Technology:
-
- Deepfake videos and audio: Scammers use AI-driven deepfake technology to create convincing videos or audio recordings of individuals, such as CEOs or public figures, instructing employees to transfer funds or disclose sensitive information.
Chatbots and Automated Conversations:
-
- AI chatbots: Scammers employ AI chatbots that can engage in conversations, answering questions and convincing individuals to click on malicious links or share sensitive data.
- Voice assistants: Voice-activated AI assistants can be manipulated to make unauthorized transactions or provide access to sensitive information when manipulated by scammers.
Credential Stuffing Attacks:
-
- AI-driven credential stuffing: Scammers use AI to automate the process of trying stolen usernames and passwords from previous data breaches on various online accounts. When they find a match, they can gain unauthorized access to accounts.
Investment and Trading Scams:
-
- AI-powered trading bots: Scammers offer AI-driven trading bots that promise high returns on investments. These bots often turn out to be fraudulent and result in financial losses for victims.
Social Media Manipulation:
-
- AI-generated content: Scammers use AI to generate fake news articles, reviews, or social media posts to spread misinformation and promote scams or fraudulent products.
Email Spoofing and Domain Mimicry:
-
- AI-generated email content: Scammers can use AI to create convincing email content, and they may even use AI to mimic the sender’s email domain, making it appear as if the message is from a trusted source.
To protect yourself from AI-driven scams, it’s essential to stay vigilant and verify the authenticity of communications by creating CODE WORDS with your family members to verify their authenticity. Be cautious when receiving unsolicited requests for personal or financial information and educate yourself about the latest scams and their characteristics. Additionally, using strong, unique passwords, enabling two-factor authentication, and keeping your software up to date can help mitigate the risk of falling victim to AI-powered scams.