[ad_1]
CELINA, Ohio — Artificial intelligence has many perks, including for scammers who continue to push AI into schemes.
In a newer scheme, criminals are using AI to clone to the voice of loved ones.
Scammers use audio clips, gathered from social media, voicemail greetings or videos to create a convincing voice replica.
The scams are designed to create panic and sound very real, with calls sounding like they are from a child, grandchild or other family member who is in trouble and needs money immediately.
Examples of scenarios include:
- “I’ve been in an accident and need bail money.”
- “I’m stuck somewhere and need you to send money right away.”
- “I was arrested and can’t talk long – please don’t tell mom/dad.”
The caller may beg the victim to stay on the phone and not to tell other family members.
Police are urging Ohioans to protect themselves.
Steps to take include:
- Pause and verify. Hang up and call your loved one directly using a number you already have
- Ask a personal question only they would know the answer to
- Create a family “safe word” to use in real emergencies
Law enforcement is also asking for caution when sharing voice recordings online and for users to check social media privacy settings.
Money, gift cards or wire transfers should never be sent based solely on a phone call.
If you receive a suspicious call:
- Do not provide personal or financial information
- Report the incident to local law enforcement
- Report scams to the FTC at reportfraud.ftc.gov
“These scams rely on fear and urgency,” Celine police said. “Taking a moment to slow down and verify can prevent financial loss and emotional distress.”
[ad_2]
Madison MacArthur
Source link