Attorney General Kris Mayes | Attorney General Kris Mayes Official website
Attorney General Kris Mayes | Attorney General Kris Mayes Official website
PHOENIX–Attorney General Kris Mayes today warned Arizonans that scammers appear to be using voice clones generated by artificial intelligence to defraud consumers. With this rapidly evolving technology, scammers can trick consumers into thinking a loved one is in trouble and needs money or gift cards. Some scammers may combine this technology with spoofing equipment so that it seems a call is coming from a friend or family member.
“Scammers are using AI technology to personalize scams and mimic a loved one’s voice—or to send similar personalized text messages—to trick people,” said Attorney General Mayes. “Receiving a call from a loved one in distress, with a voice that appears to be real, can easily push a consumer into rushing to send money or other forms of payment. Be wary of any call asking for emergency money. Contact the family member who is supposed to be calling to verify the ask – and always seek help from others, including law enforcement, before sending any form of payment.”
The Attorney General provides the following tips to protect yourself from voice-clone AI scams:
- Beware of any emergency call asking for money to be sent right away.
- Don’t trust the voice or message as voices can be imitated with AI.
- Hang up and call your loved one through a trusted number to verify the call or text.
- Consider establishing a word or phrase that only your loved one would know to verify their identity.
- Beware of high-pressure scare tactics.
- Beware of requests for payments through gift cards, person-to-person pay apps, etc.
Original source can be found here.