LastPass targeted in AI voice phishing scam
LastPass recently faced a sophisticated cyber attack involving deepfake audio. An employee received calls, texts, and a voicemail featuring an AI-generated voice impersonating CEO Karim Toubba. The employee recognized signs of a social engineering attempt, such as unusual urgency, and reported the incident to the internal security team.
LastPass reported no impact from the attempt but chose to share the experience to raise awareness about the increasing use of deepfake technology in cyber attacks. The incident highlights that such advanced tactics are no longer limited to nation-state actors but are being used in executive impersonation fraud.
Security experts note that these attacks are an evolution of business email compromise (BEC) scams. They add personal pressure through voice, SMS, and video to manipulate employees into taking unauthorized actions. The rise of AI-powered tools has made it easier for threat actors to create convincing fake content across multiple platforms.
Cybersecurity professionals emphasize the need for employee awareness training to combat these threats. They suggest that traditional security measures may not be sufficient against sophisticated AI-generated content. Critical thinking and verifying requests through established channels are crucial defenses against such scams.
The incident underscores the growing challenge deepfake technology poses to organizations and the importance of adapting security strategies to address these evolving threats.