Cyber Mirage: How AI is Shaping the Future of Social Engineering
In this webcast, Senior Security Consultant Brandon Kovacs aims to illuminate the sophisticated capabilities that AI brings to the table in creating hyper-realistic deepfakes and voice clones.
The emergence of artificial intelligence (AI) has transformed the landscape of social engineering and given rise to a new class of AI-powered threats targeting the finance, healthcare, and energy sectors. Earlier this year, a finance professional at a global corporation in Hong Kong was deceived into transferring $25 million to scammers, who leveraged deepfake and voice-cloning technology to impersonate the company's chief financial officer on a video conference call.
In this session, Senior Security Consultant Brandon Kovacs illuminates the sophisticated capabilities that AI brings to the table in creating hyper-realistic deepfakes and voice clones.
Through a live demonstration, we are showcasing:
- Real-time AI-powered deepfake and voice cloning technologies
- How these technologies can be used by offensive cybersecurity professionals to conduct highly effective social engineering attacks
- The critical need for the development of more sophisticated defense mechanisms to mitigate the risks posed by these rapidly evolving AI-based cyber threats
Lessons Learned from the session
AI has given rise to a new class of cyberattacks that enable attackers to create hyper-realistic deepfake video and voice impersonations to manipulate or deceive their victims into taking actions they otherwise wouldn’t take.
How AI-Powered Attacks are Created
- Deepfakes and voice impersonations are made by training on data collected from public sources, like social media posts, podcasts, and earnings calls.
- Training is the process of teaching a model to recognize patterns within a dataset.
Training Vocal Clones
- Collect audio or video files.
- Prepare the data by cleaning and formatting it. Clean the data of background noise or muffled speech, splice the audio into shorter chunks, convert the files into WAV format, and transcribe the text.
- Train the data using specialized software that analyzes the audio files and creates a trained model from that specific individual’s voice. This gives you the ability to impersonate the individual’s voice.
Training Deepfakes
Technologies like DeepFaceLab use advanced machine learning to train a model. Through a process known as “merging,” the face is swapped from the source to the destination.
- Collect high-quality photos or videos.
- Preparation is a lengthy process involving extraction, alignment, labeling, and masking. It revolves around identifying and defining the source and the destination. The source is the person being manipulated (i.e. a celebrity’s face). The destination is the outcome of the manipulation (i.e. the individual impersonating the source).
- Training through inferencing, the model is taught to recognize patterns in both the source and destination, learn, and make a series of predictions. As the number of iterations increase, the images become more lifelike. A tool like DeepFaceLive enables you to deepfake in real time.
Using Deepfakes and Voice Clones for Offensive Security
Incorporate this technology into your red team exercises by:
- Using a vocal clone during an external breach assessment to attempt to process a password reset with IT helpdesk.
- Using a real-time deepfake during an internal video conference to attempt to gain trust and manipulate employees to take an action (such as wiring money).
How to Protect Against Vocal Clones and Deepfakes
When facing AI-powered cyber threats, a low- or no-tech approach can enable you to block the attacker. For example, in a scenario where an individual requests an action be taken – especially while employing fear, uncertainty, or doubt (FUD) as a pressure tactic –establish the caller’s identity by:
- Calling the person back using a verified phone number from the company directory. Remember, attackers can perform caller ID manipulation to make it appear like the impersonated individual is calling.
- Using a previously agreed upon safe word that only you and that individual know for authentication.
While cloning technology has positive applications for society, it is a powerful tool that threat actors will use for nefarious purposes. Make sure to watch the video for a deeper dive into this topic.