Cyber Mirage: How AI is Shaping the Future of Social Engineering

In this webcast, Senior Security Consultant Brandon Kovacs aims to illuminate the sophisticated capabilities that AI brings to the table in creating hyper-realistic deepfakes and voice clones.

The emergence of artificial intelligence (AI) has transformed the landscape of social engineering and given rise to a new class of AI-powered threats targeting the finance, healthcare, and energy sectors. Earlier this year, a finance professional at a global corporation in Hong Kong was deceived into transferring $25 million to scammers, who leveraged deepfake and voice-cloning technology to impersonate the company's chief financial officer on a video conference call.

In this webcast, Senior Security Consultant Brandon Kovacs aims to illuminate the sophisticated capabilities that AI brings to the table in creating hyper-realistic deepfakes and voice clones.

Through a captivating live demonstration, we will showcase:

  • Real-time AI-powered deepfake and voice cloning technologies
  • How these technologies can be used by offensive cybersecurity professionals to conduct highly effective social engineering attacks
  • The critical need for the development of more sophisticated defense mechanisms to mitigate the risks posed by these rapidly evolving AI-based cyber threats

Slides in PDF format will be made available in coming months.


Brandon Kovacs Headshot

About the speaker, Brandon Kovacs

Senior Security Consultant

Brandon Kovacs (CRT, OSCP) is a Senior Security Consultant at Bishop Fox, where he specializes in red teaming, network penetration testing, and physical penetration testing. As a red team operator, he is adept at identifying critical attack chains that an external attacker could use to fully compromise organizations and reach high-value targets.

More by Brandon

This site uses cookies to provide you with a great user experience. By continuing to use our website, you consent to the use of cookies. To find out more about the cookies we use, please see our Privacy Policy.