Unmasking the threat: how deep fakes endanger corporate security

18 July 2024

Artificial Intelligence (AI)

In today’s digital era, the landscape of cybersecurity threats has evolved dramatically, with AI-generated cyber threats emerging as a significant concern for businesses. These threats, powered by AI and machine learning, can adapt, learn and execute attacks with unprecedented sophistication and efficiency.

As an example – and one construction and transport leaders around the world need to start playing close attention to – British engineering group Arup lost approximately US$25 million earlier in 2024 after scammers used AI-manipulated “deep fakes” to falsely pose as the group’s CFO and request transfers from an employee to bank accounts in Hong Kong (according to the Financial Times).

In essence, as reported by Hong Kong police, a staff member received a message claiming to be from Arup’s CFO – based in the UK – regarding a “confidential transaction.” Following a video conference with the false CFO and other AI-generated employees, the staff member then made a number of transactions to five Hong Kong bank accounts, before discovering the fraud.

Joel Dandrea

Hyper realistic digital falsification

As rapid technological advancement continues, the rise of AI-generated deep fakes presents a significant challenge for businesses worldwide. Deep fakes, which comprise hyper-realistic digital falsifications using artificial intelligence, can manipulate audio, video and images, making it appear as though individuals are saying or doing things they never did. As the technology behind deep fakes becomes more sophisticated and accessible, the threat to corporate security only grows.

While CFOs may not be keeping such incidents top of mind presently, that will change as they become more frequent. As keepers of the financial keys of a business – and therefore as potential targets of impersonation by enterprising scammers – ensuring awareness is a critical first step. An entire staff needs to be aware of threats such as the potential impersonation of executives and how such fraud could impact critical business processes.

Fighting chance

To safeguard against deep fakes, businesses must bolster their cybersecurity frameworks. Implementing advanced threat-detection systems that use machine learning can help identify deep fakes by analysing inconsistencies in videos, audio and images. These systems can flag suspicious content for further investigation. In addition, companies will likely soon be investing in biometric authentication methods, such as voice and facial recognition systems that are designed to detect anomalies indicative of deep fakes.

While cybersecurity fraud isn’t new, companies in the construction and transport industries have traditionally been slower to adapt to it – even resistant in many cases. That said, now would be exactly the right time for CFOs to gain as much awareness as possible and start looking closer at the fraud controls they have in place. Finance chiefs should revisit many of their business processes that could be susceptible to deep fake and-or social media-type attacks. The emergence of generative AI and other tools fuelling them allows fraudsters to scale such scams massively – effectively altering the economics of fraud.

Employee training, as always, is also crucial in combating the threat of deep fakes, as is developing robust verification protocols.

Ultimately, protecting your company against AI-generated deep fakes requires a multifaceted approach – involving advanced technology, employee education, the aforementioned protocols and proactive monitoring. But by staying vigilant, and implementing comprehensive safeguards, businesses can mitigate the risks posed by deep fakes and at least cultivate a fighting chance in combatting the emergent dangers within this digital age.

CONNECT WITH THE TEAM
Leila Steed Editor, Demolition & Recycling International Tel: +44(0) 1892 786 261 E-mail: [email protected]
Peter Collinson International Sales Manager Tel: +44 (0) 1892 786220 E-mail: [email protected]