Over the past few years AI has become one of the things that is included in everything from cars to lights whether it makes sense or not and criminals are not behind in this trend. We have AI based systems testing computer security, working on bypassing checks and balances in systems etc and now in a new twist, AI is being used in Vishing as well. Voice phishing or vishing as it’s sometime referred to is a form of criminal phone fraud, using social engineering over the telephone system to gain access to private personal and financial information for the purpose of financial reward.
Anatomy of Vishing Attack. Source: https://www.biocatch.com/blog/detect-vishing-voice-phishing
In this particular instance criminals used commercially available voice-generating AI software to impersonate the CEO of a German Company and then convinced the CEO of their UK based subsidiary to transfer $243,000 to a Hungarian supplier. The AI was able to mimic the voice almost perfectly including his slight German accent and voice patterns. This is a new phase of crime and unfortunately will not be a one-off case as criminals will soon realize the potential then these kind of attacks are only bound to increase in frequency. Interestingly it will also make the biometric voice authentication systems used by certain banks like Citibank more vulnerable to fraud.
To safeguard from the economic and reputational fallout, it’s crucial that all instructions are verified via a follow-up email or other alternative means i.e. if you have an email asking for a transfer/detail call the person and if you get a call asking for transfer follow up via email or other means. Do not use a number provided by the call for verification, you need to call the number in the company address-book or in your records.
Well this is all for now. Will post more later.
Thanks to : Slashdot.org for the original link.
– Suramya