Can Audio Deepfakes Really Fake a Human?
Audio deepfakes are the new frontier for business compromise schemes and are becoming more common pathways for criminals to deceptively gain access to corporate funds. Nisos recently investigated and obtained an original attempted deepfake synthetic audio used in a fraud attempt against a technology company. The deepfake took the form of a voicemail message from the company’s purported CEO, asking an employee to call back to “finalize an urgent business deal.” The recipient immediately thought it suspicious and did not contact the number, instead referring it to their legal department, and as a result the attack was not successful.