By VINCENT OWINO

The African Union Commission has fallen victim of cybercrime after fraudsters deployed artificial intelligence tools to impersonate the continental body’s head Moussa Faki.

This could be the first diplomatic loophole of the new tech.

Mr Faki who is the Chairperson of the African Union Commission, the secretariat of the African Union routinely writes to global leaders whenever he needs to place a call.

Such a letter is formally known as note verbale and is the standard procedure for scheduling meetings between the African Union leadership and representatives of other countries or international organisations.

Also Read: Can you trust your ears? AI voice scams rattle US

But fraudsters faked his voice and placed several video calls to European capitals, ostensibly seeking to arrange meetings.

Advertisement

AU Commission has revealed that the cybercriminals used fake email addresses, too, pretending to be the organisation’s deputy chief of staff, seeking to arrange calls between foreign leaders and Mr Faki.

Ebba Kalondo, Mr Faki’s spokesperson confirmed that the pranksters then went ahead to hold video calls with several European leaders, while using deep fake video alterations to impersonate the chairperson.

In a statement on Friday, AUC said it “regrets these incidents,” reiterating the commission only uses official diplomatic channels to communicate with foreign governments, through their embassies in Addis Ababa.

“The African Union Commission reiterates its strict adherence to diplomatic protocol and exclusive usage of Note Verbale for high-level engagement requests,” Ms Kalondo said in a tweet.

It is not yet clear what the intentions of the imposters were, but the AU statement termed their fake emails “phishing,” an indication that they might have intended to steal digital identities to gain access to privileged information.

Deep fakes, the technology used by cybercriminals, are increasingly becoming popular and are sometimes used by certain entities to spread misinformation and propaganda.

They involve using artificial intelligence tools to develop someone’s image, voice and traits into a video of them doing or saying something they haven’t actually done.

Leave a Reply

Your email address will not be published.