How Manipulated Audio-visuals Were Used To Shape Political Narratives In Nigeria's 2023 Polls

How Deepfake Audio-visuals Were Used To Shape Political Narratives In Nigeria’s 2023 Polls

12 months ago
7 mins read

Political actors have at different times deployed various strategies for circulating propaganda messages to bring down their opponents, however, this time, the advancement of tech tools has taken the art to an entirely different level.

The just concluded 2023 general elections witnessed the use of what was later described as deepfake images, videos and audios to push certain narratives about candidates and political parties participating in the polls.

The most recent that easily comes to mind is that of last week Thursday when a photo went viral, purportedly showing the executive chairman of the Nigerians in Diaspora Commission (NiDCOM), Dr Abike Dabiri-Erewa, seated in an office with presidential candidate of the Labour Party, Mr Peter Obi and a United Kingdom law enforcement officer.

The report that accompanied the photo posted on Twitter by an APC Chieftain, Adamu Garba, which Prime Business Africa correspondent sighted, alleged that Obi was with Dabiri-Erewa in the office in order to secure the LP presidential candidate’s release from detention.

READ ALSO: Abike Dabiri Not In UK, Never Secured Release Of Obi – NiDCOM

This was few days after Obi was reportedly detained by UK Immigration authorities at the Heathrow Airport, London, over impersonation allegations.

Just as the fake photo was circulating on social media, NiDCOM spokesperson, Abdur-Rahman Balogun, issued a statement countering it.

Balogun said in the statement that Dabiri-Erewa was not in the UK at the said time and has no power to secure the release of any Nigerian undergoing interogation in the country. He dismissed it as one of “social media stunts” deliberately circulated to deceive the public.

It was later discovered that the images of Obi, Dabiri-Erewa and a law enforcement officer seen in the photo were cropped from different photo exposures and pieced together to look like the LP presidential candidate was seated with the NiDCOM executive chairman purportedly to secure his release from detention.

Another recent comteoversial one is the purportedly leaked audio of Peter Obi having a conversation with presiding Pastor of Living Faith Church, popularly known as Bishop Oyedepo.

The audio which was released by the People’s Gazette, a Lagos-based online newspapers, had what looked like the voice of Obi and that of Oyedepo.

The voice which looked like that of Obi, was urging the other man who is supposed to be Oyedepo to canvas for Christian votes for him.

Many people, especially members of opposition political parties and their supoorters called Obi names, accusing him of being a religious bigot and calling for religious war in the country during and after the election.

Obi’s camp, however, denied it, describing it as “deepfake” audio produced by the opposition forces to tarnish the LP presidential candidate’s image and abort his quest of seeking redress of the outcome of the February 25 presidential election in court.

While denying the audio, Obi disclosed that his Party’s legal team has been detailed to institute a case against People’s Gazette and others for circulating the audio.

“Let me reiterate that the audio call being circulated is fake, and at no time throughout the campaign and now did I ever say, think, or even imply that the 2023 election is, or was, a religious war. The attempts to manipulate Nigerians is very sad and wicked. Our legal team has been instructed to take appropriate legal actions against People’s Gazette and others,” Obi said in a statement.

Also, the Labour Party through a statement signed by chief spokesperson, Obi-Datti Presidential Campaign Council, Dr Tanko Yunusa, described the audio as part of the continued attempts by the All Progressives Congress to tarnish the image of Peter Obi.

“From the show of shame in Port Harcourt to the drama in the Ibom Air aircraft, both of which they contrived, they have now moved to the circulation of a deep fake audio file aimed at promoting religious tension in the country,” said in a statement.

People’s Gazette has said it stands by the authenticity of the audio it released. It however, went ahead to suspend the reporter who did the report that came with the leaked audio, stating that “his conduct online violated the newspaper’s social media policy and called into question its integrity.”

Prior to February 25 Presidential and National Assembly elections, a similar audio of the Peoples Democratic Party (PDP) presidential candidate, Atiku Abubakar and his vice, Governor Ifeanyi Okowa and Sokoto State Governor, Aminu Waziri Tambuwal, was circulated. The audio file had voices that sounded like that of the trio, purportedly discussing how to rig the election.

Atiku Abubakar, however denied it saying it was the work of his opponents on who were on a pernicious propaganda drive.

A statement by Atiku’s Special Assistant on Public Communication, Phrank Shaibu, claimed that it was created with Artificial Intelligence technology by the opponents to deceive the public.

“In this age of artificial intelligence technology, even dead people can be portrayed as delivering speeches. This is nothing new,” Shaibu stated.

Meanwhile, a fact check conducted by the Centre for Democracy and Development (CDD) said its findings indicated that the audio file had been “manipulated.”

“Running the audio through deepware, we discovered that the audio has been manipulated and was 97% fake.

“The viral audio is a fake; it is an ensemble of various audio files containing the voices of the persons in question, falsely edited to spread a false narrative,” CDD said in its verdict on the fact-check it conducted.

Deepfake audio-visual gaining prominence.

In this era of digital media, where information manipulation has become rife, deepfake technology has no doubt gained popularity among tech enthusiasts who deploy it for their selfish motives which could be political, economic or for acts of terror.

This artificial intelligence tool enables the alteration of videos, photos and audio. Also called audio cloning, audio deepfake is a type of Artificial Intelligence used to create convincing speech sentences that sound like specific people saying things they did not say.

Deepfakes use AI to generate completely new videos or audio, with the goal of portraying something that did not actually happen in reality.

It was said that this technology was created for positive uses to help people solve life problems, but today has been turned to negative uses by criminals, terrorists and political gladiators for propaganda purposes.

Tianxiang Chen, Avrosh Kumar, Parav Nagarsheth, Ganesh Sivaraman, Elie Khoury in their workshop paper presented in 2020 titled “Generalisation of Audio Deepfake Detection,” noted that people can use the tools as a logical access voice spoofing technique, where they can be used to manipulate public opinion for propaganda, defamation, or terrorism.

A Washington Post report in August 2021 gave account of how a Uk-based software company, Sonantic that helps clone voices of actors, helped a Hollywood, actor, Val Kilmer, “to speak again.” Kilmer was said to have undergone surgery due to throat cancer in 2015 which later affected his voice making it difficult for his audience to understand him.

But he embraced the technology through the tech firm to relate with his audience once again.

However, clone voice technologies have in recent times been turned to object of manipulation and disinformation because it offers an audio that is seemingly real and very difficult to disprove.

“The generated voices have gotten more realistic in the age of deepfakes, a technology that uses AI to manipulate content to look and sound deceptively real. This media is so good that it is sometimes tough to tell the difference between human voices and their synthetic counterparts,” said Dalvin Brown, Washington Post tech writer.

According to experts creating a different auido of someone, requires having audio recordings of the person which is then feeded into an AI software in a computer system or mobile devices with such capabilities, to produce it, sometimes by creating a different script and the AI will generate an audio conversation of whoever is intended.

Impact of Audio-based Deepfakes in Nigerian Politics

Though there are still contestations on whether the Obi’s “Yes Daddy” audio is real or or not, a group identified as Democracy Watchman, said it conducted a forensic analysis of purportedly leaked audio.

The group took to its Twitter handle on April 2, to explain steps it took to verify the authenticity of the said audio.

The forensic analysis was conducted using Adobe Premier Pro, a timeline-based and non-linear video editing software application developed by Adobe Inc.

Describing its procedure for verification, the Democracy Watchman said:”The first step is to analyse the audio itself for any form of manipulation. Is this an authentic audio that has been cut and joined or this is an entirely scripted conversation that never took place?”

“That question is not so hard to answer, as a quick “Scene Edit Detection” check on Adobe Premier Pro reveals that the audio has indeed been manipulated.

“Premier Pro was able to detect 4 different audios that have been put together to form the 4mins 17 secs clip.”

The analyst said he went ahead with the second phase of the forensic analysis which to prove that the audio was general with AI and an outright falsehood.

“To prove or disprove this theory, I relied on the only tool that is 99% reliable – AI. Using an AI tool created. To detect AI generated audio, I ran a scan of the leaked audio.”

The scan according to him was done using Gitub’s dessa-oss, an open-source deepfake detection tool.

He uploaded some screenshots of visuals to show differences in displayed between manipulated audio and real one when ran on the detection tool.

The analyst concluded that based on the results he got in the first and second phases of the forensic analysis, tbe audio is not original.

The use of AI-generated audio deepfakes in Nigerian politics has introduced another level of political propaganda.

However, this is not new in foreign climes.

Audio deepfake attackers have targeted individuals and organizations, including politicians and governments.

In third week of the Russia-Ukraine crisis video emerged online and was even broadcast on television stattions. The video appeared to show Ukrainian President Volodymyr Zelenskyy, stilted with his head moving and his body largely motionless, calling on the citizens of his country to stop fighting Russian soldiers and to surrender their weapons. The video further portrayed Zelenskyy as having already fled Kyiv.

The Ukrainian president later came out to debunk the video report in Northeastern Global News, April 1, 2022 indicated.

Similar cases of the use of deepfake in the United States have been spotted at different times.

In 2018, a Belgian political party released a video of former US President, Donald Trump, giving a speech calling on Belgium to withdraw from the Paris Climate Agreement. It was discovered that Trump never gave such speechm but was purely created with deepfake.

Also, in early 2021, at the peak of the scourge of COVID-19 in India, a video went viral showing dolls dumped by what looked like a river bank, with narratives that Indians, who are mostly Hindu worshippers, have out of rage, discarded their religious objects, as the gods could not save them from the disease that caused unprecedented death in the country.

Some reputable media platforms later ran fackchecks on the report and found out that the video first appeared in 2015.

Using reverse image search tools, AFP’s Fact Check was able to trace when the video first appeared online, which predates the outbreak of the coronavirus pandemic.

“The claim is false: the said clip dates back to 2015 – four years before the pandemic erupted in December 2019. It shows a ritual during a Hindu festival dedicated to the Hindu god Ganesh,” the AFP Fact Check revealed.

Experts have expressed concerns about the impact of such manipulated audio-visuals, saying that they end up reinforcing the biases of a certain category of audience and it is difficult to convince them otherwise.

Victor Ezeja is a passionate journalist with six years of experience writing on economy, politics and energy. He holds a Masters degree in Mass Communication.


MOST READ

Follow Us

Latest from FEATURES

Don't Miss

Obi Sympathises With Abure Over House Fire Accident

Obi Sympathises With Abure Over House Fire Accident

Peter Obi, has commiserated with the party’s national