What to know about user generated content during Ukraine conflict

Russia has so far been unable to change or even muddy the global narrative on Ukraine. The world is largely united in seeing that Ukraine and its people have done nothing to aggression from Russia. Syrians were also victims. I witness horrible violence against peaceful protests throughout Damascus and other cities at the start of the conflict. Yet, despite the extraordinary amount of visual evidence in Syria, the international community was unable to fully align on a narrative like we have seen thus far in Ukraine. Why? This is complicated and can be addressed by many geo-strategic, historical, political, interest-based factors—all should be honestly examined. Though the answer is multi-dimensional, there is one critical piece worth examining: visual truth.

Currently, Ukraine has a largely permissive media environment. Though it is dangerous, most international news organizations can enter the country and run a live feed and allow the world to watch events unfold live on television. Similar to protests in Egypt in 2010, the whole world saw the crowds of people in Tahrir Square live, despite swirling false narratives.

In Ukraine today, social media is being littered with “cheapfakes.” These out-of-context or altered media are problematic but haven’t truly influenced the global narrative. No doubt President Biden’s strategic declassification of information In the weeks leading up to the invasion helped undercut false narratives, but the ongoing live broadcasts and broadcasts’ access affirmed what’s happening on the ground to the world. This has remarkable value in helping preempt false narratives and rally global empathy toward Ukraine and its brave people.

Perhaps, more importantly, it forces governments to address the reality and not use the fog of war as a shield to avoid difficult political and geo-strategic decisions. Even UN Secretary-General Guterres directly called out the Russians for the attack, rather than suggest any equivalency or give a tipid plea for cessation of hostilities on all sides.

Syria on the other hand was already a closed-off media environment before the conflict and that became even worse at the start of the uprising. Simultaneously, the Assad regime immediately rolled out a counter-narrative against “fitna” or sectarianism to heighten fears in its multi-ethnic society, even before widespread protests started.

The world—including Syrians themselves— were forced to rely on user-generated content of images and videos captured on smartphones to decipher what was happening throughout the country. In response, bad actors pivoted and weaponized the “Liar’s Dividend,” which modified all user-generated content because it could not be authenticated. In other words, if a piece of digital content could theoretically be fake, then everything can be claimed to be fake and nothing has to be real. This remains a problem today and until more tech platforms begin seriously engaging in and adopting digital content provenance standards.

Until they do, advanced editing tools and the rise of synthetic media will allow artificial intelligence (AI) to continue to be weaponized to exacerbate visual fraud that destroys all digital content. A recent UC Berkeley study shows that AI-generated faces are indistinguishable and perceived as more trustworthy as compared with real ones. We are potentially seeing the signs of this already, as at least one fake persona was allegedly created and active on social media complete with an alleged synthetic headshot.

Use of the Liar’s Dividend is quite effective when combined with a fabricated counter-narrative too. It forces decision-makers to prove that asynchronous, often anonymous, poorly filmed video is authentic, which is still not possible in real-time or at scale. As a result, bad actors can propel their narratives and cast doubt by simply questioning authenticity. Those same bad actors can simultaneously flood the internet with purposefully fake content to reinforce their claims of “everything being fake.” This would dilute authentic information and promote general confusion and apathy externally and at home, especially as Russians may start seeking non-state media sources in greater numbers.

On March 1, Ukrainian Defense officials warned that a large-scale “information and psychological operation meant to deceive was imminent. Not only does this confuse the general public, but it also gives governments and members of the international community the plausible deniability to stall, drag out decisions, or resort to lengthy investigations as a cover from the hard choice of swift and decisive action.

Thus far, media outlets are still able to access large parts of Ukraine, so the Liar’s Dividend is much harder to deploy. As a result, brave brave people are able to furnish reports allowing the world to see what’s happening in a way thats a fabricated narrative and unites it behind the Ukrainian people.

If in the future, should large parts of territory not permit, I caution you to be mindful of the Liar’s Dividend and aggressive counter-narratives to general the flow of user-generated content and muddy the truth we all see today.

Moonir Ibrahim is the VP of Public Affairs & Impact at Truepic.

Leave a Comment