As the world watches Europe convulse from Russian President Vladmir Putin’s aggression and invasion of neighbouring Ukraine, we are struck with information overload from voluminous data sources broadcasting on televisions and radios, on social media, and in the print media from newspapers to pamphlets.
Then some sympathy eroded upon empathising with video clips of some of our African sisters and brothers experiencing racism in their fleeing from the conflict to safety.
As Putin’s war and the world’s reaction progresses differently than anticipated, the prospect for the ramp-up of misleading information looms large.
Already, many supporters of former US President Donald Trump fell victim to non-factual pro-Putin propaganda.
In 2021, Karen Santos-d'Amorim and Májory Miranda came up with different ways to categorise fake or harmful news and information sharing based on the intentionality behind the information.
They broke down harmful data into first disinformation, meaning incorrect or inaccurate information with the intent of deceiving an audience, such as propaganda.
Given Putin’s war, we can expect a sharp increase in disinformation propaganda. An example closer to home, the 1994 genocide in Rwanda partially originated from colonial disinformation and misinformation followed later by years of disinformation propaganda falsely dehumanising the ethnic Tutsi population.
The Cambridge Analytica scandal of 2018 exposed political disinformation campaigns around the world and right here in Kenya.
Second, misinformation means false information usually without the intention to fool anyone. Misinformation could be shared accidentally.
During the Covid-19 pandemic, many global citizens shared untrue coronavirus and vaccine information thinking they were sharing what’s accurate.
Third, malinformation means real accurate information that is disseminated as a way to cause harm. Examples include publishing embarrassing photos of someone without their permission or leaking confidential corporate financial statements.
Erik Nisbet and Olga Kamenchuk highlight how the human brain becomes overwhelmed with distinguishing accurate information from false or malicious information.
They coined the term informational learned helplessness in that over time when we get bombarded with harsh situational information when we cannot avoid or assist solving the situation, then we start to just accept the situation as an unchangeable reality of life.
We gain a certain functional fixedness. People with the highest levels of helplessness sadly are the most likely to share fake news.
Psychology authors Susan Nolan and Michael Kimball find that informational learned helplessness leads to conspiracy theories and cognitive exhaustion.
Unfortunately, the exhaustion causes individuals to stop fact-checking news stories and social media gossip.
They cite Erik Nisbet and Olga Kamenchuk’s finding that millions of Russians believe erroneously that philanthropist Bill Gates-funded a secret lap that started Covid-19 because of repetitive government propaganda.
Susan Nolan and Michael Kimball ponder whether news organisations should, therefore, be required to post links to their sources similar to the Wikipedia model.
On an individual level, we must not wary of rigorous fact-checking and making sure we swallow accurate well-intentioned information and not let ourselves get brainwashed.
The authors recommend becoming self-aware and not liking or sharing online content during times where we feel overwhelmed and instead wait until we can fight back in favour of a better world.
More directly to us business professionals in East Africa, how do we combat misinformation, disinformation, and malinformation about our own firms? Do we provide transparent data to build trust among our stakeholders?
Social scientist Jacktone Momanyi details that businesses often do not know about misinformation pertaining to their organisations until it is too late.
Proactive robust social media analysis must be combined with any communications or public relations departments.