The World Economic Forum has published its Global Risks Report 2024. The report is based on data from the Global Risks Perception Survey conducted worldwide.

This year’s survey was participated in by 1,490 people from academia, business, public administrations, the international community, and civil society organizations. In other words, we can easily say that the study was carried out based on the views of a wide and competent audience.

We are used to seeing climate change, social instability, or cyber insecurity risks at the top of risk management surveys. However, this year’s survey produced a major surprise. When asked to rank the most important risks based on two-year and ten-year future projections, the professionals mentioned above placed the risk of “misinformation and disinformation” at the very top from a near-term perspective.

Actually, don’t be misled by the word “surprise.” The fact that this risk ranks high is not that surprising. Why? The answer is simple. Especially as we have recently seen abundantly in the Israel–Palestine conflict, information can be bent and twisted, made misleading to confirm a certain viewpoint, or, more precisely, presented differently from the truth to serve some interest. For this purpose, images unrelated to the content can be used, as well as videos and photos generated by artificial intelligence in line with technological developments. In such an environment, things get so murky that truth and falsehood; reality and lies become intertwined.

Opinion leaders in risk management are particularly concerned about the major elections to be held around the world throughout 2024. This is the main reason why this risk ranks first. With the creation of misinformation and disinformation, it seems to be a major problem that election results could be steered away from the will of the people and the mechanisms of democracy could be impaired.

Does this seem impossible to you? Don’t think that way.

Here, let’s give an example: As you know, there will be presidential elections in the United States this year. An interesting incident took place in New Hampshire. Some voters received phone calls from someone whose voice sounded very much like President Joe Biden. The call attempted to convince New Hampshire residents to stay home during last week’s primaries and “save their votes” for the general election in November. Of course, this makes no sense whatsoever. Voters can cast ballots in both elections. Why would Biden tell them otherwise? The answer is simple. Biden had made no such call. These were automated calls generated by artificial intelligence imitating Biden’s voice.

Another recent example of fake content is Taylor Swift — more precisely, “created” obscene images of the artist circulating online.

These fake images went viral on X and were viewed by millions. Many also shared them from their own accounts. As a result, Taylor Swift’s fans and those sharing the images clashed with each other. In other words, even on a small scale, fake content scored yet another (!) success in creating polarization.

The examples could go on…

The coming period will be turbulent. Misinformation and disinformation will appear much more frequently from now on.

Governments and companies operating on social media and the internet need to establish strong control rules and tools to eliminate such content at the source. But of course, not everything can be expected from governments and companies. We individuals also need to avoid immediately believing what we hear, to filter it through logic, and to refrain from commenting on or acting upon such news without researching it from other sources.

Otherwise, the end of a democracy that can only walk with great difficulty, stumbling and out of breath, and the dragging of the world into a dark process will be inevitable.

Our duty is to warn.

Until the next article, stay healthy…

Özgün Çınar