Accessing information has required a great deal of struggle throughout history. Now, however, we are in a digital age, and accessing information is easier than ever before; the flow of information reaches millions in a matter of seconds.
In our daily routine, we often grab our phones; perhaps to follow the national agenda, or perhaps to click on the most popular social media platforms to look at our friends' photos. News of conflicts from around the world, the results of sporting events, or political developments in Turkey are right in front of us.
However, this speed and accessibility also bring serious risks. While accessing information is very easy, accessing accurate information is not easy at all. Algorithms choose what we consume in the digital world for us, and the sense of speed suppresses critical thinking. Hate speech and disinformation can deepen social polarization, creating destructive effects on both individuals and society.
The anatomy of polarization
Disinformation does not float around in a vacuum; it often triggers emotions, leading to more dangerous consequences. It is possible to think of this process as a mechanism.
It all starts with a claim being put forward. For example, the claim that “Syrians enter universities without exams.” This may seem like ordinary misinformation on social media; however, the potential for harmful information to cause damage is greater than we think.
In the second step, this claim merges with the feelings of everyday injustice in our lives. When people believe that another group is entering university “privileged,” while their own children have worked hard for years, it ceases to be just “information” and gives way to powerful emotions like anger.
By the third stage, this anger seeks a social target, and the claims evolve into hate speech. In this example, immigrants are collectively blamed. What starts as a single claim circulating as “Syrians are enrolled without exams” can eventually turn into discriminatory rhetoric such as “They are stealing our future.” Thus, individual grievances, economic concerns, or criticisms of the education system are directed at immigrants.
This example demonstrates how false information and hate speech feed off each other. Making this chain visible is critical to understanding why false information is not merely a simple “misinformation,” but a hate production mechanism that threatens social peace and democracy and directly fuels polarization.
The solution lies in critical digital literacy
Of course, false information existed before the digital world became an indispensable part of our lives; but the digital age we live in has brought new and complex dimensions to this problem, such as hate speech and polarization. Thus, the skills required for a solution have also become more diverse.
It is precisely at this point that critical digital literacy emerges not as a luxury, but as a survival skill in the digital world. Critical digital literacy refers to individuals' ability to access, manage, understand, integrate, communicate, evaluate, and produce information through digital technologies; as well as their competence to use this information safely, ethically, context-sensitively, and with consideration for its social implications.
It should also be noted that particularly the concepts of “digital media literacy,” “media literacy,” and “media and information literacy” are often used interchangeably. So why do I prefer to use “critical digital literacy” rather than “digital media literacy” in this article? Because the issue is not limited to questioning the accuracy of shared content; it is about understanding the systems that produce and circulate information, the ownership structures of algorithms, and the content they promote. Critical thinking comes into play precisely at this point: this competence empowers users to make informed decisions in the digital world, ensure online safety, and comprehend social impacts.
What are the core skills?
There are many frameworks that describe the skills involved in critical digital literacy. The framework outlined by Juliet Hinrichsen and Antony Coombs from the University of Greenwich consists of five stages and depicts an established literacy model supported by a critical perspective:
1) Decoding
Decoding can be summarized as the ability to see the invisible mechanisms and rules of the digital world. For example, if we constantly see similar content on social media, continuing to consume it knowing that this is not a coincidence but the result of algorithms is an effective outcome of decoding.
2) Meaning making
Every piece of digital content we encounter gains meaning through our experiences, knowledge, and biases. Therefore, interpretation involves questioning the context, purpose, and emotional impact of digital content. For instance, when encountering a hateful post targeting a group, we must remember that it reinforces biases and exclusionary attitudes.
3) Analyzing
To make conscious choices in the digital environment, we need to be aware of the creator behind the content, that is, the author's or source's aesthetic, ethical, and critical choices.
For example, even if the shared image is accurate, we need to keep in mind that it may have a manipulative context. Analysis requires thinking by asking questions such as, “Who produced this, for what purpose did they put it into circulation, and who could it harm?”
4) Persona
This dimension is about consciously managing our presence in the digital world.
In the online world, information does not circulate alone, but together with our identities. When we comment or share, we are actually building our own digital persona. For example, when we see a hateful comment directed at an athlete, staying silent is a choice; so is politely objecting or offering an alternative voice.
5) Using
The final step points to the ability to use digital tools not only for consumption but also for production and democratic participation. When we see someone we know sharing misinformation, explaining it calmly is a practical application of this skill. Creating a healthier digital environment for both ourselves and the community we are part of requires active participation.
That was the slightly theoretical part. So, how can we use these skills in our daily lives?
What is the source?
Not all information we encounter in the digital world is accurate. Not immediately believing everything we see and maintaining reasonable skepticism allows us to stand more firmly on our feet.
If you encounter a source whose reliability you are unsure of, you can visit independent verification platforms or track down the truth yourself. Remember, verification methods are available to everyone, not just experts. Some verification steps we can take include typing the claim into search engines to see what other sources say about it, checking whether it is current or just an old story being brought up again, and confirming whether the author of the content is an expert on the subject.
Don't forget the algorithms
Remember that the content you encounter is tailored to you by algorithms. When we understand that we live in filter bubbles fed with content designed to keep us engaged through AI-powered algorithms, we can become more resistant to both manipulation and hate speech.
The results of a study led by researcher Myojung Chung reveal how knowledge about algorithms shapes attitudes and behaviors toward misinformation.
Be aware of your emotions
Anger, shock, or fear... Such emotions can open the door to both manipulation and hate speech. That's why we need an instinctive brake. Simply being aware can help reduce the impact of acting on our emotions. Taking a step back and asking questions like, “How did this content make me feel, and who could be harmed by it?” brings us one step closer to critical digital literacy.
Think before you share
Speed isn't always our friend. Misinformation spreads much faster than we think, and at the end of the day, polarization can increase. Think the other way around: asking yourself, “If this claim said the exact opposite, would I still believe it?” is a good way to recognize our own biases.
Is responsibility solely on individuals?
Individual reflexes are certainly valuable. These small steps protect us and, when spread to our surroundings, create collective resistance. But responsibility should not be placed solely on individuals.
The solution goes beyond individuals being careful; it requires platform transparency, regulations, education policies, and social awareness. Tech companies must make their algorithms transparent, governments must establish accountable rules, and civil society and academia must empower society through education and ethical journalism. Real change can only happen when these institutions take an active role. Critical digital literacy also involves scrutinizing these structures.
Dr. Gianfranco Polizzi also highlights the responsibility of public institutions, especially the education system, to promote critical digital literacy and enable people to participate more actively in democracy.
According to UNESCO, countries are developing digital literacy policies for different purposes. For example, while the goal in Korea is more transparent public services, Oman aims to bridge the digital divide and create employment for young people. These differences highlight the importance of considering cultural context in combating hate speech and disinformation.
Why shouldn't we give up our reflexes?
The journey to becoming critically digitally literate is a long and arduous one. So, wouldn't it be easier and faster to ask, “Grok, is this true?” and have it check for us?
Critical digital literacy, like a muscle, gets stronger the more you use it. If we ask someone else about everything, we will inevitably lose our own reflexes for questioning and analyzing over time. However, questioning things ourselves makes us active subjects who move consciously in the digital world.
On the other hand, like any individual source, artificial intelligence may also give incorrect, incomplete, or manipulable answers. Critical thinking requires looking at multiple sources without losing diversity of information.
There is still hope
In short, critical digital literacy is more than a shield that protects us from manipulation; it is a tool that strengthens social peace and democracy. This path does not involve rejecting all information; it involves taking more conscious steps by first recognizing our own biases and then understanding how the digital world works.
Of course, it is not an easy path; but every questioning glance, every conscious sharing, and every small reflex becomes part of a collective resistance. It is possible to make the digital world a fairer, more inclusive, and safer place. And this transformation is actually in all of our hands.
Digital Media Literacy Series
Alghoritmic bias: Platform capitalism, Data and Reality - Tirşe Erbaysal Filibeli
How to approach “accurate” Information in the digital age? - Koray Kaplıca
Journalists’ rights and obligations in digital media - Nihan Güneli
Our Media
IPS Communication Foundation/bianet is among the partners of the EU-funded “Our Media” project, which will run from 2023 to 2025.
The “Our Media: Civil Society Movement for the Multiplication of Media Literacy and Activism, Prevention of Polarization, and Promotion of Dialogue” project will last for three years.
The project's initial focus will be on building the capacity of NGOs, media professionals, young activists, and the public in the Balkans and Turkey to address trends and challenges related to media freedom, development, and sustainability.
Funded by the EU and covering the years 2023–2025, the partners of the “Our Media” project are as follows:
South East Europe Network for Professionalization of Media (SEENPM)
Albanian Media Institute (Tirana)
Mediacentar Foundation (Sarajevo)
Kosovo Press Council
Montenegro Media Institute (Podgorica)
Macedonia Media Institute (Skopje)
Novi Sad School of Journalism (Novi Sad)
Peace Institute (Ljubljana)
bianet (Turkey).
The researcher for the “Our Media” project on behalf of the IPS Communication Foundation/bianet is Sinem Aydınlı, the foundation's research coordinator.
.jpg)
A new civil society initiative: 'Our Media'
Scope of the project
The project begins with research aimed at identifying key trends, risks, and opportunities for media sustainability and mapping good practices in media activism to support media freedom and media and information literacy (MIL). The research findings will be used to strengthen the capacities of NGOs and other stakeholders in the media field to address challenges in the media.
Advocacy activities will be carried out to understand the capacities of journalists, media organizations, and media institutions within the scope of “Our Media.” Local and national media and other actors will be encouraged to carry out media activism work on gender inequalities in the media. Within the scope of the project, young leaders will be empowered to oppose discrimination and gender stereotypes and to support gender equality through various activities.
The project will reach local communities through financial support provided to NGOs in urban and rural areas, with the aim of developing citizens' MIL skills, supporting media freedom and integrity, and countering polarization caused by propaganda, hate speech, and disinformation.

The regional program “Our Media: A civil society action to generate media literacy and activism, counter polarisation and promote dialogue” is implemented with the financial support of the European Union by partner organizations SEENPM, Albanian Media Institute, Mediacentar Sarajevo, Press Council of Kosovo, Montenegrin Media Institute, Macedonian Institute for Media, Novi Sad School of Journalism, Peace Institute and bianet.
This article was produced with the financial support of the European Union. Its contents are the sole responsibility of IPS Communication Foundtaion/bianet and do not necessarily reflect the views of the European Union.
(ÖHK/SA/VC/VK)

