Digital Media Literacy Beyond the Classroom

Door Seden Anlar, Maria Luís Fernandes, op Thu Jan 16 2025 23:00:00 GMT+0000

In the third and last part of their article series on AI and digital literacy, Seden Anlar and Maria Luís Fernandes focus on non-formal education. Such initiatives outside of the school system aim to reach wider audiences and provide practical, accessible tools for navigating the complexities of the digital world. 'Ensuring young people are equipped to navigate misinformation and critically engage with content is not just an educational priority but a political one.'

Artificial intelligence (AI) has transformed the way information is created, shared, and consumed, creating both opportunities and challenges in the digital world. Tools like ChatGPT have made it easier to produce content, but they have also accelerated the spread of misinformation. AI-generated deepfakes and manipulated media have already influenced elections and swayed public opinion, raising concerns about the growing impact of disinformation. Efforts by social media platforms and news organisations to mitigate these risks – such as labelling AI-generated content and encouraging community-driven fact-checking – have seen limited success. For young people, in particular, the sheer volume of digital content makes distinguishing fact from fiction increasingly difficult. Studies reveal that individuals aged 15–24 are especially vulnerable to believing and sharing false information, often relying on intuition rather than critical evaluation.

As we described in the first article of this series, in response to these issues, the European Union has launched initiatives like the Digital Education Action Plan and Media Literacy Guidelines to help individuals navigate today’s media landscape. However, the implementation of these policies varies significantly between member states. Taking a deeper look at Belgium and Portugal, with their differing approaches to digital media literacy in schools, offers valuable insights into how countries are tackling the challenges posed by AI-driven content. In the second article, we analysed policy documents and reports, reviewed existing initiatives, and conducted interviews with teachers, students, and civil society members in both countries to better understand these efforts.

Higher Education

While primary, secondary, and high schools continue to grapple with the challenges of integrating AI and media literacy into curricula, higher education institutions in Belgium and Portugal are taking notable steps towards education on AI.

For instance, In Belgium, Vrije Universiteit Brussel (VUB) set a national precedent by launching the country’s first academic bachelor’s program in artificial intelligence in 2022. This milestone was followed by the creation of advanced master’s programs, such as KU Leuven’s interdisciplinary masters degree on AI and Ghent University’s AI research initiatives. These programs primarily focus on practical applications of AI, aiming for innovation in fields ranging from healthcare to robotics and sustainability.

The focus of these programs remains largely on technical aspects, rather than digital media literacy and AI and combating misinformation.

Portugal’s universities have also expanded their offerings. ISCTE – University Institute of Lisbon provides a bachelor’s degree in Digital Technologies and Artificial Intelligence, while the University of Porto offers a degree in Artificial Intelligence and Data Science. In addition, the University of Coimbra offers a master’s program which delves into cutting-edge AI technologies, including machine learning and natural language processing.

Despite these developments in higher education, the focus of these programs remains largely on technical aspects, rather than digital media literacy and AI and combating misinformation.

Beyond the Classroom

As concerns over disinformation and the need for digital media literacy grow, a variety of non-formal education initiatives across Europe are stepping in to fill gaps in formal education. These initiatives focus on learning opportunities outside traditional classrooms, engaging young people through workshops, online programs, and interactive activities. By extending digital media literacy education beyond the confines of formal schooling, they aim to reach wider audiences and provide practical, accessible tools for navigating the complexities of the digital world.

Can we rely on corporations like Google to foster critical reflection about the technologies they profit from?

For instance, the 2024 Europe Code Week, an online event held in October 2024, engaged more than 4 million participants across Europe. It provided young people with opportunities to take part in coding exercises, learn basic computer science principles, and explore AI literacy. By promoting creativity and problem-solving, the program aimed to help participants better understand the digital tools shaping their lives and equip them to use technology responsibly.

In Portugal, an initiative called ‘Data Defenders’ uses video games as a medium for teaching media literacy. By gamifying complex topics like algorithmic bias and data ethics, this initiative makes abstract concepts relatable and engaging for younger audiences. The Experience AI program, developed by the Raspberry Pi Foundation in collaboration with TUMO Portugal, represents another non-formal initiative. Funded by Google.org, this initiative aims to train over 2,000 young people in AI literacy by 2025. The program includes culturally relevant materials, teacher training, and parental involvement. However, the involvement of big tech companies like Google in AI education raises important questions. In a world where corporations like Google control and store vast amounts of information and where conversations about society and politics increasingly take place on their platforms, can we rely on them to foster critical reflection about the technologies they profit from? Initiatives funded by companies with vested interests in the digital landscape may shape curricula in ways that avoid scrutinising their practices, making it essential to have independent oversight and diverse perspectives to ensure AI education remains balanced and objective.

Journalists at the Frontline of Truth

While some of these formal and non-formal programs can play a significant role in helping young people critically navigate the digital landscape, there is a key player in the field who is often overlooked in discussions about enhancing media literacy: the journalist. With their ability to verify sources, identify bias, and provide context, journalists are uniquely positioned to bridge critical gaps in media education and contribute significantly to improving digital media literacy.

Journalists are uniquely positioned to bridge critical gaps in media education and contribute significantly to improving digital media literacy.

For instance, initiatives like Lie Detectors, a Europe-wide journalist-driven non-profit that works to help teenagers, pre-teens and teachers tell news facts from news fakes and understand ethical journalism, bridges the gap between schools and media professionals, enabling journalists to conduct workshops that teach students how to verify sources, assess credibility, and recognise bias. These partnerships not only empower students but also support teachers in integrating media literacy concepts into their classrooms.

We spoke to Juliane Von Reppert-Bismarck, Executive Director of Lie Detectors, who explained: ‘Journalists are very well placed to hand on these skills to young people and teachers. They are champions of truth and fact. Until teachers are fully equipped to integrate media literacy into classrooms, involving external experts such as journalists and librarians –who also deal with truth and fact – will remain a necessary part of the solution.’

Digital Literacy Starts at Home

When it comes to enhancing AI literacy beyond the classroom, other key figures can be found even closer to home: parents. 2018 Research showed that young people often develop their digital habits and skills by observing family members. Therefore, active parental involvement – whether through co-viewing media, discussing online content, or establishing healthy screen time habits – can significantly bolster critical thinking and self-regulation in the digital sphere.

However, research revealed that many parents feel unprepared for this role, unsure of how to balance constructive technology use with concerns over screen time and online safety. The study found that parents with less familiarity with technology tend to impose stricter limits, fearing negative effects like reduced concentration or social isolation. By contrast, digitally literate parents are more comfortable supporting their children’s use of technology and often engage with them directly.

To help parents develop their role as digital mentors, schools can play a supportive role. Axel Baeyens, a Belgian primary school teacher at Sint-Lievenscollege Wereldwijs, who teaches students aged 10 to 12 across subjects like math, languages, sciences, arts, and religion provided insights into how schools can support parents’ digital literacy. ‘That's a bit of a grey zone,’ he admitted, ‘because essentially, when you're going into that, it sounds like you're telling parents how to parent, and some could take it that way. But schools like ours try to bridge this gap by employing social workers who help parents unfamiliar with the school system, language, or digital tools.’

‘What we really advocate for is that media literacy – critical digital media literacy – should be recognised as a core literacy alongside basic skills like reading and writing and counting.’ (Juliane Von Reppert-Bismarck)

Axel described a program where parents with low-level digital skills participated in four or five lessons. ‘We started from scratch, showing them how their kids use school platforms, how to track the school bus online, book a movie, or even navigate Google Maps,’ he explained. ‘We also have an introduction to our school system every year, and for fifth and sixth grades, we have sessions where parents and students define daily screen time limits together – something we even note in the school diary.’

Yet, he acknowledged the variability in parental approaches: ‘Some are very anti-social media, and some don’t care at all, giving kids unlimited screen time. We see the effects – students coming to school exhausted because they stayed up all night watching videos.’

Core Literacy for the Digital Age

Civil society advocates argue that digital media literacy should be treated as an essential skill, integral to education systems at every level. Juliane from Lie Detectors highlighted this vision: ‘What we really advocate for is that media literacy – critical digital media literacy – should be recognised as a core literacy alongside basic skills like reading and writing and counting. Once it’s acknowledged as a core literacy, everything else follows. It becomes part of all curricula, teacher training, and every subject area.’

Juliane humorously frames Lie Detectors’ ultimate goal as becoming obsolete: a future where media literacy is seamlessly embedded in education, eliminating the need for external organisations. However, she acknowledges that this vision will take time. Until then, external experts will remain indispensable in helping schools build capacity and adapt to evolving challenges.

Striking the right balance is crucial – teaching young people to question information critically without fostering an excessive cynicism that undermines trust in credible sources.

In terms of the scope of digital media literacy education, while some suggest that teaching basic principles of critical questioning may suffice, experts caution against oversimplifying a complex and dynamic media landscape. This is why Safa Ghnaim, Associate Programme Director and Project lead for the Data Detox Kit at Tactical Tech, expressed scepticism toward one-size-fits-all solutions: ‘Whenever somebody offers a simple solution for a complex problem, I’m suspicious. Complex problems don’t have simple solutions.’

Tactical Tech, an international non-profit focused on helping individuals and communities navigate the societal impacts of digital technology, emphasises the importance of nuanced approaches. Helderyse Rendall, Senior project coordinator of Tactical Tech's project focused on youth, What The Future Wants, echoed this sentiment. ‘Digital media literacy spans interconnected areas – safety, critical thinking, content creation – and it connects with our lives in increasingly complex ways,’ she said, emphasising the need for frameworks that balance clarity with adaptability.

Navigating Risks

There is also the worry that digital media literacy education, if not carefully implemented, could inadvertently make young people overly sceptical or paranoid, leading them to distrust credible sources and even fall into conspiracy theories. This risk is particularly significant – and somewhat ironic – given the role of misinformation in fueling conspiracy theories, which contributed to significant polarisation and radicalisation during the pandemic. Therefore, striking the right balance is crucial – teaching young people to question information critically without fostering an excessive cynicism that undermines trust in credible sources.

‘Young people are digital natives. They’re stakeholders and contributors who deserve a say in the conversation about technology.’ (Helderyse Rendall)

Safa from Tactical Tech suggests focusing on what makes a source credible, such as adherence to standards like those of the International Fact-Checking Network. She notes: ‘If we can give people a place they feel they can rely on and explain why it’s credible, they’re less likely to fall into distrust or conspiracies.’

Helderyse from Tactical Tech stresses the value of peer learning spaces, where individuals can discuss and challenge their perspectives. Tactical Tech’s Everywhere, All the Time exhibition exemplifies this approach, inviting young people to explore the societal impacts of AI and engage in discussions that encourage reflection and dialogue. ‘Spaces for dialogue push people toward a middle ground,’ Helderyse explains. ‘They help combat confirmation bias and extreme responses, which are often signs of misinformation.’

Beyond Protection

Another dynamic that seems to be a point of concern for experts is infantilisation of young people. Media literacy for young people is often framed as a protective measure, focused on shielding them from harm. While safeguarding is critical, Helderyse argues this approach can veer into paternalism. ‘Young people are digital natives,’ she said. ‘They’re stakeholders and contributors who deserve a say in the conversation about technology.’

Tactical Tech’s co-creation model challenges this type of paternalism, engaging young people as active participants in designing educational tools and discussions. By involving them directly, the model respects their insights while equipping them with critical skills to navigate and shape the digital landscape. ‘Young people aren’t just recipients of technology – they’re contributors,’ Helderyse added. ‘By positioning them as partners, we empower them to take ownership of their digital futures.’

The Cost of Inaction

Doing nothing concerning AI literacy, however, poses the biggest threat. When it comes to the consequences of inaction, the latest research shows that failing to address media literacy risks significant global consequences. The Global Risks Perception Survey ranks misinformation among the top ten global threats. According to the Global Risks Report 2024, accessible AI tools have fueled a surge in falsified content, from deepfakes to counterfeit websites, undermining public trust.

Ensuring young people are equipped to navigate misinformation and critically engage with content is not just an educational priority but a political one.

Furthermore, research from the Centre for Emerging Technology and Security (CETaS) at The Alan Turing Institute highlights the dangers of AI-enabled misinformation. While recent elections in Europe and the UK have not seen major disruptions, concerns are growing over targeted smear campaigns and voter confusion caused by AI-generated content. Women politicians, in particular, face disproportionate attacks, underscoring the need for proactive strategies to counter these trends.

The stakes are particularly high as young people – tomorrow’s voters – become increasingly immersed in digital spaces. Ensuring they are equipped to navigate misinformation and critically engage with content is not just an educational priority but a political one, safeguarding the democratic process for future generations.

Looking Ahead

The future of media and digital literacy education in Europe is set to evolve further, with increasing emphasis on equipping individuals with critical digital skills.

Lie Detectors anticipates more coordinated efforts across Europe. ‘We still have to wait for confirmation from the European Commission, but they’ve certainly already made it a banner issue, including in the allocation of portfolios within the Commission,’ Juliane shared. ‘I think we’re going to see a lot more initiatives that span the whole of Europe. More financing also, and some really important work on impact analysis, so that Europe can create standards that those implementing and designing curricula can aim for and integrate into their daily work in education.’

Encouraging national ownership will also play a crucial role. Juliane explained, ‘I think we will also see European institutions becoming very active in encouraging member states to take ownership of this effort – to make it something that every country can be proud of within their own education structures.’

Moving forward, the challenge lies in ensuring these plans translate into practical, impactful changes. With support from educators, policymakers, and external organisations, there is potential for Europe to build more cohesive and adaptable approaches to media and digital literacy – while recognising the varied needs of its member states. As these efforts take shape, the hope is to foster a generation better equipped to critically engage with an increasingly digital world.

This text is part of the Come Together Fellowship Program, a training program for young journalists led by cultural journal Gerador. The text was written under the guidance of rekto:verso.

This article was published in the context of Come Together, a project funded by the European Union.