Skip to main content

Ask an Expert _

We live in a complex information system and it can be really hard to know what is accurate and reliable, and what to do to navigate the information storm. That’s why we have a team of Be Media Smart experts on hand to answer your questions. Take a look below for some common questions, or ask your own question.

Meet the experts

Dr. Claire McGuinness

Read info

Ciaran O’Connor

Read info

Dr. Marta Bustillo

Read info

Dr. Ricardo Castellini Da Silva

Read info

Dr. Eileen Culloty

Read info

Stephen McDermott

Read info

Jane McGarrigle, Webwise

Read info

Dr. Eugenia Siapera

Read info

Submit your Question

Do you have a question for our Experts?

Submit your question here.

Claims that the army was deployed to Dublin city center on the night of Thursday 23 November were shared in the midst of a live, developing situation where many things were happening at once and rumours, misinformation and lies often circulate in real time online and offline. Many people believed the images were real and current.

To add to the misleading nature of the claim, this was not the only image posted online that claimed the army were being deployed to the city, so, in the context of a live situation like this, it’s not surprising that the claim was shared as true and deceived lots of people online.

The particular image was especially deceptive because the original tweet featuring it that went viral made the explicit claim (which was false) that it showed tanks on the streets of Dublin that night. The image depicted a scene at night, which lined up with real world events happening at night time, but beyond that, there were no details to suggest the date that photo was taken.

To verify the location, there were some clues that were helpful. In the background was a white building with a yellow sign. By zooming in on the image you could just about make out that it said “Doyle.” You could then have Googled “Doyle” “Dublin” and you probably would have got a lot of results about the two different Doyles pubs on College St and Phibsborough.

If you opened Google Maps and searched “Doyle” over Dublin, you would have received a list of many businesses with Doyle in the name. There’s a lot of businesses but, by process of elimination, you could have checked each and reviewed the front of their buildings to see if it was a match for the building in the photo.

One of the results was JP & M Doyle, a real estate agent in Terenure. Their building features the same yellow “Doyle” sign as seen in the image, verifying the image was taken in Dublin.

This of course is a time consuming process that is usually mostly done by journalists and researchers and online users may not have the time to devote to verification like that. If you were following events online, you could have also done your own search for something like “Dublin” “army” and reviewed the online posts from others to see if someone else had commented on or debunked the claim that the army was deployed.

If you searched that on X (formerly known as Twitter) on Thursday, you might’ve seen this tweet stating the photo was taken in Terenure but also helpfully noting that army vehicles pass through here regularly on their way to/from the nearby Cathal Brugha Barracks.

That still doesn’t confirm when the image was taken, which was really at the heart of this false claim. For that, it’s important to rely on credible, trusted sources of news and information or official organisations at the heart of the claim.

In this instance, with an army deployment, you could expect the Defence Forces to issue a statement announcing an operation – or – confirming the image was unrelated to November 23. That’s what they did with this post on Twitter stating the image was “from a separate routine operation and have no connection to this evening’s events.”

Misinformation is typically defined as false, misleading, or manipulated information that is shared without any critical evaluation and without the intent of misinforming others. Think of the kind of messages that you may have seen “forwarded many times” on WhatsApp during the early days of COVID-19 with different claims or rumours of things that never happened.

Disinformation is typically defined as false, misleading, or manipulated information intentionally spread with the aim of deceiving, manipulating, or misleading an audience. This might be done for political, economic, social or personal gain and is usually done through the use of technology and social media.

It can be hard to distinguish between misinformation and disinformation when you don’t know what the intention is, but some ways to recognise the difference include:

  • The repeated posting, publishing, sharing or dissemination of false or misleading information or supporting others doing the same. Activity that is calculated instead of casual.
  • Clearly demonstrable manipulation of media (sophisticated or not) or misinterpretation of factual information. This may mean producing deepfakes (sophisticated) or creating fake tweets (not so sophisticated).
  • The person/group responsible claim that powerful organisations or figures in government, the media or versions of the establishment are deliberately and covertly working together to obscure the truth, influence public opinion or restrict people’s freedoms.

There are so many ways false, misleading or inaccurate information circulates but here’s some of the most popular.

  1. Imposter websites or accounts that appear to look like professional news organisations or journalists produce misleading content that are designed to misinform, enrage or cause a reaction.
  2. Anonymous accounts online often share memes that are only intended to make people angry or upset and generate, or exploit, feelings of grievance among one group against members of another.
  3. Automated accounts or click farms are used to try and manipulate trending sections on social media platforms to artificially boost content or influence recommendation systems that can, in turn, expose more online users to false or misleading information.
  4. States have become interested in using the digital world to influence and shape public opinion at home and abroad. Ahead of the 2016 US presidential election, online accounts linked to Russian military intelligence posed as American citizens online and used Facebook and other platforms to spread false and misleading information to foster polarisation and division.
  5. Private and semi-private messaging apps or group chats are highly popular. Such online spaces operate away from public scrutiny meaning they can be used to disseminate false and misleading information, encourage others to share this content or add new members to such chats.

Put simply, everyone is. Misinformation, disinformation, conspiracy theories, rumours and so on all have the potential to mislead or misinform us. False and misleading claims are often attractive because they confirm a person’s understanding of the world, even if some or all of the claim is not based in fact.

Belief in false and misleading information is linked to certain psychological needs that are not being fulfilled in a person’s life. For example, people’s need for knowledge and certainty fuel their desire to understand why certain events are taking place. Conversely, in times of crises, like, for example, a pandemic, uncertainty is widespread and this can result in more people being drawn to narrative or claims that may be false but offer certainty.

False or misleading information gives people information that gives them a sense of knowledge or control during crisis situations. If people aren’t satisfied with the official reasons that explain certain events, misinformation and disinformation that confirms their biases can seem more appealing. They’re often simple narratives that allow people to easily recognise who is on the side of ‘good’ and who isn’t. stripping out the complexity of a situation. This is where disinformation is most successful.

Conspiracy theories are based on the assumptions that nothing happens by accident, and that harmful or tragic events are caused by the actions of small, powerful groups of controlling (often government) elites, who are pulling the strings. There is usually an alleged, secret plot (e.g., ‘Plandemic’); a cabal of shadowy conspirators (e.g., Qanon); some cherry-picked or falsified ‘evidence’ to support the theory; a belief that coincidences don’t exist; the division of the world into “good” and “bad” sides; and the identification of scapegoats, who become targets for the conspiracists (European Commission). Conspiracy theories can be recognised by extraordinary, unsupported claims that defy established scientific knowledge, and excessive use of ad hominem (personal) attacks against anyone who attempts to debunk the theory (the “enemy”). They are often highly emotive, exploiting people’s fears, suspicions, or outrage to garner support.

While common explanations centre on lack of education or low intelligence, conspiracy theories are also linked to vulnerability. It has been shown that they thrive during traumatic times, when people may feel powerless over their circumstances – such as a global pandemic. Conspiracies can help people make sense of chaotic situations, by offering apparently simplistic explanations that may be easier to accept than reality. They also enable people to feel part of ‘insider’ communities, who know the “truth” about the imaginary enemies behind the state of affairs. Social media is also critical – the combination of echo chambers, viral sharing, algorithmic content delivery, anonymity, and a lack of fact-checking has created a fertile environment for conspiracy theories, often expedited by “bad actors” who push them. There is no universal personality type that is prone to believing in conspiracy theories – anyone can fall prey.

This is a challenging issue. While conspiracy theories have always existed – e.g., the 1969 moon landing was faked in a studio – they seem to have become even more pervasive in recent years. This was apparent during the COVID-19 pandemic, when multiple conspiracies relating to vaccines, lockdowns and the origin of the virus were shared widely on the Internet and social media. While conspiracy theories can be entertaining, they are also dangerous, especially when “going down the rabbit hole” leads to cataclysmic outcomes, such as the January 6th US Capitol attacks, or preventable deaths from viruses.

Simply confronting a believer with the facts can be ineffective, as they may be interpreted as further “proof” of the conspiracy, causing them to dig in deeper. It’s important to show empathy, and not ridicule your family member – remember, the reasons for their belief can be complex. One approach is to encourage open debate, and to show interest, e.g., by asking questions about the theory which may trigger self-reflection and allow you to gently introduce some counter-information. If possible, asking a former conspiracy theorist to help with debunking can also be a useful strategy. Ultimately, the greatest weapon against conspiracy theories is media and information literacy, which empowers people to critically evaluate the information they encounter, seek out reputable sources, and engage in thoughtful discussions.
Some helpful sources of information can be found below:

COMPACT Guide to Conspiracy Theories:


The Conspiracy Theory Handbook:


Identifying conspiracy theories (EU Commission):


One of the more interesting recent changes in news reporting has emerged from the rise of social media, which has empowered ordinary members of the general public – in this context, non-journalists like your teenager – to become content creators and what has been called “citizen journalists”. Citizen journalism is more formally defined as “the reporting of news events by members of the public using the Internet to spread the information” (Techopedia), but is probably more recognisable to us as livestreams or video clips of events taken by bystanders and shared on YouTube, Facebook, TikTok, blogs and other platforms. Some of these clips also find their way into the mainstream news media channels, which may not yet have been able to get their own reporters on the ground. While it’s not a completely new phenomenon – think of the famous Zapruder film of the JFK assassination in 1963 – the citizen journalism trend has been accelerated by technology. Most of us now own smartphones equipped with increasingly high-quality cameras that enable people to capture and share news events as they happen. Easy-to-use editing software has allowed people to cut and enhance their videos to create attention-grabbing reports. It’s not just video – blogs, websites and other open source channels are also populated by eager citizen journalists. Now, “Almost anyone with internet access can break a story or even create the news, at least in principle” (Tewkesbury & Rittenberg, 2012). In recent times, public awareness of citizen journalism has soared, most likely due to high-profile news stories such as the 2020 murder of George Floyd, which was captured on video by a 17-year old citizen journalist who shared her clip on social media, bringing global attention to the killing. The resulting outcry led to a surge of protests against racism and policy brutality, which circulated under the #blacklivesmatter hashtag on social media.

In these situations and others, such as reporting from disaster zones, arguments in favour of citizen journalism seem clear – if mainstream news media are not or cannot be present, sharing first-hand reports can draw attention to crimes, injustices and humanitarian crises that might otherwise stay below the public radar. You could argue that citizen journalists contribute to democratising the news landscape, enabling diverse perspectives and local stories to reach broader audiences, even before professional journalists take up the story. In regions where media is under tight government control, this can be critical. However, caution is advised – citizen journalists can also distort the news. As they are not required to adhere to the standards and codes of ethics governing professional journalists, the trustworthiness of citizen reports can be questionable. There are no editorial checks, and content may be manipulated to fit an agenda. Problems with copyright and privacy may arise, when no legal or fact checks are performed. Important context or exposition may be omitted. To address this, one thing that aspiring citizen journalists like your teenager can do is familiarise themselves with the Seven Standards of Quality Journalism – and be careful out there!

The short answer is that the evidence should determine what is and isn’t disinformation. Any person or organisation making a decision about a claim should do so transparently and in reference to the evidence. Of course, the difficulty is that evaluating evidence isn’t always straightforward because there may be a lack of evidence or the nature of the claim isn’t entirely factual. For example, there is a clear scientific consensus on the evidence for the safety and effectiveness of childhood vaccines. Some people may disagree with that, but the scientific evidence is clear. In contrast, many other things we argue about in society are not based on clear-cut facts; they are interpretations of evidence and they concern political and moral ideas about what we should or shouldn’t do. In these cases, it’s often the case that a claim is partially false or decontextualised or distorted in some way and the reasons for coming to that conclusion should be clearly explained.

In recent years, many studies investigating why people believe, share, or engage with disinformation. For example, when people are tired, stressed, or distracted, they are less likely to think critically about the claims they come across. Related to that, prior knowledge is important because when people have a good foundation of knowledge about a topic, they are less likely to accept false claims about that topic. Moreover, people may have a bias that colours their judgement. Disinformation often taps into these conditions by presenting attention-grabbing and emotional claims. However, specific scenarios of disinformation and belief vary considerably. Any of us could fall for disinformation in the right circumstances. Also, there’s an unfortunate tendency to assume that engagement is the same as belief. People are motivated to share and engage with disinformation for a variety of reasons that don’t necessarily mean they believe it. They may find claims entertaining or thought provoking or they might be worried just in case a claim is true.

Disinformation is created by a wide range of actors including states, corporations, groups, and individuals. And the reasons or motivations for creating it are also wide ranging. Motivations could be political, ideological, or financial or a mix of these. Some disinformation stems from a highly-organised and well-funded international lobby groups while other disinformation might be highly local. There is no single model.

The answer to disinformation is not blind trust in institutions or experts. It is not the case that all “mainstream media” is good and all “alternative media” is bad. Trust must be earned and organisations need to demonstrate that they are trustworthy by following good practices, acting in the public interest, and relying on evidence. Media and information literacy may help the public recognise when different media and official institutions live up these ideals and whether they are accountable. We are bombarded with information from different sources. There is a great burdern on individual to place their trust wisely.

Pre-bunking is a relatively simple idea: it involves warning people in advance about manipulation tactics and strategies. Whereas debunking or factchecking tries to correct false information after the fact, pre-bunking rests on the idea that if you are forewarned about disinformation strategies you are more likely to be able to resist them when you see them. For example, pre-bunks might explain that there’s an organised campaign to discredit climate science or that conspiracy theorists have been promoting claims about crisis actors to distract attention from crimes and atrocities. On the surface, we might think of it as being similar to media literacy in that both aim to encourage people to think critically about the media they consume, but they have very different origin. The ideas behind pre-bunking come from research in social psychology in the 1960s. In contrast, media literacy has a much broader set of concerns to protect people from undue media influence but also to empower people to use media to express themselves and lead full lives. One way to think of the difference is that pre-bunking is trying to put out the fire of disinformation whereas media literacy is trying to promote good citizenship and participation and inclusion so  fighting disinformation is just a part of it.

The broad term Alt Tech generally refers to smaller platforms that have a more lax approach to content moderation. These platforms emerged or became more popular when platforms such as YouTube, Facebook, Instagram and Twitter (now X), ‘deplatformed’ several personalities of the extreme or far right. This began around 2017-2018, following events such as the Charlottsville Unite the Right rally, the Pittsburgh Synagogue shooting, the Christchurch Mosque attack and peaked after the January 6, 2021 events in the US Capitol.

When activists and personalities of the far right found themselves without a platform, they turned to spaces where there was little or no limit to the kind of content posted. Platforms such as Gab, Bitchute, the now-defunct Parler, TruthSocial, Rumble and pockets of Telegram, DLive, Odysee and Discord are all considered to be part of Alt Tech. Substantial evidence suggests that Alt Tech platforms host significantly higher volumes of hate speech, disinformation, conspiracy theories, and other forms of toxic content compared to mainstream platforms. There is some hope that stronger policy frameworks at the national and EU levels may address the role of Alt Tech. However, users may want to be vigilant and avoid circulating information generated from these platforms, which can have urls beginning with (Telegram), (Bitchute), (Rumble) etc. These platforms have few, if any, quality checks on the information circulating and this means that it can include anything from hate speech to extremely violent and disturbing contents.

As with most social media, Alt Tech platforms mainly rely on users interacting with, commenting and sharing (mis) information. The key therefore is to create engaging narratives to draw people in.  A common characteristic of misinformation circulating in Alt Tech platforms is the use of emotional communication, and in particular anger, fear and anxiety. Referring to vulnerable groups, for example children, or highlighting points of tension, for example around housing, misinformation narratives provoke fear or anxiety.

In a second step, they provide a clear solution by naming and blaming the supposed perpetrators. Stoking fear, anxiety and anger is therefore a typical strategy used in harmful narratives, which then direct these emotions to target and blame specific communities.

As media stories often engage the emotions of audiences, it can be very tricky to determine whether a narrative is manipulative. We live in imperfect societies and there are many grievances and tensions, which can be expressed through anger and anxiety. However, when a narrative blames whole communities of people and presents overly simplistic solutions, this should make us pause, think and question the motives and reasoning behind it.

Research shows parents are the main source of help when something bothering or upsetting happens online to the children and parents are the primary educators of their children. When it comes to helping children navigate the online world, parents can feel overwhelmed feel like they don’t know as much as about the internet as the kids do.

The most important thing parents can do to help protect their child online is by having a good relationship with them through open conversations. There are tonnes of great resources to support parents; take a look at or download the Webwise Parents’ Guide to a Better Internet. For more info visit: Additionally, The National Parents Council Primary offer online and face-face training for parents in the area of internet safety, and watch out for messages from your childs school offering free training in this area.

Digital skills are fundamental to children’s successful engagement with the world through the internet. The following are key points to consider when using technology with younger pupils.

Simplicity: Keep it simple. Use one website or app and think about creative ways of how you can meaningfully integrate it to develop language learning or knowledge and understanding of a topic that you have planned to explore in your Scéimeanna Coicíse.

Model: Model what you want your pupils to do. If you have access to a laptop or tablet and can connect it to a whiteboard in your classroom, you can introduce your pupils to the features of a new digital resource on a whole class level together. For example, if you wanted to introduce your pupils to a new digital storytelling tool such as Write Reader or Book Creator, you could engage them in a shared digital writing experience to create a class book. This would not only provide you with an opportunity to model how to use the new tool but it would also be an excellent opportunity to focus on the writing process itself.

Clear expectations: Have clear expectations for your pupils to follow in relation to handling devices and their behaviour when working in pairs or in groups. A good idea would be to involve your pupils in creating a simple class “Digital Contract” so that they clearly know what is expected of them.

Digital Citizenship Education is more than one school subject; it crosses subject boundaries and requires a whole-school approach involving the whole-school community. Below are some ways you can embed digital citizenship education into your teaching and learning:

  • Create and agree a class code of conduct with your pupils for using the internet and digital devices in your classroom. This will provide an opportunity to discuss digital citizenship domains such as access and inclusion, rights and responsibilities and e-presence and communication.
  • Use research tasks/projects to teach your pupils media and information literacy skills such as how to safely search the internet to find accurate and trustworthy information and to use critical thinking skills to spot false information online.
  • Encourage learning and creativity into your next history lesson by introducing students to film and storytelling. Have pupils create and record a short history video on an important figure in history.
  • Explore ethics and empathy online using stories, songs or plays incorporating scenarios or characters feelings. Webwise provides free online safety resources that feature stories, songs and short videos such as Vicky’s Party from the My Selfie and the Wider World, available here:
  • Use subjects such as science and geography to discuss with pupils how websites, apps, technology use big data in various ways for example, GPS signals, location tracking, weather/climate information, artificial intelligence, web browser cookies, etc. and begin conversations on topics such as rights and responsibilities or privacy and security.
  • Safer Internet Day is a great opportunity to discuss being a safe and responsible digital citizen with your pupils and involve the whole school community. The Webwise Safer Internet Day hub has lots of great ideas and activities to help teach your pupils about creating a positive digital footprint and their rights and responsibilities online as digital citizens:

Digital Citizenship crosses subject boundaries and involves a range of school stakeholders. It requires a whole-school approach to provide consistency and shared expectations across the school community. There are opportunities that naturally occur in a subject in which digital citizenship education can be embedded. Digital Citizenship naturally offers opportunities to support junior students’ development of the 8 key skills as outlined in the junior cycle curriculum.

Digital citizenship education also fits perfectly within the digital learning planning process, and maps directly to the Digital Learning Framework (DLF). It directly links to SHPE, CSPE and Leaving Cert Computer Science. For young people to become effective digital citizens there are several competencies they need to develop including information and data literacy, communication and collaboration, digital content creation, problem solving, privacy and data literacy.

Schools can access a range of free curriculum aligned resources from including the Connected programme. Connected – a Junior Cycle Digital Media Literacy programme explores; Online Wellbeing; News, Information and Problems of False Information; Big Data and the Data Economy; and My Rights Online. Connected aims to empower young people to be effective, autonomous and safe users of technology and online media.

When it comes to helping children navigate the online world, we know many educators can feel overwhelmed. The internet is a big part of children and young people’s lives so it is important to engage with them and support them. But educators need support to do this.

Webwise offer a wide range of resources and supports on from online safety courses for educators to tools to children managing their online experiences.

Oide Technology in Education provides a wide array of professional learning opportunities to teachers through its online courses, good practice videos, webinars, online learning resources and courses in Education Centres. Bespoke school support is provided by Oide Digital Technologies Professional Learning Leaders.

Youth Work Ireland offer a superb eLearning to support the youth work sector to keep young people safe online.

Finally, keep an eye out on for upcoming events and webinars.