| MIL Exposure | Mean Accuracy | SD Accuracy | Mean Sharing | SD Sharing |
|---|---|---|---|---|
| MIL | 9.91 | 1.63 | 2.59 | 2.33 |
| No MIL | 9.90 | 1.92 | 2.43 | 2.70 |
Evaluating the Impact of Media and Information Literacy on University Students’ Ability to Discern and Share Fake News in the Philippines
1 Introduction
1.2 Misinformation in Political Decision-making
As individuals turn to online platforms for news, they encounter a deluge of information that may include inaccuracies and distortions, leading to substantial implications for democratic processes and civic engagement. Misinformation can exacerbate political cynicism, as demonstrated by findings from the 2020 U.S. Presidential election. Research indicates that increased exposure to misinformation through social media is positively correlated with higher levels of political cynicism among users [9]. This relationship highlights how misinformation does not merely mislead individuals but also fosters disillusionment with political institutions and processes.
Furthermore, misinformation’s impact extends beyond individual attitudes to influence broader political behaviors and civic engagement. For instance, misinformation about health issues, particularly during crises such as the COVID-19 pandemic, can sway public opinion on policy decisions, affecting electoral participation. In areas where political and health-related misinformation proliferates, there are significant concerns regarding how misinformation distorts citizens’ expectations of political leaders and undermines public trust [10] . This undermines their commitment to civic responsibilities, including informed voting and participation in democratic processes.
The consequences of misinformation are further amplified by the dynamics of social media, where false narratives can spread rapidly, creating echo chambers that reinforce existing beliefs [11] . The concept of collective influence suggests that individuals are not merely passive recipients of misinformation; they engage in selective sharing based on their partisan biases, which compounds the problem [12]. Misinformation can distort public discussions around crucial policies, steering voters away from rational decision-making and towards ideologically motivated reasoning [13] .
Moreover, individuals often exhibit a reluctance to change beliefs even when confronted with corrections to misinformation, a phenomenon known as the continued influence effect [14]. This can have dangerous implications for political decision-making, as voters may remain anchored to false beliefs that misinform their choices. Particularly in polarized environments, the challenge becomes not only discerning accurate information but also overcoming cognitive biases that reinforce the acceptance of misinformation [15] .
Overall, the integration of misinformation into political discourse poses significant risks to democratic stability and civic integrity. The rejection of factual information in favor of misleading narratives threatens not only individual decision-making but can ultimately undermine the foundation of democratic governance itself [16]. Combating this requires a multifaceted approach, including enhancing media literacy among citizens to cultivate critical thinking and discernment in the face of online misinformation [17].
1.3 Addressing Misinformation Through Educational Initiatives (MIL)
Addressing misinformation through educational initiatives, particularly media and information literacy (MIL), is increasingly recognized as critical in today’s digital landscape. Misinformation can undermine public health efforts, skew political beliefs, and erode social trust. Educational initiatives aimed at improving literacy in media consumption can empower individuals to navigate the complex information ecosystem and critically evaluate the content they encounter.
One significant finding is that integrating interdisciplinary approaches in education can enhance the effectiveness of misinformation correction. For instance, a study highlighted the importance of community-engaged science communication training, which encourages students to adopt inclusive and participatory methods for addressing science misinformation. Students learned to value building trust alongside correcting misinformation, suggesting that educational programs that merge scientific skills with communication strategies can improve outcomes in combating misinformation. This not only equips students with the necessary tools to discern credible information but also fosters a more informed public capable of participating in discourse regarding scientific and socio-political issues [18].
Moreover, the relationship between education and misinformation acceptance underscores the critical role of MIL in enhancing individuals’ ability to evaluate the veracity of information. Research indicates a widening gap in misinformation acceptance based on educational background, which could exacerbate health and social issues [19]. This suggests that educational systems need to prioritize MIL to help students become more discerning consumers of information. By doing so, they can withstand the surges of misinformation that frequently accompany crises such as health emergencies or political events.
Health misinformation, particularly during the COVID-19 pandemic, has demonstrated the detrimental effects on public health outcomes. Acknowledging the role of health literacy as a form of educational initiative is vital in this context. Programs aimed at improving health literacy can significantly reduce the susceptibility to health misinformation by enabling individuals to better understand and seek reliable sources of health information [20]. Additionally, educating healthcare professionals about recognizing and addressing misinformation can bolster their effectiveness in patient communication [21, 22] . Not only must they convey correct information, but they should also engage with empathy to address existing misconceptions while promoting evidence-based practices.
An equally important aspect of combating misinformation through educational initiatives is the use of technology and media platforms in an increasingly digital age. A literature review noted that social media serves both as a pervasive source of misinformation and a potential tool for education [23] . By incorporating digital literacy programs that teach individuals how to critically assess the content encountered on social media, educational initiatives can address the infodemic effectively. Such programs intend to develop skills in identifying misleading information, analyzing sources, and recognizing biases [24].
Finally, the continued exploration of innovative educational approaches is essential, especially in the face of an evolving misinformation landscape. Collaborative efforts, like the Implementing Mitigating Misinformation Toolkit or interprofessional education curricula that address misinformation, represent promising strategies to unify knowledge across disciplines and engage learners [25, vraga2020creating?]. These initiatives aim not just to impart knowledge but also to foster collaborative skills among future leaders tasked with combating misinformation in their respective fields.
2 Research Question
This study investigates the influence of media and information literacy (MIL) skills acquired through education on students’ ability to assess the veracity of news headlines and their intentions to share potentially false information. Specifically, the study addresses the research question:
RQ: How do information literacy skills from MIL education influence students’ evaluation of headline veracity and their intentions to share fake news?
This question seeks to understand if structured MIL education, particularly courses like LIS 50 and LIS 10 at the University of the Philippines, significantly enhance students’ capacities to discern factual news from misinformation, and whether such educational interventions affect their subsequent sharing behavior. Addressing this research question provides insights into the effectiveness of MIL educational practices and their potential role in mitigating the spread of misinformation on social media platforms.
3 Methods
3.1 Participants
Participants were recruited via Formbricks on the UP SLIS Facebook page and website (N=66). Eligibility criteria included: (a) LIS students who completed LIS 50; (b) non‑LIS students enrolled in the General Education course LIS 10; and (c) non‑LIS students who had not taken LIS 10 or LIS 50 (control group). Demographic data (age, gender, academic program) were collected to describe the sample and check for potential confounds.
3.2 Materials
Headlines: 28 Facebook‐style headlines (14 real, 14 fake) posted between February and June 2025. Real headlines were drawn from the Facebook pages of leading Philippine news outlets (GMA Network, Manila Bulletin, Super Radyo DZBB, Philippine Daily Inquirer, TV5, ABS‑CBN, and The Philippine Star). Fake headlines were obtained from Tsek.Ph, an IFCN‑accredited fact‑checking platform focusing on election misinformation.
3.3 Survey Items
For each headline, two binary items: (a) Accuracy Judgment (“To the best of your knowledge, is this claim in the above headline accurate?”; Yes/No), and (b) Sharing Intention (“Would you consider sharing this story online through Facebook?”; Yes/No).
3.4 Procedure
Participants first completed demographic and social‑media usage questions. Headlines were then presented one at a time in randomized order. After each headline, participants indicated an accuracy judgment and a sharing intention. Finally, open‑ended questions captured rationales for their judgments and sharing decisions.
3.5 Data Analysis
R Packages: Analyses were conducted in R (v4.x)[26] using readxl[27] and dplyr[28] .
MIL Exposure Coding: Participants were classified as “MIL” if they had taken LIS 10 or LIS 50, and “No MIL” otherwise.
Flag Computation: For each of the 28 accuracy items, responses were compared against the answer key to generate binary flags (corr1…corr28). For each of the 28 sharing items, responses were coded into share1…share28 (Yes = 1; No = 0).
Score Calculation: A row‐wise sum of corr* flags yielded an accuracy_score (0–28). A row‐wise sum of share* flags yielded a sharing_score (0–28).
Statistical Tests: One-way ANOVAs assessed differences in accuracy_score and sharing_score by MIL exposure. Statistical significance was evaluated at \(α\) = .05.
Qualitative Analysis: Open‐ended responses were thematically coded to identify the criteria used for accuracy judgments (e.g., source credibility, language cues, verification strategies) and motivations for sharing or withholding content.
Transparency and Reproducibility: All materials used in this paper’s preprint (datasets, code, instruments) are publicly available at: https://github.com/panda-lab-slis/informationliteracy
4 Results
The analysis investigated whether participation in MIL courses (LIS 10 or LIS 50) influenced students’ abilities to correctly identify fake news and their willingness to share content online. Results are organized into quantitative comparisons and qualitative insights.
4.1 Quantitative Findings
Table 1 displays descriptive statistics for accuracy and sharing scores by MIL exposure. Both groups performed nearly identically in identifying true versus false headlines (MIL: M = 9.91, SD = 1.63; No MIL: M = 9.90, SD = 1.92). Sharing intentions were also similar (MIL: M = 2.59 shares, SD = 2.33; No MIL: M = 2.43 shares, SD = 2.70).
One-way ANOVAs tested whether MIL exposure explained variance in scores. Table 2 shows the test for accuracy: the sum of squares attributable to MIL was virtually zero (SS = 0.001), producing F(1, 60) = 0.0006, p = 0.989, indicating no significant group difference.
| term | df | sumsq | meansq | statistic | p.value |
|---|---|---|---|---|---|
| MIL_exposure | 1 | 0.001 | 0.001 | 0 | 0.989 |
| Residuals | 60 | 189.419 | 3.157 | NA | NA |
Similarly, Table 3 presents the ANOVA for sharing intentions, with F(1, 60) = 0.063, p = 0.803, confirming no significant effect of MIL exposure.
| term | df | sumsq | meansq | statistic | p.value |
|---|---|---|---|---|---|
| MIL_exposure | 1 | 0.398 | 0.398 | 0.063 | 0.803 |
| Residuals | 60 | 379.085 | 6.318 | NA | NA |
Contrary to studies by Guess et al.[29] or Jones‑Jang et al.[30] demonstrating that targeted MIL interventions can boost fake-news detection, our findings show no measurable advantage for students who completed LIS 10 or LIS 50. This aligns with research suggesting that singular educational exposures may produce short-lived or subtle gains according to Pennycook & Rand [31] and that sharing behavior is influenced by emotional or social motivations beyond mere detection skills. Possible explanations include:
Ceiling effects: High baseline performance left little room for improvement.
Course content scope: MIL curricula may emphasize source evaluation but not practical sharing judgments.
Measurement sensitivity: Binary forced-choice tasks may not capture nuanced critical thinking.
4.2 Qualitative Themes
Analysis of open-ended responses revealed that students use a combination of cognitive and affective processes when evaluating and deciding to share headlines:
Source Credibility
Trust in recognized news outlets and fact-checkers guided judgments. This supports inoculation theory’s emphasis on equipping learners with heuristics to preempt misinformation [32, 33].Linguistic Cues
Participants flagged sensational phrasing, exaggerated claims, or grammatical errors as red flags. This mirrors techniques employed in technique-based inoculation, where understanding persuasive tactics fosters resilience [34] .Verification Practices
Many described cross-checking headlines through online searches or dedicated fact-checking sites before sharing. This active strategy reflects deeper analytic engagement and parallels findings that deliberative thinking reduces fake-news belief [35].
Sharing Motivations
Ethical Responsibility: A strong desire to avoid spreading falsehoods discouraged sharing despite recognition of accuracy.
Audience Relevance: Content perceived as irrelevant to one’s network was seldom shared, indicating social utility considerations.
Emotional Engagement: Outrage, amusement, or emotional resonance sometimes overrode accuracy concerns, consistent with research on affect-driven sharing [34].
These qualitative insights underscore that improving fake-news discernment alone may not suffice to change sharing behavior. Educational interventions should integrate reflective components on social and emotional drivers of sharing, aligning critical evaluation with responsible digital citizenship.
5 Conclusion
The findings indicate that completing MIL coursework (LIS 10 or LIS 50) does not produce statistically significant improvements in students’ abilities to discern fake headlines or alter their intentions to share such content on social media. Accuracy scores and sharing scores were virtually identical between those with and without MIL exposure, suggesting that a single course may be insufficient to shift underlying cognitive and behavioral patterns.
5.1 Recommendations
Integrated Learning Modules: Embed MIL principles within broader curricula rather than standalone courses. Reinforce critical evaluation skills across multiple classes and contexts to deepen retention.
Reflective Sharing Exercises: Develop activities where students track and reflect on their real-world sharing behaviors over time, linking accuracy judgments with social and emotional factors that drive sharing.
Scenario-Based Simulations: Use immersive scenarios—such as role-playing or interactive fake-news outbreaks—to practice detection under realistic time pressure and peer influence, mirroring real social media dynamics.
Collaborative Fact-Checking Labs: Create group-based fact-checking projects where students collaborate to verify emerging claims, fostering both analytical skills and social accountability.
Longitudinal Assessment: Implement follow-up evaluations months after instruction to detect longer-term impacts and refine interventions accordingly.
5.2 Limitations
Sample Size and Scope: The modest sample (N = 66) limits statistical power and generalizability. Participants were drawn from a single institution, potentially reflecting shared baseline competencies.
Measurement Constraints: Binary forced-choice items may not capture nuanced evaluative processes. More graded or open-ended accuracy assessments could reveal subtler improvements.
Short-Term Evaluation: Data were collected immediately after course completion. The study does not assess retention or transfer of MIL skills over time.
Self-Reported Intentions: Sharing intentions may not correspond to actual behavior. Tracking real sharing activity on social platforms would enhance ecological validity.
Content Homogeneity: Headlines were limited to election-related items from a narrow timeframe. Including diverse topics and formats (e.g., images, videos) could test MIL applicability across contexts.
Addressing these limitations will strengthen future research and support the design of robust MIL interventions that yield measurable improvements in both critical evaluation and sharing behavior.
1.1 Social Media Use and Prevalence of Fake News
The prevalence of social media in the Philippines has significantly increased in recent years, reshaping communication and information dissemination. As of 2023, approximately 76% of the Filipino population actively engages with social media platforms, facilitating personal interactions and the spread of information, including news. This heightened usage raises concerns about the dissemination of fake news, which has become increasingly significant in the socio-political context of the country. The digital landscape in the Philippines is particularly vulnerable to misinformation due to a combination of extensive social media adoption, varying digital literacy levels, and political polarization among users [1, 2] .
Recent studies indicate that despite a high awareness of fake news among users—especially among younger demographics—apathy and misinformation continue to thrive. A study conducted in Cebu City highlighted that while millennials are aware of the prevalence of fake news surrounding events like the COVID-19 pandemic, many exhibit a concerning level of disengagement. A significant percentage of respondents reported feeling overwhelmed by the volume of information [1] . This phenomenon, known as information overload, can lead to superficial engagement with content, where users may browse through emotionally charged posts without critically evaluating the validity of the information or the motives of the sources [3] .
Further discourse suggests that individuals with strong political ideologies are more likely to perceive fake news as a problem for others, a phenomenon known as the third-person effect. This perception often correlates with greater support for government regulation of fake news, even as they may not recognize similar biases in their own information consumption [2] . The interplay between political affiliation and the spread of misinformation emphasizes the necessity of improving media literacy programs that help individuals critically analyze information, particularly leading up to significant events like elections—an area where surveys have indicated a growing concern about disinformation tactics and their impact on electoral integrity [4].
To combat the challenges posed by misinformation in the Philippines, various stakeholders are advocating for enhanced digital and media literacy initiatives. Research advocates that increasing the public’s ability to recognize misleading information is integral to safeguarding societal discourse [3, 5] . The implementation of Digital Literacy Movements aims to equip individuals, particularly the youth, with skills necessary to navigate the evolving digital landscape and foster critical thinking regarding the content they engage with on social media [6, 7] . Educational frameworks that integrate new media literacy concepts are deemed pivotal in adapting to the rapid evolution of media forms and their associated challenges [7, 8] .
As the Philippines grapples with these challenges, the collective effort to improve media literacy will be crucial in mitigating the effects of fake news and fostering a more informed citizenry capable of engaging with digital media responsibly [1, 2, 4] .