ЦАХИМ ОРЧИН ДАХЬ ХУДАЛ МЭДЭЭЛЛИЙН ЗОХИЦУУЛАЛТ, ЧИГ ХАНДЛАГА
DOI:
https://doi.org/10.22353/journalism.2025.01.14Keywords:
Digital Environment, False Information, Content Moderation, Regulation, Artificial IntelligenceAbstract
The rapid development of Artifi cial Intelligence (AI) and algorithm-driven digital technologies has fundamentally reshaped the nature of false information, transforming it from a problem of misleading content into a systemic challenge embedded within digital information ecosystems. This study examines how AI-supported content generation, algorithmic curation, and platform-based content moderation shape the production, circulation, and societal impact of misinformation in contemporary digital environments. The analysis demonstrates that AI-generated and synthetic content, particularly deepfakes, poses serious risks to individual dignity, psychological security, and public trust. Beyond individual-level harm, the large-scale dissemination of false information undermines democratic processes by distorting public deliberation, weakening rational discourse, and encouraging emotionally driven, fact-independent political decision-making. In this sense, false information constitutes not only a media ethics concern, but also a human security and democratic governance challenge. Through a comparative review of international regulatory frameworks, including Germany’s NetzDG, the EU Digital Services Act, and Singapore’s POFMA, the study identifi es a global shift toward recognizing digital platforms as active intermediaries, emphasizing transparency, risk assessment, and platform accountability. The paper analyzes Mongolia’s regulatory responses, noting limitations, particularly the absence of independent risk assessment, and concludes that eff ective governance requires a risk-based approach balancing harm prevention with fundamental rights.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Сэтгүүл зүй

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.