Description

Disinformation is a major challenge for our society and poses considerable risks. Currently, there are hardly any tools available to detect disinformation campaigns actively. Those affected often find out about their involvement far too late, which limits their ability to respond effectively. Often, only damage limitation remains. Early detection of such trends would provide room for maneuvering, e.g., to prepare appropriate counter-measures. DesinFact aims to improve the state of research on the automatic detection of disinformation trends, to identify gaps in technical, legal, and ethical areas, and to develop suitable approaches to enable such a system. Different data sources need to be continuously monitored to detect disinformation campaigns to identify trends. In order to then automatically evaluate these as disinformation campaigns, approaches must be used that meet the highest quality standards since an erroneous decision - that fake news content is being disseminated here - can fall back on the authors or their institutions and thus considerably damage their reputation. Likewise, this damages the reputation of the consortium partners and the public trust in such a technology. Therefore, trustworthiness - the enhancement of trust in technical, legal, and ethical matters - is a central focus of the research activities in DesinFact. Methods are to be researched to increase quality and make this measurable or explainable, and these methods should be understandable for experts and operational operators. One aspect of increasing accuracy is combining network structure analysis and communication patterns with content-based analysis. To this end, DesinFact will explore methods for detecting dissemination channels and key actors in disinformation networks and combine them with content assessment methods. Another focus of DesinFact is researching a possible public provision of a system for disinformation detection. Such a system should enable citizens to have content checked for disinformation online. DesinFact will explore socio-technological aspects that are relevant for an adequate implementation of such a technology. However, since disinformation is a highly complex task whose assessment depends on numerous factors such as general education, cultural, political, and religious background, controversial decisions can hardly be avoided. Accordingly, both the assessment systems and the presentation of results must be free of misunderstandings. Corresponding interdisciplinary studies are central contents of DesinFact.

Details

Duration 02/10/2023 - 01/10/2025
Funding FFG
Program
Department

Department for E-Governance and Administration

Center for Infrastructural Security

Principle investigator for the project (University for Continuing Education Krems) Assoz. Prof. Mag. Dr. Walter Seböck, MAS MSc
Project members
Back to top