FFT-method case validity
Fake News! Or is it?
Why is it difficult to provide decision support for problems of uncertainty?
A. What do you need?
- Aikman, D., Galesic, M., Gigerenzer, G., Kapadia, S., Katsikopoulos, K. V., Kothiyal, A., ... & Neumann, T. (2014). Taking uncertainty seriously: Simplicity versus complexity in financial regulation. Bank of England Financial Stability Paper, 28.
- Green, L., & Mehr, D. R. (1997). What alters physicians' decisions to admit to the coronary care unit?. Journal of Family Practice, 45(3), 219–226.
- Jablonskis, E., & Czienskowski, U. (2017). Decision trees online. http://www.adaptivetoolbox.net/Library/Trees/TreesHome#/
- Jenny, M. A., Pachur, T., Williams, S. L., Becker, E., & Margraf, J. (2013). Simple rules for detecting depression. Journal of Applied Research in Memory and Cognition, 2(3), 149–157.
- Luan, S., Schooler, L. J., & Gigerenzer, G. (2011). A signal-detection analysis of fast-and-frugal trees. Psychological Review, 118(2), 316.
- Martignon, L., Katsikopoulos, K. V., & Woike, J. K. (2008). Categorization with limited resources: A family of simple heuristics. Journal of Mathematical Psychology, 52(6), 352–361.
In the 2016 US election, the 20 most successful false news reports were more often liked, shared or commented on than the 20 most successful articles by serious media. This example shows for one: Nowadays, the Internet makes it much easier to distribute information quickly. Secondly, it shows that fake news is often more attractive than serious reporting. This leads to misjudgements, rigid discussion fronts and thus makes social exchange more difficult. It's not always easy to recognise fake news. In a first step it is therefore important for you to be able to distinguish between a piece of news reporting and an opinion text. Which text presents facts - and which one does not? Can you recognise a satire, a commentary or a gloss? Our decision tree is a digital checklist that helps you to check a news text and warns you about any doubts.
When do I need this figure?
If you search for supposed news, receive them from others, or encounter them in digital groups, it is appropriate to check whether the texts are an opinion format or an actual news piece.
You can also check this text more extensively. Please note, however, that no text or checklist is ever perfect. With each additional feature that you check, the risk of an incorrect assessment of the text increases.
Further features are:
- Is a fact exaggerated without providing explanations [NEGATIVE FEATURE, i.e. if this occurs, journalistic requirements for a news report are not fulfilled]?
- Is there an explicit claim that the text reveals a secret [NEGATIVE FEATURE]?
- Is it claimed that other media hide the truth or lie [NEGATIVE FEATURE]?
- Are personal pronouns such as "you", "we", "us", "your" [NEGATIVE FEATURE] used in the text, outside of quotations?
What does the figure show?
An "all-clear" means that journalistic key requirements are met to a large extent. It can be assumed that the text is not fake news.
It is not strictly necessary to check further requirements beyond the features of the decision tree. Studies of the underlying texts show that other important requirements or features are usually fulfilled or not fulfilled.
Where is the data that the decision tree is based on coming from?
Cases – Which texts served as a basis?
600 texts from German-language websites were compiled. They were researched by experts from the Harding Center for Risk Literacy. Since we are interested in the detection of fake news, a topic-based strategy was chosen. The topics on which both news and opinion formats were to be expected came from Correctiv and from Faktenfinder. They were then researched with the following keywords: "Angela Merkel", "refugees", "asylum", " migrant background", "chemtrails", "contrails", "Islam", "Muslims", "Israel", "Jews", "cancer", "unemployed", "gender", "Russia", "VW", "left-wing extremism", " autonomists", "right-wing extremism", "cash", "climate". This research was carried out on Bing-News, Google-News, Facebook, Twitter and with the help of Google's "auto-complete function". In addition, the sample was enriched by individual texts from sources that were thematized as fake news portals in the classic high-reach news portals.
TV recommendations and videos were removed, as were reports and interviews.
Target assessment – How was the fulfillment/non-fulfillment of journalistic standards in news reportings determined?
18 journalists with professional experience in print and digital media assessed the cases.
Each text was assessed by three experts with regard to the question: "Does this text fulfil the journalistic standards for a message? A four-step answer format was used. The median value of three experts each was used as the target value for the individual case. The experts did not receive any information about the potential features used in the study.
The derived warnings or "all-clears" then appear in the decision tree if the journalistic requirements for a news text are not met at all or at least partially within the framework of this expert model. In addition, an explanation is given.
Based on various sources (ARD MEDIATHEK, 2017; BR.de, 2017; Brown, 2015; Bundeszentrale für politische Bildung, 2017; Erb, 2017; Focus.de, 2015; Kolonko, 2017; La Roche, 2005; Rack et al., 2017; Shu et al., 2017) 86 features were collected, 50 of which were regarded as generally testable by laypeople.
- Cases 1–100 were coded, compared, discussed and harmonised by two independent research assistants in 50 features. feedback of the coders (were found to be too difficult to use by laypeople). 20 features remained.
- Cases 101–500 were in groups of 100 by two independent . Four further 15 features remained.
- Cases 501–600 were used
On the model shown above – filtering extreme opinion formats.
The model for recognising texts that do not meet journalistic standards for news is of the following quality:
A cross validation of the identified decision tree resulted in the following quality measures: balanced accuracy = 0.76; sensitivity in the recognition of texts that do not meet any kind of standards for news texts (share of 21% in the test set), of 0.88. This means that 88 out of 100 of such texts that are definitely not news were detected by the decision tree.
The specificity in the confirmation of news and hybrid forms is 0.64.
Potential for development
Empirical evaluation with consumers
- ARD MEDIATHEK (2017): Fakt oder Fake? Wie man gefälschten Nachrichten auf die Schliche kommt. Verfügbar unter: https://www.ardmediathek.de/tv/neuneinhalb-das-Reportermagazin-f%C3%BCr-Ki/Fakt-oder-Fake-Wie-man-gef%C3%A4lschten-Na/Das-Erste/Video?bcastId=431486&documentId=41134052 (letzter Abruf am 19.06.2018).
- BR.de (2017). So geht Medien (letzter Zugriff, 23.03.2017).
- Brown, P. (2015). Sechs Wege um Falschmeldungen zu entlarven. Der Freitag, 28.10.2015 (letzter Abruf am 23.03.2017).
- Bundeszentrale für politische Bildung (2017). Den Durchblick behalten. So lassen sich Fake News enttarnen, 23.02.2017 (letzter Abruf am 23.03.2017).
- cf. Shu, K., Sliva, A., Wang, S., Tang, J., & Liu, H. (2017). Fake news detection on social media: A data mining perspective. ACM SIGKDD Explorations Newsletter, 19(1), 22–36.
- Erb, S. (2017). So entlarven Sie Fake News, 31.07.2017 (letzter Abruf am 14.2.2017).
- Focus.de (2017). Unseriöse Quellen: So enttarnen Sie Fake News. Online focus, 11.10.2017 (letzter Abruf am 24.03.2017).
- Kolonko (2017). Wie erkenne ich Fake News? Hilfreiche Tipps für die Faktenprüfung. Planet Wissen.
- La Roche, W. V. (2006). Einführung in den praktischen Journalismus. München, List.
- nach Computer + Unterricht 74/2009, S. 43 und FH Hannover: Handbuch zur Recherche. Hannover 2006.
- Rack et al. (2017). Fakt oder Fake? Wie man Falschmeldungen im Internet entlarven kann. klicksafe.de to go.