DoCMA is an interdisciplinary research project in which algorithms process huge amounts of newspaper articles and texts of social media in order to recognize patterns. The results enable our team to identify emerging trends in social networks, to track the development of a topic or compare how a topic is reported in different media or countries.
Why we are here
When he thinks back to his first semester at the Institute for Journalism, Prof. Henrik Müller also remembers TTIP. It was around the turn of 2013 and 2014, Müller was newly appointed to the professorship for economic-policy journalism, and he asked his students in a seminar which topics they considered neglected in the media. "The TTIP negotiations," replied one, the planned transatlantic trade and investment partnership between Europe and the United States. "I fell out of the clouds," recalls Müller. Because he just came from practice: Before he was called to the Technical University Dortmund, he was deputy chief editor of the German "manager magazin". "TTIP was actually no issue for us. We believed that it was all about technical questions - boring stuff." A short time later, everyone who reads newspaper and views television knews the abbreviation. The media was full of TTIP - especially the protests against it. It was an important moment for Müller, because the debate about the free trade agreement, largely unnoticed by the general public, was first conducted on social media. The student of Müller's seminar had also become aware of this topic there. "The protest movement against TTIP was built there until it reached the traditional media. This phenomenon has been repeated on various topics, but the periods when they reach the classic media are getting shorter and shorter," says Müller.
Müller had long dreamed of using the media as a seismograph to identify social developments earlier. In his first lecture, he draw the picture of cooperation between economics and journalism research. With DoCMA this has become reality: The Dortmund Center for data-based Media Analysis exists since 2015. It is a virtual institute under which four university professors work together. As an economist and communications scientist Henrik Müller is in minority. In addition to him, the team consists of two statisticians and a computer scientist as well as their employees.
In order to examine public communication, possibly even in real time and to derive trends and developments, one need to analyse huge amounts of text - and in turn the techniques that turn the data debris field into readable and usable results. In other words, you need data mining. "I'm an economist - during this time I had no idea about this," says Müller. "I only knew the frequency analysis - pure word counting: How often does the word 'recession' appear in the reporting? However, data mining can do much more. And it is much more complicated than I thought."