When I opened the door to the administration of the special committee on the 2016 Berlin attack for the first time in 2018, I glance at 4,500 folders with 690,000 pages of investigation records, emails, and police files. This first glimpse dates back two years when I started working for two members of parliament, jointly scrutinizing the most severe Islamist attack in Germany. Today, I am sitting in front of an air-gapped computer, fanning it thoroughly to prevent it from overheating. The machine runs my NLP-based analysis of all documents mapping the social network around the terrorist. The nebulous amount of files indicates how much (more) data security agencies have to processes in a digitised democracy. And: that parliamentary control over authorities require automated data analysis to persist in future.
Both insights – the challenge of digitization for security authorities, as well as the effective control over their work – typify my academic and professional interest: How should national security work in the digitised democracy?
Studying Public Policy at Columbia University's School of International and Public Administration (SIPA) in New York City and the Hertie School in Berlin, as well as working as an advisor to public players in the German security sector, I understood that the intersection of national (cyber) security, (open-source) intelligence and data analytics is a pivotal combination to address my questions. Inspired by my studies, freelance projects, and practical experiences, I wish to make the digitised democracy safer – without restricting freedom or curtailing the rule of law.
Not only after the killing of Walther Lübcke, the terror attack on a synagoge in Halle and the shooting spree in Hanau is far-right, anti-Semitic and anit-refugee crime surging in Germany. NGOs and journalists are cooperating to give a more granular picture of these hate crimes throughout Germany. Based on ~6,900 observation from the "arvig" dataset (2014 - today) and ~16,000 observations of "tatort rechts" (rechte gewalt, rege) dataset (2000 - today), this R-based ShinyApp visualized individual and county level data. I expanded the aggregation on the county level with an 600-variable panel dataset on demographics, social and economic factors from the German Federal Bureau of Statistics.
"CYsyphus" (pronounced SIGH-si-fis) is a decision-support tool, that provides users with an easy-to-search online database on existing cyber reports and recommendations. CYsyphus facilitates the discovery of past wisdom to avoid repetition and enable leapfrogging to new insights and recommendations in support of policy makers, congressional staffers, journalists and students. The project uses NLP-driven classification and categorization algorithms to corroborate and expand the existing collection of approx. 1,200 recommendations from 130 reports.
All-source intelligence analysts need improved modeling, analytic tools, and data visualization in order to understand dense urban areas and enhance situational awareness more effectively. POLassist helps you understand location data in an urban areas to improve situational awareness and response allocation. The POLassist prototype was developed as part of "Hacking4Defense" at Columbia University in the City of New York, 2020.
Read some of the latest pieces of opinion. Triggered by course-related questions and issues popping up in every day policy work.
R- and Python-powered quantitative research. As this is preliminary, conclusions should be handled with care - like always!
Miscellaneous texts about policy issues. Sometimes more analytical, sometimes more normative, sometimes just something.