Research Papers

The TRUSTS initiative aims at making an impact on everyday lives with their research. Therefore, information about our research will be provided on this side.

Practice and Challenges of (De-)Anonymisation for Data Sharing

June 2020

Authors: Alexandros Bampoulidis, Alessandro Bruni, Ioannis Markopoulos, Mihai Lupu

Personal data is a necessity in many fields for research and innovation purposes, and when such data is shared, the data controller carries the responsibility of protecting the privacy of the individuals contained in their dataset. The removal of direct identifiers, such as full name and address, is not enough to secure the privacy of individuals as shown by de-anonymisation methods in the scientific literature. Data controllers need to become aware of the risks of de-anonymisation and apply the appropriate anonymisation measures before sharing their datasets, in order to comply with privacy regulations. To address this need, we defined a procedure that makes data controllers aware of the de-anonymisation risks and helps them in deciding the anonymisation measures that need to be taken in order to comply with the General Data Protection Regulation (GDPR). We showcase this procedure with a customer relationship management (CRM) dataset provided by a telecommunications provider. Finally, we recount the challenges we identified during the definition of this procedure and by putting existing knowledge and tools into practice.

Read the full paper on Research Gate 

Robustness of Meta Matrix Factorization Against Strict Privacy Constraints

March 2021

Authors: Peter Muellner, Dominik Kowald, Elisabeth Lex

In this paper, we explore the reproducibility of MetaMF, a meta matrix factorization framework introduced by Lin et al. MetaMF employs meta learning for federated rating prediction to preserve users’ privacy. We reproduce the experiments of Lin et al. on five datasets, i.e., Douban, Hetrec-MovieLens, MovieLens 1M, Ciao, and Jester. Also, we study the impact of meta learning on the accuracy of MetaMF’s recommendations. Furthermore, in our work, we acknowledge that users may have different tolerances for revealing information about themselves. Hence, in a second strand of experiments, we investigate the robustness of MetaMF against strict privacy constraints. Our study illustrates that we can reproduce most of Lin et al.’s results. Plus, we provide strong evidence that meta learning is essential for MetaMF’s robustness against strict privacy constraints.

Read the full paper on Research Gate

Why open government data initiatives fail to achieve their objectives: categorizing and prioritizing barriers through a global survey

May 2021

Authors: Anneke Zuiderwijk & Mark de Reuver

Existing overviews of barriers for openly sharing and using government data are often conceptual or based on a limited number of cases. Furthermore, it is unclear what categories of barriers are most obstructive for attaining open data objectives. This paper aims to categorize and prioritize barriers for openly sharing and using government data based on many existing Open Government Data Initiatives (OGDIs).

Read the full paper on Emerald Insight