BSides Tallinn 2025

Stella Goldman

Stella Goldman is Co-founder and Privacy Expert at Damus and Lead Legal Counsel – Privacy & Product at Veriff. She is International Association of Privacy Professionals accredited Certified Information Privacy Manager (CIPM) and a member of Estonian Bar Association (currently inactive due to in-house position).

Stella is a trusted cross-functional partner to product, information security and business teams, translating complex legal requirements into compliant, actionable and scalable business-enabling solutions.

https://www.linkedin.com/in/stella-raudsepp/


Session

09-25
13:30
60min
Privacy by Design in the Age of AI: Key to Anonymisation and Lessons from Real-World Security Incidents
Margot Arnus, Stella Goldman

In the age of AI and large-scale data processing, it’s tempting to assume that applying security practices equals good privacy. But as multiple real-world breaches have shown—from Estonia’s Asper Biogene genetic data exposure to pharmacy data leaks at Allium UPI— insufficient security controls and a lack of privacy by design can expose organizations to significant privacy risks.

This interactive workshop is tailored for security and privacy professionals whose organizations work with sensitive or large datasets, especially in the context of AI/ML training or internal analytics. We’ll break down the differences and overlaps between infosec and personal data breaches, demystify what anonymisation and pseudonymisation really mean under the GDPR, and explore how to make data useful and safe. Participants will also gain practical insights into breach response basics and how to act when things go wrong.

We’ll wrap with a practical group exercise where attendees get to “anonymise” a fictional database based on publicly available data—and see if their efforts withstand real-world re-identification threats.


KEY TOPICS:
1. How large datasets fuel AI innovation yet at the same time cause regulatory risk. Why effective privacy compliance is not a checklist task but active daily practice.
2. Key differences between infosec incidents and personal data breaches (and when they overlap).
3. Legal definition of anonymisation and pseudonymisation, hands-on practical task to understand both the value as well as the risk of these measures.
4. Case study examples:
4.1. Asper Biogene (genetic data breach)
4.2. Allium UPI (pharmacy breach)
4.3. European Data Protection Board’s recent recommendations:
4.3.1. Guidelines 01/2025 on Pseudonymisation
4.3.2. Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models
5. What to do when a breach happens: notify, assess, contain, communicate.


PRACTICAL WORKSHOP EXERCISE:
Participants are expected to have at least one device per team. Participants are given a dataset for a machine learning exercise. Their task in teams is to:
1. Anonymise the dataset using privacy enhancing techniques (masking, generalization, suppression, etc.).
2. Switch files between teams and evaluate potential for re-identification based on auxiliary data.
3. Determine whether their approach met the standard of anonymisation or only pseudonymisation.
4. Present each teams’ anonymisation strategy and summarize a residual risk assessment. Discuss what would be the potential consequences of a leak of such data - would it be merely a security incident or a data breach.


LEARNING OBJECTIVES:
1. Understand how anonymisation supports safe AI use and data reuse.
2. Recognize when a breach is a security issue, a privacy issue, or both.
3. Learn to evaluate anonymisation effectiveness using legal and technical criteria.
4. See how access control gaps can escalate into reportable personal data breaches.
5. Get hands-on anonymisation experience and peer feedback.


SPEAKERS:
Margot Arnus - CIPP/US, Co-founder and Privacy Expert at Damus, Senior Legal Counsel at Veriff
Stella Goldman - CIPM, Co-founder and Privacy Expert at Damus, Lead Legal Counsel at Veriff

Workshop
Workshops