2026-06-01 –, Room 1
Cloud detection rules break silently. You write one, deploy it, and the only feedback is silence — which could mean no attacks, or could mean your rule stopped working three months ago. When you're maintaining rules across multiple cloud providers, the problem compounds: different telemetry, different log formats, etc. We wanted a better feedback loop, so we reached for attack simulation.
We started with Threatest, Datadog's open-source framework for pairing cloud attack execution with detection validation. It gave us a solid foundation — but we hit its limits fast. We run a different SIEM, we needed custom attack techniques, and we wanted tighter control over execution and results. So we extended it. This talk is about what we built, what surprised us along the way, and the mistakes we'd avoid if we were starting over.
If your team is weighing whether to invest in building something like this rather than buying a commercial alternative, we hope our experience gives you an honest picture of what that actually looks like.
Pavel is a Senior Security Engineer on the Detection and Response team at Confluent, where he builds detection rules, designs logging pipelines, and investigates security incidents across multiple cloud providers. With nearly a decade of experience in cybersecurity, he has developed a deep focus on cloud security, detection engineering, and incident response.