Illuminating Shadow AI: An Open-Source Tool for CustomGPT Risk Assessment
2026-04-25 , Track 2

How comfortable are you knowing your company is using custom LLMs, like CustomGPT, with zero visibility into the sensitive data flowing through them?

Organizations are racing to adopt AI, creating new blind spots faster than they can secure them. The reality is hundreds of shadow AI instances where employees inadvertently expose company IP and PII daily.

This session introduces GCI (CustomGPT Compliance Insights), a new open-source tool built to solve this exact visibility gap. We will jump directly into the research behind the tool, dissecting the attack surface of administrative APIs and the specific regex patterns we developed to hunt for secrets in unstructured chat logs. We will demonstrate how the tool identifies high-risk exposures, from hardcoded credentials to sensitive PII, that standard DLP solutions often miss.

You will leave with the source code in hand and a practical method to run your own audits and minimize these risks immediately.

Sharon has a strong background in defensive research, especially around emerging AI technologies and GCP environments. Off the clock, she channels that same curiosity into cooking, taking on kitchen challenges like beef Wellington and lemon pie.