2025-08-29 –, FORGE (Room 1)
What if you could see network packets?
You can’t see Wi-Fi. Or Bluetooth. Or radio waves. Your eyes just weren’t built for that. But what if they were?
Our ancestors evolved eyes to spot potential sudden threats and the promise of food. We have fast, ambient, spatial awareness that runs on almost no conscious effort. Our visual processing tracks our background environment and immediately brings our attention to small but important anomalies in the space around us. However, in today's jungle of invisible wireless signals, we've gone blind to threats and environmental conditions. In this talk, I’ll explore how to give ourselves new kinds of vision: artificial eyes that can see the last meter of network activity in real time.
I'll cover how augmented and virtual reality can spatialize packet sniffing, letting you see network traffic flowing through the room, and why your ancient monkey brain is way better at spotting surprises in 3D space than reading logs or dashboards. I'll show videos of XRShark, a prototype for mixed-reality network introspection, and show how it helps uncover unexpected patterns and activity in a real-time fashion that would be difficult to replicate with traditional tools.
This is especially powerful for Internet-of-Things applications, where one physical action can ripple invisibly through the air, triggering local communications, cloud calls, and downstream controls. In cyber-physical systems, seeing only the network or only the physical isn't enough to understand what is happening. Mixed reality lets you see the complete system.
Along the way, we’ll detour into some other attempts to render the invisible visible, from hyperspectral cameras based on radio astronomy principles to volumetric RF field visualizations. If you want to learn how to build systems that will let you literally see what your network is doing, this talk is for you!
Dr. Meghan Clark is a Principal Scientist, IoT Systems at Resideo Technologies, Inc. She has worked for over a decade on improving interactions with the built environment. Previously, she was a postdoctoral fellow at University of California, Berkeley, where she worked on novel methods for capturing and visualizing wireless network traffic. Before that she completed her PhD in computer science from University of California, Berkeley with a focus on smart buildings. She received a Masters in computer science and engineering (and an unofficial degree in hacking things) at University of Michigan. Her initial exposure to embedded systems and networking work was at the Army Research Laboratory's Night Vision and Electronic Sensors Directorate, where she developed an interoperability protocol for converting proprietary sensors into plug-and-play devices, enabling rapid deployment of early-warning systems. Her work has earned several awards, including the NSF Graduate Research Fellowship, the Microsoft Graduate Women’s Scholarship, and the Rackham Merit Fellowship. Her work on mixed reality network inspection won Best Poster Award from ACM MobiCom 2022 and David Wessel Best Demo Award from CONIX 2019.