Machine Learning Workshop
07-22, 08:30–12:00 (US/Eastern), PH 203N

Demystify machine learning buzzwords by learning how to train and use your own neural network in this interactive workshop. We'll cover the foundational principles that underpin modern machine learning and demonstrate how Julia makes it easy and fast.


Interest and excitement in machine learning (ML) has skyrocketed in recent years due to its proven successes in many disparate domains. Julia is uniquely positioned as a strong language for ML due to its high performance, ease of use, and groundbreaking research in differentiable programming.

In this interactive workshop you will learn the core concepts that drive and underpin modern machine learning techniques. The first half incrementally introduces key ML terminology and concepts as you build and train your first neural network with Flux. Covered along the way are data representations, models, gradient descent, training, and testing. Then take a step back and explore the wide array of applications of machine learning with a handful of demonstrations of different tasks and models.

Matt Bauman is a Senior Research Scientist at Julia Computing, focusing on teaching and training as well as continuing to improve Julia's array infrastructure. He’s been contributing to both the core language and multiple packages since 2014. At his previous position as a Data Science Fellow at the University of Chicago’s Center for Data Science and Public Policy, he longed for dot-broadcasting in Python. He recently defended his PhD in Bioengineering from the University of Pittsburgh, focusing on neural prosthetics.

This speaker also appears in: