Matt Bauman is a Senior Research Scientist at Julia Computing, focusing on teaching and training as well as continuing to improve Julia's array infrastructure. He’s been contributing to both the core language and multiple packages since 2014. At his previous position as a Data Science Fellow at the University of Chicago’s Center for Data Science and Public Policy, he longed for dot-broadcasting in Python. He recently defended his PhD in Bioengineering from the University of Pittsburgh, focusing on neural prosthetics.
Demystify machine learning buzzwords by learning how to train and use your own neural network in this interactive workshop. We'll cover the foundational principles that underpin modern machine learning and demonstrate how Julia makes it easy and fast.
Parallel computing is hard. Julia can make it much easier. In this workshop, we discuss modern trends in high performance computing, how they’ve converged towards multiple types of parallelism, and how to most effectively use these different types in Julia.