PyConDE & PyData Berlin 2024

Mojo 🔥 - Is it Python's faster cousin or just hype?
2024-04-24 –, B09

On 2023-05-02, the tech sphere buzzed with the release of Mojo 🔥, a new programming language developed by Chris Lattner, renowned for his work on Clang, LLVM, and Swift. Billed as "Python's faster cousin," and "The programming language for all AI developers", Mojo promised a 68,000x performance uplift and a familiar Pythonic syntax.

As it reaches its first anniversary, we unpack Mojo's journey towards its ambitious promise. This talk delves into the practical experiences developing a Large Language Model Interpretation library as part of an AI Safety Camp project in that language. We cast a critical eye over its performance, evaluate its usability, and explore its potential as a Python superset. Against a backdrop where alternatives like Rust, PyPy and Julia dominate performant programming for AI, we question whether Mojo can carve out its niche or if it will languish as another "could-have-been" in the programming language pantheon.


Background & Motivation

The introduction of Mojo by Chris Lattner captured the attention of the Python community with the allure of dramatic performance enhancements and a syntax that would not alienate current Python developers. As Mojo progresses beyond its infancy, it's critical to assess its evolution and its capacity to disrupt the programming ecosystem, particularly within artificial intelligence and machine learning domains.

Objective & Scope

This presentation will share findings from an AI Safety Camp project which used Mojo to build a Large Language Model Mechanistic Interpretatability and Activation Engineering library. Through our exploration, we aim to provide a candid narrative of Mojo's strengths and limitations, judge its performance claims, and probe its likelihood of adoption for AI development.

Content Overview

Introduction to Mojo: Brief overview of Mojo's conception, ethos, and intended use-cases.
Performance Claims: An further look at the purported 68,000x speed increase over Python, including benchmark comparisons and real-world application data.
Language Design: An analysis of Mojo's syntax and semantics, drawing parallels and contrasts with Python, and the implications for developers transitioning to or adopting Mojo.
Case Study: Detailed account of the process of writing a Large Language Model Interpretation library in Mojo, highlighting the challenges and breakthroughs experienced.
Ecosystem Overview: Examination of the current state of Mojo's ecosystem, its community support, and the availability of tooling and libraries.
Discussion: Engaging the audience in a discussion about Mojo's potential future, its fit within existing projects, and the propensity for it to become the primary language for AI development.

Conclusion
We'll wrap up with predictions for Mojo's trajectory based on our experiences and broader industry trends, potentially setting the stage for Mojo to capture the "Mojo" it needs to triumph or to become a footnote in the annals of programming language history.


Expected audience expertise: Domain –

Intermediate

Expected audience expertise: Python –

Intermediate

Abstract as a tweet (X) or toot (Mastodon) –

"Chris Lattner's Mojo promised to revolutionize AI dev with 68k times speed & Python ease. One year later, we dissect its reality—can it outshine Rust & Julia, or is it just hype? #PyData #MojoLanguage #PythonCousin"

Public link to supporting material, e.g. videos, Github, etc. –

https://github.com/jcoombes/pyconde-slides/

I'm a Machine Learning Engineer with 3 years of Python and PyTorch development experience. I've provided ML expertise to startups and the UK government, I am interested in beneficial AI applications.

I spoke at EuroPython Prague this summer and I have speaking experience through my prior role as a Science Teacher with TeachFirst. My background is studying Physics and then Atmospheric Physics interpreting large tropical cyclone datasets at Imperial College London.