# JuliaCon 2022 (Times are UTC)

Graph computing is an innovative technology that allows developers to build applications and systems as directed acyclic graphs (DAGs). Graph computing offers generic solutions to some of the most fundamental challenges in enterprise computing such as scalability, transparency and lineage. In this workshop, we survey the available graph computing tools in Julia, then walk through a few hands-on examples of building real world applications and systems using graph computing.

A three-hour introductory workshop for newcomers to Julia and machine

learning. Participants will have training in some technical domain,

for example, in science, economics or engineering. While no prior

experience with Julia or machine learning is needed, it is assumed

participants have Julia 1.7 installed on their computer.

Statistics is a domain where some early stage development of packages, and some early applications, have come about in Julia. We think of this mini-symposium as a combination of (a) Report on many interesting recent developments in this field and (b) Offer a birds eye view to the people interested in this field, and help them assess the state of maturity so as to make decisions about whether Julia is appropriate for their statistics work.

The JuliaMolSim community is hosting a minisymposium! Come hear about AtomsBase, our project to create a unified interface for representing atomic geometries, as well as packages for simulation (both quantum mechanical and classical particles-based) and machine learning on atomistic systems. Do you have an idea for a package you think the community needs? Participate in our “quick pitch” session and find co-developers to help build it!

Come get your feet wet with Oceananigans.jl, a native Julia, fast, friendly, flexible and fun ocean model. In the first half of this workshop participants will be helped to run and analyze one of several simple ocean problems. The problems relate to state-of-the-art challenges in climate science and computational science. In the second half we will examine Julia language features, packages and design choices that enable Oceananigans.jl.

This session has moved to Zoom. Please join with Zoom ID: 6376486897 at 2PM Israel time.

This (Hebrew language) workshop provides an introduction to the Julia language for machine learning engineers, data-scientists, and statisticians. Attendees will gain a solid entry point for using Julia as their preferred data analysis tool.

Makie.jl is a Julia-native interactive data visualization library. In this workshop, participants will learn how to create complex interactive and static plots, using the full range of tools Makie has to offer. Topics could include writing custom recipes, understanding the scene graph, mastering the layout system, handling complex observable structures and tweaking visual styles. The workshop will also be an opportunity to learn about the architecture and underlying ideas of Makie.

This minisymposium will feature the use of the differentiable programming paradigm applied to Earth System Models (ESMs). The goal is to exploit derivative information and seamlessly combine PDE-constrained optimization and scientific machine learning (SciML). Speakers will address (1) Why differentiable programming for ESMs; (2) What ESM applications are we targeting?; and (3) How are we realizing differentiable ESMs? Target ESMs include ice sheet, ocean, and solid Earth models.

This minisymposium aims to provide researchers in astronomers and astrophysicists an opportunity to share how Julia has enhanced their science and the challenges they encountered. We aim to identify shared needs (e.g., opportunities for new/upgraded packages) that could significantly accelerate the adoption of Julia among the astronomical research community. A secondary goal is to help strengthen the community of Julia developers active in astronomical research.

Catalyst.jl is a modeling package for analysis and high performance simulation of chemical reaction networks (CRNs). It defines symbolic representations for CRNs, which can be created programmatically or specified via a domain specific language. Catalyst provides tooling to analyze models, and to translate CRNs to ModelingToolkit-based ODE, SDE, and jump process models. In this workshop we will overview how to generate, analyze, and efficiently solve such models across a variety of applications.

DataFrames.jl provides a comprehensive set of functions that allow performing transformations of tabular data using an operation specification language. This language lets users pass columns from a source data frame, a function to apply to them, and column names to store the result in the target data frame. In this workshop, I will explain the functionalities that it provides. Here you can find workshop materials.

The "Julia for HPC" minisymposium aims to gather current and prospective Julia practitioners in the field of high-performance computing (HPC) from multidisciplinary applications. We invite participation from industry, academia, and government institutions interested in Julia’s capabilities for supercomputing. The goal is to provide a venue for Julia enthusiasts to share best practices, discuss current limitations, and identify future developments in the scientific HPC community.

DFTK.jl is a framework for the quantum-chemical simulation of materials using Density Functional Theory. Many relevant physical properties of materials, such as interatomic forces, stresses or polarizability, depend on the derivatives of quantities of interest with respect to input data. To perform such computations efficiently Automatic Differentiation has been implemented in DFTK using both forward and backward modes of AD.

We present a Julia toolchain for the adaptive simulation of hyperbolic PDEs such as flow equations on complex domains. It begins with HOHQMesh.jl to create a curved, unstructured mesh. This mesh is then used in Trixi.jl, a numerical simulation framework for conservation laws. We visualize the results using Julia’s plotting packages. We highlight select features in Trixi.jl, like adaptive mesh refinement (AMR) or shock capturing, useful for practical applications with complex transient behavior.

The package VectorEngine.jl enables the use of the SX-Aurora Tsubasa Vector Engine (VE) as an accelerator in hybrid Julia programs. It builds on GPUCompiler.jl leveraging the VEDA API as well as the LLVM-VE compiler and the Region Vectorizer. Even though the VE is very different from GPUs, using only few cores and arithmetic units with very long vector length, the package enables programmers to use it in a very similar way, simplifying the porting of Julia applications.

Could Julia be uniquely well-suited for rapidly developing new approaches to simulate the brain ? What if neuroscientists could use a composable set of tools to craft models of ion channels, compartmentalized neuronal morphology, networks of LIF or conductance-based neurons, reinforcement learning, and everything in-between?

Join the discussion on the bof-voice channel in discord.

This talk will present the package UnitJuMP that allows the user to include units when modelling optimization problems in JuMP. Both variables and constraints can have specified units, as well as parameters involved in objective and constraints. If different units are combined, functionality in Unitful is used to check for compatibility and perform automatic conversions.

Is life possible without for-loops? This talk reviews some syntactic constructs available in Julia, especially the do-block, and relates them to programming language theory concepts.

Industry scale optimization problems are often large and sparse, and problem construction time can rival solution time. The default containers and macros in JuMP present some challenges for this class of problems, in particular some performance gotchas.

We present SparseVariables.jl for simple sparse modelling, and demonstrate performance and conciseness with a supply chain optimization example, benchmarking both problem construction time and LOC for multiple modelling approaches.

I am tenure at the University and I teach several courses of Computer Science. In this talk of 10 minutes I will teach how I will my use of Julia as a useful resource for teaching. In this talk I will introduce how I have used Pluto/PlutoSliderServer to explain concept and to allow students to check some implementations. Also, I show an online web for creating easily online quiz for Moodle. Finally, I have used it to prototype some algorithms that later the students should implement.

We present ToQUBO.jl, a Julia package to reformulate general optimization problems into QUBO instances. This tool aims to convert JuMP problems for straightforward application in physics and physics-inspired solution methods whose normal form is equivalent to QUBO. It automatically maps between source and target models, providing a smooth JuMP modeling experience.

We also present a simple interface to connect various annealers and samplers as QUBO solvers bundled in another package, Anneal.jl.

Turbulence in the atmosphere is often studied using 3D simulations as a virtual laboratory. Most experiments require code modification by users, but this is hard, because most codes are written in low-level languages as Fortran and C++. MicroHH.jl, a Julia port of the dynamical core of MicroHH, has been designed to solve this problem. It is built on MPI.jl, LoopVectorization.jl, CUDA.jl, and HDF5.jl for the IO. Interaction with running simulations is made possible via user scripts.

In this talk, we describe a Julia implementation of RipQP, a regularized interior-point method for convex quadratic optimization. RipQP is able to solve problems in several floating-point formats, and can also start in a lower precision as a form of warm-start. The algorithm uses sparse factorizations or Krylov methods from the Julia package Krylov.jl. We present an easy way to use RipQP to solve problems modeled with QuadraticModels.jl and LLSModels.jl.

We present a GitHub organization, CUPofTEAproject, hosting a Franklin.jl website, cupoftea.earth, and a suite of Julia packages. The organization goal is to host versioned analysis and web interactive visualization (using WGLMakie.jl) of science studies about exchanges between terrestrial ecosystems and the atmosphere.

Enzyme is a new LLVM-based differentiation framework capable of creating fast derivatives in a variety of languages. In this talk we will showcase improvements in Enzyme.jl, the Julia-language bindings for Enzyme that enable us to differentiate through parallelism (Julia tasks, MPI.jl, etc), mutable memory, JIT-constructs, all while maintaining performance. Moreover we will also showcase Enzyme's new forward mode capabilities in addition to its existing reverse-mode features.

ProtoSyn.jl is an open-source alternative to molecular manipulation and simulations software, built with a modular architecture and offering a clean canvas where new protocols and models can be tested and benchmarked. By delivering good documentation, ProtoSyn.jl aims to lower the entry barrier to inexperienced scientists and allow a “plug-and-play” experience when implementing modifications. Learn more on the project’s GitHub page:

https://github.com/sergio-santos-group/ProtoSyn.jl

In the Fall Semester 2021 at ETH Zurich, we designed and taught a new course: **Solving PDEs in parallel on GPUs with Julia**. We present technical and teaching experiences we gained: we look at our tech-stack `CUDA.jl`

, `ParallelStencils.jl`

and `ImplictGlobalGrid.jl`

for GPU-computing; and `Franklin.jl`

, `Literate.jl`

, `IJulia.jl`

/Jupyter for web, slides, and exercises. We look into the crash-course in Julia, teaching software-engineering (git, CI) and project-based student evaluations.

The Planning Domain Definition Language (PDDL) is a formal specification language for symbolic planning problems and domains that is widely used by the AI planning community. This talk presents PDDL.jl, a fast and flexible interface for planning over PDDL domains. It aims to be what PyTorch is to deep learning, or what PPLs are to Bayesian inference: A general high-performance platform for contemporary AI applications and research programs that leverage automated symbolic planning.

In this talk, we show how to use Julia to build the system software for a medical imaging device. Such a device is a distributed system that has to coordinate the handling of real-time signals and asynchronous tasks. The talk will highlight the key parts and design patterns of our software stack. We show how we used a variety of Julia features to implement the control logic of the entire imaging device and the coordination and communication with the large number of sub-devices it controls.

Examples are an essential part of the package documentation. In this talk, I'll introduce DemoCards.jl as a plugin package for Documenter to manage your demo files. I'll explain its design and show how it is used to build the demos in JuliaImages and JuliaPlots. Package authors and document writers are potential users of this package.

The talk will introduce Clarabel.jl, a conic convex optimization solver in pure Julia. Clarabel.jl uses an interior point technique with a novel homogeneous embedding and can solve LPs, QPs, SOCPs, SDPs or exponential cone programs. The talk will highlight the solver's performance advantages relative to competing solvers, discuss algorithmic and software design ideas drawn from existing solvers, and highlight extensibility features leveraging Julia's multiple dispatch system.

The accelerating outflow of ice in Antarctica or Greenland due to a warming climate or the geodynamic processes shaping the Earth share common computational challenges: extreme-scale high-performance computing (HPC) which requires the next-generation of numerical models, parallel solvers and supercomputers. We here present a fresh approach to modern HPC and share our experience running Julia on thousands of graphical processing units (GPUs).

In 2019 the Event Horizon Telescope (EHT) Collaboration produced the first image of a black hole. This talk details how Julia has been an essential tool for EHT black hole imaging and the advancement of black hole science. I will demonstrate how Julia’s features such as multiple dispatch, differentiable programming, and composability have enabled orders of magnitude performance improvement, moving black hole imaging from clusters to a single laptop.

Julia Gender Inclusive is an initiative that supports gender diversity in the Julia community. Over the last year, we have worked toward doing so through meetups and workshops for community building and education. In this Birds-of-Feather session, we hope to discuss current and future initiatives with other people with underrepresented genders, as well as supportive allies.

Join the discussion on the bof-voice channel

JuMP 1.0 was released in March 2022. I'll present the state of JuMP today, how we got here, what the JuMP community should know about the 1.0 release, and what's next on the roadmap.

JunctionTrees.jl implements the junction tree algorithm: an efficient method to perform Bayesian inference in discrete probabilistic graphical models. It exploits Julia's metaprogramming capabilities to separate the algorithm into a compilation and a runtime phase. This opens a wide range of optimization possibilities in the compilation stage. The non-optimized runtime performance of JunctionTrees.jl is similar to those of analog C++ libraries such as libdai and Merlin.

ModalDecisionTrees.jl offers a set of symbolic machine learning algorithms that extend classical decision tree learning algorithms, and are able to natively handle time series and image data. *Modal Decision Trees* leverage *modal logics* to perform a primitive-but-powerful form of entity-relation reasoning; this allows them to capture temporal and spatial patterns, and makes them suitable to natively deal (= no need for feature extraction) with data such as multivariate time-series and images.

Using HawkesProcesses.jl I'll introduce the theory behind Hawkes process and show how it can be used across many different applications. Hawkes processes in Julia benefit from the speed of the language and composability of the different libraries, as you can easily extend the Hawkes process using other packages without too much difficulty. Plus, by using Pluto notebooks I can build simple interactions that demonstrate the underlying mechanics of the Hawkes process.

With Gridap, Julia has a Finite Element package that allows writing expressions that closely mimic the mathematical notation of the weak form of an equation and automate the assembly of the linear system from there. Rather than using macros, the equations are interpreted as regular Julia expressions. This approach is similar to what has been done in C++, e.g. in the Coolfluid 3 CFD code. In this talk, both methods will be compared, showing how Julia really is "C++ done right" for this use case.

Dithering algorithms are a group of color quantization techniques that create the illusion of continuous color in images with small color palettes by adding high-frequency noise or patterns. Traditionally used in printing, they are now mostly used for stylistic purposes.

DitherPunk.jl implements a wide variety of fast and extensible dithering algorithms. Using its example, I will demonstrate how packages for creative coding can be built on top of the JuliaImages ecosystem.

Depending on the applications, the requirement for a multivariate polynomial library may be efficient computation of product, division, substitution, evaluation, gcd or even Gröbner bases. It is well understood that the concrete representation to use for these polynomials depends on whether they are sparse or not. In this talk, we show that in Julia, the choice of representation also depends on whether to specialize the compilation on the variables.

I will introduce ElectrochemicalKinetics.jl, a package that implements a variety of models for electrochemical reaction rates (such as Butler-Volmer or Marcus-Hush-Chidsey). It can also fit model parameters and construct nonequilibrium phase diagrams. While the package has already been of great use in electrochemical research applications, I will focus more on the design choices as well as the challenges that have come up in implementing automatic differentiation support.

In this lightning talk, I will present an example workflow in leveraging the Kubernetes cluster of RaspberryPis to perform parallel search in finding the best AutoML pipeline in a given classification task. While many applications of RasPis are targeted for IOT usage, a K8s cluster of RasPis running Julia can be targeted to solve more complex problems and I will provide examples of the cluster performance running AutoMLPipeline applications.

FdeSolver Julia package solves fractional differential equations in the sense of Caputo, suitable for nonlinear and stiff ordinary differential systems. It has been used for describing memory effects in microbial community dynamics and complex systems. With some practical examples, I will present why (and how) developing such a computational package in open source programming is important (and useful).

PHCpack is a software package for solving polynomial systems via homotopy continuation methods. Our interface exports the functionality of PHCpack either via the executable or the shared object file, via its C interface. The software is free and open source and we have a cloud server that hosts the application at phcpack.org.

Our talk will also explore a specific application area in the design of mechanisms.

Astronomers have detected nearly a thousand exoplanets by precisely charting the radial velocity (RV) wobble of their host stars. The RVSpectML family of packages is a new, open-source, modular and performant pipeline for measuring radial velocities and stellar variability indicators from spectroscopic time-series. This talk aims to give potential users and/or developers an overview of the component packages and their status.

This talk discusses ScotBen, a microsimulation tax-benefit model for Scotland written in Julia. Scotben lets you analyse how changes to the tax system change revenues, inequality and poverty.

The Risk-Aware Market Clearing (RAMC) project investigates the quantification and management of risk in power systems, thereby bridging machine learning, optimization and risk analysis. This talk will discuss the team's experience --from a user perspective-- on using Julia and JuMP within an academic project and a multi-disciplinary team. This will include the motivation for using these tools, as well as hurdles encountered along the way, and practical experience on industrial-size systems.

In this talk, I will introduce Comonicon. Comonicon is a CLI generator designed for Julia, unlike other CLI generators such as Fire, ArgParse, and so on, Comonicon does not only parse command-line arguments but also provide a full solution for building CLI application (via PackageCompiler), packing tarballs, generating shell auto-completion, CLI application installation, mitigating CLI latencies. I'll also talk about ideas arise from development about the future official Julia application.

JuliaGeo is a community that contains several related Julia packages for manipulating, querying, and processing geospatial geometry data. We aim to provide a common interface between geospatial packages. In 2022 there has been a big push to have parity with the Python geospatial packages, such as rasterio and geopandas. In this 10 minute talk, we'd like to show these improvements---both in code and documentation---during a tour of the geospatial ecosystem.

If you want to simulate something, sooner or later you’re going to come across partial differential equations. But solving PDEs is hard, right? It doesn’t have to be! In this talk we'll cut to the chase: how do I copy paste the textbook description of my PDE into Julia symbolic syntax and get a solution? MethodOfLines.jl is the answer, and in this talk we'll show you how to do it!

Computational efficiency is vital when estimating macroeconomic models for use in policy analysis. We introduce the models contained within DSGE.jl and overview how to estimate them. We provide details on two estimation methods, adaptive Metropolis-Hastings and sequential Monte Carlo, and discuss how they can provide more efficiency during the estimation process.

We present a new collection of algorithms dedicated to compute the basins of attraction of any complex rational map. This is a relevant matter in Holomorphic Dynamics, and also a way to visualize and study amazing fractal objects like Julia Sets. These algorithms solve many computational problems that often arise in Numerical Analysis, like overflows or mathematical indeterminations, and provide more information about the dynamics of the system than traditional algorithms generally do.

ASML is a 30.000+ employee company which is the world leader on photo-lithographic systems that are crucial for semiconductor manufacturing. During the last two years a community of Julia enthusiasts has been running pilot projects to assess opportunities offered by Julia for rapid development of early Proof-of-Concepts and, subsequent, rapid deployment in prototypes and, whether possible, actual products. Similar to other robotic systems, ASML lithography systems have hard real-time....

We present an efficient approach for writing architecture-agnostic parallel high-performance stencil computations in Julia. Powerful metaprogramming, costless abstractions and multiple dispatch enable writing a single code that is usable for both productive prototyping on a single CPU and for production runs on GPU or CPU workstations or supercomputers. Performance similar to CUDA C is achievable, which is typically a large improvement over reachable performance with `CUDA.jl`

Array programming.

COPT (Cardinal Optimizer) is a mathematical optimization solver for large-scale optimization problems. It includes high-performance solvers for LP, MIP, SOCP, convex QP, convex QCP and other mathematical programming problems. In this talk we will give an overview of COPT and introduce its Julia interface.

We present a straightforward approach for distributed parallelization of stencil-based Julia applications on a regular staggered grid using GPUs and CPUs. The approach allows to leverage remote direct memory access and was shown to enable close to ideal weak scaling of real-world applications on thousands of GPUs. The communication performed can be easily hidden behind computation.

Standardized data objects can greatly support the collaborative development of new data science methods. In particular, commonly agreed data standards will provide improved efficiency and reliability in complex data integration tasks. We demonstrate the application of this framework in the context of microbiome research.

We will showcase our efforts building Julia proxy applications, or mini apps, targeting the Summit and Frontier supercomputers. We developed XSBench.jl to simulate on-node CPU and GPU scalability of a Monte Carlo computational kernel and, and RIOPA.jl for parallel input/output (I/O) strategies. We will share the lessons learned from Julia’s fresh approach for performance and productivity as a viable language, similar to Fortran, C and C++ for high-performance computing (HPC) systems.

This talk will describe how the JuMP and HiGHS teams have worked together to deliver the best open-source linear optimization solvers to the Julia community, and present some high-profile use cases.

We make machines that make chips; the hearts of the devices that keep us informed, entertained and safe.

Metalenz is commercializing metasurface technology and transforming optical sensing in consumer electronics and automotive markets.

Opening remarks

Keynote- Erin LeDell

Julia Computing's mission is to develop products that bring Julia's superpowers to its customers. Julia Computing's flagship product is JuliaHub, a secure, software-as-a-service platform for developing Julia programs, deploying them, and scaling to thousands of nodes.

Amazon Web Services sponsor Talk

The virtual poster session will take place in the designated area in Gather.town. See the full poster list on the JuliaCon website.

Interporability of Julia with C++ is essential for the use of the Julia programming language in fields with a large legacy of code written in this language. We will show how the generation of a Julia interface to a C++ library can be automatized using CxxWrap.jl and a tool that generates the required C++ glue code. The concept is demonstrated with a prototype called WrapIt!, https://www.github.com/grasph/wrapit, based on clang and which is already well advanced.

Pajarito is an outer approximation solver for mixed integer conic problems. We have redesigned and rewritten Pajarito in version 0.8 to support MathOptInterface (finally!). Pajarito now has a generic cone interface that allows adding support for new convex cones through a small list of oracles. PajaritoExtras.jl extends Pajarito by defining several cones supported by the continuous conic solver Hypatia. We illustrate with applied examples, including mixed integer polynomial problems.

We introduce PastaQ.jl, a computational toolbox for simulating, designing, and benchmarking quantum hardware. PastaQ relies on a tensor network description of quantum processes, built on top of ITensors.jl, a leading library for efficient tensor network algorithms. Leveraging recent developments in tensor network differentiation in ITensor, PastaQ provides access to a broad range of computational tools to tackle a tasks routinely encountered when building quantum computers.

This is a use-case scenario of using Julia for planning and optimization of production in one of the largest bicycle manufacturing plants in Europe. The optimization model has been implemented utilizing JuMP and custom made heuristics. The Julia solution has increased profitability of the manufacturing plant over 10% (compared to the previous approach) and the optimal part allocation made it possible to increase the bike production volume by 25%.

AI has inaugurated a new era in Bioinformatics, to the point where contemporary language models can extract structural information from processing single protein sequences. Contributing to this field, we built TintiNet.jl, a 100% open-source, open-data and Julia-based portable language model to predict 1D protein structural properties. Our model achieves top performance - computational and predictive -, when compared to other modern algorithms, with only a fraction of the parameter count.

We present Scylla, a primal heuristic for mixed-integer optimization. It uses matrix-free approximate LP solving with specialized termination criteria and parallelized fix-and-propagate procedures blended with feasibility pump-like objective updates. Besides the presentation of the method and results, we will go over lessons learned on experimentation and implementation tricks including overhead reduction, asynchronous programming, and interfacing with natively-compiled libraries.

Materials computations, especially of the *ab initio* kind, are intrinsically complex. These difficulties have inspired us to develop an extensible, lightweight, high-level workflow framework, `Express.jl`

, to automate long and extensive sequences of the *ab initio* calculations. In this talk, we would like to share some experiences that we gained in building a software framework and multifunctional scientific tools with Julia's versatility.

Many new high-level programming languages have emerged in recent years. Julia is one of these languages, that offers the speed of C, the macro capabilities of Lisp, and the user-friendliness of Python. However, its library set is still reduced when compared to languages, such as Python. We propose extending PyJL, an open source transpilation tool, to speedup the conversion of libraries to Julia.

Join the sponsored forum here.

Conventional Ptychography is a lensless imaging technique which captures a sequence of light diffraction patterns to solve the optical phase problem. The resulting datasets are large and can typically not directly be solved. Instead, iterative reconstruction algorithms with low runtime memory footprint are employed.

Here we present PtyLab.jl, a software for ptychographic data analysis and demonstrate how a functional programming style in Julia allows for performant iterative algorithms.

A key aspect for operating near-term intermediate-scale quantum (NISQ) computers is to develop compact circuits to implement quantum algorithms, given the hardware's architectural constraints. Efficient formulations and algorithms to solve such hard optimization problems with optimality guarantees is a key in designing such NISQ devices. This talk provides an overview of QuantumCircuitOpt.jl, a software package developed at LANL for provably optimal synthesis of architecture for Quantum circuits

Cropbox.jl provides a domain specific language for developing crop models.

Chemical predictions have gained ground in the last decade as a way to automate the streamlining of chemical reactivity of multiple substrates. This procedure requires the modeling of interatomic potentials, which can be done by fitting these potentials to data obtained at the quantum-mechanical level. Therefore, the aim of this work is to propose GapTrain.jl, a fast, automatic and broad model to develop the Gaussian approximation potential based on a hundred or thousand data.

We will highlight new features in the Julia VS Code extension that shipped in the last year and give a preview of some new work. The new features from last year that we will highlight are: 1) a new profiler UI, 2) a new table viewer UI, 3) a revamped plot gallery, 4) cloud indexing infrastructure, and 5) integration of JuliaFormatter. We will also present some brand-new features, in particular an entirely new test explorer UI integration.

Previously traditional modeling tools were used to provide the acausal modeling framework which could be statically compiled and integrated with our distributed software. But with this comes the dual language problem and friction with model research and development. With ModelingToolkit.jl the tools needed to transition from traditional modeling frameworks are now available. This talk will cover our approach and success in re-writing our Hydraulic Crash Simulation system model in pure Julia.

QuantumAnnealing.jl provides a toolkit for performing simulations of Adiabatic Quantum Computers on classical hardware. The package includes functionality for rapid simulation of the Schrodinger evolution of the system, processing annealing schedules used by real world annealing hardware, implementing custom annealing schedules, and more.

Observational health research is a domain of health informatics centered around the use of what is known as "Real World Data". This data comes in several different modalities, standards, and levels of quality. Through efforts done in JuliaHealth, JuliaInterop, and associated communities, the ability to work with this data is now fully realized. Through this talk, viewers will see how an observational health study can be conducted with Julia and how similar tools can be adapted to their research.

Deep learning using neural networks is increasingly popular, but neural networks come with few built-in guarantees of correctness. This talk will discuss how I use JuMP to compute verified upper bounds on the error of a neural network trained as an inverse model. Using JuMP together with Distributed.jl allows me to run a large number of verification queries in parallel with minimal time spent on non-research development.

Complex numbers appear in a variety of optimization problems such as AC optimal power flow problems (AC-OPF) and quantum information optimization. This talk presents the integration of complex numbers in JuMP. We first describe how to create complex variables and constraints with complex coefficients in JuMP. Then, we show how this addition makes use of the extensible design of MathOptInterface and JuMP. We illustrate this with examples from PowerModels.jl and SumOfSquares.jl.

In radio astronomy, "Fast Radio Bursts" are short, high-energy signals of unknown origin. So far, relatively few have been discovered as many telescopes weren't designed to observe radio transients. Additionally, searching real-time spectral data is an expensive task, for which there are only a few aging packages to automate. In this talk, we'll look at using CUDA.jl and the Julia ecosystem to accelerate the hunt for these mysterious sources and the integration into an FRB detection pipeline.

Privacy is an important aspect of the internet today. Providing privacy protection, however, is a difficult problem especially when you work with many data processes and systems. To solve this problem holistically, privacy needs to be a built-in feature, not an after-thought. I will talk about how to solve this problem with the idea of context and capabilities.

Join us on Gather.town for a social hour.

Keynote - Jeremy Howard

Quiqbox.jl is a Julia package that allows highly customizable Gaussian-type basis set design for electronic structure problems in quantum chemistry and quantum physics. The package provides a variety of useful functions around basis set generation such as RHF and UHF methods, standalone 1-electron and 2-electron integrals, and most importantly, variational optimization for basis set parameters. It supports Linux, Mac OS, and Windows.

Mathematica is a powerful tool for many purposes, but it can be cumbersome to work with. This is especially clear for more automated tasks.

In this short talk, I will introduce MathLink and MathLinkExtras, which enable interoperability between Julia and Mathematica.

I will introduce the basic syntax of MathLink and discuss an application of automated computation of nested integrals.

Julia's DateTime type is limited to milliseconds, while the Time type supports nanoseconds. This talk introduces NanoDate.jl and NanoDates. This type works like DateTime with higher precision. CompoundPeriods behave more smoothly and are available as an operational design element for developers.

The study of audio circuits is interdisciplinary. It combines DSP, analog circuits, differential equations, and semiconductor theory. Mathematical tools like Fourier Transforms and standard circuit analysis cannot explain the behavior of stateful nonlinearities. A complete description of a circuit can only be obtained through time-domain (or ‘transient’, in SPICE terms) simulation. ModelingToolkit.jl enables rapid design iteration and combines features that traditionally require multiple tools.

The JuliaML ecosystem introduces an effective way to model natural phenomena with Universal Differential Equations. UDEs enrich differential equations combining an explicitly known term with a term learned from data via a Neural Network. Here, we explore what happens when our assumptions about the known term are wrong, making use of the rich interoperability of Julia. The insight we offer will be useful to the Julia community in better understanding strengths and possible shortcomings of UDEs.

Modeling the temporal evolution of complex networks is still an open challenge across many fields. Using the SciML ecosystem in Julia, we train and simplify a Neural ODE on the low-dimensional embeddings of a temporal sequence of networks. In this way, we discover a dynamical system representation of the network that allows us to predict its temporal evolution. In the talk we’ll show how the tight integration of SciML, Network, and Matrix Algebra packages in Julia opens new modeling directions.

This talk focuses on an iterative algorithm, called active learning, to update radial basis function surrogates by adaptively choosing points across its input space. This work extensively uses the SciML ecosystem, and in particular, Surrogates.jl.

In this talk, we present the architecture of the NFFT.jl package, which implements the non-equidistant fast Fourier trans-form (NFFT). The NFFT is commonly implemented in C/C++ and requires sophisticated performance optimizations to exploit the full potential of the underlying algorithm. We demonstrate how Julia enables a high-performance, generic, and dimension-agnostic implementation with only a fraction of the code required for established C/C++ NFFT implementations.

Julia code can be precompiled to save time loading and/or compiling it on first execution. Precompilation is nuanced because Julia code comes in many flavors, including source text, lowered code, type-inferred code, and various stages of optimization to reach native machine code. We will summarize what has (and hasn't) previously been precompiled, some of the challenges posed by Julia's dynamism, the nature of some recent changes, and prospects for near-term extensions to precompilation.

The JuliaGPU community welcomes both long-standing contributors and newcomers to a birth-of-the feather event on the state of the JuliaGPU ecosystem.

Join the discussion on the bof-voice channel in discord.

Voice your feedback and experiences.

Ever written code that was too slow because of excessive allocations, but didn't know where in your code they were coming from? Julia 1.8 introduces a new Allocation Profiler for finding and understanding sources of allocations in your julia programs, providing stack traces and type info for allocation hotspots. In this talk we will introduce the allocation profiler, cover how to use it, and talk through a small success story in our own codebase.

This talk compares the runtime of Julia's builtin sorting with that of other languages and explains some of the techniques Julia uses to outperform other languages. This is a small part of the larger ongoing effort to equip Julia with state of the art and faster than state of the art performance for all sorting tasks.

Julia already has quite a few well-established Neural Network Frameworks -- Flux & KNet. However, certain design elements -- **Coupled Model and Parameters** & **Internal Mutations** -- associated with these frameworks make them less compiler and user friendly. Making changes to address these problems in the respective frameworks would be too disruptive for users. To address these challenges, we designed `Lux,`

a NN framework.

Tricks.jl is a package that does cool tricks to do more work at compile time.

It does this by generating (`@generated`

) functions that just return "hardcoded" values, and then trigger the generation when (if) that value changes. This retriggering is done using backedges.

Tricks.jl can for example declare Tim Holy traits that depend on whether or not a method has been defined

Slides

We present GraphPPL.jl - a package for user-friendly specification of probabilistic models with variational inference constraints. GraphPPL.jl creates a model as a factor graph and supports the specification of factorization and form constraints on the variational posterior for the latent variables. The package collection GraphPPL.jl, ReactiveMP.jl and Rocket.jl provide together a full reactive programming-based ecosystem for running efficient and customizable variational Bayesian inference.

High-dimensional PDEs cannot be solved with standard numerical methods, as their computational cost increases exponentially in the number of dimensions. This problem, known as the curse of dimensionality, vanishes with HighDimPDE.jl. The package implements novel solvers that can solve non-local nonlinear PDEs in potentially up to 1000 dimensions.

Cthulhu.jl is a highly useful tool for performance engineering as well as general debugging of Julia programs. However, as the name implies, one can quickly descend into the abyss that is Julia's compilation pipeline and get lost in the vast amounts of code even modest looking Julia functions may end up generating. I present a combination of Cthulhu.jl with a step-by-step debugger, showing concrete results every step along the type lattice to make compilation more interpretable.

Optimizing Julia isn't hard if you compare it to Python or R where you have to be an expert in Python or R and C/C++. I'll describe what type stability is and why it is important for performance. I'll discuss it in the context of performance (raw throughput) and in the context of time to first X (TTFX). Julia is sort of notorious for having really bad TTFX in certain cases. This talk explains the workflow that you can use to reduce running time and TTFX.

In this talk we present a new feature of Gridap.jl focusing on the solution of transient Partial Differential Equations (PDEs). We will show a new API that: a) leads to weak forms with very simple syntax, b) supports automatic differentiation, c) enables the solution of multi-field and DAE systems, and d) can be used in parallel computing through GridapDistributed.jl. We will showcase the novel features for a variety of applications in fluid and solid dynamics.

TuringGLM makes easy to specify Bayesian **G**eneralized **L**inear **M**odels using the formula syntax and returns an instantiated Turing model.

Example:

```
@formula(y ~ x1 + x2 + x3)
```

Heavily inspired by brms (uses RStan or CmdStanR) and bambi (uses PyMC3).

Garbage collection is one of those productivity tools that you don't think about until you need to. We will discuss the current state of Julia GC and what can be done to make it better.

Julia Gaussian Processes (Julia GPs) is home to an ecosystem of packages whose aim is to enable research and modelling using GPs in Julia. It specifies a variety of interfaces, code which implements these interfaces in standard settings, and code built on top of these interfaces (e.g. plotting). The composability and modularity of these interfaces distinguishes it from other GP software. This talk will explore the things that you can currently do with the ecosystem, and where it’s heading.

Most (Mathematical) Optimization problems are subject to bounds on the decision variables. In general, a nonlinear cost function `f(x)`

is to be minimized, with the vector `x`

constrained by simple bounds `l <= x <= u`

. The *Projected Gradient* class of methods is tailored for this very optimization problem. Our package includes various Projected Gradient methods, fully implemented in Julia. We make use of Julia's Iterator interface, allowing for user-defined termination criteria.

Introducing TextSegmentation.jl, a package for Text Segmentation with Julia. Text Segmentation is a method of dividing an unstructured document including various contents into several parts according to its topics. So it is an important technique that supports various natural language processing tasks such as summarization, extraction, and question answering. If the audience listen to this presentation, they will learn Text Segmentation and how to use packages and be able to perform it easily.

With the increasing popularity of Julia for memory intensive applications, garbage collection is becoming a performance bottleneck.

Julia currently uses a serial mark-and-sweep collector, in which objects are traced starting from a root-set (e.g. thread’s stacks, global variables, etc.) and unreachable objects are then deallocated.

We discuss in this talk how we recently parallelized tracing of live Julia objects and the performance improvements we got so far.

Recommender system is a data-driven application that generates personalized content for users. This talk shows how Julia can be a deeply satisfying option to capture the unique characteristics of recommenders, which rely heavily on repetitive matrix computations in multi-stage data pipelines. To build trustworthy systems in terms of not only accuracy and scalability but usability and fairness at large, we particularly focus on API design and evaluation methods implemented on Recommendation.jl.

An introduction to the Transformers.jl and relative packages for building transformer models.

G-Research is Europe’s leading quantitative finance research firm

The tension between performance and flexibility is always present when developing new systems. Often, poor performance is unacceptable. But poor flexibility hinders experimentation and evolution, which may lead to bad performance later on. In this talk, we show how we used MMTk.io – a toolkit we are developing that provides language implementers with a powerful garbage collection framework – to implement a flexible (unbaking the cake) and performant (and eating it too) memory manager for Julia.

With deep expertise in allied fields of clinical pharmacology, pharmacometrics, drug development, regulations and advanced data analytics including machine learning, Pumas-AI works with companies, laboratories and universities as their healthcare intelligence partner.

Restreaming of the earlier Keynote by Jeremy Howard

oneAPI.jl is a Julia package that makes it possible to use the oneAPI framework to program accelerators like Intel GPUs. In this talk, I will explain the oneAPI framework, which accelerators it supports, and demonstrate how oneAPI.jl makes it possible to work with these accelerators from the Julia programming language.

Julius offers an auto-scaling, low code graph computing solution that allows firms to quickly build transparent and adaptable data analytics pipelines.

Results of the Julia Developer Survey 2022

I present a new package which aims to automate the process of using reinforcement learning to solve discrete-time heterogeneous-agent macroeconomic models. Models with discrete choice, matching, aggregate uncertainly, and multiple locations are supported. The pure-Julia package, tentatively named Bucephalus.jl, also defines a data structure for describing this class of models, allowing new solvers to be easily implemented and models to be defined once and solved many ways.

We developed the open-source software package BlockDates using the Julia programming language to allow the extraction of fuzzy-matched dates from a block of text. The tool leverages contextual information and draws on external date data to find the best date matches. For each identified date, multiple interpretations are proposed and scored to find the best fit. The output includes several record-level variables that help explain the result and prioritize error detection.

PartitionedArrays is a distributed sparse linear algebra engine that allows Julia users to easily prototype and deploy large computations on distributed-memory HPC platforms. The long-term goal is to provide a Julia alternative to the parallel vectors and sparse matrices available in well-known distributed algebra packages such as PETSc. Using PartitionedArrays, application libraries have shown excellent strong and weak scaling results up to tends of thousands of CPU cores.

In JuMP 1.0, support for nonlinear programming is a second-class citizen. You must use the separate `@NL`

macros, the automatic differentiation engine is a JuMP-specific implementation that cannot be swapped for alternative implementations, and vector-valued nonlinear expressions are not supported. In this talk, we discuss our plans and progress to address these issues and make nonlinear programming a first-class citizen. This work is supported by funding from Los Alamos National Laboratory.

The Julia HPC community has been growing over the last years with monthly meetings to coordinate development and to solve problems arising in the use of Julia for high-performance computing.

The Julia in HPC Birds of a Feather is an ideal opportunity to join the community and to discuss your experiences with using Julia in an HPC context.

**Note:** We will host the BoF via Zoom and share the meeting link 15 min before start time in the #hpc channels of JuliaCon Discord and Julia Slack.

Julia's compiler spends almost all of its time generating, optimizing, and compiling LLVM IR. Currently, much of this work is done under one giant lock, which is also held during type inference, reducing compiler throughput in a multithreaded environment. By using finer-grained locking and handling LLVM IR in a threadsafe manner, we can reduce contention of compilation resources. This work also leads into future JIT optimizations such as lazy, parallel, and speculative compilation of Julia code.

A wide range of research on feedforward neural networks requires "bending" the chain rule during backpropagation. The package Bender.jl provides neural network layers (compatible with Flux.jl), which gives users more freedom to choose every aspect of the forward mapping. This makes it easy to leverage ChainRules.jl to compose a wide range of experiments, such as training binary neural networks, Feedback Alignment and Direct Feedback Alignment in just a few lines of code.

MATLAB is a proprietary programming language popular for scientific computing. Calling MATLAB code from Julia via the C API has been supported for many years via MATLAB.jl. The reverse direction is more complex. One approach is to compile Julia via the C++ MEX API as in Mex.jl. In MATDaemon.jl (https://bit.ly/3JxTFFU), we instead communicate by writing data to .mat files. This method is robust across Julia and MATLAB versions, and easy to use: just download jlcall.m from the GitHub repository.

In this talk, updates on the development of a GPU backend for Apple hardware (specifically the M-series chipset) will be presented along with a brief showcase of current capabilities and interface. The novel compilation flow will be explained and compared to the other GPU backends as well as the benefits and limitations of both a unified memory model and Apple's Metal capabilities. A brief overview of Apple's non-GPU hardware accelerators and their potential will also be discussed.

ArrayAllocators.jl uses the standard array interface to allow faster `zeros`

with `calloc`

, allocation on specific NUMA nodes on multi-processor systems as well as aligned memory. The allocators are given as an argument to `Array{T}`

in place of `undef`

. Overall, this allows Julia to match the allocation performance of popular numerical libraries such as NumPy, which uses some of these techniques.

In this talk, we will also explore some of the unexpected properties of these allocation methods.

Treating deep neural networks probabilistically comes with numerous advantages including improved robustness and greater interpretability. These factors are key to building artificial intelligence (AI) that is trustworthy. A drawback commonly associated with existing Bayesian methods is that they increase computational costs. Recent work has shown that Bayesian deep learning can be effortless through Laplace approximation. This talk presents an implementation in Julia: `BayesLaplace.jl`

.

Need to solve Ax=b for x? Then use A\b! Or wait, no. Don't. If you use that method, how do you swap that out for a method that performs GPU offloading? How do you switch between UMFPACK and KLU for sparse matrices? Krylov subspace methods? What does all of this mean and why is A\b not good enough? Find out all of this and more at 11. P.S. LinearSolve.jl is the answer.

Mathematical models are crucial to build and predict the behaviour of new biological systems. However, selecting between plausible model candidates or estimate parameters is an arduous job, especially considering the different informative content of experiments. BOMBs.jl is a package to automate model simulations, pseudo-data generation, maximum likelihood estimation and Bayesian inference of parameters (Stan and Turing.jl), and design optimal experiments for model selection and inference.

This work discusses some of the requirements for deploying non-convex nonlinear optimization methods to solve large-scale problems in practice. AC Optimal Power Flow is proposed as a proxy-application for testing the viability of nonlinear optimization frameworks for solving such problems. The current performance of several Julia frameworks for nonlinear optimization is evaluated using a standard benchmark library for AC Optimal Power Flow.

The CALiPPSO.jl package implements a new algorithm for producing *disordered* spheres packings with very high accuracy. The algorithm reaches the critical jamming point of hard spheres through a chain of constrained linear optimization problems. CALiPPSO.jl exploits the functionality of JuMP for modelling and is thus compatible with several optimizers. In collaboration with C. Artiaco, G. Parisi, and F. Ricci Tersenghi.

Inspired by the compile-time features of Zig, we present a CompTime.jl, a package that wraps Julia’s features for generated functions into a seamless interface between compile-time and runtime semantics. The desire for this came from heavy use of @generated functions within Catlab.jl, and we have found that CompTime.jl makes our code more readable, maintainable, and debuggable. We will give a tutorial and then a brief peek into the implementation.

BanyanONNXRunTime.jl is an open-source Julia package for running PyTorch/TensorFlow models on large distributed arrays. In this talk, we show how you can use BanyanONNXRunTime.jl with BanyanDataFrames.jl for running ML models on tabular data and with BanyanImages.jl for running ML models on image data.

Julia's Continuous Integration pipeline has struggled for many years now as the needs of the community have significantly outgrown the old Buildbot system. In this talk we will detail the efforts of the CI dev team to provide reliability, reproducibility, security, and greater introspective ability in our CI builds. These CI improvements aren't just helping the Julia project itself, but also other related open-source projects, as we continue to generate self-contained, useful building blocks.

We present SpeedyWeather.jl, a global atmospheric model currently developed as a prototype for a 16-bit climate model incorporating machine learning for accuracy and computational efficiency on different hardware. SpeedyWeather.jl is designed for type flexibility with low precision, and automatic differentiation to replace parts of the model with neural networks for a more accurate representation of climate processes and computational efficiency.

Arpack is a library for computing eigenvalues and eigenvectors of a linear operator. It has been used in many technical computing packages. The goal of the `GenericArpack.jl`

package is to create a Julia translation of Arpack. Right now, the Julia `GenericArpack.jl`

methods produce *bitwise identical* results to the `Arpack_jll`

methods for Float64 types in all testcases. The new library has zero dependency on BLAS and supports element types beyond those in Arpack, such as `DoubleFloats.jl`

.

In pursuit of interpreting black-box models such as deep image classifiers, a number of techniques have been developed that attribute and visualize the importance of input features with respect to the output of a model.

ExplainableAI.jl brings several of these methods to Julia, building on top of primitives from the Flux ecosystem. In this talk, we will give an overview of current features and show how the package can easily be extended, allowing users to implement their own methods and rules.

In this talk, we present Extremes.jl, a package that provides exhaustive high-performance functions for the statistical analysis of extreme values with Julia. Parameter estimation, diagnostic tools for assessing model accuracy and high quantile estimation are implemented for stationary and non-stationary extreme value models. The functionalities will be illustrated in this talk by reproducing many results from the popular book of Coles (2001).

InfiniteOpt.jl is built on a unifying abstraction for infinite-dimensional optimization problems that enable it to tackle a wide variety of problems in innovative ways. We present recent advances to InfiniteOpt.jl that significantly its flexibility to model/solve these challenging problems. We have developed a general transformation API to facilitate diverse solution methodologies, and we have created an intuitive nonlinear interface that overcomes the current shortcomings of JuMP.jl.

Have you ever wondered how many FLOPS your CPU or GPU actually performs when executing (parts of) your Julia code? Or how much data it has read from main memory or a certain cache? Then this talk is for you! I will present LIKWID.jl (Like I Knew What I'm Doing), a Julia wrapper around the same-named performance benchmarkig suite, that allows you to analyse the performance of your Julia code by monitoring various hardware performance counters sitting inside of your CPU or GPU.

Training artificial neural networks to recapitulate the dynamics of biological neuronal recordings has become a prominent tool to understand computations in the brain. We present an implementation of a recursive-least squares algorithm to train units in a recurrent spiking network. Our code can reproduce the activity of 50,000 neurons of a mouse performing a decision-making task in less than an hour of training time. It can scale to a million neurons on a GPU with 80 GB of memory.

SimpleChains is an open source pure-Julia machine learning library developed by PumasAI and JuliaComputing in collaboration with Roche and the University of Maryland, Baltimore.

It is specialized for relatively small-sized models and NeuralODEs, attaining best in class performance for these problems. The performance advantage remains significant when scaling to tens of thousands of parameters, where it's still >5x faster than Flux or Pytorch while all use a CPU, even outperforming GPUs.

`Manopt.jl`

provides a set of optimization algorithms for problems given on a Riemannian manifold. Build upon on a generic optimization framework, together with the interface `ManifoldsBase.jl`

for Riemannian manifolds, classical and recently developed methods are provided in an efficient implementation. This talk will also present some algorithms implemented in the package.

Join us on Gather.town for a social hour.

This talk introduces GeometricTheoremProver.jl, a Julia package for automated deduction in Euclidean geometry. The talk will give a short overview of geometric theorem proving concepts and hands-on demos on how to use the package to write and prove statements in Euclidean geometry. A roadmap of the package for future development plans will also be presented.

Enhorabuena, ha llegado el momento de tener un foro dedicado para los usuarios de JuliaLang en español.

Discutiremos:

- foros y centros donde se usa Julia en español

- materiales educativos (cursos, libros, artículos, video tutoriales), y planes a futuro

- diversidad, inclusión y apoyo de hispano-parlantes

Join the discussion on the bof-voice channel in discord.

Enterprise adoption for Julia can be a difficult process for developers and engineers to champion. In this sponsored forum, we invite leading industry experts to talk about the common challenges organizations face when bringing Julia and Julia based solutions onboard. Join here.

OnlineSampling.jl is a Julia package for online Bayesian inference on reactive models, i.e., streaming probabilistic models.

Online sampling provides 1) a small macro based domain specific language to describe reactive models and 2) a semi-symbolic inference algorithm which combines exact solutions using Belief Propagation for trees of Gaussian random variables, and approximate solutions using Particle Filtering.

Heterogeneous computing resources, such as GPUs, TPUs, and FPGAs, are widely used to accelerate computations, or make them possible, in scientific/technical computing. We will talk about how loose addressing of heterogeneous computing requirements in programming language designs affects portability and modularity. We propose contextual types to answer the underlying research questions, where programs are typed by their execution platforms and Julia's multiple dispatch plays an essential role.

While Julia is great, there are still a lot of existing useful differentiable Python code in PyTorch, Jax, etc. Given PyCall.jl is already so great and seamless, one might wonder what it takes to differentiate through those calls to Python functions. PyCallChainRules.jl aims for that ideal. DLPack.jl is leveraged to pass CPU or GPU arrays without any copy between Julia and Python.

The JSO organization is a set of Julia packages for smooth, nonsmooth optimization, and numerical linear algebra intended to work consistently together and exploit the structure present in problems. It provides modeling facilities, widely useful known methods, either in the form of interfaces or pure Julia implementations, but also unique methods that are the product of active research. We review the main features of JSO, its current status, and hint at future developments.

In the next decade, forthcoming galaxy surveys will provide the astrophysical community with an unprecedented wealth of data. The standard analysis pipeline, usually employed to analyze this kind of surveys, are quite expensive from a computational point of view.

In this presentation I will show how, using Flux.jl and DiffEquations.jl, it is possible to accelerate standard analysis of some order of magnitudes.

We present a preliminary version of a SIMD-vectorized implementation of the sixteenth order 8-stage implicit Runge-Kutta integrator IRKGL16 implemented in the Julia package IRKGaussLegendre.jl. For numerical integrations of typical non-stiff problems performed in double precision, we show that a vectorized implementation of IRKGL16 that exploits the SIMD-based parallelism can clearly outperform high order explicit Runge-Kutta schemes available in the standard package DifferentialEquations.jl.

Automatic Differentiation (AD) is widely applied in many different fields of computer science and engineering to accurately evaluate derivatives of functions expressed in a computer programming language. In this talk we illustrate the use of AD for the solution of Finite Elements (FE) problems with special emphasis on solid mechanics.

Many remote electronic voting systems use the ElGamal re-encryption mixnet as the foundation of their design, motivated by a number of ways authorities can be held accountable. In particular, zero-knowledge proofs of shuffle as implemented in the Verifiactum library offer an elegant and well-established solution. In ShuffleProofs.jl, I implement a Verificatum compatible verifier and prover for non-interactive zero-knowledge proofs of shuffle, making it more accessible, as I shall demonstrate.

Tools for performing autodifferentiation (AD) and dimensional work in Julia are robust, but not always compatible. This talk explores how we can understand rule-based AD in Julia by showing how to make dimensional quantities from `Unitful.jl`

compose with `ChainRules.jl`

. Combining these two projects produces an intuitive look at the building blocks of AD in Julia using only rudimentary calculus and dimensional analysis.

We present LowRankArithmetic.jl and LowRankIntegrators.jl. The conjunction of both packages forms the backbone of a computational infrastructure that enables simple and non-intrusive use of dynamical low rank approximation for on-the-fly compression of large matrix-valued data streams or the approximate solution of otherwise intractable matrix-valued ODEs. We showcase the utility of these packages for the quantification of uncertainty in scientific models.

MagNav.jl is an open-source Julia package that contains a full suite of tools for aeromagnetic compensation and airborne magnetic anomaly navigation. This talk will describe the high-level functionalities of the package, then provide a brief tutorial using real flight data that is available within the package. The functionalities can be divided into the four essential components of MagNav: sensors (flight data), magnetic anomaly maps, aeromagnetic compensation models, and navigation algorithms.

Why did `exp10`

get 2x faster in Julia 1.6? One reason is, unlike most other languages, Julia doesn't use the operating system-provided implementations for math (Libm). This talk will be an overview of improvements in Julia's math library since version 1.5, and areas for future improvements. We will cover will be computing optimal polynomials, table based implementations, and bit-hacking for peak performance.

In this presentation, we showcase a new optimization infrastructure within JuliaSmoothOptimizers for PDE-constrained optimization problems in Julia. We introduce PDENLPModels.jl a package that discretizes PDE-constrained optimization problems using finite elements methods via Gridap.jl. The resulting problem can then be solved by solvers tailored for large-scale optimization implemented in pure Julia such as DCISolver.jl and FletcherPenaltyNLPSolver.jl.

Optimization.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means, you learn one package and you learn them all! Optimization.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.

How do we trust that a given fluid model is suitable for simulating water waves as they approach and wash over the land? This talk presents some of the benchmark tests used to validate a tsunami model. Using our Julia implementation of a fluid model, we check how well it conserves mass, matches analytical solutions, and reproduces laboratory experiments.

Data visualization with intuitive interactions is an essential feature of many scientific investigations. I propose to go over use cases and examples on why/how to develop reactive dashboards in Julia using "Pluto.jl". Pluto provides a way to isolate cells in a separate page of which the style is editable as regular HTML/CSS. Alongside PlutoUI's experimental layout feature, this is a powerful tool to create immersive interactive experiences for users.

JCheck is a native Julia implementation of a randomized property testing (RPT) framework. It aims at integrating as seamlessly as possible to the Test.jl package in order to enable developers to easily use RPT along with more "traditional" approaches. Although a fair number of generators are included, designing novel ones for custom data types is a straightforward process. Additional features such as shrinkage and specification of "special" non-random input are available.

To study the cosmos, astronomers examine images captured of light exceeding human-visible colors and dynamic range. AstroImages.jl makes it easy to load, manipulate, and visualize astronomical data intuitively and efficiently using arbitrary color-schemes, stretched color scales, RGB composites, PNG rendering, and plot recipes. Come to our talk to see how you too can create beautiful images of the universe!

We present a Julia package (DisjunctiveProgramming.jl) that extends the functionality in JuMP to allow modeling problems via logical propositions and disjunctive constraints. Logical propositions are converted into algebraic expressions by converting the Boolean expressions to Conjunctive Normal Form and then to algebraic inequalities. The package allows the user to specify the technique to reformulate the disjunctions (Big-M or Convex-Hull reformulation) into mixed-integer constraints.

JuliaSyntax.jl is a new Julia language frontend designed for precise error reporting, speed and flexibility. In this talk we'll tour the JuliaSyntax parser implementation and tree data structures, highlighting benefits for users and tool builders. We'll discuss how to losslessly map Julia source text for character-precise error reporting and how a "parse stream" abstraction cleanly separates the parser from syntax tree creation while being 10x faster than Julia's reference parser.

This talk will present a deep dive into juliaup, the upcoming new official Julia installer and version multiplexer. The talk will give a brief presentation of the features of Juliaup, and then dive into design decision, integration with existing system package managers and an outlook of planned future work.

Microbiome.jl is a Julia package to facilitate analysis of microbial community data. BiobakeryUtils.jl is built on top of Microbiome.jl, and provides utilities for working with a suite of command line tools (the bioBakery) that are widely used for converting raw metagenomic sequencing data into tables of taxon and gene function counts. Together, these packages provide an effective way to link microbial community data with the power of Julia’s numerical, statistical, and plotting libraries.

The goal of this talk is to enlighten members of the Julia ecosystem on how they can make an impact by contributing to open source with technical writing. While this talk would be targeted at beginners, there would be something for even the more experienced members.

Join us on Gather.town for a social hour.

We present JustSayIt.jl, a software and high-level API for offline, low latency and secure translation of human speech to computer commands or text, leveraging the Vosk Speech Recognition Toolkit. The API includes an unprecedented, highly generic extension to the Julia programming language, which allows to declare arguments in standard function definitions to be obtainable by voice. As a result, it empowers any programmer to quickly write new commands that take arguments from human voice.

As the generalization of classical calculus and differential equations, fractional calculus and fractional differential equations are important areas since their invention, to provide a comprehensive Differential Equations package, SciFracX is here to explore fractional order area with Julia. In 2022 JuliaCon, we will talk about the progress we have made in FractionalDiffEq.jl and FractionalCalculus.jl, how Julia helped us speed up fractional order modeling and com

On paper, Julia and its ecosystem are a perfect match for space engineering. We have got all the cool tools at our disposal The number of people working on great Julia-based solutions for space engineering are increasing year over year. What else do we need to gain orbital velocity?

Let's have a discussion, come up with a plan, and let's go light this candle!

Join the discussion on the bof-voice channel in discord.

Quarto is an open-source scientific and technical publishing system that builds on standard markdown with features essential for scientific communication. The system has support for reproducible embedded computations, equations, citations, crossrefs, figure panels, callouts, advanced layout, and more. In this talk we'll explore the use of Quarto with Julia, describing both integration with IJulia and the Julia VS Code extension, as well as areas for future improvement and exploration.

We present a Julia package for differentiating through functions that are defined implicitly. It can be used to compute derivatives for a wide array of "black box" procedures, from optimization algorithms to fixed point iterations or systems of nonlinear equations.

Since it mostly relies on defining custom chain rules, our code is lightweight and integrates nicely with Julia's automatic differentiation and machine learning ecosystem.

We introduce a new plotting package allowing users to easily create publication-quality figures in W.E.B. Du Bois’s unique style of data visualizations. A groundbreaking sociologist and historian, Du Bois collected data on Black Georgia residents in the late 19th century and designed over 60 eye-catching graphs to depict these data at the 1900 Paris Exposition. We showcase our package by replicating the original figures exactly and by revisiting them with new data.

Based on the Makie library, AlgebraOfGraphics offers visualizations for common analyses (frequency table, 1- or 2-D histogram and kernel density, linear and non-linear regression...), as well as functions to express how the data should be processed, grouped, styled, and visualized. These building blocks can be combined using the `*`

and `+`

operators, thus forming a rich algebra of visualizations. The unified syntax layer simplifies the creation of AlgebraOfGraphics-based UIs for data analysis.

Various ways of calculating three-dimensional optical point spread functions (PSFs) are presented. The methods account for the vector nature of the optical field as well as phase aberrations. Quantitative comparisons in terms of speed and accuracy will be presented.

When WhereTraits.jl was published 2 years ago, the key missing feature was to address ambiguations between traits function definitions. It is implemented now!

If you as a user encounter a trait conflict, you are now prompted with a concrete example resolution. You simply specify an ordering between the traits and everything is resolved automatically for you.

A feature only available in WhereTraits - even normal julia functions cannot do this.

I will talk about which methods get called by `which(methods)`

calls. How does Julia decide? How fast does it decide? And when does it figure it all out?

Let’s take a lightning fast dive together into the complexities of the method selection algorithm we affectionately call multiple-dispatch.

A critical component of any programming language’s potential for impact is the diversity of its

community! A supportive, inclusive community draws in new learners, brings fresh perspectives

to package development, and ultimately expands the reach a language has.

We are a team of scientists and engineers working together to solve the social, economic and environmental issues that we face in the world today.

This session hosts all of this year's experience talks.

QuEra is a neutral-atom based quantum computing startup located in the heart of Boston near Harvard University.

For many years we simply accepted the two language problem at our company and spent our time converting MATLAB/Python prototypes into C/C++/Java production code. But during the last two years we have been growing an internal Julia community from 3 initial enthusiasts to over 300 Julians. We would like to share our ongoing journey with you and inspire other Julians who want to kickstart similar communities at their company.

In this talk, I intend to discuss about the use of Firebase in Julia through Firebase.jl

https://github.com/ashwani-rathee/Firebase.jl

A lot of databases are well supported in Julia but support for Firebase is rather limited, which is an issue. We want to attract more younger people towards Julian community, but a big chunk of these people prefer to use Firebase in their relatively small size projects. Through this talk, I want to demonstrate how to use Firebase.jl for project developement.

Keynote - Husain Attarwala, Moderna

At RelationalAI, we are building the world’s fastest, most scalable, most expressive, most open knowledge graph management system, built on top of the world’s only complete relational reasoning engine that uses the knowledge and data captured in enterprise databases to learn and reason.

An update on Julia from the core development team.

Closing remarks

The Julia language is uniquely suitable for control-systems analysis and design. Features like a mathematical syntax, powerful method overloading, strong and generic linear algebra, arbitrary-precision arithmetics, all while allowing high performance, creates a compelling platform to build a control ecosystem upon. We will present the JuliaControl packages and illustrate how they make use of Julia to enable novel and sophisticated features while keeping implementations readable and maintainable.

Dagger.jl is a Julia library aiming to improve the way Julia users do distributed programming. With its functional task-focused API, distributed table and array implementations, and intelligent scheduler, Dagger is quickly becoming the de-facto distributed programming interface for many parts of our ecosystem.

This talk is focused on Dagger's development over the last year, and where we see Dagger going over the next few years. I'll also provide examples of how to use Dagger's new features.

DiffOpt aims at differentiating optimization problems written in MathOptInterface. Moreover, everything “just works” in JuMP. The current framework is based on existing techniques for differentiating the solution of optimization problems with respect to the input parameters. We will show the current state of the package that supports Quadratic Programs and Conic Programs. Moreover, we will highlight how other packages are used to keep the library generic and efficient.

Genie provides a powerful set of features for fast and easy creation of interactive data dashboards, helping data and research scientists to design, build, and publish production ready interactive apps and dashboards using pure Julia. In this talk we'll explain and demonstrate how to build a production ready, powerful data dashboard, going from 0 to live in 20 minutes!

Real-world problems require sophisticated methodologies providing feasible and efficient solutions. Metaheuristics are algorithms proposed to approximate those optimal solutions in a short time, making them suitable for applications where saving time is important. Metaheuristics.jl package implements relevant state-of-the-art algorithms for constrained, multi-, many-objective and bilevel optimization. Moreover, performance indicators are implemented in this package.

We believe that Julia is uniquely well-positioned to pioneer new approaches to dataflow orchestration that are currently dominated by monolithic frameworks. In this BoF, Julia's nascent Data Engineering community will swap experiences and identify opportunities to collaborate on open-source next-generation data engineering tools.

Join the discussion on the bof-voice channel in discord.

At RelationalAI, we are building the world’s fastest, most scalable, most expressive, most open knowledge graph management system, built on top of the world’s only complete relational reasoning engine that uses the knowledge and data captured in enterprise databases to learn and reason. Join the sponsored forum here.

DTables.jl is a distributed table implementation based on Dagger.jl. It aims to provide distributed and out-of-core tabular data processing for the Julia programming language. The DTables package consists of data structures, distributed algorithms and it's built to be compatible with our rich data processing ecosystem.

The talk covers a quick intro on how to use the DTable, what functionality is currently available and what are our plans for the future!

The modified Bessel function of the second kind, provided by `SpecialFunctions.jl`

as `besselk`

, is an important function in several fields. Despite its significance, convenient numerical implementations of its derivatives with respect to the order parameter are not easily available. In this talk, we discuss a solution to this problem that leverages Julia's exceptional automatic differentiation ecosystem to provide fast and accurate derivatives with respect to order.

In this talk, we will give an annual update on the current diversity and inclusion efforts underway in the community. We will also present stats from Google Analytics showing aggregate country of origin, gender, and age. These stats will help provide additional context for the continued challenge of D&I in the Julia community and will set the stage for the Julia Inclusive BoF session.

SQL is far from the only declarative paradigm for specifying the dynamics of data! The field of graph transformation formalizes a generalization of term rewriting systems that is visual, intuitive, and applicable to a wide array of data structures, including Catlab's ACSet datatypes. We will describe the basic theory of graph transformation and show how our implementation in Catlab.jl can be applied to e-graph equality saturation and general agent-based model simulations.

We propose `CounterfactualExplanations.jl`

: a package for explaining black-box models through counterfactuals. Counterfactual explanations are based on the simple idea of strategically perturbing model inputs to change model predictions. Our package is novel, easy-to-use and extensible. It can be used to explain custom predictive models including those developed and trained in other programming languages.

We present InferOpt.jl, a generic package for combining combinatorial optimization algorithms with machine learning models. It has two purposes:

- Increasing the expressivity of learning models thanks to new types of structured layers.
- Increasing the efficiency of optimization algorithms thanks to an additional inference step.

Our library provides wrappers for several state-of-the-art methods in order to make them compatible with Julia's automatic differentiation ecosystem.

ParametricOptInterface.jl is a MathOptInterface extension that helps users deal with parameters in MOI/JuMP. The package started as a GSOC project in 2020 and has seen some new developments in recent months. The goal of this talk is to show the current state of ParametricOptInterface amid the JuMP ecosystem as well as to show some interesting use cases of the package.

Risk budgeting is a portfolio strategy where each asset contributes a pre-specified amount to the total portfolio risk. We propose a numerical framework in JuMP that uses only simulations of returns for estimating risk budgeting portfolios, and provide a Sample Average Approximation algorithm. We leveraged automatic differentiation and JuMP's modeling flexibility to build a clear and concise code. We also report on memory issues encountered when solving for every day in a 14 year horizon.

In this talk, we will present the JuliaCon proceedings, the purpose, scope, and target audience of this venue. The proceedings are a community-driven initiative to publish articles of interest to the research and developer communities gathered by JuliaCon, they do not require application processing fees nor a paywall on article, making both producing and accessing the articles possible for all. We will then give a quick tour of the reviewing and publication process which happen transparently in

This work is focused on the development of an open-source Julia package for the stochastic characterization and the study of chaotic motion in astrodynamics. We focus on the computation of various chaos indicators, among others Fast Lyapunov Indicators (FLI), Finite-Time Lyapunov exponents (FTLE) and Mean Exponential Growth factor of Nearby Orbits (MEGNO) and chaos indicators more in general.

Fantasy Premier League is an online fantasy sports game where you select a team of 15 players and score points based on their performance each week. You have a finite budget and each player costs a certain amount, plus a number of other constraints which makes this an optimisation problem that JuMP can solve. In this talk I will work through this problem and show how each constraint is translated into the JuMP language. It will be a fun introduction to optimisation in an alternative domain.

Broadly, there are two paradigms of interfacing with a UI library to create a Graphical User Interface (GUI) - Retained-Mode (RM) and Immediate-Mode (IM). This talk is for anyone who wants to understand how to make an immediate-mode GUI from scratch. I will explain the inner workings of an immediate-mode UI library and show one possible way to implement simple widgets like buttons, sliders, and text-boxes from scratch.

Link: https://github.com/Sid-Bhatia-0/SimpleIMGUI.jl

A major part of Julia's success as a language has come from a large community of user advocates. User advocacy continues to be one of the most effective outreach mechanisms and this talk aims to help improve the approach of those seeking to advocate for Julia.

In this talk, we will address the problem of data-driven estimation and approximation of completely or partially unknown systems using DataDrivenDiffEq.jl.

We will start by giving a short introduction to the field of symbolic regression in general followed by an example of its practical use.

Here we learn how to

(a) set up a DataDrivenProblem,

(b) use ModelingToolkit.jl to incorporate prior knowledge,

(c) use different algorithms to recover the underlying equations.

We present MarkovBounds.jl -- a meta-package to SumOfSquares.jl which enables the computation of guaranteed bounds on the optimal value of a large class of stochastic optimal control problems via a high-level, practitioner-friendly interface.

In this talk, I present ProjectionPursuit.jl, a package that is designed to address the limitation of PCA, that is the lack of flexibility for dimension reduction. I also discuss the background of projection pursuit, why the result of PCA could be misleading, and compare projection pursuit and PCA with some data examples.

An alternative title of this talk is “Bring Your Own Objective Function: Why PCA can be a bad idea?”.

"Half the money I spend on advertising is wasted; the trouble is I don't know which half." (J.Wanamaker, 19th-century retailer)

Optimizing marketing spend is still difficult, but this talk introduces a modern marketing analysis: Media Mix Modelling (MMM).

We will combine the strength of Julia with Bayesian decision-making to optimize marketing spend for a hypothetical business.

Find more details in the associated GitHub Repo

StatsModels.jl provides the `@formula`

mini-language for conveniently specifying table-to-matrix transformations for statistical modeling. RegressionFormulae.jl extends this mini-language with additional syntax that users coming from other statistical modeling ecosystems such as R may be familiar with. This package also serves as a template for developers wish to expand the StatsModels.jl `@formula`

syntax in their own packages.

Random utility models are widely used in social science. While most statistical software, including Julia, has some facilities for estimating multinomial logit models, more advanced models such as mixed logit models and models with different utility functions for different outcomes generally require specific choice modeling software. This presentation describes a new package, DiscreteChoiceModels.jl, which provides flexible and high-performance multinomial and forthcoming mixed logit estimation.

Join us on Gather.town for a social hour.

This talk introduces GeneDrive.jl, a package designed to study the effect of biotic and abiotic interactions on metapopulations, outlining functionalities and use cases. GeneDrive.jl is a 3-part framework for building and analyzing simulations wherein organisms are subjected to anthropogenic and environmental change. It includes: (1) Data models that exploit the power of Julia's type system. (2) Dynamic models that build on DifferentialEquations.jl. (3) Decision models that employ JuMP.jl.

BanyanDataFrames.jl is an open-source library for processing massive Parquet/CSV/Arrow datasets in your Virtual Private Cloud. One of the key goals of the project is to match the API of DataFrames.jl as much as possible. In this talk, we will provide an overview of BanyanDataFrames.jl and discuss challenges and success so far in achieving massively scalable data analytics with the Julia language.

We propose a prototype for a vectorized modeler written in pure Julia, targeting the resolution of large-scale nonlinear optimization problems. The prototype has been designed to evaluate seamlessly the problem's expression tree with GPU accelerators. We discuss the implementation and the challenges we have encountered, as well as preliminary results comparing our prototype together with JuMP's AD backend.

We introduce RandomizedPreconditioners.jl, a package for preconditioning linear systems using randomized numerical linear algebra. Crucially, our preconditioners do not require a priori knowledge of structure present in the linear system, making them especially useful for general-purpose algorithms. We demonstrate significant speedups of positive semidefinite linear system solves, which we use to build fast constrained optimization solvers.

The ODE solver spit out dt<dtmin, what do you do? MethodError Dual{...}, what does it mean? Plenty of people ask these questions every day. In this talk I'll walk through the steps of debugging Julia simulation codes and help you get something working!

In this talk, we will be discussing some of the state of the art techniques to scale training of ML models beyond a single GPU, why they work and how to scale your own ML pipelines. We will be demonstrating how we have scaled up training of Flux models both by means of data parallelism and by model parallelism. We will be showcasing ResNetImageNet.jl and DaggerFlux.jl to accelerate training of deep learning and scientific ML models such as PINNs and the scaling it achieves.

Join us on Gather.town for a social hour.