Mridul Seth

I am currently working on the NetworkX open source project (work funded through a grant from Chan Zuckerberg Initiative!). Also collaborating with folks from the Scientific Python project (Berkeley Institute of Data Science), Anaconda Inc. Before this I used to work on the GESIS notebooks and gesis.mybinder.org.
I am also interested in the development and maintenance of the open source data & science software ecosystem. I try to help around with the Scientific Open Source ecosystem wherever possible. To share my love of Python and Network Science, I have presented workshops at multiple conferences like PyCon, (Euro)SciPy, PyData London and many more!


Institute / Company

NetworkX

Git*hub|lab

github.com/mriduls

Homepage

mriduls.com

Twitter handle

@Mridul_Seth


Sessions

08-14
08:30
90min
Network Analysis Made Simple (and fast!)
Mridul Seth

Through the use of NetworkX's API, tutorial participants will learn about the basics of graph theory and its use in applied network science. Starting with a computationally-oriented definition of a graph and its associated methods, we will build out into progressively more advanced concepts (path and structure finding). We will also discuss new advances to speed up NetworkX Code with dispatching to alternate computation backends like GraphBLAS. This will be a hands-on tutorial, so stretch your muscles and get ready to go through the exercises!

Data Science and Visualisation
Aula
08-17
10:30
90min
Interoperability in the Scientific Python Ecosystem
Joris Van den Bossche, Tim Head, Olivier Grisel, Franck Charras, Mridul Seth, Sebastian Berg

This slot will cover the effort regarding interoperability in the scientific Python ecosystem. Topics:

  • Using the Array API for array-producing and array-consuming libraries
  • DataFrame interchange and namespace APIs
  • Apache Arrow: connecting and accelerating dataframe libraries across the PyData ecosystem
  • Entry Points: Enabling backends and plugins for your libraries

Using the Array API for array-producing and array-consuming libraries

Already using the Array API or wondering if you should in a project you maintain? Join this maintainer track session to share your experience and exchange knowledge and tips around building array libraries that implement the standard or libraries that consume arrays.

DataFrame-agnostic code using the DataFrame API standard

The DataFrame Standard provides you with a minimal, strict, and predictable API, to write code that will work regardless of whether the caller uses pandas, polars, or some other library.

DataFrame Interchange protocol and Apache Arrow

The DataFrame interchange protocol and Arrow C Data interface are two ways to interchange data between dataframe libraries. What are the challenges and requirements that maintainers encounter when integrating this into consuming libraries?

Entry Points: Enabling backends and plugins for your libraries

In this talk, we will discuss how NetworkX used entry points to enable more efficient computation backends to plug into NetworkX

Scientific Applications
HS 119 - Maintainer track