Devconf.US

Aakanksha Duggal

Aakanksha Duggal is a Senior Data Scientist in the Emerging Technologies Group at Red Hat. She is a part of the Data Science team and works on developing open source software that uses AI and machine learning applications to solve engineering problems.


Sessions

08-15
16:00
80min
LLMs 101: Introductory Workshop
Surya Pathak, Hema Veeradhi, Aakanksha Duggal

Are you curious to learn about Large Language Models (LLMs), but unsure how and where to begin? This workshop is designed with specifically you in mind. LLMs have emerged as powerful tools in natural language processing, yet their implementation poses challenges, particularly in managing computational resources effectively.

During this workshop, we will delve into the fundamentals of LLMs and guide you in selecting the appropriate open source models for your requirements. We will discuss the concept of self-hosted LLMs and introduce containerization technologies such as Kubernetes, Docker, and Podman. Through illustrative use-cases like RAG application, text generation or speech recognition, you will learn how to set up LLMs locally on your laptop and build container images for the models using Podman. We will also be exploring model serving and inference methods, including interaction with the model via a simple UI application. Moreover, the workshop will cover model evaluation techniques and introduce various metrics that can be utilized to effectively measure the performance and quality of model outputs.

Attendees will gain practical knowledge and skills to effectively harness the capabilities of LLMs in real-world applications. They will understand the challenges associated with managing computational resources and learn how to overcome them. By the end of the workshop, participants will be equipped with the tools to set up and deploy LLMs, evaluate model performance, and implement them in various natural language processing tasks.

Artificial Intelligence and Data Science
Terrace Lounge (capacity 48)
08-16
14:05
35min
Self-Hosted LLMs: A Practical Guide
Hema Veeradhi, Aakanksha Duggal

Have you ever considered deploying your own large language model (LLM), but the seemingly complex process held you back from exploring this possibility? The complexities of deploying and managing LLMs often pose significant challenges. This talk aims to provide a comprehensive introductory guide, enabling you to embark on your LLM journey by effectively hosting your own models on your laptops using open source tools and frameworks.

We will discuss the process of selecting appropriate open source LLM models from HuggingFace, containerizing the models with Podman, and creating model serving and inference pipelines. For newcomers and developers delving into LLMs, self-hosted setups offer various advantages such as increased flexibility in model training, enhanced data privacy and reduced operational costs. These benefits make self-hosting an appealing option for those seeking a user-friendly approach to exploring AI infrastructure.

By the end of this talk, attendees will possess the necessary skills and knowledge to navigate the exciting path of self-hosting LLMs.

Artificial Intelligence and Data Science
Conference Auditorium (capacity 260)