2025-07-23 –, Main Room 3
In my work, I explore a novel simulation where tumor growth is controlled by the immune system using Neural ODEs. I integrate skip connections and physics-informed loss to capture real-world metastatic behavior under various treatments. By calibrating the model with experimental data, I uncover insights into optimizing combination therapies that could ultimately improve patient outcomes.
In this presentation, I share my journey in modeling the complex interplay between tumor growth and the immune response using Julia. I developed an enhanced neural ODE pipeline that integrates two neural networks—immune_nn for representing the immune reaction and correction_nn to capture adaptive, metastatic growth patterns. The backbone of the model is a logistic growth ODE, parameterized by per-group estimates of growth rate and carrying capacity (or their logarithms), allowing me to incorporate variations across different cell lines and treatment conditions.
(function make_ode_fun(i::Int)
len_imm = length(θ_immune)
len_corr = length(θ_corr)
len_nn = len_imm + len_corr
return function (du, u, p, t)
p_immune = p[1:len_imm]
p_corr = p[len_imm+1 : len_nn]
local_immune = re_immune(p_immune)
local_corr = re_corr(p_corr)
# Exponentiate for positivity
r_val = exp(p[len_nn + i])
K_val = exp(p[len_nn + ngroups + i])
vol = u[1] # let it be, we'll penalize negativity in the loss
# Logistic
logistic = r_val * vol * (1.0 - vol / K_val)
# Evaluate NNs
imm_val = local_immune(reshape([vol], 1, 1))[1]
corr_val = local_corr(reshape([vol], 1, 1))[1]
du[1] = logistic - alpha*imm_val + corr_val
end
end
Above is an example code where I define the UDE and tried to model the volume growth.)
Inspired by real-world data from studies available in the tumorgrowth repository, I worked with datasets that include identifiers like CellLine (e.g., C1, C2) and Treatment (e.g., T1, T2). This helped me to tailor the simulation to reflect how different treatments can affect tumor progression. For example, I built a residual block in Flux to enable skip connections in the neural networks, ensuring that the model captures both the inherent biological dynamics and the corrective mechanisms brought by immunotherapy.
My code leverages advanced techniques such as physics-informed loss functions—combining volume and derivative mismatches—to ensure that the simulated tumor dynamics adhere to biological principles. I also apply L2 regularization and a negative volume penalty to enforce realistic constraints. The model is trained over 20,000 epochs using AdamW and adaptive solvers like Tsit5, making full use of Julia’s high-performance computing capabilities.
I then use the model to simulate tumor growth under different scenarios: one with a full immune reaction and another where I disable the immune component to observe the impact. This dual-simulation approach reveals the delicate balance between tumor expansion and immune suppression, and highlights how strategic combination therapies might shrink tumors more effectively while reducing toxicity.
Throughout the talk , there are several clear visualizations—such as scatter plots of observed versus predicted tumor volumes, growth rate curves, and comparative bar charts for estimated cell-line parameter (r and k) values—to guide the audience through the model’s insights. I also discuss how sensitivity analysis of the model parameters can inform the design of personalized treatment strategies. By simulating various “what-if” scenarios, I demonstrate how my framework can serve as a virtual laboratory for optimizing cancer treatment regimens.
I invite you to join me as I detail how this computational approach not only deepens our understanding of tumor-immune dynamics but also paves the way for innovative therapeutic strategies. My goal is to show that with Julia, we can transform complex biological systems into actionable insights that may one day lead to better, more personalized cancer therapies—all through the power of high-performance computing.
I'm a computer science enthusiast with a love for AI and machine learning. Currently, I'm working through a B.Tech in AI & ML at Netaji Subhash Engineering College in Kolkata and a BS in Data Science & Applications at IIT Madras. These studies have given me a solid grounding in math, data analytics, and machine learning.
I've had some opportunities along the way—like working on large language models to simplify language for kids with learning disabilities at Jadavpur University, and developing a machine learning solution at Doyen Diagnostics to predict disease burden and craft personalized health scores.
I've also dabbled in projects ranging from EEG-based schizophrenia classification to highway traffic flow analysis and accident detection with computer vision.
Now, I'm diving into Julia and looking forward to sharing ideas and collaborating with the Julia community at JuliaCon on all things computational and innovative.