Julia for Battery Model Parameter Estimation
2019-07-24 , Room 349

High-fidelity battery modeling requires the estimation of numerous physical parameters in order to properly capture the physics of the electrochemical, thermodynamic and chemical processes that underlie the system. Using Julia, the parameters for this model were able to be estimated by speeding up the code such that a Markov Chain Monte Carlo approach (Hamiltonian Monte Carlo) could be used, combined with a high-performance computing cluster, to sample the vast search domain and reach the global error minima.


The particular battery model employed in this application predicts the voltage, temperature, state-of-charge, and degradation (i.e. lithium lost due to aging factors). Due to the complex interactions among these properties --- along with other dynamic, codependent cell properties --- the behavior of the cell over the course of an arbitrary load cannot be accurately characterized from an initial state without simulating these interactions over time.

As a result, the model implementation discretely progresses the cell through discharge and charge using a time-step of 2 seconds, predicting forward the state properties. The only time-dependent input is a load profile, which can come in the form of the power over time or current over time associated with the discharge due to the load and charging protocol. Beyond that, user inputs are only required for the initial cell state.

Looking at an individual step, the mole fraction of lithium in different parts of the cell is found using either the initial conditions or the prediction from a prior step. Calculating the open circuit potential for both the anode and cathode depends on these mole fractions and the current cell temperature. Following this step the mole fractions for the next state are calculated by approximating their rate of change, which relies on the input current, and multiplying the rate by the 2-second time-step. The state-of-charge for the next step is also calculated at this point.

The cell voltage depends on a set of overpotentials on top of the open circuit potential already estimated. As with the mole fractions, these come from initial conditions or a previous step. To predict the overpotentials for the next state, properties from the current state are used to calculate the current rate of change, which is then multiplied by the time-step. Temperature is predicted for the next state in a similar manner, as is the cell voltage.

The computational challenge derives in part from the vast parameter space necessary to characterize the model to a real cell based on testing data. The model depends on roughly 20 parameters for a single discharge-charge cycle to predict the state over time. On top of this, keeping track of cell degradation requires an additional 5 parameters.

Working from the state-of-charge model described above, a state-of-health model can be set up using these additional parameters and running the discharge model for hundreds of cycles, updating the input parameters at the beginning of each cycle. At each state during an individual cycle, the amount of lithium lost either to reactions with the electrolyte or isolation into inactive lithium metal is added to a running total for the cycle. After a cycle completes, this total is removed from the initial lithium available to the cell. Resistance and diffusivity also change over multiple cycles, and the contribution to their decay is also maintained as a running total within each cycle.

Since local minima are pervasive in this parameter space, and error-minimizing strategies are too strongly influenced by initial guesses, a Monte Carlo implementation is necessary to properly train the model. This becomes prohibitively expensive computationally within Matlab, where the model was first implemented, because each cycle lasts for up to 10,000 seconds, and up to 2,000 cycles can be required to compare the aging model to the available aging data.

The search space defined by the parameters requires that the Monte Carlo be able to perform several thousand iterations. Under the Matlab implementation, each Monte Carlo iteration would take approximately 0.03 seconds. This means that the algorithm can do 1 million iterations in 500 minutes or about 8 hours. While this seems sufficient, there are 20 parameters which means that on average, there are only 50,000 changes to each variable which is likely not enough iterations per variable to properly sample the space. In addition, more complex Monte Carlo models such as the Hamilton Monte Carlo take significantly more time to run, thus limiting the number of iterations that can be run.

By implementing the same code in Julia, the algorithm got a significant speed up in addition to other benefits. Compared to the Matlab implementation, the Julia implementation had one Monte Carlo iteration complete in about 0.003 seconds. This means that there was a 10X speed up, allowing for 10X more iterations to be completed in the same time. Thus, in about 8 hours, 10 million Monte Carlo iterations could be performed. In addition, Julia enabled the code to be run in parallel on Arjuna, a high-performance computing cluster at Carnegie Mellon University. This means that in 8 hours, several of these algorithms can be run in parallel in which each performs a phase space search using 10 million iterations. Since each algorithm has enough iterations to properly sample the space, the minimum error found from the collection of Monte Carlo simulations can be assumed to be the global minimum of the search space. The large amount of Monte Carlo iterations also allows for the algorithm to use a simulated annealing function to allow the algorithm to not get stuck in local minima.

Matt is a second year Ph.D. student under Venkat Viswanathan studying the dynamics of batteries as they relate to systems. He has been involved with numerous projects including how platooning, convoying of trucks, affects the energy requirements of electric semi-trucks, creating a charger placement algorithm called INCEPTS that hinges on the coupling of battery dynamics and vehicle dynamics as well as the locality of the simulation including weather, traffic flow, etc. Matt got his undergraduate degrees in Mechanical Engineering and Energy Engineering from the University of California at Berkeley and has had numerous internships in industry with companies such as SunPower.

I am a researcher in the Viswanathan group at Carnegie Mellon University.