Cell Cooling & Cell Degradation
We all have had some experience with placing laptops or mobile phones under air-conditioning or even putting it on top of a bed of ice in a desperate attempt to cool the device down because instinctively we KNOW that hot electronics is bad. This is more so for Lithium-ion battery (LIB) systems where thermal management is very, very important. LIBs operate effectively at a strict range of temperatures between room temperature to slightly above room temperature. Anything above or below that… well expect cycle life, performance, degradation.
While we can take matters into our own hands for small, portable devices like laptops and mobile phones, we can’t submerge electric vehicles into a bath of coolant when it gets too hot cruising down the highway during a mid-summer afternoon. Luckily, the battery systems are fitted with cooling systems that, more often than not, utilize liquid cooling. This functions to manage the LIB temperatures and prevent any rapid cell degradation. From a cell engineer’s perspective, we WANT this thing to be operating 24/7 to maintain the cell temperature constant. From a driver’s and vehicle engineer’s perspective… we don’t want this thing to be operating at all as it may complicate pack design and may drain precious cell energy.
So here comes the obvious follow up question - what is the most effective cell cooling method?
First, the work of Shen Li et al must be referenced where an investigation was conducted on determining the ‘perfect tab design’ in terms of current uniformity and temperature gradients for cylindrical cells. The gist of the investigation is that increasing the number of contact points between the terminal and the electrodes is important in shortening the current path (i.e. resistance), thereby reducing Ohmic heating from parts and the thermal gradient disparities within electrodes. This work underlines that in order to minimize heat energy, parts level considerations and tab/terminal design is crucial.
(And also that the logic behind Tesla’s tab-less design is not ONLY to do with cost savings)
Second, the work of Rachel Carter et al where the researchers revealed how fickle and sensitive the LIBs are in terms of thermal gradient inside the cell. It was found that even a small temperature difference of ~2℃ between the cathode and anode at an ambient temperature of 35℃ is enough to trigger Lithium plating and accelerated degradation of LIBs. Considering that current LIBs for EV applications consists of tens of layers of cathode-anode pairs and that cooling plates are usually placed in direct contact with LIBs, we can imagine this thermal gradient caused degradation impacting the LIB system. Because the cooling plates are placed on the face of the LIB in a pouch system, this effect will be compounded and become exaggerated as the outermost electrodes (usually anode) will be the first to become cooled as schematically shown in Figure 1.
Figure 1) Schematic of the temperature gradient that forms in a LIB pouch cell.
The thermal gradient between electrodes causes slight differences in kinetics - Li-ion conductivity, electrolyte diffusivity, and etc. which are all temperature sensitive parameters - that leads to irregularities in current distribution, differences in SOCs between electrode sheets AND across a single electrode sheet. These effects compound to accelerate the degradation of the LIB system. Therefore, a brute force approach to LIB cooling will only serve to exacerbate the temperature irregularities and thermal gradient must be managed to prevent accelerated cell degradation.
Side bar - prismatic cells do have cooling plates but are usually only placed perpendicular to the electrode stack. This may reduce the impact of thermal gradients as the thermal gradient is formed bottom to top rather than between entire electrode sheets (But I have yet to find any papers that state thus).
Third, - and to finally get something close to an answer regarding the question posed above - we move on to the work by Yan Zhao et al where a model comparison between surface cooled and tab cooled systems was made. The respective systems are compared in Figure 2.
The results of the investigation was based on applying a discharge current of 6C for up to 550 seconds and monitoring the differences in temperature, C-rate, and SOC between the two systems. The surface cooled system demonstrated a large temperature gradient of delta 10℃ while the tab cooled system showed a delta of only ~1℃. The large difference in temperature gradient in the surface cooled system impacted the kinetics of LIBs, which resulted in the cooler parts of the electrode experiencing much lower C-rates than the warmer parts of the electrode, thereby leading to the SOCs of cooler parts being higher than the warmer parts of the electrode (i.e. the discharge process occurred preferentially in the warmer parts of the electrode). By contrast, the tab cooled system with its small delta temperature, showed much uniform distribution of C-rate and end SOC (i.e. the discharge process occurred quite uniformly throughout the electrode).
Figure 2) Schematic comparison between surface cooled (Right) and tab cooled (Left)
Finishing thoughts are… the conundrum lies in the fact that it is extremely difficult to implement the tab cooling method as the tabs/terminals are the points of current flow as the cooling systems must be in contact so as to induce heat transfer, but must not hinder current flow or raise safety concerns. Despite this, it’s interesting to think that the simple act of turning on the cooling system in the EV system, may actually be the cause of cell degradation and the car mileage decreasing faster than it’s supposed to.
References
1) Shen Li et al, 2021, Journal of Power Sources, 492, 229594
2) Rachel Carter et al, 2021, Cell Reports Physical Science, 2, 100351
3) Yan Zhao et al, 2018, Journal of The Electrochemical Society, 165, A3169