Whenever a system does work, waste heat is generated. For example, diesel engines waste an average of 55-70% of the stored chemical energy of the fuel. For an electric vehicle (EV) there is also waste heat generated in the motors, conductive cables and, for our purposes here, in the lithium-ion (Li-ion) cells.
At Xerotech, we use predictive modelling based on experimental tests and numerical simulations to predict how the cells within our battery packs behave under different conditions.
In general, it is accepted that Li-ion cells will lose 5% of the power demand to waste heat at a discharge current equal to the cell capacity, i.e., 1C. This tapers down to approximately 1.5% at a discharge current one quarter of the cell capacity, or 0.25C.
To give an example: taking a battery pack of approximately 40Ah capacity with a current demand of 40A at a nominal voltage of 230V. This gives you a power demand of 9.2kW. Then, 5% of the power gives a typical heat generation rate of 0.46kW.
If you don’t intend to use the battery continuously, then you don’t need to size your thermal management system (TMS) based on this value. Instead, you can allow the cells to rise in temperature during periods of high charge/discharge rates and then allow them to cool during periods of low charge/discharge rates. Indeed, allowing the cells to rise slightly in temperature can reduce their internal resistance that directly affects the heat generated (see below).
Additionally, it should be noted that Li-ion cells like to stay within a Goldilocks’ Zone of 20°C – 35°C, which is not too hot and not too cold. When operating below 20°C, the internal resistance of the cells begins to rise significantly, which leads to more of the stored energy being lost to waste heat. Crucially, Li-ion should never be charged below 0°C as long-term damage can be inflicted on the cells.
For this reason, our battery management system (BMS) does not allow charging below this temperature. Likewise, above 35°C, the cells can suffer long-term degradation effects such as accelerated aging or capacity loss, so a thermal management system (TMS) that can remove any waste heat from the battery before reaching high temperatures is advised.
If there’s a possibility that the battery pack may reach temperatures below 20°C due to ambient conditions, then it is worth including a TMS that can operate as a heat pump as well as a cooling system. This is done using a separate refrigeration loop in similar fashion to an air conditioning unit. With such a system, our battery packs can comfortably operate in ambient temperatures between -40°C to +60°C.
Thus, when sizing your TMS, you should consider the ambient conditions that the pack will be operating in and the expected use profile or typical charge/discharge cycle.
To go into further detail, we can look at the dominant mechanisms in heat generation within a Li-ion cell. Generally, we can break down the heat generation into that caused by Ohmic energy losses and reversible (entropic) energy losses.
To go into further detail, there are four key mechanisms that cause heat generation with a Li-ion cell: Ohmic heating, reversible (entropic) heating, electrochemical reactions, and side reactions. The dominant one is Ohmic heating, which is a power loss quantified from the Ploss = I2R formula, where Ploss is the power lost to heat, I is the current through the cell and R the internal resistance of the cell that is a combination of the Direct Current Internal Resistance (DCIR). This acts immediately upon the current being applied, and over longer discharges, the gradual increase in resistance caused by electrochemical processes within the cell.
At Xerotech, we conduct in-house testing on all the cell types we offer and generate DCIR curves for each one at different temperatures (DCIR varies with temperature of the cell). As you can see in the graph, the internal resistance increases as the temperature decreases, and varies throughout the open circuit voltage range of the cell.

Variation of DCIR for our energy cell during charging at different temperatures across the open circuit voltage (OCV) range of the cell.
Separate to this, we use Computational Fluid Dynamics (CFD) software to predict the heat transfer coefficient at the cell-cooling duct interface, which in-turn is validated against experimental results.
All of this is then fed into a 1-D model that incorporates both the electrical and thermal aspects that predicts the heat rejection required.
As mentioned before, we generally recommend the cells to be maintained between the ideal temperature range of 20°C–35°C and to keep the temperature difference between the hottest and coldest cell within 5°C.
The 5°C temperature difference is a limit we try to encourage because it helps with the aging of the cells over time, i.e., that they age at similar rates.
It should be noted that as the cells age, their internal resistance will rise, so it’s always prudent to include this in your sizing.
Additionally, you need to consider the type of coolant used. We generally specify a 50/50 water/glycol mixture; however, given the operating conditions that you expect, you may need to use a different coolant. From Xerotech you will need a pressure drop curve for sizing the pump to be used to circulate the coolant. All our packs use cooling ducts that run in parallel through the pack to ensure an optimum temperature difference between the cells.
The predicted pressure drop through our packs is given below as an example, as calculated with 50/50 water/glycol mixture, and if you need values for specific conditions, you can contact us directly.

Predicted pressure drop of all Xerotech packs. Note: this can vary with temperature and type of coolant.