03Nov 2014

By now you are familiar with the limitations of a rechargeable lithium-ion battery. There are three parameters that I have covered so far that describe the general performance of a lithium-ion battery. They are energy density (which dictates capacity), charge rate (which dictates charge times), and cycle life (which dictates warranty).

Without the use of more sophisticated battery management algorithms, one can achieve excellent performance on two of these parameters axes, but not three. So battery vendors and device manufacturers often resort to compromises that are rapidly becoming quite limiting. Let’s review some of these design compromises:

1) Sacrifice cycle life:

This has been the most common of these compromises, primarily because carriers and operators did not historically specify or enforce an actual figure for cycle life. It was commonly understood that 500 cycles was sufficient but Verizon Wireless moved the goal post to 800 cycles which made life far more challenging for the battery vendors.

So for these newest crops of smartphones with over 3,000 mAh in capacity (and often pushing a thin profile which drives the energy density to or above 600 Wh/l), device manufacturers are trying to get by with 500 cycles. But what if you need to ship to Verizon and meet their 800-cycle specification, what other compromise do you implement? As a consumer, look for the fine print on your product warranty. If it says that the battery warranty is limited to one (1) year, then you have every reason to suspect that your battery will not get much past 500 cycles.

2) Sacrifice charge time:

That’s right. If you can’t make 800 cycles, then drop the charge rate, or increase the charge time. This can be painful and is an old trick in the book to increase cycle life. But slower charge times irritate end users and this trick is beginning to lose steam. How slow will the charge time be? How about pushing 3 hours or more?

3) Sacrifice energy density:

Precisely! Drop the energy density and lose capacity, and that buys the device maker and battery vendor some extra room. How low should one drop it? To increase the charge rate from 0.3-0.4C to about 0.7C, the energy density drops to about 550 Wh/l, and to increase the charge rate to 1C or above, the energy density has to drop to 500 Wh/l or even lower….that means you can only fit less than 2,500 mAh in the same volume where you originally were able to fit 3,000 mAh. That’s not progress! For a discussion on C-rates, read this previous post. So what’s the penalty for dropping energy density?

4) Sacrifice depth of charge:

Depth of what? Yes, this is the first time I introduce this new term and parameter. It is not typically used in mobile devices, but it is very prevalent in automotive electric vehicles. When one talks about a certain capacity, say 3,000 mAh and a corresponding energy density, it is implicitly assumed that the battery will be fully charged, i.e., the voltage of a single battery cell will rise to its highest safest limit, most commonly 4.35 V (but in some cases, it can be only 4.2 V). This is the true definition of 100% full. But what if we fill the battery up to say only 80%, which will correspond to a lower battery voltage (by about 0.1 or 0.15V) from the maximum allowed voltage? In other words, the true energy density is actually only 80% of its nominal density. This is called depth of discharge. To use an analogy, just because your bucket can hold 10 gallons of water, it does not mean you actually have 10 gallons of water. If your bucket is only 80% of full (your depth of discharge), then you only have 8 gallons to use. I will talk more in future posts about the impact of depth of discharge on cycle life and battery performance.

Share this post