It’s all over the media: Moore’s Law just turned 50! What is Moore’s Law? It’s more an observation than a law, but it has stuck around, now 50 years, that we think it is a law, like gravity.
On 19 April 1965, Gordon Moore, at the time the head of R&D at Fairchild Semiconductor, and later the CEO of Intel, made an observation turned prophecy. He predicted at the time* that the number of components and transistors on an integrated circuit (IC or chips) would double every approximately 18 months while holding the cost of the chip constant. In layman’s terms, it means that the industry will be able to double the complexity, and hence the computational power, of these chips every roughly 1.5 years without increasing the cost of the function. And for 50 years, this prediction held remarkably well and has been hailed by the tech and semiconductor industry.
Its implications are just spectacular. Next time you hold a smartphone in your hand, pause for a moment and think about the fact that it is thousands of times more capable than the Apollo Guidance Computer that landed Neil Armstrong and Buzz Aldrin on the moon. Moore’s Law is in some ways the new Bible of the tech industry with an implicit expectation that all new technologies ought to follow this trend. But is that true? and specifically, is it and will it be true of energy storage and battery systems?
Left: Processor power measured in MIPS shown on a logarithmic scale Right: Battery energy density in Wh/l
The answer in a nutshell is a big fat NO! Whereas history has shown that semiconductors follow Moore’s Law, that same history shows that the trend in batteries is closer to the progress of gastropods, hence, Snail’s Law. The two figures above illustrate the difference. From 1995 to 2015, the computational power of processors made by Intel increased by a factor 300X, effectively doubling every 2 years. In contrast, over that same period of 20 years, the energy density of lithium-ion batteries increased by 4X, or less than 7% annually. No one disputes this fact because most consumers complain about their batteries, and few, if any, complain about the processor or the electronics in their devices. But why is that? They both involve materials and manufacturing, yet the differences are stark.
It boils down to a balance between the laws of physics and economics. The laws of physics dictate the amount of technical improvement that is possible given a scientific and/or engineering problem. In the case of semiconductors, these were the laws of physics that dictated the shrinking of the dimensions of transistors in a silicon chip. Back in 1965, these transistors did not operate anywhere near the fundamental limits of materials or equipment. So physics were not the limiting factor in this balance, but economics were. In other words, the R&D and manufacturing costs associated with shrinking transistors had to increase at a lesser pace than the technical performance of these transistors. Under such a circumstance, these added costs were amortized over a rapidly increasing technical performance, and hence the benefit of Moore’s Law: more performance for the same cost point. Said differently, shrink the dimensions more, get more benefits, and this equation becomes seemingly a virtuous circle….that is until it starts to hit the limits of physics, at which point the balance tips — something that the industry may be soon facing.
For batteries, that balance between technical limits and economics was really never in place. First, the cost of R&D and manufacturing was not offset by increasing performance, in particular, energy density. In other words, every increase in energy density manifested itself initially as an increase in unit cost. So there really was never an equivalent to Moore’s Law’s cost-constancy. As a matter of fact, as we examine closely the economics of lithium-ion batteries over the past 20 years, we find that the cost of these batteries declines as a function of cumulative production volumes, not annual production volumes. This is a much slower cost curve and is partly responsible for why raising R&D investments for battery research does not make a lot of sense to battery manufacturers. There is more supporting evidence in the fact that battery vendors live on single-digit gross margins, whereas many semiconductor companies have gross margins close to 50% — i.e., much better profitability.
Second, lithium-ion batteries are already hitting some serious material and physical limits. The presently used material systems in lithium-ion batteries seem to saturate right around 650-700 Wh/l. Going above these figures means higher R&D and manufacturing investments for new materials, and these costs are difficult to amortize.
The result is that energy density begins to level off or improve at an increasingly slower rate. Yes, a breakthrough from a university research program or an innovative company may change that, but history has shown that such breakthroughs don’t come from wishful thinking, but rather from years and billions of dollars in research, both of which are becoming scarcities in batteries.
But that may not be such a bad thing. When technologies begin to level off, cost pressures rise immensely as better manufacturing methods are introduced and as more competitors, especially in low-cost geographies, join the fray. So that means costs will drop rapidly — this perhaps may be the implicit corollary and inverse of Moore’s Law. In mathematical form, we can anecdotally write this as Snail’s Law = 1/[Moore’s Law].
In summary, I believe that the battery industry is entering a new era with accelerating cost pressures, accompanied with a shift to improving the performance of the entire battery system that includes the individual cells, the control electronics, algorithms and software. And that will bode well to companies that are skilled in this system integration exercise , regardless of the application and end-market.