The omnipresent but once much-derided metric levelized cost of energy is set for a much-deserved return. The reasons: the energy transition is delivering – at scale – both storage and flexible demand.
[This slightly technical post is part of my clean energy teaching.]
Key takeaways:
- Levelized cost of energy (LCOE) has played a central role in utility planning and, more recently, in public understanding of the cost of different sources of new electricity generation.
- During the initial rise of renewables, LCOE received criticism for setting up inappropriate comparisons by failing to capture the importance of intermittency, the need for dispatchable power, and the rising role of system costs.
- With the explosive growth of utility-scale storage and steadily increasing demand flexibility, LCOE deserves to return as a central way to compare new generation assets – making intermittent resources increasingly dispatchable.
My earliest memory of levelized cost of energy (LCOE) dates from 2004, when I served on my local utility’s citizen advisory committee for an update to the Integrated Electric Resource Plan. The comparison of different resources in 2004 was somewhat anticlimactic: we weren’t going to build any new hydro or nuclear; wind was still novel and growing only sporadically, and utility-scale solar didn’t quite exist yet; a few legacy biomass resources continued in play; and we were always going to get the vast bulk of our power on preferential terms from Bonneville Power Administration, per the Northwest Power Act of 1980. LCOE guided the math, but it didn’t cause controversy. And ultimately we settled on the obviously most cost-effective primary strategy for accommodating our (very slow) load growth: energy efficiency.
Fast-forward to when I started teaching clean energy finance in 2013, and LCOE was a ubiquitous lightning rod. Excellent summary metric? Badly misleading? LCOE was everywhere, while at the same time ink was regularly spilled to skewer its weaknesses.
Yet from where we sit today, I see a shift in the winds coming.
A short history will set the stage.
LCOE’s emergence amidst ‘peak boring’
From the late 1970s until fairly recently, the electric power system was mostly uninteresting: slow load growth and little change in the composition of capacity additions. Sure, the oil price shocks in the 1970s purged most of the petroleum from the mix, and the last new nuclear power plants (all started before Three Mile Island) came on line throughout the 1980s and early 1990s. But consider the net additions during the period of peak boring.

Still, there was a need to make decisions, and it was clear – from fuel price fluctuations and the rise of natural gas – that we needed a life-cycle cost metric. Standard net present value (NPV) calculations didn’t set up easy comparisons, and anyway the utility world wanted its own way to do things. I haven’t been able to find the origin of the term levelized, but I suspect it came from engineers who didn’t feel beholden to business nomenclature. Regardless, the essence of LCOE is two-fold: (a) it is a time value of money calculation and (b) it generates a price per unit of energy, so it is recognizable to people dealing with power markets (or for that matter, anyone paying attention to the utility bill). In short:

My only quibble with this Wikipedia summary is that we should see the word “discounted” in both the numerator and denominator, just as present-value discounting appears in the math at right: the ratio is the sum of discounted costs divided by the sum of discounted generation. But the essence is there: all of the costs and benefits, discounted accordingly, spitting out a cost per megawatt-hour.
[For the true nerds, here’s a slightly longer history of LCOE, including a timeline; relatedly, consider EIA’s 2025 assessment of LCOE, LCOS, and LACE.]
Since changes to the power grid were uninteresting and therefore off the radar for most people at this time, LCOE attracted little attention outside of utilities’ integrated resources plans and deliberations by public utility commissions.
Then the winds changed. Literally.
Rise of renewables and consternation
Wind then threw a wrench in the works. In just a few years, utility-scale wind became an important new source of generation. From almost nothing, wind power became the first intermittent renewable energy source to hit a meaningful scale. And it happened quickly.

At low levels of grid penetration, the aforementioned intermittency didn’t matter much. Sure, the wind would blow, and then stop blowing, and then blow again. At just a few percent of a regional grid, it was easy for dispatchable renewables to accommodate this growth.
A major driver for this shift: wind power’s low cost. “Look!” the advocates said, “such a low LCOE!” Indeed, it was true: between 2009 and 2014, the unsubsidized levelized cost of new wind generation fell by more than half, plunging well below the price of new gas generation. On top of this rapid technological improvement, Congress was still (most years) offering a production tax credit for wind, making it the best deal on much of the U.S. electric grid.
Friends of fossil fuels and enemies of change became gradually livid. After mocking these “wind mills” for a few years, they started to see the momentum of this new industry as an annoyance and perhaps even a bit of a threat. They complained that LCOE masked wind’s intermittency.
“LCOE for wind vs. LCOE for gas? Apples and oranges!” they cried. In fairness, they had, and have, a point. Yes, utility-scale wind quickly became, and has for nearly two decades remained, the lowest-cost source of new energy. Still, you can’t tell the wind when to blow, and the grid needs power when it needs power, so LCOE tells you only the cost of a generic megawatt-hour.
The tricky part is to disentangle LCOE’s legitimate use (for life-cycle costing) from its abuse (pretending an intermittent apple is like a dispatchable orange).
Unfortunately, things were going to get worse before they got better.
Solar exacerbates the situation
Just as the shale gas revolution was gathering momentum and gas generation got its real building boom underway, solar started to take off. Its cost decrease was even more spectacular, with the cost of utility-scale solar falling by roughly 90% between 2009 and 2019. Its growth from a near-zero starting point was rapid, not to mention grid-defining in the places with the most rapid adoption.
(A quick digression: lest you worry about poor old natural gas, keep in mind that it represents the largest source of electricity generation on the U.S. grid, providing over 40% of energy. The big loser has been coal, with its share of electricity generation down from 51% in 2001 to under 15% in 2024.)
So, is LCOE a worse metric for solar than it is for wind? They’re both bad, according to those expecting apples-to-apples comparisons from LCOE. Yes, solar is more “reliable” than wind in the sense that the sun has, shall we say, a rather well-understood cadence across the day and year. At the same time, the perfect correlation (i.e., temporal coincidence) of the solar resource in a particular region – and solar is still somewhat concentrated in a few sunny places, notably 40% of it in California and Texas – creates a mid-day feast and a night-time famine, with a challenging ‘ramp’ down in the morning and, in particular, back up in the evening. This inconvenience has given rise to the so-called duck curve, named for the vaguely duck-shaped pattern of ‘net load’ (load minus small- and utility-scale solar).

It is important to understand why this analytical and practical tension should have become more acute. Intermittent utility-scale wind and solar have grown from 3.5% in 2012 (97% of which was wind) to over 17% in 2024. Integrating a few percent of intermittent renewables into the grid is easy; once those intermittent resources represent a large part of total generation in a particular place and at particular times, the challenge is far greater – and solar has been disproportionately concentrated in a relative handful of regions. All of a sudden, a once-peripheral phenomenon is now a central tension of the grid. (For example, EIA has already documented the beginnings of a duck curve in New England.)
Complaints gushed in response. Lower LCOE, yes, but now the grid as a whole and certain grid regions in particular had to absorb wind and solar in the worst possible way: gas and coal plants had to wait around for intermittent power to come off line. Increasingly, these plants’ capacity factors were being eroded by near-zero-marginal-cost power, but providing none of the grid reliability, year-round availability, or peaking capacity that the fossil sources could provide.
These complaints were, of course, entirely reasonable. Until they weren’t any longer.
Storage and flexibility rise…and LCOE returns?
As we were approaching 2020, utility-scale storage started to gather momentum. In the few years since, it has become a central part of the grid through astronomical growth.

Note that the graph above shows additions, i.e., not merely an increase but an acceleration in installed storage capacity.
California is the caricature and leader, with utility-scale or front-of-the-meter (FTM) storage “expanding nearly twentyfold from 0.6 GW in 2020 to 11.7 GW in 2024 – making up nearly half (45%) of total national utility battery capacity” and likely hitting 17 GW by the end of 2025. The context: peak demand in 2025 for California (technically for CAISO, the California Independent System Operator, whose boundary is 95% the same as the state’s) was 44.5 GW. Texas isn’t far behind, with over 15 GW of storage relative to an annual peaks in the 75-85 GW range.
This large-scale storage is a distinct capacity resource; it is also increasingly combined with solar on site, with about 20% of 2024’s installed solar coupled with storage. LBNL’s 2025 Utility-Scale Solar Update further notes that 47% of solar nationally in the interconnection queue (the various regional waiting lists for planned resources seeking grid connections) is paired with storage, and 93% of the California queue has storage.
In other words, storage – as a grid-scale player – has arrived.
Yet this recent rise of battery storage has coincided with the other long-running shift that will make LCOE truly relevant: flexible demand. This realm has fuzzy boundaries – and has almost none of the punchy summary stats of storage – but consider the following trends in tandem:
- Growth of DERs that have inherent flexibility (such as EVs) or are designed entirely to deliver flexibility (behind-the-meter storage).
- VPPs and utility programs that are or contain elements of VPPs, as well as entirely new business models (such as Base Power) that perform similar functions.
It’s challenging to make the case quickly and clearly, but it’s easy to get a strong sense of the direction we’re headed: according to Berkeley’s Lab’s VPP Profiles and Inventory, there is already a large and growing ‘inventory’ of VPPs organized and managed by a wide array of entities. Again, it’s tricky to summarize, and the Inventory attempts to combine several different data sets, but consider the following:
- Over 15 million utility customers currently participate in over 180 programs with a potential capacity of 19 GW.
- Some ‘programs’ are old-style demand response, while others are new-fangled tech-enabled flexibility with additional nimbleness and elements of automation.
- The utility types include IOUs, municipal utilities, rural cooperatives, corporations, and every possible combination of those.
- Many are clearly just a start in one area – only EV charging, smart thermostats, PV+battery, even a few grid-interactive buildings – with clear ambition to expand.
On that final point, it’s worth acknowledging that the lion’s share of activity (both capacity and participants) is with a relative handful of VPPs, but there are many small VPPs that are either (a) in big places with the opportunity for growth or (b) in small places, thereby demonstrating that these can work at 5 or 10 MW. The vast majority of utility programs were enacted in just the last five years.
In short, while storage has fully arrived, flexibility is still arriving – but steadily, and all over the place.
So…LCOE returns?
Let’s return to the complaints against LCOE as renewables were on the rise: the metric focused too much on the cost of any megawatt-hour, without understanding how megawatt-hours are worth different amounts at different times of day, in different seasons, at different extreme circumstances. The complaint is based on essential truths: demand is not constant, and energy can be more or less scarce at different moments. Intermittency was never particularly well aligned with those needs, so the low LCOE of intermittent renewables was misleading.
That’s now changing due to these fundamental shifts in the energy system. Need to move megawatt-hours around during the day? Storage can do that, at scale, charging with solar or wind and discharging back into the grid at peak times or when the sun goes down and the wind stops blowing. Need to reshape demand? We’re building the systems – the utility programs, the embedded tech in devices, the right pricing – to make that happen too, charging EVs and running dishwashers and heating water when power is abundant and cheap.

(A quick aside: there might even be other benefits, such as increased reliability due to solar and storage. Texas has found that, as solar and storage have come on line, the state has a far lower risk of summer blackouts.)
Of course, there are system-wide issues to figure out; for example, as we coordinate more and more distributed resources, we have to worry a lot more about transmission and distribution capacity. Until recently, that has meant building high-voltage long-distance transmission, but now we need to focus in a hyper-local way, finding the congestion and bottlenecks at the level of the county, the city, even the substation.
Still, overall, storage and flexibility now allow us to chase the lowest-cost resources, and that means (largely) the ones with the lowest LCOE. As we see in the 2025 data, we seem to be doing just that: new generation continues to be overwhelmingly solar and wind, with storage still rising quickly.

This is all exciting, but let’s keep our cool: the energy transition is still just getting started. Our electric system, encompassing the transmission grid and all of our generation and storage assets, is a massive piece of infrastructure, and we will need at least a couple of decades of sustained investment. Furthermore, the task of integrating renewables, storage, and flexibility will surely pose challenges demanding the same continued effort and long attention span. And lest we forget, there are still political headwinds (indeed, some folks are forcing uneconomic coal plants to run past their planned retirement dates in Indiana, Colorado, Washington, and Michigan – but that’s a story for another time).
Still, I hope you will savor the excitement. In particular, I hope you will carry the message that those low-LCOE resources really are turning out to be a good deal after all.