According to data from the Energy Information Administration (EIA), more than 20 gigawatts (GW) of battery capacity have been added to the US electric grid in the last four years. This rapid expansion is equivalent to the production of 20 nuclear reactors and is crucial for averting power disruptions, especially in states that rely significantly on intermittent renewable energy sources such as wind and solar.
Storage does have 2 relevant metrics. how fast it can charge/discharge in GW, and the amount of energy available in gwh. Batteries typically have both these amounts equal. While other storage technologies usually can discharge a large amount of gwh at a slow rate. The discharge rate is often limited to the line capacity available as well.
They’re equal if they’re running at a 1c discharge rate. Lfp, which are stable and good for safety, can have higher discharge rates of 5c up to 25c. Which would mean the capacity would be much less. To compare apples to apples, it’d be much better if they gave both the GW and GWh numbers.
Yep, the two numbers picture the actual status. What good is having a GW power if it lasts for a second, sort of speak.
No they’re equal if the battery is designed to provide 1 hr of coverage.
A 1 GWh batter will last 1 hour if its discharge rate is 1 GW.
It’s the timeframe of 1 hour that makes these two measures numerically equal.
Thats what was said, for some applications 1c is good, for others 0,5 or even 0,25 is better. It depends on your usecase. Frequency regulation is often 1c, while if you are primarily concerned about depth, you could choose another configuration. It is also partly dependent on chemistry.
As an example: a 100kWh can be at either 1c discharge rate, or 0,5c. 50 kW(0,5c) is usually cheaper because there is less need for hardware (and I believe less risk of thermal runaway)
That’s what 1c means. If it were designed to provide 25GW but only lasted 1hr, then it’d be 25c.