The POWER Podcast

Follow The POWER Podcast
Share on
Copy link to clipboard

The POWER Podcast provides listeners with insight into the latest news and technology that is poised to affect the power industry. POWER’s Executive Editor Aaron Larson conducts interviews with leading industry experts and gets updates from insiders at po

POWER


    • May 21, 2025 LATEST EPISODE
    • every other week NEW EPISODES
    • 25m AVG DURATION
    • 193 EPISODES


    Search for episodes from The POWER Podcast with a specific topic:

    Latest episodes from The POWER Podcast

    191. Modular Geothermal Power: Gradient's Scalable Solution for Oil and Gas Sites

    Play Episode Listen Later May 21, 2025 22:16


    As the world transitions toward renewable energy sources, geothermal power has emerged as one of the most promising, yet underutilized, options in the clean energy portfolio. Unlike solar and wind, geothermal offers consistent baseload power generation capacity without intermittency challenges, making it an increasingly attractive component in the renewable energy mix. The geothermal sector has shown increasing potential in recent years, with technological innovations expanding its possible applications beyond traditional volcanic regions. These advances are creating opportunities to tap into moderate-temperature resources that were previously considered uneconomical, potentially unlocking gigawatts of clean, renewable power across the globe. It's within this expanding landscape that companies like Gradient Geothermal are pioneering new approaches. As a guest on The POWER Podcast, Ben Burke, CEO of Gradient Geothermal, outlined his company's innovative approach to geothermal energy extraction that could transform how we think about energy recovery from oil and gas operations. Modular and Mobile Geothermal Solutions Gradient Geothermal differentiates itself in the geothermal marketplace through its focus on modular, portable equipment designed specifically for oil field operations, geothermal operators, and potentially data centers. Unlike traditional geothermal installations that require permanent infrastructure, Gradient's equipment can be moved every six to 18 months as needed, allowing clients to adjust their thermal capacity by adding or removing units as requirements change. “The advantage of mobility and modularity is really important to oil and gas operators,” Burke said. The company's solution consists of two main components: an off-the-shelf organic Rankine cycle (ORC) unit and a primary heat exchanger loop. This system can handle various ratios of oil, gas, and water—even “dirty” water containing sand, brines, and minerals—and convert that heat into usable power. One of the most compelling aspects of Gradient's technology is its ease of installation. “Installation takes one day,” Burke explained. “It's two pipes and three wires, and it's able to sit on a gravel pad or sit on trailers.” This quick setup contrasts sharply with traditional geothermal plants that can take years to construct. The units come in three sizes: 75 kW, 150 kW, and 300 kW. The modular nature allows for flexible configurations, with units able to be connected in series or parallel to handle varying water volumes and temperatures.

    190. What Trump's First 100 Days Have Meant to the Power Industry

    Play Episode Listen Later Apr 30, 2025 38:47


    U.S. President Donald Trump was sworn into office for the second time on Jan. 20, 2025. That means April 30 marks his 100th day back in office. A lot has happened during that relatively short period of time. The Trump administration has implemented sweeping changes to U.S. energy policy, primarily focused on promoting fossil fuels while curtailing renewable energy development. The administration declared a “national energy emergency” to expedite approvals for fossil fuel infrastructure and lifted regulations on coal plants, exempting nearly 70 facilities from toxic pollutant rules. Coal was officially designated a “critical mineral,” with the Department of Justice directed to investigate regulatory bias against the industry. Additionally, the administration ended the Biden-era pause on approvals for new liquefied natural gas (LNG) export facilities, signaling strong support for natural gas expansion. On the environmental front, U.S. Environmental Protection Agency (EPA) Administrator Lee Zeldin announced 31 deregulatory actions designed in part to “unleash American energy.” The administration is also challenging the 2009 EPA finding that greenhouse gases endanger public health—a foundational element of climate regulation. President Trump announced the U.S.'s withdrawal from the Paris Climate Agreement, effective in early 2026, and terminated involvement in all climate-related international agreements, effectively eliminating previous emissions reduction commitments. Renewable energy has faced significant obstacles under the new administration. A six-month pause was imposed on offshore wind lease sales and permitting in federal waters, with specific projects targeted for cancellation. The administration issued a temporary freeze on certain Inflation Reduction Act (IRA) and Bipartisan Infrastructure Law (BIL) funds designated for clean energy projects. Policies were implemented to weaken federal clean car standards, potentially eliminate electric vehicle (EV) tax credits, and halt funding for EV charging networks—indirectly affecting power generation by potentially reducing electricity demand from EVs. Yet, the administration's tariff policy may end up impacting the power industry more than anything else it has done. “One thing in particular that I think would be hard to argue is not the most impactful, and that's the current status of tariffs and a potential trade war,” Greg Lavigne, a partner with the global law firm Sidley Austin, said as a guest on The POWER Podcast. In April, President Trump declared a national emergency to address trade deficits, imposing a 10% tariff on all countries and higher tariffs on nations with large trade deficits with the U.S. These tariffs particularly affect solar panels and components from China, potentially increasing costs for renewable energy projects and disrupting supply chains. Meanwhile, the offshore wind energy industry has also taken a hard hit under the Trump administration. “My second-biggest impact in the first 100 days would certainly be the proclamations pausing evaluation of permitting of renewable projects, but particularly wind projects, on federal lands,” said Lavigne. “That is having real-world impacts today on the offshore wind market off the eastern seaboard of the United States.” Despite the focus on traditional energy sources, the Trump administration has expressed support for nuclear energy as a tool for energy dominance and global competitiveness against Russian and Chinese nuclear exports. Key appointees, including Energy Secretary Chris Wright, have signaled a favorable stance toward nuclear power development, including small modular reactors. All these actions remain subject to ongoing legal and political developments, with their full impact on the power generation industry yet to unfold.

    189. Optimizing Supply Chain Processes to Ensure a Reliable Electric Power System

    Play Episode Listen Later Apr 24, 2025 19:40


    The power industry supply chain is facing unprecedented strain as utilities race to upgrade aging infrastructure against a backdrop of lengthening lead times and increasing project complexity. This supply chain gridlock arrives precisely when utilities face mounting pressure to modernize systems. As the industry confronts this growing crisis, innovations in procurement, manufacturing, and strategic planning are essential. “Utilities can optimize their supply chain for grid modernization projects by taking a collaborative approach between the services themselves and how they can support the projects, as well as having a partner to be able to leverage their sourcing capabilities and have the relationships with the right manufacturers,” Ian Rice, senior director of Programs and Services for Grid Services at Wesco, explained as a guest on The POWER Podcast. “At the end of the day, it's how can the logistical needs be accounted for and taken care of by the partnered firm to minimize the overall delays that are going to naturally come and mitigate the risks,” he said. Headquartered in Pittsburgh, Pennsylvania, Wesco is a leading global supply chain solutions provider. Rice explained that through Wesco, utilities gain access to a one-stop solution for program services, project site services, and asset management. The company claims its tailored approach “ensures cost reduction, risk mitigation, and operational efficiencies, allowing utilities to deliver better outcomes for their customers.” “We take a really comprehensive approach to this,” said Rice. “In the utility market, we believe pricing should be very transparent.” To promote a high level of transparency, Wesco builds out special recovery models for its clients. “What this looks like is: we take a complete cradle-to-grave approach on the lifecycle of the said project or program, and typically, it could be up to nine figures—very, very large programs,” Rice explained. “It all starts with building that model and understanding the complexity. What are the inputs, what are the outputs, and what constraints are there in the short term as well as the long term? And, really, what's the goal of that overall program?” The answers to those questions are accounted for in the construction of the model. “It all starts with demand management, which closely leads to a sourcing and procurement strategy,” Rice said. “From there, we can incorporate inventory control, and set up SOPs [standard operating procedures] of how we want to deal with the contractors and all the other stakeholders within that program or project. And that really ties into what's going to be the project management approach, as well in setting up all the different processes, or even the returns and reclamation program. We're really covering everything minute to minute, day to day, the entire duration of that project, and tying that into a singular model.” But that's not all. Rice said another thing that sets Wesco apart from others in the market is when it takes this program or project approach, depending on the scale of it, the company remains agnostic when it comes to suppliers. “We're doing procurement on behalf of our customers,” he said. “So, if they have direct relationships, we can facilitate that. If they're working with other distributors, we can also manage that. The whole idea here is: what's in the best interest of the customer to provide the most value.”

    188. DOE's Loan Programs Office Offers Game-Changing Possibilities

    Play Episode Listen Later Apr 10, 2025 25:06


    As the presidential inauguration loomed on the horizon in January this year, the U.S. Department of Energy's (DOE's) Loan Programs Office (LPO) published a “year-in-review” article, highlighting accomplishments from 2024 and looking ahead to the future. It noted that the previous four years had been the most productive in the LPO's history. “Under the Biden-Harris Administration, the Office has announced 53 deals totaling approximately $107.57 billion in committed project investment––approximately $46.95 billion for 28 active conditional commitments and approximately $60.62 billion for 25 closed loans and loan guarantees,” it said. Much of the funding for these investments came through the passing of the Bipartisan Infrastructure Law (BIL) and the Inflation Reduction Act (IRA). The LPO reported that U.S. clean energy investment more than doubled from $111 billion in 2020 to $236 billion in 2023, creating more than 400,000 clean energy jobs. The private sector notably led the way, enabled by U.S. government policy and partnerships. “There were 55 deals that we got across the finish line,” Jigar Shah, director of the LPO from March 2021 to January 2025, said as a guest on The POWER Podcast, while noting there were possibly 200 more projects that were nearly supported. “They needed to do more work on their end to improve their business,” he explained. That might have meant they needed to de-risk their feedstock agreement or their off-take agreement, for example, or get better quality contractors to do the construction of their project. “It was a lot of education work,” Shah said, “but I'm really proud of that work, because I think a lot of those companies, regardless of whether they used our office or not, were better for the interactions that they had with us.” A Framework for Success When asked about doling out funds, Shah viewed the term somewhat negatively. “As somebody who's been an investor in my career, you don't dole out money, because that's how you lose money,” he explained. “What you do is you create a framework. And you tell people, ‘Hey, if you meet this framework, then we've got a loan for you, and if you don't meet this framework, then we don't have a loan for you.” Shah noted that the vast majority of the 400 to 500 companies that the LPO worked closely with during his tenure didn't quite meet the framework. Still, most of those that did have progressed smoothly. “Everything that started construction is still under construction, and so, they're all going to be completed,” said Shah. “I think all in all, the thesis worked. Certainly, there are many people who had a hard time raising equity or had a hard time getting to the finish line and final investment decision, but for those folks who got to final investment decision and started construction, I think they're doing very well.” Notable Projects When asked which projects he was most excited about, Shah said, “All of them are equally exciting to me. I mean, that's the beauty of the work I do.” He did, however, go on to mention several that stood out to him. Specifically, he pointed to the Wabash, Montana Renewables, EVgo, and Holtec Palisades projects, which were all supported under the LPO's Title 17 Clean Energy Financing Program, as particularly noteworthy. Perhaps the most important of the projects Shah mentioned from a power industry perspective, was the Holtec Palisades endeavor. Valued at $1.52 billion, the loan guarantee will allow upgrading and repowering of the Palisades nuclear plant in Covert, Michigan, a first in U.S. history, which has spurred others to bring retired nuclear plants back online. “[It's] super exciting to see our first nuclear plant being restarted, and as a result, the Constellation folks have decided to restart a nuclear reactor in Pennsylvania, and NextEra has decided to restart a nuclear reactor in Iowa. So, it's great to have that catalytic impact,” said Shah.

    187. TVA's Clinch River Nuclear Power Project: Where Things Stand Today

    Play Episode Listen Later Apr 2, 2025 23:09


    The Tennessee Valley Authority (TVA) has for many years been evaluating emerging nuclear technologies, including small modular reactors, as part of technology innovation efforts aimed at developing the energy system of the future. TVA—the largest public power provider in the U.S., serving more than 10 million people in parts of seven states—currently operates seven reactors at three nuclear power plants: Browns Ferry, Sequoyah, and Watts Bar. Meanwhile, it's also been investing in the exploration of new nuclear technology by pursuing small modular reactors (SMRs) at the Clinch River Nuclear (CRN) site in Tennessee. “TVA does have a very diverse energy portfolio, including the third-largest nuclear fleet [in the U.S.],” Greg Boerschig, TVA's vice president for the Clinch River project, said as a guest on The POWER Podcast. “Our nuclear power plants provide about 40% of our electricity generated at TVA. So, this Clinch River project and our new nuclear program is building on a long history of excellence in nuclear at the Tennessee Valley.” TVA completed an extensive site selection process before choosing the CRN site as the preferred location for its first SMR. The CRN site was originally the site of the Clinch River Breeder Reactor project in the early 1980s. Extensive grading and excavation disturbed approximately 240 acres on the project site before the project was terminated. Upon termination of the project, the site was redressed and returned to an environmentally acceptable condition. The CRN property is approximately 1,200 acres of land located on the northern bank of the Clinch River arm of the Watts Bar Reservoir in Oak Ridge, Roane County, Tennessee. The CRN site has a number of significant advantages, which include two existing power lines that cross the site, easy access off of Tennessee State Route 58, and the fact that it is a brownfield site previously disturbed and characterized as a part of the Clinch River Breeder Reactor project. The Oak Ridge area is also noted to have a skilled local workforce, including many people familiar with the complexities of nuclear work. “The community acceptance here is really just phenomenal,” said Boerschig. “The community is very educated and very well informed.” TVA began exploring advanced nuclear technologies in 2010. In 2016, it submitted an application to the Nuclear Regulatory Commission (NRC) for an Early Site Permit for one or more SMRs with a total combined generating capacity not to exceed 800 MW of electricity for the CRN site. In December 2019, TVA became the first utility in the nation to successfully obtain approval for an Early Site Permit from the NRC to potentially construct and operate SMRs at the site. While the decision to potentially build SMRs is an ongoing discussion as part of the asset strategy for TVA's future generation portfolio, significant investments have been made in the Clinch River project with the goal of moving it forward. OPG has a BWRX-300 project well underway at its Darlington New Nuclear Project site in Clarington, Ontario, with construction expected to be complete by the end of 2028. While OPG is developing its project in parallel with the design process, TVA expects to wait for more design maturity before launching its CRN project. “As far as the standard design is concerned, we're at the same pace, but overall, their project is about two years in front of ours,” said Boerschig. “And that's by design—they are the lead plant for this effort.” In the meantime, there are two primary items on TVA's to-do list. “Right now, the two biggest things that we have on our list are completing the standard design work, and then the construction permit application,” Boerschig said, noting the standard design is “somewhere north of 75% complete” and that TVA's plan is to submit the construction permit application “sometime around mid-year of this year.”

    186. How Virtual Power Plants Enhance Grid Operations and Resilience

    Play Episode Listen Later Mar 19, 2025 27:35


    A virtual power plant (VPP) is a network of decentralized, small- to medium-scale power generating units, flexible power consumers, and storage systems that are aggregated and operated as a single entity through sophisticated software and control systems. Unlike a traditional power plant that exists in a single physical location, a VPP is distributed across multiple locations but functions as a unified resource. VPPs are important to power grid operations because they provide grid flexibility. VPPs help balance supply and demand on the grid by coordinating many smaller assets to respond quickly to fluctuations. This becomes increasingly important as more intermittent renewable energy sources—wind and solar—are added to the grid. “A virtual power plant is essentially an aggregation of lots of different resources or assets from the grid,” Sally Jacquemin, vice president and general manager of Power & Utilities with AspenTech, said as a guest on The POWER Podcast. “As a whole, they have a bigger impact on the grid than any individual asset would have on its own. And so, you aggregate all these distributed energy resources and assets together to create a virtual power plant that can be dispatched to help balance the overall system supply to demand.” VPPs provide a way to effectively integrate and manage distributed energy resources such as rooftop solar, small wind turbines, battery storage systems, electric vehicles, and demand response programs. VPPs can reduce strain on the grid during peak demand periods by strategically reducing consumption or increasing generation from distributed sources, helping to avoid blackouts and reducing the need for expensive peaker plants. Other benefits provided by VPPs include enhancing grid resilience, enabling smaller energy resources to participate in electricity markets that would otherwise be inaccessible to them individually, and reducing infrastructure costs by making better use of existing assets and reducing peak demand. VPPs enable consumers to become “prosumers,” that is, both producers and consumers of energy, giving them more control over their energy use and potentially reducing their costs. “Virtual power plants are becoming important, not only for utilities, but also in the private sector,” Jacquemin explained. “Because of the commercial value of electricity rising and the market system rates, it's now profitable for these virtual power plants in many markets due to the value of power that they can supply during these periods of low supply.” AspenTech is a leading industrial software partner, with more than 60 locations worldwide. The company's solutions address complex environments where it is critical to optimize the asset design, operation, and maintenance lifecycle. AspenTech says its Digital Grid Management solutions “enable the resilient, sustainable, and intelligent utility of the future.” “At AspenTech Digital Grid Management, our software is in control rooms of utilities around the world,” said Jacquemin. “All utilities know they need to be investing in their digital solutions and modernizing their control room technology in order to meet the demands of the energy transition. So, utilities need to be focusing more time and more money to ensure that their software and their systems are capable of enabling that utility of the future.”

    185. AI-Powered Energy Forecasting: How Accurate Predictions Could Save Your Power Company

    Play Episode Listen Later Mar 12, 2025 29:31


    Net-demand energy forecasts are critical for competitive market participants, such as in the Electric Reliability Council of Texas (ERCOT) and similar markets, for several key reasons. For example, accurate forecasting helps predict when supply-demand imbalances will create price spikes or crashes, allowing traders and generators to optimize their bidding strategies. It's also important for asset optimization. Power generators need to know when to commit resources to the market and at what price levels. Poor forecasting can lead to missed profit opportunities or operating assets when prices don't cover costs. Fortunately, artificial intelligence (AI) is now capable of producing highly accurate forecasts from the growing amount of meter and weather data that is available. The complex and robust calculations performed by these machine-learning algorithms is well beyond what human analysts are capable of, making advance forecasting systems essential to utilities. Plus, they are increasingly valuable to independent power producers (IPPs) and other energy traders making decisions about their positions in the wholesale markets. Sean Kelly, co-founder and CEO of Amperon, a company that provides AI-powered forecasting solutions, said using an Excel spreadsheet as a forecasting tool was fine back in 2005 when he got started in the business as a power trader, but that type of system no longer works adequately today. “Now, we're literally running at Amperon four to six models behind the scenes, with five different weather vendors that are running an ensemble each time,” Kelly said as a guest on The POWER Podcast. “So, as it gets more confusing, we've got to stay on top of that, and that's where machine learning really kicks in.” The consequences of being ill-prepared can be dire. Having early and accurate forecasts can mean the difference between a business surviving or failing. Effects from Winter Storm Uri offer a case in point. Normally, ERCOT wholesale prices fluctuate from about $20/MWh to $50/MWh. During Winter Storm Uri (Feb. 13–17, 2021), ERCOT set the wholesale electricity price at its cap of $9,000/MWh due to extreme demand and widespread generation failures caused by the storm. This price remained in effect for approximately 4.5 days (108 hours). This 180-fold price increase had devastating financial impacts across the Texas electricity market. The financial fallout was severe. Several retail electricity providers went bankrupt, most notably Griddy Energy, which passed the wholesale prices directly to customers, resulting in some receiving bills of more than $10,000 for just a few days of power. “Our clients were very appreciative of the work we had at Amperon,” Kelly recalled. “We probably had a dozen or so clients at that time, and we told them on February 2 that this was coming,” he said. With that early warning, Kelly said Amperon's clients were able to get out in front of the price swing and buy power at much lower rates. “Our forecasts go out 15 days, ERCOT's forecasts only go out seven,” Kelly explained. “So, we told everyone, ‘Alert! Alert! This is coming!' Dr. Mark Shipham, our in-house meteorologist, was screaming it from the rooftops. So, we had a lot of clients who bought $60 power per megawatt. So, think about buying 60s, and then your opportunity is 9,000. So, a lot of traders made money,” he said. “All LSEs—load serving entities—still got hit extremely bad, but they got hit a lot less bad,” Kelly continued. “I remember one client saying: ‘I bought power at 60, then I bought it at 90, then I bought it at 130, then I bought it at 250, because you kept telling me that load was going up and that this was getting bad.' And they're like, ‘That is the best expensive power I've ever bought. I was able to keep my company as a retail energy provider.' And, so, those are just some of the ways that these forecasts are extremely helpful.”

    184. Nuclear Power Renaissance Underway in West Texas

    Play Episode Listen Later Mar 4, 2025 35:13


    When you think of innovative advancements in nuclear power technology, places like the Idaho National Laboratory and the Massachusetts Institute of Technology probably come to mind. But today, some very exciting nuclear power development work is being done in West Texas, specifically, at Abilene Christian University (ACU). That's where Natura Resources is working to construct a molten salt–cooled, liquid-fueled reactor (MSR). “We are in the process of building, most likely, the country's first advanced nuclear reactor,” Doug Robison, founder and CEO of Natura Resources, said as a guest on The POWER Podcast. Natura has taken an iterative, milestone-based approach to advanced reactor development and deployment, focused on efficiency and performance. This started in 2020 when the company brought together ACU's NEXT Lab with Texas A&M University; the University of Texas, Austin; and the Georgia Institute of Technology to form the Natura Resources Research Alliance. In only four years, Natura and its partners developed a unique nuclear power system and successfully licensed the design. The U.S. Nuclear Regulatory Commission (NRC) issued a construction permit for deployment of the system at ACU last September. Called the MSR-1, ACU's unit will be a 1-MWth molten salt research reactor (MSRR). It is expected to provide valuable operational data to support Natura's 100-MWe systems. It will also serve as a “world-class research tool” to train advanced reactor operators and educate students, the company said. Natura is not only focused on its ACU project, but it is also moving forward on commercial reactor projects. In February, the company announced the deployment of two advanced nuclear projects, which are also in Texas. These deployments, located in the Permian Basin and at Texas A&M University's RELLIS Campus, represent significant strides in addressing energy and water needs in the state. “Our first was a deployment of a Natura commercial reactor in the Permian Basin, which is where I spent my career. We're partnering with a Texas produced-water consortium that was created by the legislature in 2021,” said Robison. One of the things that can be done with the high process heat from an MSR is desalinization. “So, we're going to be desalinating produced water and providing power—clean power—to the oil and gas industry for their operations in the Permian Basin,” said Robison. Meanwhile, at Texas A&M's RELLIS Campus, which is located about eight miles northwest of the university's main campus in College Station, Texas, a Natura MSR-100 reactor will be deployed. The initiative is part of a broader project known as “The Energy Proving Ground,” which involves multiple nuclear reactor companies. The project aims to bring commercial-ready small modular reactors (SMRs) to the site, providing a reliable source of clean energy for the Electric Reliability Council of Texas (ERCOT).

    183. Geothermal Energy Storage: The Clean Power Solution You Haven't Heard Of

    Play Episode Listen Later Feb 24, 2025 22:54


    Geothermal energy has been utilized by humans for millennia. While the first-ever use may be a mystery, we do know the Romans tapped into it in the first century for hot baths at Aquae Sulis (modern-day Bath, England). Since then, many other people and cultures have found ways to use the Earth's underground heat to their benefit. Geothermal resources were used for district heating in France as far back as 1332. In 1904, Larderello, Italy, was home to the world's first experiment in geothermal electricity generation, when five lightbulbs were lit. By 1913, the first commercial geothermal power plant was built there, which expanded to power the local railway system and nearby villages. However, one perhaps lesser-known geothermal concept revolves around energy storage. “It's very much like pumped-storage hydropower, where you pump a lake up a mountain, but instead of going up a mountain, we're putting that lake deep in the earth,” Cindy Taff, CEO of Sage Geosystems, explained as a guest on The POWER Podcast. Sage Geosystems' technology utilizes knowledge gleaned from the oil and gas industry, where Taff spent more than 35 years as a Shell employee. “What we do is we drill a well. We're targeting a very low-permeability formation, which is the opposite of what oil and gas is looking for, and quite frankly, it's the opposite of what most geothermal technologies are looking for. That low permeability then allows you to place a fracture in that formation, and then operate that fracture like a balloon or like your lungs,” Taff explained. “When the demand is low, we use electricity to power an electric pump. We pump water into the fracture. We balloon that fracture open and store the water under pressure until a time of day that power demand peaks. Then, you open a valve at surface. That fracture is naturally going to close. It drives the water to surface. You put it through a Pelton turbine, which looks like a kid's pinwheel. You spin the turbine, which spins the generator, and you generate electricity.” Unlike more traditional geothermal power generation systems that use hot water or steam extracted from underground geothermal reservoirs, Sage's design uses what's known as hot dry rock technology. To reach hot dry rock, drillers may have to go deeper to find desired formations, but these formations are much more common and less difficult to identify, which greatly reduces exploration risks. Taff said traditional geothermal energy developers face difficulties because they need to find three things underground: heat, water, and high-permeability formations. “The challenge is the exploration risk, or in other words, finding the resource where you've got the heat, the large body of water deep in the earth, as well as the permeability,” she said. “In hot dry rock geothermal, which is what we're targeting, you're looking only for that heat. We want a low-permeability formation, but again, that's very prevalent.” Sage is now in the process of commissioning its first commercial energy storage project in Texas. “We're testing the piping, and we're function testing the generator and the Pelton turbine, so we'll be operating that facility here in the next few weeks,” Taff said. Meanwhile, the company has also signed an agreement with the California Resources Corporation to establish a collaborative framework for pursuing commercial projects and joint funding opportunities related to subsurface energy storage and geothermal power generation in California. It also has ongoing district heating projects in Lithuania and Romania, and Taff said the U.S. Department of Defense has shown a lot of interest in the company's geothermal technology. Additionally, Meta signed a contract for a 150-MW geothermal power generation system to supply one of its data centers.

    182. Space-Based Solar Power: The Future of 24/7 Clean Energy Generation

    Play Episode Listen Later Feb 18, 2025 43:58


    Imagine a field of solar panels floating silently in the endless day of Earth's orbit. Unlike their terrestrial cousins, this space-based solar array never faces nighttime, clouds, or atmospheric interference. Instead, they bathe in constant, intense sunlight, converting this endless stream of energy into electricity with remarkable efficiency. But the true innovation lies in how this power is transmitted to power grids on Earth. The electricity generated in space is converted into invisible beams of microwaves or laser light that pierce through the atmosphere with minimal losses. These beams are precisely aimed at receiving stations on Earth—collections of antennas or receivers known as “rectennas” that capture and reconvert the energy back into electricity that can be supplied to the power grid. This isn't science fiction—it's space-based solar power (SBSP), a technology that could revolutionize how clean energy is generated and distributed. While conventional solar panels on Earth can only produce power during daylight hours and are at the mercy of weather conditions, orbital solar arrays could beam massive amounts of clean energy to Earth 24 hours a day, 365 days a year, potentially transforming the global energy landscape.

    181. A New Paradigm for Power Grid Operation

    Play Episode Listen Later Feb 10, 2025 41:18


    Power grids operate like an intricate ballet of energy generation and consumption that must remain perfectly balanced at all times. The grid maintains a steady frequency (60 Hz in North America and 50 Hz in many other regions) by matching power generation to demand in real-time. Traditional power plants with large rotating turbines and generators play a crucial role in this balance through their mechanical inertia—the natural tendency of these massive spinning machines to resist changes in their rotational speed. This inertia acts as a natural stabilizer for the grid. When there's a sudden change in power demand or generation, such as a large factory turning on or a generator failing, the rotational energy stored in these spinning masses automatically helps cushion the impact. The machines momentarily speed up or slow down slightly, giving grid operators precious seconds to respond and adjust other power sources. However, as we transition to renewable energy sources like solar and wind that don't have this natural mechanical inertia, maintaining grid stability becomes more challenging. This is why grid operators are increasingly focusing on technologies like synthetic inertia from wind turbines, battery storage systems, and advanced control systems to replicate the stabilizing effects traditionally provided by conventional power plants. Alex Boyd, CEO of PSC, a global specialist consulting firm working in the areas of power systems and control systems engineering, believes the importance of inertia will lessen, and probably sooner than most people think. In fact, he suggested stability based on physical inertia will soon be the least-preferred approach. Boyd recognizes that his view, which was expressed while he was a guest on The POWER Podcast, is potentially controversial, but there is a sound basis behind his prediction. Power electronics-based systems utilize inverter-based resources, such as wind, solar, and batteries. These systems can detect and respond to frequency deviations almost instantaneously using fast frequency response mechanisms. This actually allows for much faster stabilization compared to mechanical inertia. Power electronics reduce the need for traditional inertia by enabling precise control of grid parameters like frequency and voltage. While they decrease the available physical inertia, they also decrease the amount of inertia required for stability through advanced control strategies. Virtual synchronous generators and advanced inverters can emulate inertia dynamically, offering tunable responses that adapt to grid conditions. For example, adaptive inertia schemes provide high initial inertia to absorb faults but reduce it over time to prevent oscillations. Power electronic systems address stability issues across a wide range of frequencies and timescales, including harmonic stability and voltage regulation. This is achieved through multi-timescale modeling and control techniques that are not possible with purely mechanical systems. Inverter-based resources allow for distributed coordination of grid services, such as frequency regulation and voltage support, enabling more decentralized grid operation compared to centralized inertia-centric systems. Power electronic systems are essential for grids with a high penetration of renewable energy sources, which lack inherent mechanical inertia. These systems ensure stability while facilitating the transition to low-carbon energy by emulating or replacing traditional generator functions. “I do foresee a time in the not-too-distant future where we'll be thinking about how do we actually design a system so that we don't need to be impacted so much by the physical inertia, because it's preventing us from doing what we want to do,” said Boyd. “I think that time is coming. There will be a lot of challenges to overcome, and there'll be a lot of learning that needs to be done, but I do think the time is coming.”

    180. Data Centers Consume 3% of Energy in Europe: Understand Geographic Hotspots and How AI Is Reshaping Demand

    Play Episode Listen Later Jan 31, 2025 30:59


    The rapid rise of data centers has put many power industry demand forecasters on edge. Some predict the power-hungry nature of the facilities will quickly create problems for utilities and the grid. ICIS, a data analytics provider, calculates that in 2024, demand from data centers in Europe accounted for 96 TWh, or 3.1% of total power demand. “Now, you could say it's not a lot—3%—it's just a marginal size, but I'm going to spice it up a bit with two additional layers,” Matteo Mazzoni, director of Energy Analytics at ICIS, said as a guest on The POWER Podcast. “One is: that power demand is very consolidated in just a small subset of countries. So, five countries account of over 60% of that European power demand. And within those five countries, which are the usual suspects in terms of Germany, France, the UK, Ireland, and Netherlands, half of that consumption is located in the FLAP-D market, which sounds like a fancy new coffee, but in reality is just five big cities: Frankfurt, London, Amsterdam, Paris, and Dublin.” Predicting where and how data center demand will grow in the future is challenging, however, especially when looking out more than a few years. “What we've tried to do with our research is to divide it into two main time frames,” Mazzoni explained. “The next three to five years, where we see our forecast being relatively accurate because we looked at the development of new data centers, where they are being built, and all the information that are currently available. And, then, what might happen past 2030, which is a little bit more uncertain given how fast technology is developing and all that is happening on the AI [artificial intelligence] front.” Based on its research, ICIS expects European data center power demand to grow 75% by 2030, to 168 TWh. “It's going to be a lot of the same,” Mazzoni predicted. “So, those big centers—those big cities—are still set to attract most of the additional data center consumption, but we see the emergence of also new interesting markets, like the Nordics and to a certain extent also southern Europe with Iberia [especially Spain] being an interesting market.” Yet, there is still a fair amount of uncertainty around demand projections. Advances in liquid cooling methods will likely reduce data center power usage. That's because liquid cooling offers more efficient heat dissipation, which translates directly into lower electricity consumption. Additionally, there are opportunities for further improvement in power usage effectiveness (PUE), which is a widely used data center energy efficiency metric. At the global level, the average PUE has decreased from 2.5 in 2007 to a current average of 1.56, according to the ICIS report. However, new facilities consistently achieve a PUE of 1.3 and sometimes much better. Google, which has many state-of-the-art and highly efficient data centers, reported a global average PUE of 1.09 for its facilities over the last year. Said Mazzoni, “An expert in the field told us when we were doing our research, when tech moves out of the equation and you have energy engineers stepping in, you start to see that a lot of efficiency improvements will come, and demand will inevitably fall.” Thus, data center load growth projections should be taken with a grain of salt. “The forecast that we have beyond 2030 will need to be revised,” Mazzoni predicted. “If we look at the history of the past 20 years—all analysts and all forecasts around load growth—they all overshoot what eventually happened. The first time it happened when the internet arrived—there was obviously great expectations—and then EVs, electric vehicles, and then heat pumps. But if we look at, for example, last year—2024—European power demand was up by 1.3%, U.S. power demand was up by 1.8%, and probably weather was the main driver behind that growth.”

    179. District Energy Systems: The Invisible Giant of Urban Efficiency

    Play Episode Listen Later Jan 22, 2025 51:20


    District energy systems employ a centralized facility to supply heating, cooling, and sometimes electricity for multiple buildings in an area through a largely underground, mostly unseen network of pipes. When district energy systems are utilized, individual buildings do not need their own boilers, chillers, and cooling towers. This offers a number of benefits to building owners and tenants. Among them are: • Energy Efficiency. Centralized heating/cooling is more efficient than individual building systems, reducing energy use by 30% to 50% in some cases. • Cost Savings. Lower operations and maintenance costs through economies of scale and reduced equipment needs per building. • Reduced Environmental Impacts. Emissions are lessened and renewable energy resources can often be more easily integrated. • Reliability. A more resilient energy supply is often provided, with redundant systems and professional operation. • Space Optimization. Buildings need less mechanical equipment, freeing up valuable space. The concept is far from new. In fact, Birdsill Holly is credited with deploying the U.S.'s first district energy system in Lockport, New York, in 1877, and many other cities incorporated district systems into their infrastructure soon thereafter. While district energy systems are particularly effective in dense urban areas, they're also widely used at hospitals and at other large campuses around the world. “There's over 600 operating district energy systems in the U.S., and that's in cities, also on college and university campuses, healthcare, military bases, airports, pharma, even our sort of newer industries like Meta, Apple, Google, their campuses are utilizing district energy, because, frankly, there's economies of scale,” Rob Thornton, president and CEO of the International District Energy Association (IDEA), said as a guest on The POWER Podcast. “District energy is actually quite ubiquitous,” said Thornton, noting that systems are common in Canada, throughout Europe, in the Middle East, and many other parts of the world. “But, you know, not that well-known. We're not visible. Basically, the assets are largely underground, and so we don't necessarily have the visibility opportunity of like wind turbines or solar panels,” he said. “So, we quietly do our work. But, I would guess that for the listeners of this podcast, if they went to a college or university in North America, I bet, eight out of 10 lived in a dorm that was supplied by a district heating system. So, it's really a lot more common than people realize,” said Thornton.

    178. Why LVOE May Be a Better Decision-Making Tool Than LCOE for Power Companies

    Play Episode Listen Later Dec 19, 2024 33:33


    Most POWER readers are probably familiar with levelized cost of energy (LCOE) and levelized value of energy (LVOE) as metrics used to help evaluate potential power plant investment options. LCOE measures the average net present cost of electricity generation over a facility's lifetime. It includes capital costs, fuel costs, operation and maintenance (O&M) costs, financing costs, expected capacity factor, and project lifetime. Meanwhile, LVOE goes beyond LCOE by considering the actual value the power provides to the grid, including time of generation (peak vs. off-peak), location value, grid integration costs and benefits, contributions to system reliability, environmental attributes, and capacity value. Some of the key differences stem from the perspective and market context each provides. LCOE, for example, focuses on pure cost comparison between technologies, while LVOE evaluates actual worth to the power system. Notably, LCOE ignores when and where power is generated; whereas, LVOE accounts for temporal and locational value variations. Concerning system integration, LCOE treats all generation as equally valuable, while LVOE considers grid integration costs and system needs. “Things like levelized cost of energy or capacity factors are probably the wrong measure to use in many of these markets,” Karl Meeusen, director of Markets, Legislative, and Regulatory Policy with Wärtsilä North America, said as a guest on The POWER Podcast. “Instead, I think one of the better metrics to start looking at and using more deeply is what we call the levelized value of energy, and that's really looking at what the prices at the location where you're trying to build that resource are going to be.” Wärtsilä is a company headquartered in Finland that provides innovative technologies and lifecycle solutions for the marine and energy markets. Among its main offerings are reciprocating engines that can operate on a variety of fuels for use in electric power generating plants. Wärtsilä has modeled different power systems in almost 200 markets around the world. It says the data consistently shows that a small number of grid-balancing gas engines in a system can provide the balancing and flexibility to enable renewables to flourish—all while maintaining reliable, resilient, and affordable electricity. Meeusen noted that a lot of the models find engines offer greater value than other technologies on the system because of their flexibility, even though they may operate at lower capacity factors. Having the ability to turn on and off allows owners to capture high price intervals, where prices spike because of scarcity or ramp shortages, while avoiding negative prices by turning off as prices start to dip and drop lower. “That levelized value is one of the things that we think is really important going forward,” he said. “I think what a lot of models and planning scenarios miss when they look at something like LCOE—and they're looking at a single resource added into the system—is how it fits within the system, and what does it do to the value of the rest of their portfolio?” Meeusen explained. “I call this: thinking about the cannibalistic costs. If I look at an LCOE with a capacity factor for a combined cycle resource, and don't consider how that might impact or increase the curtailment of renewable energy—no cost renewable energy—I don't really necessarily see the true cost of some of those larger, inflexible generators on the system. And, so, when we think about that, we really want to make sure that what we're covering and capturing is the true value that a generator has in a portfolio, not just as a standalone resource.”

    177. How Nuclear Power Could Help Decarbonize Industrial Steam Needs

    Play Episode Listen Later Dec 12, 2024 32:29


    Steam is used for a wide variety of critical processes across many industrial sectors. For example, pulp and paper facilities use steam to power paper machines, dry paper and wood products, and provide heat for chemical recovery processes. Steam is used by metal and mining companies, as well as in the food and beverage industry, petroleum refining, pharmaceutical manufacturing, textile production, and many other industrial processes. “About 20% of global carbon emissions come from the industrial heat sector, and virtually all of that industrial heat today is produced by burning hydrocarbons—coal and natural gas—and emitting carbon into the atmosphere,” Clay Sell, CEO of X-energy, said as a guest on The POWER Podcast. “With our technology, we have the opportunity to replace hydrocarbons and use nuclear-generated carbon-free steam to dramatically decarbonize these so-called hard-to-decarbonize sectors.” X-energy is a nuclear reactor and fuel design engineering company. It is developing Generation-IV high-temperature gas-cooled nuclear reactors and what's known as TRISO-X fuel to power them. The company's Xe-100 small modular reactor (SMR) is an 80-MWe reactor that can be scaled into a four-pack (320-MWe power plant) that can grow even larger as needed. “The most significant advantages that we have over large-scale traditional nuclear power plants is the evolution of our technology, our safety case, and the smaller, more simplified designs that can be built with much less time and much less money,” Sell said. “We're a high-temperature gas-cooled reactor using a TRISO fuel form—that's ceramic, encapsulated fuel in a round pebble that flows through the reactor like gumballs through a gumball machine.” The Xe-100 design's intrinsic safety makes it especially unique. “This is a plant that cannot melt down under any scenario that one could imagine affecting the plant. So, that extraordinary safety case allows us to operate on a very small footprint,” said Sell. The simplified design has fewer subsystems and components, less concrete, less steel, and less equipment than traditional nuclear power plants. As noted previously, X-energy's SMR is capable of producing high-quality steam, which is especially attractive for use in industrial processes. As such, Dow Inc., one of the world's leading materials science companies, has agreed to deploy the first Xe-100 unit at its Union Carbide Corp. Seadrift Operations, a sprawling chemical materials manufacturing site in Seadrift, Calhoun County, Texas. “Our first project is going to be deployed in a public-private partnership with the U.S. government and Dow Inc., the large chemical manufacturer, at a site southwest of Houston, Texas, that will come online around the end of this decade,” Sell reported. Currently, X-energy is in the final stages of its design effort. Once complete, the next step will be to submit a construction permit application to the Nuclear Regulatory Commission (NRC). If all goes according to plan, the application should be approved by the NRC in early 2027, which would allow construction to start around that time. “We anticipate construction on the plant to be about a three- to three-and-a-half-year process, which will then bring it online in the early 2030s,” Sell explained. Beyond that, X-energy has an agreement to supply Amazon with 5 GW of new SMR projects (64 units) by 2039, starting with an initial four-unit 320-MWe Xe-100 plant with regional utility Energy Northwest in central Washington. Sell believes the deal positions X-energy to quickly apply lessons learned from its first-of-a-kind project with Dow, replicate and repeat the effort to achieve scale, and reach a favorable nth-of-a-kind cost structure faster than anyone else in the SMR market today. Said Sell, “When we imagine a future of a decarbonized economy with reliable power supporting dramatic growth at a reasonable cost, I believe X-energy is going to be a central technology to that future.”

    176. Hydrogen Use Cases for the Power Industry

    Play Episode Listen Later Dec 4, 2024 31:19


    Hydrogen is becoming increasingly important to the electric power generation industry for several reasons. One is that hydrogen offers a promising pathway to decarbonize the power sector. When used in fuel cells or burned for electricity generation, hydrogen produces only water vapor as a byproduct, making it a zero-emission energy source. This is crucial for meeting global climate change mitigation goals and reducing greenhouse gas emissions from power generation. Hydrogen also provides a potential energy storage solution, which is critical for integrating solar and wind energy into the power grid. These renewable resources are intermittent—sometimes they produce more energy than is needed by the grid, while at other times, they may completely go away. Hydrogen can be produced through electrolysis during periods of excess renewable energy production, then stored and used to generate electricity when needed. This helps address the challenge of matching energy supply with demand. Hydrogen is a flexible and versatile fuel that can be used in fuel cells, gas turbines, or internal combustion engines. It can also be blended with natural gas to accommodate existing equipment limitations. The wide range of options make hydrogen a great backup fuel for microgrids and other systems that require excellent reliability. “We've actually seen quite a bit of interest in that,” Tim Lebrecht, industry manager for Energy Transition and the Chemicals Process Industries with Air Products, said as a guest on The POWER Podcast. Lebrecht noted that hydrogen can be a primary use in microgrids, or used as a source of backup or supplement. “Think of a peaking unit that as temperature goes up during the day, your pricing for power could also be going up,” Lebrecht explained. “At a point, hydrogen may be a peak shave–type situation, where you then maximize the power from the grid, but then you're using hydrogen as a supplement during that time period.” Another hydrogen use case revolves around data centers. “Data centers, specifically, have been really interested in: ‘How do we use hydrogen as a backup type material?' ” Lebrecht said. Air Products is the world's leading supplier of hydrogen with more than 65 years of experience in hydrogen production, storage, distribution, and dispensing. Lebrecht noted that his team regularly works with original equipment manufacturers (OEMs); engineering, procurement, and construction (EPC) companies; and other firms to collaborate on solutions involving hydrogen. “We've got a great history,” he said. “My team has a great amount of experience.”

    175. Communication Is Key to Successful Power Projects

    Play Episode Listen Later Nov 21, 2024 20:07


    Power plant construction and retrofit projects come in all shapes and sizes, but they all generally have at least one thing in common: complexity. There are usually a lot of moving pieces that must be managed. This can include sourcing the right materials and components, getting equipment delivered to the site at the right time, finding qualified contractors, and overseeing handoffs between working groups. Getting a job done on time and on budget is not as easy as some people might think. “It absolutely can be difficult and a lot of things to consider,” Kevin Slepicka, vice president of Sales for Heat Recovery Boilers at Rentech Boiler Systems, said as a guest on The POWER Podcast. “You've got to make sure that communication is ongoing between your suppliers and the end user.” Rentech is a leading manufacturer of boiler systems including package boilers, waste heat boilers, and heat recovery steam generators (HRSGs). Rentech's fabrication facilities are in Abilene, Texas. “We have three shops,” Slepicka explained. “There's 197,000 square feet of manufacturing space under roof. We've got over 100 tons of lift capability with cranes, and we can bring in other cranes for our heavier lifts. Our properties are located on 72 acres, so we have a lot of room for staging equipment, storing equipment, if customers aren't ready to take delivery at the time the units are done.” Moving large boilers from Texas to sites around the country and other parts of the world can be difficult, which is another reason why good communication is imperative. “Shipping is a major consideration on how the unit is constructed, how much is going to be built in the facility, and how large we can ship. So, it really goes hand in hand with the design of the boiler,” Slepicka said. “It really is important that we work with our logistics people and work with our partner companies that do our transportation for us.” Communication with customers on potential future needs is also important. Slepicka said knowing that a retrofit may be required down the road to account for a new environmental regulation, for example, could allow a boiler system to be designed with space to accommodate changes. This could save a lot of money and headaches in the long run. “That's where you've got to be able to work with the customer—make sure you understand the space available and make sure that the unit's going to work properly,” he said. Slepicka said Rentech had a customer recently that faced new formaldehyde restrictions and needed its HRSG system modified. “Luckily, we had the space in the unit where that catalyst could be installed in the right location to address the concern they had, so it was a relatively easy retrofit for them to make.” If the prospect had not been considered up front, the cost and complexity could have been much greater.

    174. Kingston Coal Ash Spill: Cleanup Workers Were the Unfortunate Losers

    Play Episode Listen Later Nov 6, 2024 33:36


    On Dec. 22, 2008, a major dike failure occurred on the north slopes of the ash pond at the Tennessee Valley Authority's (TVA's) Kingston Fossil Plant. The failure resulted in the release of approximately 5.4 million cubic yards of coal ash spilling onto adjacent land and into the Emory River. The Kingston spill is considered one of the most significant and costly events in TVA history. In a project completion fact sheet issued jointly by the U.S. Environmental Protection Agency (EPA) and the TVA in December 2014, it says the cleanup took about six years, required a total of 6.7 million man-hours, and cost $1.178 billion. TVA hired various contractors to perform the post-spill cleanup, removal, and recovery of fly ash at the Kingston site. Perhaps most notable among them was Jacobs Engineering. TVA hired Jacobs in 2009 specifically to provide program management services to assist with the cleanup. Jacobs claims to have “a strong track record of safely managing some of the world's most complex engineering and environmental challenges.” It has noted that TVA and the EPA's on-scene coordinator oversaw the worker safety programs for the Kingston cleanup, approving all actions in consultation with the Tennessee Department of Environment and Conservation. Jacobs said TVA maintained rigorous safety standards throughout the cleanup, and that it worked closely with TVA in following and supporting those standards. Jared Sullivan, author of Valley So Low: One Lawyer's Fight for Justice in the Wake of America's Great Coal Catastrophe, studied the Kingston cleanup and followed some of the plaintiffs for more than five years while writing his book. As a guest on The POWER Podcast, Sullivan suggested many of the workers felt fortunate to be employed on the Kingston cleanup. The U.S. economy was not thriving at the time; housing and stock markets were in a funk, and unemployment was relatively high. “These workers—these 900 men and women—this disaster is kind of a godsend for them as far as their employment goes, you know. A lot of them needed work. Many of them were very, very pleased to get this call,” Sullivan explained. “The trouble is that after a year or so of working on this job site—of scooping up and hauling off this coal ash muck from the landscape, also from the river—they start feeling really, really terribly,” he said. “At first they kind of write off their symptoms as overworking themselves. In many cases, these workers were working 14-hour shifts and just pushing themselves really, really hard because there's a lot of overtime opportunities. So, that was good for them—that they could work so much, that this mess was so big,” Sullivan continued. But after a while, some workers start blacking out in their cars, having nosebleeds, start coughing up black mucous, and it becomes clear to them that the coal ash is the cause. Jacobs reports several contractors' workers at the Kingston site filed workers compensation claims against their employer in 2013. These workers alleged that conditions at the site caused them to experience various health issues that were a result of excessive exposure to coal ash. Jacobs said many of these claims were found to be unsubstantiated and were rejected. Then, many of the same workers filed lawsuits against Jacobs, even though they may not have been Jacobs employees. Jacobs says it stands by its safety record, and that it did not cause any injuries to the workers. “The case resolved early last year, after almost 10 years of litigation,” Sullivan said. “Jacobs Engineering and the plaintiffs—230 of them—finally settled the case. $77.5 million dollars for 230 plaintiffs. So, it works out to a couple hundred thousand dollars each for the plaintiffs after the lawyers take their fees—so, not tons of money.” In a statement, Jacobs said, “To avoid further litigation, the parties chose to enter into an agreement to resolve the cases.”

    173. Why Data Center Developers Should Think ‘Power First'

    Play Episode Listen Later Oct 30, 2024 42:10


    You don't need me to tell you how artificial intelligence (AI) is impacting the power grid; you can just ask AI. Claude, an AI assistant created by Anthropic, told POWER, “AI training and inference are driving unprecedented demand for data center capacity, particularly due to large language models and other compute-intensive AI workloads.” It also said, “AI servers, especially those with multiple GPUs [graphics processing units], require significantly more power per rack than traditional servers—often 2–4x higher power density.” So, what does that mean for power grid operators and electricity suppliers? Claude said there could be several effects, including local grid strain in AI hub regions, the need for upgraded transmission infrastructure, higher baseline power consumption, and potential grid stability issues in peak usage periods. Notably, it said AI data centers tend to cluster in specific regions with favorable power costs and regulations, creating “hotspots” of extreme power demand. Sheldon Kimber, founder and CEO of Intersect Power, a clean energy company that develops, owns, and operates a base portfolio of 2.2 GW of operating solar PV and 2.4 GWh of storage in operation or construction, understands the challenges data centers present for the grid. As a guest on The POWER Podcast, Kimber suggested the only way to meet the massive increase in power demand coming from data centers is with scalable behind-the-meter solutions. “These assets may still touch the grid—they may still have some reliance on the grid—but they're going to have to bring with them an enormous amount of behind-the-meter generation and storage and other things to make sure that they are flexible enough that the grid can integrate them without creating such a strain on the grid, on rate payers, and on the utilities that service them,” Kimber said. Yet, data center developers have not traditionally kept power top-of-mind. “The data center market to date has been more of a real estate development game,” Kimber explained. “How close to a labor pool are you? What does it look like on the fiber side? What does the land look like?” He said electric power service was certainly part of the equation, but it was more like part of a “balanced breakfast of real estate criteria,” rather than a top priority for siting a data center. In today's environment, that needs to change. Kimber said Intersect Power has been talking to data center companies for at least three years, pitching them on the idea of siting data centers behind-the-meter at some of his projects. The response has been lukewarm at best. Most of the companies want to keep their data centers in already well-established hubs, such as in northern Virginia; Santa Clara, California; or the Columbia River Gorge region in Oregon, for example. Kimber's comeback has been, “Tell us when you're ready to site for ‘Power First.' ” What “Power First” means is simple. Start with power, and the availability of power, as the first criteria, and screen out all the sites that don't have power. “To date, data center development that was not ‘Power First' has really been focused on: ‘What does the plug look like?' ” Kimber said. In other words: How is the developer connecting the data center to the power grid—or plugging in? The developers basically assumed that if they could get connected to the grid, the local utility would find a way to supply the electricity needed. However, it's getting harder and harder for utilities to provide what developers are asking for. “The realization that the grid just isn't going to be able to provide power in most of the places that people want it is now causing a lot of data center customers to re-evaluate the need to move from where they are. And when they're making those moves, obviously, the first thing that's coming to mind is: ‘Well, if I'm going to have to move anyway, I might as well move to where the binding constraint, which is power, is no longer a constraint,' ” he said.

    172. What Are Microreactors and How Soon Could We See One in Operation

    Play Episode Listen Later Oct 22, 2024 33:56


    Microreactors are a class of very small modular reactors targeted for non-conventional nuclear markets. The U.S. Department of Energy (DOE) supports a variety of advanced reactor designs, including gas, liquid-metal, molten-salt, and heat-pipe-cooled concepts. In the U.S., microreactor developers are currently focused on designs that could be deployed as early as the mid-2020s. The key features of microreactors that distinguish them from other reactor types mainly revolve around their size. Microreactors typically produce less than 20 MW of thermal output. The size obviously allows a much smaller footprint than traditional nuclear power reactors. It also allows for factory fabrication and easier transportability. Among other unique aspects are their self-regulating capability, which could enable remote and semi-autonomous microreactor operation. Their rapid deployability (weeks or months rather than many years) is a huge benefit, too, allowing units to be used in emergency response and other time-sensitive situations. Furthermore, some designs are expected to operate for up to 10 years or more without refueling or significant maintenance, which could be a big benefit in remote locations. A lot of microreactor development work is being done at the Idaho National Laboratory (INL). John H. Jackson, National Technical Director for the DOE's Office of Nuclear Energy Microreactor program at INL, was a recent guest on The POWER Podcast. On the show, he noted some of the programs and facilities INL has available to assist in proving microreactor concepts. “I like to say it starts with my program, because I'm overtly focused on enabling and accelerating commercial development and deployment of microreactor technology,” Jackson said. “But there are certainly the entities like the National Reactor Innovation Center, or NRIC, which is heavily focused on deployment and enabling deployment of microreactor technology, as well as small modular reactor technology.” POWER has reported extensively on the Pele and MARVEL microreactor projects. Project Pele is a Department of Defense (DOD) project that recently broke ground at INL. Meanwhile, MARVEL, which stands for Microreactor Applications Research Validation and EvaLuation, is funded through the DOE by the Office of Nuclear Energy's Microreactor program. Project Pele aims to build and demonstrate a high-temperature gas-cooled mobile microreactor manufactured by Lynchburg, Virginia–headquartered BWXT Advanced Technologies. Fueled with TRI-structural ISOtropic particle fuel, Project Pele will produce 1 MWe to 5 MWe for INL's Critical Infrastructure Test Range Complex (CITRC) electrical test grid. The DOD noted last month that assembly of the final Pele reactor is scheduled to begin in February 2025, and the current plan is to transport the fully assembled reactor to INL in 2026. The MARVEL design is a sodium-potassium-cooled microreactor that will be built inside the Transient Reactor Test (TREAT) facility at INL. It will generate 85 kW of thermal energy and about 20 kW of electrical output. It is not intended to be a commercial design, but the experience of constructing and operating the unit could be crucial for future microreactor developers and microgrid designers, as future plans are to connect it to a microgrid. “The MARVEL reactor is one of the top priorities, if not the top priority, at the Idaho National Laboratory, along with the project Pele,” Jackson said. “One or the other—Pele or MARVEL—will be the first reactor built at Idaho National Laboratory in over 50 years.” Still, Jackson was cautious when it came to predicting when the first microreactor might begin operation. “I cringe sometimes when people get a little ahead of themselves and start making bold declarations, like, ‘We're going to have a microreactor next year,' for instance. I think it's important to be excited, but it's also important to stay realistic with respect to timeframes for deployment,” he said.

    171. The Domestic Content Bonus Credit and How to Maximize Incentives for Solar Projects

    Play Episode Listen Later Sep 26, 2024 24:25


    The domestic content bonus credit is available to taxpayers that certify their qualified facility, energy project, or energy storage technology was built with certain percentages of steel, iron, or manufactured products that were mined, produced, or manufactured in the U.S. “What we've seen happen is just a proliferation of investments into U.S. domestic manufacturing,” Mike Hall, CEO of Anza Renewables, said as a guest on The POWER Podcast. Hall said U.S. manufacturers started with the easiest and probably lowest-risk investment in the supply chain, which is module assembly. “You could count on one hand the number of U.S. module options just a couple of years ago,” he said. “Today, I was actually looking at our database, and if you were looking to take delivery in late-2025, there are 17 different manufacturers that are willing to sign POs [purchase orders] today to supply domestically made modules.” Hall suggested most developers that are looking to utilize domestic supplies are trying to solve one or two problems. “Either they're trying to mitigate trade risk—AD/CVD [anti-dumping and countervailing duty] risk—from the various petitions, or risk around detainment by customs due to concerns around UFLPA [Uyghur Forced Labor Prevention Act] violations,” explained Hall. “So, that's one potential problem that customers are trying to solve, and a domestically made module may really help solve that problem,” he said. “The other thing, though, that we increasingly see developers looking to do is to try and access the extra 10% tax credit that you can get if you meet certain minimum standards for domestically manufactured content,” Hall continued. For solar projects, that generally means a domestically manufactured solar cell is needed. “A few years ago, again, there were one, maybe two options for that,” Hall noted. “There's still only a few—we see those options growing over time—but if you're looking at late-2025 deliveries, there's four to five viable options of companies that will actually issue POs today for domestically manufactured cells. So, overall, we're definitely seeing more and more options come to the market, and that's really exciting.” Yet, aside from domestic content, the options available on the market have never been greater than today. “There are more manufacturers selling into the market,” said Hall. “On Anza, we have coverage of 95% of the U.S. supply, and that requires us to have relationships—partnerships in the data pipeline—with over 33 different suppliers. So, if you're doing a mid- or large-scale project, there's over 120 different products that you should be considering. And, so, navigating that, and finding the module or the handful of modules that are actually going to deliver an optimal financial outcome is a big challenge.” Hall suggested maximizing project economics requires having a sound view of the market. Then, developers must compare products, accounting for cost to install, predicted energy production, the value of the energy, and particular project risks and priorities. “One of the things we help developers do is really understand: what is the value in dollars per watt of efficiency and the value for their particular project,” explained Hall. “And that value differs. If you've got a community solar project with a really high priced PPA [power purchase agreement], then efficiency is worth a whole lot. If you've got a really low dollar-per-megawatt-hour utility-scale PPA, then efficiency is still worth something, but it might be worth less.” Projecting the longevity of products can be difficult, but Anza tries to factor that in using warranty information. If different manufacturers warranty their equipment for different lengths of time, that can be incorporated into financial models and will impact outcomes.

    170. How Trump or Harris Would Alter the U.S.'s Energy and Power Landscape

    Play Episode Listen Later Aug 28, 2024 31:01


    A new U.S. president will be inaugurated in less than five months. Polls show the race between Donald Trump and Kamala Harris to be very close, with potentially only a few swing states deciding the election. While energy policy may not be a deciding factor for many Americans in choosing who they will vote for, it is very important to power industry professionals. With that in mind, Mary Anne Sullivan, senior counsel with the law firm Hogan Lovells, and Megan Ridley-Kaye, a partner with Hogan Lovells, were interviewed as guests on The POWER Podcast to discuss how the candidates might differ in their areas of focus after the election. Among the most pronounced differences is the rhetoric the two might espouse. “A Trump administration, I think, would talk a lot more about energy security, energy independence, and the need to be friendly to American-made fossil fuels,” Sullivan said. “A Harris administration, I assume, will follow in the footsteps of the Biden administration and focus on the need to respond to climate change and build on what have truly been unprecedented accomplishments under the Infrastructure Investment and Jobs Act and the IRA [Inflation Reduction Act],” she said. Although a Trump administration might seek to repeal all or at least parts of the IRA, Sullivan thought that would be hard to achieve. “I think recent indications are that it [the IRA] has now a fair bit of support in Congress,” she said. Ridley-Kaye agreed. “Obviously, key to what happens there [the fate of the IRA] is what happens in Congress,” she said. “It seems increasingly unlikely that it will be repealed.” And, while the government has made major investments that support energy and power projects, private parties have invested a lot of money too. At this point in the cycle, however, Ridley-Kaye suggested some of her clients are beginning to take a wait-and-see approach, especially if project economics are not viable without tax credits. Still, many other investors are unworried about the possibility of policy changes. “We do have a large group of clients that would say, ‘The train has left the station. Corporate America expects the tax credits. There's no way that they would be taken away,' ” Ridley-Kaye said. Meanwhile, there are some areas where the candidates may see eye to eye. “No matter which of them is elected, I think they will both recognize the need for more power transmission and more power generation,” said Sullivan. “Although the Biden administration has talked a good game about greening power generation, they have also very much pursued an all-of-the-above approach to generation resources. And I would expect that to continue in a Harris administration, just because there are so many new demands for electricity—the data centers, AI [artificial intelligence], vehicle electrification, the sort of ‘electrify everything' movement that some people talk about,” she said. Two other areas where Trump and Harris might support similar policies are on nuclear power, and carbon capture and storage. “The two administrations might have different motivations for pursuing that, but I think either one will support further technology development there,” Sullivan supposed. Sullivan would expect a more light-handed approach to regulation under a Trump administration, specifically, as applied to permitting energy infrastructure projects. “But that more light-handed regulation on permitting helps the carbon-free power projects as much as the carbon-intensive power projects. It cuts both ways,” she said. Depending on how the election plays out, the energy and power landscape could change very quickly. “Trump's team seems much more ready to move on policy than it did when he ran the last time. I think they're thinking about it in advance. They're building a desired set of policies,” Sullivan said. “I do expect them to be more ready to move on their policy objectives.”

    169. Fuel Cells: What They Are, How They Work, and Why They're Important

    Play Episode Listen Later Aug 6, 2024 32:06


    Fuel cells are not some novel new technology. In fact, most history books credit the invention of the fuel cell to Welsh chemist and physicist William Grove, who, in the late 1830s and early 1840s, conducted experiments proving that electric current could be produced from an electrochemical reaction between hydrogen and oxygen over a platinum catalyst. Yet, fuel cells never really took off as a mainstream source of power. Why is that? “I think the real reason is, historically, we've been comfortable with less-clean, lower-efficient but less-expensive technologies, because we haven't been as focused on air quality and on decarbonization as we currently are,” Tony Leo, executive vice president and Chief Technology Officer with FuelCell Energy, said as a guest on The POWER Podcast. However, as people have become more focused on air quality and climate change, Leo suggested fuel cells are now poised to take off. “That's why you're seeing such an acceleration in the deployment of fuel cells and that's why you're hearing more and more about them these days,” he said. A fuel cell is a device that makes electricity from fuel and air. Instead of burning the fuel to make heat to drive a mechanical generator, fuel cells react the fuel and air electrochemically, without combustion. The electrochemical approach avoids pollutants that are created by high flame temperatures, and it is a more direct and efficient way to make power from a fuel. Reacting fuel and air electrochemically involves delivering fuel to a set of negative electrodes (called anodes) and delivering air to a set of positive electrodes (called cathodes). The electrochemical reaction of fuel produces electrons. The electrochemical reaction of oxygen in air consumes electrons. Connecting the two produces the current of usable electrical power. Fuel cells are configured in stacks of individual cells connected in a series. FuelCell Energy's carbonate stacks have up to 400 cells per stack and produce between 250 kW and 400 kW of power. FuelCell Energy's standard MW-scale module contains four stacks, nets about 1.4 MW of power, and can make electricity for sites such as universities, hospitals, and data centers. The modular design of fuel cell plants allows them to scale up to a specific site's energy needs. “One big advantage is they're quiet,” said Leo. “Since they don't have a big spinning machine and this big spinning generator, they're quiet compared to traditional power generation, so you can site them in population centers. We have a 15-MW fuel cell right in the middle of downtown Bridgeport, Connecticut, for example, and that just makes a really good neighbor.” The lack of harmful emissions is also a benefit. Another advantage is that while fuel cells are making electricity, they're also making heat that can be used to produce hot water or steam, or to drive chilling operations. “That further enhances the sustainability because you get to avoid burning fuel in a boiler, for example, if you can use the heat coming off the fuel cell,” said Leo. Additionally, fuel cells don't require a lot of maintenance or a large operations staff. “They're unmanned—we monitor them remotely—and so, they take care of themselves and just generate value,” Leo explained.

    168. Landrieu: Natural Gas Is ‘Not the Enemy, It Is Part of the Solution' to Achieving Climate Goals

    Play Episode Listen Later Jul 25, 2024 38:10


    Former U.S. Sen. Mary Landrieu (D-La.), who is now a senior policy advisor for the law firm Van Ness Feldman and co-chair of the Natural Allies Leadership Council, is keen on natural gas and believes it is part of the solution to reaching both domestic and global climate goals. “Natural gas in America is not the enemy,” Landrieu said as a guest on The POWER Podcast. “The majority of the emissions reductions of the United States in the last 10 years are directly attributed to more natural gas being used and less coal,” she said. Yet, that doesn't mean Landrieu is opposed to renewable energy. She believes in an “all-of-the-above” strategy. “As natural gas has replaced coal as the number one producer of electricity in this country, our emissions have been reduced substantially, that is, in addition and in collaboration with—in partnership with—the increase in wind [and] the increase in solar,” said Landrieu. There are many reasons to support natural gas, according to Landrieu. For one, America has a lot of it. “We have over a hundred-year supply,” she claimed. “Number two: we have an amazing pipeline infrastructure that can move gas from where we find it to the people that need it,” she added. “But also, what's so important is natural gas, because it's relatively inexpensive, we can keep the cost of electricity lower. So, it's available, it's plentiful, it's affordable, and when connected with wind and solar, we can really build a modern and low-emissions electric grid for the country.” Landrieu has a sound basis for her views, having served three terms in the U.S. Senate (1997–2015) where she chaired the prominent Senate Energy and Natural Resources Committee and she advocated for her home state of Louisiana, which is America's fourth-largest energy-producing state. Still, Landrieu pushes back when people suggest she only promotes natural gas because Louisiana produces it. “No, I promote natural gas because we produce it, but we also use a lot of it. So, my goal is to keep it plentiful [and] keep the price low and stable,” she said. Another form of energy that Landrieu supports is nuclear power. “Although our coalition doesn't promote nuclear, we recognize the power of nuclear power. We want to see more nuclear power in this country,” she said. “Nuclear provides about 18% of our electricity—it was about 20—if we could get that up to 25 or even 30%, it would really help. Natural gas can provide a lot, more wind, more solar, and as batteries come along, that's going to be, I think, the combination we're looking for.” The Natural Allies Leadership Council calls itself “a coalition of interested stakeholders that recognize the vital role natural gas and its infrastructure must play in the energy mix.” The group says natural gas partnered with renewable energy “can accelerate our path to a clean energy future—ensuring affordability and reliability while reducing carbon emissions domestically and internationally.” Landrieu co-chairs the group with Kendrick Meek (D-Fla.), who served southern Florida in Congress from 2002 to 2010; Michael Nutter, who served as Philadelphia's 98th Mayor from 2008 to 2016; and Tim Ryan (D-Ohio), who served 10 terms in Congress from 2003 to 2023. “We're talking to Democrats—we're happy always to talk with Republicans as well—but we're talking to Democratic leaders and saying, ‘If you want prices low, if you want your people employed, if you want jobs in your community, natural gas is for you.' And we're happy to partner with renewables, nuclear, batteries, and let's build a future together,” said Landrieu.

    167. Shifting from Coal to Gas: One Co-op's Award-Winning Journey

    Play Episode Listen Later Jul 17, 2024 50:38


    In 2018, Cooperative Energy, a generation and transmission co-op headquartered in Hattiesburg, Mississippi, had an issue to deal with. Several years earlier, it had joined the Midcontinent Independent System Operator (MISO), giving the power provider access to a competitive market. However, Cooperative Energy's R.D. Morrow Sr. Generating Station, a 400-MW two-unit coal-fired facility that had opened about 40 years earlier, was not being dispatched as the co-op would have liked. In fact, the facility's capacity factor in those days was running at only about 3%. “We could not compete in the MISO market due to the cost of the unit, the lack of flexibility, [and] startup time—when you're bidding the unit into a day-ahead market, a 42-hour startup time is not a good place to be,” Mark Smith, senior vice president of Power Generation with Cooperative Energy, explained as a guest on The POWER Podcast. Smith continued: “We had high transportation costs. Our coal came in by rail and the route from the mine to the plant was roughly 440 miles one way. So, the transportation cost was excessive. Environmental regulations—the goal post seems to keep moving and things keep ratcheting down—we didn't know where we were heading. At the point that we did decommission, we were well within compliance, but the future was uncertain. It was going to require a lot of capital investment in the coal unit.” With that as a backdrop, Cooperative Energy made the decision to build a new gas-fired unit to take the place of the coal units. Cooperative Energy took a somewhat unconventional approach for the project, utilizing many of its own people to manage the job, rather than opting for a turnkey EPC (engineering, procurement, and construction) contractor. “There were several reasons for us to choose what we call the multi-contract approach, as opposed to utilizing an EPC contractor,” Trey Cannon, director of Generation Projects with Cooperative Energy, said on the podcast. “Probably the one that was most important to us is just having that full transparency and full control of the entire project, including technology selections and equipment procurement, selection of construction contractors, and things of that nature,” Cannon explained. There was also a cost savings involved. “We estimated that we probably saved at least 15% on the total budget by utilizing the self-build self-manage approach,” said Cannon. The results were phenomenal. The project finished well ahead of schedule and well under budget. Yet, Cannon admitted that a lot of the savings was due to circumstances. “The market conditions and the timing of the project couldn't have been better,” he said. The market for power plants in 2018 was down, so Cooperative Energy was able to get very competitive pricing on the gas turbine and a lot of other equipment. As construction work kicked into full swing in 2020, the market took another dip with COVID and other factors pushing projects to the back burner. Cooperative Energy, however, pressed on and was able to cherry pick the best contractors and the best workers. To underscore how the project benefited from the quality of personnel it was able to attract, Smith noted, “The weld rejection rate for our mechanical contractor was 0.41%, which was remarkable.” Today, the repowered Morrow plant is the heavy-load-carrying unit in Cooperative Energy's fleet. “Since we went commercial, I think we're carrying a 90-plus-percent capacity factor on the unit,” said Cannon. “If it's not the most-efficient plant in MISO South, it's very close,” added Smith. “And, needless to say, if the unit is available—we're not in a planned outage—it's operating and it's typically baseloaded. In MISO, the name of the game is flexibility, efficiency, and reliability. The Morrow repower has checked all of those boxes for us and has Cooperative Energy in a great position for many years to come.”

    166. Analyst Says Nuclear Industry Is ‘Totally Irrelevant' in the Market for New Power Capacity

    Play Episode Listen Later Jul 8, 2024 43:13


    Nuclear power has consistently provided about 19% to 20% of total annual U.S. electricity generation since 1990. It provides significant amounts of electricity in many other countries as well. According to data from The World Nuclear Industry Status Report (WNISR), a total of 414 reactors were operating in 32 countries, as of July 1, 2024. Preliminary data says China generated the second-most electricity from nuclear power in 2023 (behind the U.S.), while France came in third and had the highest percentage share of national power generation from nuclear power at 65%. Many power industry experts and environmental activists consider nuclear power an important component in the world's transition to carbon-free energy. Yet, Mycle Schneider, an independent international analyst on energy and nuclear policy, and coordinator, editor, and publisher of the annual WNISR, said, “in [new] capacity terms, the nuclear industry, from what is going on, on the ground, is totally irrelevant.” Schneider was speaking as a guest on The POWER Podcast and prefaced his statement by comparing nuclear power additions to solar power additions in recent years. “Let's look at China, because China is the only country that has been massively building nuclear power plants over the past 20 years,” he said. “China connected one reactor to the grid in 2023—one gigawatt. In the same year, they connected, and the numbers vary, but over 200 gigawatts of solar alone. Solar power generates more electricity in China than nuclear power since 2022. And, of course, wind power generates more than nuclear power in China for a decade already,” Schneider said. Furthermore, he noted, the disparity has gone “completely unnoticed by the general public or even within the energy professionals that are in Europe or often also in North America.” Schneider said the media often gives the impression that the nuclear industry is booming, but the facts suggest otherwise. “Over the past 20 years—2004 to 2023—104 reactors were closed down and 102 started up,” Schneider said. “But here is important that almost half, 49 of those new reactors started, were in China [where none closed], so the balance outside China is minus 51.” Some nuclear advocates might suggest that things are changing. They might argue that small modular reactors (SMRs) or other advanced designs are poised to reinvigorate the industry. But Schneider disagrees. He noted that since the construction start of the second unit at Hinkley Point C in the UK in 2019—almost five years ago—there have been 35 nuclear project construction starts in the world. Twenty-two of those were in China and the other 13 were all implemented by the Russian nuclear industry in a few different countries. “Nothing else. Not an SMR here or an SMR there, or a large reactor here or a large reactor there by any other player,” reported Schneider. Schneider noted that the vast majority of new capacity being added to the grid is from solar and wind energy. “These guys are building tens of thousands of wind turbines, and literally hundreds of millions of solar cells, so the learning effect is just absolutely stunning,” he said. “On the nuclear side, we're talking about a handful. That's very difficult. Very, very difficult—very challenging—to have a learning effect with so few units.” Schneider said the nuclear discussion in general needs a “really thorough reality check.” He suggested the possibilities and feasibilities must be investigated. “Then, choices can be made on a solid basis,” he said.

    165. How to Improve U.S. Power Distribution System Reliability

    Play Episode Listen Later Jun 13, 2024 17:47


    The U.S. Energy Information Administration (EIA) reports SAIDI and SAIFI values in its Electric Power Annual report, which is regularly released in October each year. In the most recent report, the U.S. distribution system's average SAIDI value including all events was 335.5 minutes per customer in 2022. If major event days were excluded, which is often a worthwhile exercise to get accurate long-term trends because hurricanes and severe winter storms, for example, can skew the numbers quite dramatically in a given year, the figure dropped to 125.7 minutes per customer. Notably, this the highest SAIDI value tallied in the past decade and it continued what has effectively been a steady year-over-year decline in performance from 2013 through 2022. (2017 saw a brief improvement over 2016, but every year before and since has been worse than the previous year during the timespan covered by the report.) For comparison, in 2013, the SAIDI value was 106.1 minutes per customer. SAIFI values do not vary as noticeably as SAIDI, but still have been worsening. In 2022, the U.S. distribution system's average SAIFI value including all events was 1.4 power interruptions per customer. With major events excluded, SAIFI was 1.1 interruptions per customer in the U.S. While this was not substantially worse than values reported in other years over the past decade (every year from 2013 onward has been 1.0, except for 2016 when the value was also 1.1), it seems to confirm that the system hasn't been improving. Yet, Mike Edmonds, Chief Operating Officer for S&C Electric Company, said several things can be done to improve the reliability and resiliency of the power distribution system. “The grid looks different depending on what state you're in,” Edmonds said as a guest on The POWER Podcast. “We've got great experience with Florida Power & Light [FPL],” he said. “We've helped them create a resilient grid. So, that's not only a grid that is reliable, but a grid that can actually weather the storms and all the challenges thrown at the grid.” Notably, FPL reported in March that it had provided “the most reliable electric service in company history in 2023.” Over the past two decades, FPL said its customers have realized a remarkable 45% improvement in reliability. In NextEra Energy's (the parent company of FPL) Sustainability Report 2023, the company reported FPL's SAIDI was 47.1 and SAIFI was 0.85, confirming markedly better results than the U.S. averages noted earlier. Furthermore, FPL said this is the ninth time in the past 10 years that it achieved “its best-ever reliability rating.” To better understand some of the innovative new equipment S&C Electric Company offers, Edmonds provided an example. “We have some technology that does something called ‘pulse finding,' and what Florida Power & Light does, it just lets our equipment do what it does best. If there's a problem, it'll pulse to see if the problem is there or not on the grid, if it's not, it reenergizes,” he said. “This technology is available to really change how the grid operates.” Edmonds said S&C Electric Company invented the fuse 115 years ago, and he noted fuses have served the industry well since that time. However, today there is better technology available that doesn't require a lineworker to respond to an outage to replace a fuse. “Let's take fuses off the grid and have a fuseless grid, and have much more intelligent devices that can actually re-energize,” Edmonds decreed.

    164. Why the U.S. Government Should Fund Cybersecurity Efforts to Protect Power Grid

    Play Episode Listen Later Jun 6, 2024 25:38


    FBI Director Christopher Wray, while speaking at the Vanderbilt Summit on Modern Conflict and Emerging Threats in Nashville, Tennessee, in April, warned that U.S. critical infrastructure is a prime target of the Chinese government. “The fact is, the PRC's [People's Republic of China's] targeting of our critical infrastructure is both broad and unrelenting,” he said. Wray also noted that the immense size and expanding nature of the Chinese Communist Party's hacking program isn't just aimed at stealing American intellectual property. “It's using that mass, those numbers, to give itself the ability to physically wreak havoc on our critical infrastructure at a time of its choosing,” he said. Wray noted that during the FBI's recent Volt Typhoon investigation, the Bureau found that the Chinese government had gained illicit access to networks within America's “critical telecommunications, energy, water, and other infrastructure sectors.” Some cybersecurity experts have likened this activity to an act of war, although NATO hasn't defined it as such just yet. In any case, it is a serious threat to national security. “In this country, critical infrastructure is operated by the private sector, most of which are publicly traded companies,” said Alex Santos, CEO of Fortress Information Security, a company that specializes in cyber supply chain security for organizations that operate critical infrastructure including utilities and government agencies. Santos was speaking as a guest on The POWER Podcast. “Somehow, the private sector has taken on the responsibility to defend these acts of war, which I was always taught is the responsibility of the government,” he said. “I think what's really the point here is that the government is asking us to do more. We're being attacked more by the adversaries. Regulations are coming in. It's becoming more and more complicated with technology change. And, our budgets are being cut,” said Santos. Thus, while Wray can be commended for pointing out the national security problem Chinese hackers present to critical infrastructure, his words fall flat if the government doesn't put its money where its mouth is, Santos suggested. That's not to say money isn't being spent by the U.S. government. “The government is spending a lot on cybersecurity to help companies, but it's going to research and universities,” Santos said. “How many research studies do we need to tell us that cybersecurity is a problem? How many research studies do we need to tell us that we don't have enough cybersecurity workers? How much research do we need to give us 10 recommendations for how to increase the capability of our cybersecurity workforce? At some point, we need to actually do the work.” Santos suggested money could be better spent helping companies repair vulnerabilities or by getting small businesses to install basic security precautions like endpoint protection and network monitoring. “Does the government study how to build a tank or do they build tanks?” Santos asked rhetorically. “The government builds tanks and they buy bullets,” he answered. “So, think of it that way. We need to buy more tanks and bullets, and less research studies on which tanks, how many tanks, what kind of tanks—tanks with wheels, tanks with tracks—you know, let's buy some tanks,” he said.

    163. Effective Training and Mentoring Programs Are Critical to Power Project Success

    Play Episode Listen Later May 30, 2024 17:01


    The power industry has long been lamenting its aging workforce. While turnover has been happening for years, there remains a large percentage of power professionals on the verge of retirement. Furthermore, the U.S. Bureau of Labor Statistics predicts faster than average job growth for engineering occupations. That means experienced workers with the skills needed by the power industry are in high demand and can be choosy when looking for new opportunities. They can also demand higher compensation to make a change. Meanwhile, relative youngsters coming out of college and trade schools, while often having the fundamental knowledge to do power jobs, don't usually have the experience needed to add immediate value to an organization. The situation is forcing companies to implement workforce development strategies. Mechanical Dynamics & Analysis (MD&A) is a company that offers a full-service alternative to original equipment manufacturer services, parts, and repairs for steam, gas, and industrial turbines and generators. Like other power industry companies, MD&A has found it challenging to recruit experienced engineers. “When we started out back in the early 80s, we started out as a company who tended to hire engineers who were very experienced. And back around 2009, we started to realize that those people were becoming a little harder to find,” Charles Monestere, general manager for Technical Services with MD&A, said as a guest on The POWER Podcast. “So, we started hiring a few engineers a year—some years one person, some years two or three people, maybe even a little bit more—and we developed an in-house program where we would bring in generally recent graduates, within a year or two or three out of school, and put them through some classroom training, but then a structured on-the-job training where we would have weekly meetings reviewing the activities on the job sites,” he explained. “And we'd put the young engineers with very experienced project managers and technical directors that are at the sites—the field engineers who have been doing this for many years.” Called the Engineers in Training (EIT) program, the instruction tasked learners with becoming proficient at and gaining knowledge on many different technical aspects of the job. “A good part of the work is on the job sites; however, there is some structured classroom training, which is integrated into it,” Monestere said. In recent years, finding experienced people has become even more difficult, leading MD&A to increase its hiring into the EIT program. “We're actually targeting about 10 people a year now,” said Monestere. “We're just hiring in five more this summer, and then, probably another five or so at the end of the year. So, that's the direction we're heading.” Colin Baker, one of MD&A's newest field engineers, participated in the program and found it very worthwhile. “Working with all these really great and really smart engineers, you get all of their experience firsthand, and you learn what's right and what's wrong,” he said. “Also, with all these classes that you're put through, you use all of that knowledge and you learn where to apply it when you're actually out in the field.” Meanwhile, Baker said the program also offered him an opportunity to network within the industry and in the company. Baker said he now has multiple experts he can contact when he runs into problems. “Especially with MD&A, you can always reach out to anyone for help. Everyone is pretty much readily available for any kind of questions or something of that matter,” he said. “I'm still very new in the industry and I'm not going to know everything. I know people who do know most things, so it's good to get these kinds of resources.”

    162. How PG&E Is Reducing Wildfire Risks Using Satellite Imagery

    Play Episode Listen Later May 20, 2024 47:59


    Wildfires have had a devastating impact on California and on the state's largest utility company, Pacific Gas and Electric (PG&E). Potential wildfire liabilities exceeding $30 billion led PG&E to file for bankruptcy in January 2019. The company emerged from bankruptcy on July 1, 2020, with a renewed focus on mitigating wildfires within its 70,000-square-mile service territory in northern and central California. “A lot has changed,” Andy Abranches, senior director of Wildfire Preparedness and Operations with PG&E, said as a guest on The POWER Podcast. “We really saw the devastation that could occur from these wildfires, and so, that was the point that PG&E started really making a big pivot to addressing the wildfire risk. The way we address the wildfire risk is really through what we consider our layers of protection. We started initially learning as much as we could from San Diego Gas and Electric [SDG&E], and put in place the public safety power shutoff program.” High-fire-threat district maps were important in understanding risks. About half of PG&E's service territory falls in high-fire-threat areas. “We have 25,000 distribution miles that run through the high-fire-threat districts and 5,000 transmission miles,” said Abranches. Vegetation plays a critical role in the risk, and while precisely quantifying the number of trees in and around those risky transmission and distribution lines is difficult, Abranches estimated it's in the range of eight to 10 million. With such a large area and so many trees to monitor, PG&E turned to Planet Labs, a San Francisco-based provider of global, daily satellite imagery and geospatial solutions, for help. Planet's satellite-derived data on vegetation, including canopy height, cover, and proximity to electric-system infrastructure, is used by PG&E to prioritize the mitigation of vegetation-associated risks. Quantifying Threats and Consequences Abranches explained PG&E's risk characterization process by likening it to a bowtie. “The first part of your risk bowtie is: ‘How do you quantify and in a probabilistic way build a risk model to predict ignitions are going to happen?' ” He noted that the biggest source of ignitions is through contact with vegetation, such as a tree falling on a line or a branch coming into contact with a line on a windy day, but birds and other animals can also cause ignitions. “The second half of the bowtie is the consequence,” said Abranches. “If an ignition occurs at a particular location, if the vegetation around it is just not there, that ignition will never spread.” The fire triangle requires heat (or a spark), oxygen, and fuel. The fuel is the vegetation bed around the line where the ignition event occurs. If there happens to be a lot of dry fuel, that's when an ignition becomes a wildfire. Depending on the oxygen, which can be heavily influenced by wind conditions, it could become a catastrophic fire, Abranches explained. “As we built our risk models, you needed to understand the vegetation dimension on two levels. One level is for probability of ignitions: ‘How do we get better at predicting where we expect vegetation ignitions to occur?' And the data that we're able to get from Planet every year helps improve and keeps those models updated,” said Abranches. “The second piece of it is the consequence of the ignition—understanding the fuel layer. That also—data from Planet—helps inform and continually refreshes that information to make sure it's most current. So, the risk model actually uses the Planet data on both sides of the bowtie, because it's probability of ignition times the consequence of ignition gives you the risk event.”

    161. How Regulatory Burdens and Misguided Incentives Are Degrading Power System Reliability

    Play Episode Listen Later May 14, 2024 29:21


    It's no secret that the U.S. electric power system has undergone a remarkable transition that continues today. Coal-fired generation, which was the leading source of power generation during the 20th century, often providing more than half of the country's electricity supply, fell to about 16.2% of the mix in 2023. Meanwhile, the U.S. solar market installed 32.4 GWdc of electricity-generation capacity last year, a 51% increase from 2022, and the industry's biggest year by far, exceeding the 30-GWdc threshold for the first time. Solar accounted for 53% of all new electricity-generating capacity added to the U.S. grid in 2023, far greater than natural gas and wind, which were second and third on the list, accounting for 18% and 13% of new additions, respectively. But, how is the shift in resources affecting power system reliability? Some experts say it's not good. “We've got a lot of warning lights that appear to be flashing today,” Todd Snitchler, president and CEO of the Electric Power Supply Association (EPSA), said as a guest on The POWER Podcast. “I say that not just from our perspective, but from NERC [the North American Electric Reliability Corp.]—the reliability coordinator—or from FERC [the Federal Energy Regulatory Commission], who has also expressed concerns, and all of the grid operators around the country have raised concerns about the pace of the energy transition.” EPSA is the national trade association representing America's competitive power suppliers. It believes strongly in the value of competition and the benefits competitive markets provide to power customers. “Our members have every incentive to be the least-cost, most-reliable option that's available, because if you are that resource, you're going to be the resource that's selected to run,” said Snitchler. Yet, not all markets are providing a level playing field, according to Snitchler. “The challenge we're seeing is that there are a number of resources that are either having regulatory burdens that are placed on them that make them less competitive in comparison to resources that are not facing the same challenges, or there are resources that are highly subsidized, and as a result of those subsidies, it creates an economic disadvantage to unsubsidized resources, and that puts economic pressure on units that would otherwise be able to run and would earn a sufficient amount of revenue to remain on the system,” he explained. “We're also seeing a pretty significant acceleration in retirements off of the system of dispatchable resources,” Snitchler continued. “What does that mean? So, of course, it means the coal plants that have been on the system for decades, as a result of economics and environmental policies, are retiring and moving off of the system. You're seeing some of the older gas units experience the same kind of financial and regulatory pressures, and that is forcing some of them off of the system. And we're seeing a large penetration of new renewable resources come onto the system that, frankly, are good energy resources, but don't have the same performance characteristics that the dispatchable resources have. “And so, we're having to fill a gap, or as I call it, the delta between aspirational policy goals and operational realities of the system, because too much retirement of dispatchable resources without sufficient resources that can replicate or deliver the same types of services that those dispatchable resources can provide, creates reliability concerns,” said Snitchler.

    160. How Grid Enhancing Technologies Are Expanding Electric Power Transmission System Capabilities

    Play Episode Listen Later May 8, 2024 17:13


    It's no secret that power grids around the world need to expand to accommodate more renewable energy and the so-called “electrification of everything.” The latter, of course, refers to the growing trend of using electricity to power various sectors and applications that have traditionally relied on fossil fuels, such as natural gas or petroleum-based products. The electrification of everything includes the push toward electric vehicles; the transition from fossil fuel–based heating and cooling systems to electric alternatives, as well as the adoption of electric appliances; and the shift to more electric motors, furnaces, and other electric-powered equipment in manufacturing processes. Add to that the expected power needed to supply data centers and the growth of artificial intelligence-related computing, and current estimates of 50% load growth by 2050 could be vastly understated. Yet, getting new transmission lines planned, approved, and constructed is a daunting task, often taking a decade or longer to complete. So, how can the world more quickly add transmission capacity to the system without investing enormous time and money in the process? The answer: grid enhancing technologies, or GETs. “GETs are exciting to us because they are technologies that help us unlock quickly the additional headroom or additional capability of the grid to carry energy across the system,” Alexina Jackson, vice president of Strategic Development with AES Corp., said as a guest on The POWER Podcast. “This is something that is very important, because today, we are not making the fullest use of the electricity system as it's built.” The system is operated below its maximum capacity for very good reasons, specifically, to maintain reliability, but by implementing GETs, it can be operated closer to its true limits without risk of failure. “Once we have these technologies, such as dynamic line rating, which helps us visualize the dynamic and full headroom of the electrical grid, and then technologies like storage as transmission, advanced power flow control, topology optimization—they all allow us to operate the grid in its dynamic capability. By doing both these things—visualization and operation dynamically—we're able to start making fuller use of that carrying capacity for energy, which will allow us to add additional energy more quickly, serve our customer needs more efficiently, and ultimately decarbonize faster,” Jackson said. To read AES's white paper, visit: https://www.aes.com/sites/aes.com/files/2024-04/Smarter-Use-of-the-Dynamic-Grid-Whitepaper.pdf

    159. Navigating the Interconnection Queue Is One of Many Challenges Clean-Energy Projects Face

    Play Episode Listen Later Apr 19, 2024 26:45


    There are several obstacles to overcome when building a clean-energy project, but perhaps the biggest is getting through the generator interconnection queue (GIQ). Every regional transmission organization (RTO) and independent system operator (ISO) in the U.S. has a significant backlog in its GIQ and processing interconnection requests can take years to complete. This has created a significant barrier to deploying renewable energy, as companies often face long wait times, and high costs for new transmission lines and other upgrades when the local grid is near or at capacity. Part of the problem is the complexity of the interconnection process, which involves multiple studies. The Midcontinent Independent System Operator (MISO) reports that historically about 70% of projects submitted to its queue ultimately withdraw, resulting in extensive rework and delays, as studies must be redone when projects withdraw. MISO recognizes change is necessary and has implemented some reforms. On Jan. 19, 2024, the Federal Energy Regulatory Commission (FERC) accepted MISO's filing (ER24-340) to increase milestone payments, adopt an automatic withdrawal penalty, revise withdrawal penalty provisions, and expand site control requirements. These provisions were designed to help expedite the GIQ process, and maximize transparency and certainty. MISO said the filing was developed through extensive collaboration in the stakeholder process, including multiple discussions in the Planning Advisory Committee and Interconnection Process Working Group. MISO expects these reforms to reduce the number of queue requests withdrawing from the process. It said the fewer projects in studies, the quicker the evaluations can be completed, and the fewer projects that withdraw, the more certain phase 1 and 2 study results are. Still, it's likely that more needs to be done to improve the GIQ process. The Clean Grid Alliance (CGA), a nonprofit organization that works to advance renewable energy in the Midwest, conducted a survey of 14 clean energy developers who've had solar, wind, hybrid, and battery storage projects in the MISO interconnection queue over the last five years to better understand the challenges they've faced. Aside from interconnection queue challenges, the CGA survey also identified other hindrances to clean-energy project development. Soholt explained that a lot of development work is done face to face. COVID prevented that, which was a big problem that had a ripple effect. Some leases that developers had negotiated began to expire, so they had to go back out to communities and renegotiate. “Siting in general is getting more difficult, as we do more volume, as we do transmission in the MISO footprint,” said Soholt. “We need new generation to be sited, we need new transmission, and we have to find a pathway forward on that community acceptance piece,” she said. Among other challenges, Soholt said some projects saw generator interconnection agreements (GIAs) timing out and needing MISO extensions. Meanwhile, transmission upgrade delays also presented problems, not only the large backbone transmission upgrades, but also the transmission owners building interconnections for individual projects to connect breakers, transformers, and other equipment. Soholt said longer and longer component lead times presented timing challenges, which were also problematic for developers. These were all important takeaways from the CGA survey, and items the group will work to resolve. Yet, for all the difficulties, Soholt seemed optimistic that MISO would continue to find ways to improve the process. “When we get overwhelmed, we really step back and say, ‘What's going to be the best thing to work on to really make a difference?' So far, that really has been the big things like transmission planning. We feel good about where that's at in MISO—they are doing good long-range planning,” Soholt said.

    158. Molten Salt Reactor Technology Solves Several Nuclear Industry Problems

    Play Episode Listen Later Apr 9, 2024 37:44


    Today, molten salt reactors (MSRs) are experiencing a resurgence of interest worldwide, with numerous companies and research institutions actively developing various designs. MSRs offer several potential advantages, including enhanced safety, reduced waste generation, and the ability to utilize thorium as a fuel source, as previously mentioned. “There are several molten salt reactor companies that are in the process of cutting deals and getting MOIs [memorandums of intent] with foreign countries,” Mike Conley, author of the book Earth Is a Nuclear Planet: The Environmental Case for Nuclear Power, said as a guest on The POWER Podcast. Conley is a nuclear energy advocate and strong believer in MSR technology. He called MSRs “a far superior reactor technology” compared to light-water reactors (LWRs). The thorium fuel cycle is a key component in at least some MSR designs. The thorium fuel cycle is the path that thorium transmutes through from fertile source fuel to uranium fuel ready for fission. Thorium-232 (Th-232) absorbs a neutron, transmuting it into Th-233. Th-233 beta decays to protactinium-233 (Pa-233), and finally undergoes a second beta minus decay to become uranium-233 (U-233). This is the one way of turning natural and abundant Th-232 into something fissionable. Since U-233 is not naturally found but makes an ideal nuclear reactor fuel, it is a much sought-after fuel cycle. “The best way to do this is in a molten salt reactor, which is an incredible advance in reactor design. And the big thing is, whether you're fueling a molten salt reactor with uranium or thorium or plutonium or whatever, it's a far superior reactor technology. It absolutely cannot melt down under any circumstances whatsoever period,” said Conley. Conley suggested that most of the concern people have about nuclear power revolves around the spread of radioactive material. Specifically, no matter how unlikely it is, if an accident occurred and contamination went airborne, the fact that it could spread beyond the plant boundary is worrisome to many people who oppose nuclear power. “The nice thing about a molten salt reactor is: if a molten salt reactor just goes belly up and breaks or gets destroyed or gets sabotaged, you'll have a messed-up reactor room with a pancake of rock salt on the floor, but not a cloud of radioactive steam that's going to go 100 miles downwind,” Conley explained. And the price for an MSR could be much more attractive than the cost of currently available GW-scale LWR units. “The ThorCon company is predicting that they will be able to build for $1 a watt,” said Conley. “That's one-fourteenth of what Vogtle was,” he added, referring to Southern Company's nuclear expansion project in Georgia, which includes two Westinghouse AP1000 units. Of course, projections do not always align with reality, so MSR pilot projects will be keenly watched to validate claims. There is progress being made on MSR projects. For example, in February 2022, TerraPower and Southern Company announced an agreement to design, construct, and operate the Molten Chloride Reactor Experiment (MCRE)—the world's first critical fast-spectrum salt reactor—at Idaho National Laboratory (INL). Since then, Southern Company reported successfully commencing pumped-salt operations in the Integrated Effects Test (IET), signifying a major achievement for the project. The IET is a non-nuclear, externally heated, 1-MW multiloop system, located at TerraPower's laboratory in Everett, Washington. “The IET will inform the design, licensing, and operation of an approximately 180-MW MCFR [Molten Chloride Fast Reactor] demonstration planned for the early 2030s timeframe,” Southern Company said.

    157. How Utilities Are Planning for Extreme Weather Events and Mitigating Risks

    Play Episode Listen Later Mar 13, 2024 21:58


    In mid-January, scientists who maintain the world's temperature records announced that 2023 was the hottest year on record. NASA researchers say extreme weather across the planet, including heat extremes, wildfires, droughts, tropical cyclones, heavy precipitation, floods, high-tide flooding, and marine heat waves, will become more common and severe as the planet warms. That's a big problem for power grids, because extreme weather often causes outages and damage to grid assets. Michael Levy, U.S. Networks lead and Global Head of Asset Resilience at Baringa Partners, a global management consulting firm, is highly focused on extreme weather risks and developing plans to help mitigate the threats. He suggested accurately forecasting dollars of risk at the asset level from extreme weather events is very important to his clients. “Every facility all across the U.S. is having a heightened awareness of some of these extreme weather events, and more importantly, how they can protect themselves and their customers against those in the future,” Levy said as a guest on The POWER Podcast. “Utilities have always been really good, generally, at keeping the lights on and maintaining a fair level of reliability,” said Levy. “In general, they're making the right investments—they have the right ambitions—but what's challenging about these extreme weather events is that because they're so infrequent at individual locations, and the impacts are so severe, what we find is that utility clients often are really challenged to estimate those high-impact, low-frequency events, and integrate them into their investment plans.” However, Levy said advances in attribution climate science are helping utilities overcome some of the challenges. “Scientists are now able to associate, with reasonable level of accuracy, what increasing warming means physically for the rest of the world in terms of how the frequency and severity of these extreme weather events may change,” he explained. “One of the big things that we focus on with our utility clients is converting those climate forecasts into dollars of risk, and that way, it gives them an adjustable baseline that they can substantiate spend against,” said Levy. “If you're undergrounding lines to protect them against wildfire, elevating substations to protect them against flooding, all of those things cost money, and we're increasingly seeing regulators—they want to see the benefits, they want to see that the money is being spent prudently. So, that's what we're talking to our clients about today,” he said. And utilities have proven that sound planning does pay off. Levy pointed to actions taken in Florida following particularly active and intense hurricane seasons in 2004 and 2005. Soon thereafter, the Florida Public Service Commission adopted extensive storm hardening initiatives. Wooden pole inspection and replacement programs were adopted, and vegetative remediation solutions were implemented, vastly improving grid reliability. Additionally, investor-owned electric utilities were ordered to file updated storm hardening plans for the commission to review every three years. However, the proof is in the pudding, and for Florida, grid hardening has tasted very good. Levy compared the effects experienced from Hurricane Michael in 2018 to those of Hurricane Ian in 2022. “When Ian came, despite being a bigger and stronger hurricane, they had no transmission lines down, which, of course, are very costly and time intensive to replace, and they were able to restore customers three times as fast, despite having more customers out. So, they're experiencing what we like to call at Baringa ‘the rewards of resilience,' because investing in resilience is a fraction of restoration costs,” said Levy.

    156. Community Solar Projects Bring Renewable Energy to the Masses

    Play Episode Listen Later Mar 7, 2024 27:00


    The National Renewable Energy Laboratory (NREL) explains that community solar, also known as shared solar or solar gardens, is a distributed solar energy deployment model that allows customers to buy or lease part of a larger, off-site shared solar photovoltaic (PV) system. It says community solar arrangements allow customers to enjoy advantages of solar energy without having to install their own solar energy system. The U.S. Department of Energy says community solar customers typically subscribe to—or in some cases own—a portion of the energy generated by a solar array, and receive an electric bill credit for electricity generated by their share of the community solar system. It suggests community solar can be a great option for people who are unable to install solar panels on their roofs because they are renters, or because their roofs or electrical systems aren't suited to solar. The Solar Energy Industries Association (SEIA) reports 6.5 GW of community solar capacity has been installed in the U.S. through the 1st quarter of 2024. Furthermore, SEIA predicts more than 6 GW of community solar capacity will be added over the next five years. It says 41 states, plus the District of Columbia, have at least one community solar project online. “These programs are very attractive and provide a lot of benefit to a whole range of consumers,” Nate Owen, CEO and founder of Ampion, said as a guest on The POWER Podcast. Ampion currently manages distributed generation projects for developers in nine states, with new states being added as more programs become active. “It's fundamentally a different way of developing energy assets,” Owen said. “These things [community solar farms] are their own asset class. They produce a very significant value because they are generally located closer to load, and so, they fortify and strengthen local distribution networks quite a bit. And right now, they are very popular—there's quite a bit of development going on in states across the country that have put programs in place.” Owen specifically mentioned Colorado, Illinois, Maine, Maryland, Massachusetts, Minnesota, New Jersey, and New York as states with active community solar programs. “There's a lot of activity going on in a lot of states right now,” he said. According to Owen, community solar saves customers money. “The contract structure of community solar means that, ultimately, everybody's guaranteed savings,” he said. “Nearly every community solar contract we've ever done has been provided at a percent off the value of the utility bill credit. So, at its essence, we are selling dollars' worth of utility bill credits for 90 cents, and so, you automatically save money.” Contract terms often vary from project to project and state to state. “I think residential customers these days are generally signing contracts that are at least a year, if not three or five in some cases,” explained Owen. He noted that some states, such as Maine and New York, have a statutory 90-day termination notice clause for residential customers, so it doesn't really matter how long the term is because subscribers have the right to terminate deals when they choose. In such cases, Owen said the “replaceability feature” of community solar is vital to success. “We can drop a customer and replace them—and we do,” he said.

    155. Improving Nuclear Plant Construction Processes: How to Build Projects More Efficiently

    Play Episode Listen Later Feb 15, 2024 30:32


    If you have paid any attention to nuclear power plant construction projects over the years, you know that there is a long history of cost overruns and schedule delays on many of them. In fact, many nuclear power plants that were planned in the 1960s and 1970s were never completed, even after millions (or billions) of dollars were spent on development. As POWER previously reported, by 1983, several factors including project management deficiencies prompted the delay or cancellation of more than 100 nuclear units planned in the U.S.—nearly 45% of total commercial capacity previously ordered. Yet, at least one construction expert believes nuclear power plants can be built on time and on budget. “To me, nuclear should be far, far more competitive than it is,” Todd Zabelle, a 30-plus-year veteran of the construction industry and author of the book Built to Fail: Why Construction Projects Take So Long, Cost Too Much, and How to Fix It, said as a guest on The POWER Podcast. Owners have a big role to play in the process. “The owner has to get educated on how to deliver these projects, because the owner gets the value out of any decisions that are made,” Zabelle said. “You cannot just hand it over to a construction management firm and hope for the best, or EPCM [engineering, procurement, construction, and management firm]. It's just not going to work.” “What it boils down to is a lot of people doing a lot of administrative work—people watching the people doing the technical work or the craft work—and we become an industry of bureaucracy and administration,” said Zabelle. “Everyone's forgot about ‘How do we actually do the work?' That has huge implications because of the disconnect between those two.” According to Zabelle, the problem can be solved by implementing a production operations mentality. “My proposal in all this is: we need way more thinking about operations management, specifically operations science,” he said. “Not that it's what happens after the asset's delivered, but it's actually a field of knowledge that assists with how to take inputs and make their outputs. The construction industry doesn't understand anything about operations—they don't understand the fundamentals.” In Zabelle's book, he provides a more thorough explanation of the concept. “Operations science is the study of how to improve and optimize processes and systems to achieve the desired objectives. It involves the use of mathematical models and other techniques to analyze and optimize systems,” he wrote. “It is used to improve efficiency and reduce costs, while ensuring that the quality of the output remains high. Operations science is used to improve the effectiveness of operations, while also reducing waste and improving customer satisfaction.” Near the end of his book, Zabelle noted that the time for business as usual is rapidly closing. “The pain of the status quo in construction is going to increase exponentially as our capacity to develop and execute projects falls short of expectations,” he wrote. “Until we recognize projects as production systems and use operations science to drive project results, we are doomed to failure. We need to free ourselves from the prior eras and instead focus on a new era of project delivery, one in which projects will be highly efficient production systems that utilize the bounty of the technology (AI [artificial intelligence], robotics, data analytics, etc.) we are privileged to have access to.” Zabelle sounded hopeful about the future of nuclear power construction. “I truly believe—I would actually throw down the gauntlet—we can make the Westinghouse AP1000 financially viable,” he said. “I'm happy to work with anybody on how to make nuclear competitive because I think it should be and could be.”

    154. Hydrogen: ‘The Swiss Army Knife of Decarbonization'

    Play Episode Listen Later Feb 1, 2024 34:33


    It seems everywhere you go, both inside and outside of the power industry, people are talking about hydrogen. Last October, the U.S. Department of Energy (DOE) announced an investment of $7 billion to launch seven Regional Clean Hydrogen Hubs (H2Hubs) across the nation and accelerate the commercial-scale deployment of “low-cost, clean hydrogen.” Hydrogen is undoubtedly a valuable energy product that can be produced with zero or near-zero carbon emissions using renewable energy and electrolyzers. The Biden administration says it “is crucial to meeting the President's climate and energy security goals.” “Hydrogen is one of the hottest topics in the energy transition conversation right now, and that's because it really is a super versatile energy carrier. A lot of folks refer to it as ‘the Swiss Army knife of decarbonization,' including our founder, Mr. Gates,” Robin Millican, senior director of U.S. Policy and Advocacy at Breakthrough Energy, said as a guest on The POWER Podcast. Breakthrough Energy is a network of entities and initiatives founded by Bill Gates, which include investment funds, philanthropic programs, and policy efforts linked by a common commitment to scale the technologies needed to achieve a path to net-zero emissions by 2050. “If you think about the ways that you can use hydrogen, you can use it as a feedstock for industrial materials, you can combine it with CO2 to make electrofuels [also known as e-fuels], you can use it for grid balancing if you're storing it and then deploying that hydrogen when it's needed, so it can be used a lot of different ways, which is great,” Millican said. “But actually, to us, the more salient question that we should be asking ourselves is: you can use hydrogen in a lot of these different ways, but should you be using hydrogen in all of those different applications?” Millican said there's a simple framework that she uses to answer that question. “If there's a way that you can electrify a process, in almost all cases, that's going to be cheaper and more efficient from an energy conversion standpoint than using hydrogen,” she said. Millican suggested electrification is a better option than hydrogen for most building and light-duty transportation applications. While noting that hydrogen could be a suitable option for aviation e-fuels, she said biofuels might be an even better fit. However, when it comes to fertilizers and ammonia, clean hydrogen is very likely the best pathway to reducing emissions in that particular sector, she said. Breakthrough Energy isn't the first group to think about hydrogen in this way. Millican noted that Michael Liebreich's “Hydrogen Ladder” has been focusing on the best possible uses for hydrogen for years. According to Liebreich, hydrogen shouldn't routinely be used in power systems to generate power because the cycle losses—going from power to green hydrogen, storing it, moving it around, and then using it to generate electricity—are too large. However, he says, “The standout use for clean hydrogen here is for long-term storage.” Yet, Millican said there is a scenario where hydrogen could be extremely affordable at scale. She said “geologic hydrogen” is something Breakthrough Energy is very interested in. “There are companies out there that are working on identifying where hydrogen exists naturally in the subsurface, and then trying to extract that hydrogen, which could be super affordable, because again, it's abundant in some areas,” she explained. “If we're thinking about hydrogen in that scenario, we might want to use it a lot more ubiquitously.”

    153. PGE Leans into an All-of-the-Above Strategy to Decarbonize Its Power System

    Play Episode Listen Later Jan 23, 2024 20:50


    Climate change has led many states and countries to set targets for reducing greenhouse gas (GHG) emissions from power systems. Oregon, for example, has set targets for all power sold to retail customers in the state to have GHG emissions cut by 80% by 2030, 90% by 2035, and 100% by 2040. It's a challenging task, but Portland General Electric (PGE), a fully integrated energy company that generates, transmits, and distributes electricity to roughly half of Oregon's population, and for about 75% of its commercial and industrial activity, is working hard to achieve those objectives. As the first utility in the U.S. to sign The Climate Pledge, an initiative co-founded by Amazon and Global Optimism in 2019, which has since had 464 signatories join, committing to reach net-zero carbon emissions by 2040, PGE is leading the way toward a cleaner energy future. Kristen Sheeran, senior director of sustainability, strategy, and resources planning at PGE, said the process is pretty straightforward in some ways. “In order to reduce carbon on our system, we have to back out fossil fuels that we currently rely on to generate power for our customers, and we have to replace that with non-emitting alternatives,” she said as a guest on The POWER Podcast. Up to this point in time, that has primarily been done with wind, solar, and batteries, and it's not a new thing for PGE. The company's first wind farm—the Biglow Canyon site—began operation in 2007. Meanwhile, in 2012, PGE opened the Camino del Sol Solar Station, an interstate highway solar project. Since then, the company has partnered with schools, government agencies, and corporations to grow solar energy throughout Oregon. In partnership with NextEra Energy Resources, it also opened North America's first major renewable energy facility to combine wind, solar, and battery storage in one location—the Wheatridge Renewable Energy Facility in Morrow County. Today, PGE boasts having more than 1 GW of wind power capacity in service in the Northwest, and it aims to procure between 3.5 GW and 4.5 GW of new non-emitting resources and storage between now and 2030. Perhaps more difficult than decarbonizing the system, however, is doing so while also maintaining reliability, affordability, and an equitable system for all its customers. “It's a very interesting point in time—an inflection point for the industry,” Sheeran said. “How do you balance affordability? How do you balance reliability with emissions reduction?” she asked. PGE closed its last Oregon-based coal-fired power plant in October 2020, 20 years ahead of schedule, as part of an agreement with stakeholders, customer groups, and regulators to significantly reduce air emissions from power production in Oregon. PGE still receives a small amount of coal-fired power from the Colstrip plant, which is located near Billings, Montana. The company has an ownership stake in the facility, but it plans to exit its ownership in Colstrip no later than 2029. Brett Greene, PGE's senior director of clean energy origination and structuring, suggested striking the right energy balance will take more than just wind and solar, however. “We are supportive of all technology. We really think it takes a lot of innovation and creativity to hit that net-zero goal in 2040,” he said. Greene noted that resources such as hydro, pumped storage, offshore wind, and even nuclear, hydrogen, and carbon capture technologies may ultimately be needed to fully decarbonize PGE's power mix.

    152. A Boiler for Any Occasion

    Play Episode Listen Later Dec 28, 2023 18:45


    Boilers obviously play an important role in the power generation industry, providing the mechanism to convert heat produced by burning fuel into steam that can be used to drive a turbine to generate electricity. But many other industries also use boilers to produce steam for a variety of purposes. Boilers are commonly used for space heating in industrial facilities, including in factories, warehouses, and office buildings, as well as on university campuses and in large medical complexes. Boilers often provide hot water or steam, which is then distributed throughout buildings using radiators, convectors, or underfloor heating systems, to heat the air. Many industrial processes utilize high-temperature steam for manufacturing operations. Boilers are regularly used for processes such as chemical manufacturing, food processing, paper production, and textile manufacturing. Boilers are also essential in petroleum refineries for processes like distillation, cracking, and reforming. Steam can also be used as a source of energy for industrial processes such as sterilization, cleaning, and drying. In some cases, cogeneration (also called combined heat and power) systems are utilized to first generate electricity, and then, extraction steam is diverted for other purposes. This can greatly improve the overall system efficiency, saving money and reducing emissions. Rentech Boiler Systems Inc. is one of the leading manufacturers of custom water tube and waste heat recovery boilers. The company is headquartered in Abilene, Texas, but sells its boilers around the world. “We have shipped boilers to about 35 countries in the world. So, we're a company known globally,” Gerardo Lara, vice president of Fired Boiler Sales with Rentech, said as a guest on The POWER Podcast. “I think our best feature at Rentech is that we build only custom solutions,” Jon Backlund, senior sales engineer with Rentech, said on the podcast. “We don't have a catalog of standard sizes or standard designs. So, we will basically custom fit the application, and that means, we will read the specifications carefully, talk to the client about special needs, special fuels, any kind of space constraints, delivery issues, and design our system to fit exactly what they require.” Rentech typically manufacturers boilers with capacities ranging from about 40,000 lb/hr to 600,000 lb/hr of steam. Moving boiler systems of that size—which can weigh up to half a million pounds—from a manufacturing facility to a site can be challenging, but Lara suggested Rentech is very proficient at the task. “There is a wide range of logistics that have to be studied, and yes, we live in the middle of Texas, but we certainly are very well versed on how to get a big boiler to Australia, if need be,” he said. “If we can do that, we certainly can get one to any state here within the U.S., or even Canada or Mexico.” The fuel used to fire boilers can vary widely. Natural gas is very common in the U.S. because it is highly available and relatively inexpensive, but many other fuels are also suitable for industrial boilers. Backlund said there are a lot of “opportunity fuels” available in different locations. For example, landfill gas can be captured and utilized at many landfills. Likewise, biogas from brewing or sewage treatment processes are also usable. Many experts believe hydrogen will be an important fuel as the world transitions to greater carbon-free energy resources. Backlund said hydrogen has been burned in boilers for decades. “There's a lot of talk about equipping our boilers to burn hydrogen in the future, but this is not a new technology in the boiler business,” he said. “Those kinds of plants have been around for generations.” Where the hydrogen comes from and how it is produced may change, but today's boilers are already capable of utilizing hydrogen efficiently.

    151. Microgrids a Win for Both Owners and Grid Operators

    Play Episode Listen Later Dec 14, 2023 24:44


    According to a guidebook issued by Sandia National Laboratories, a U.S. Department of Energy (DOE) multi-mission laboratory, microgrids are defined as a group of interconnected loads and distributed energy resources (DERs) that act as a single controllable entity. A microgrid can operate in either grid-connected or island mode, which includes some entirely off-grid applications. A microgrid can span multiple properties, generating and storing power at a dedicated/shared location, or it can be contained on one privately owned site. The latter condition, where all generation, storage, and conduction occur on one site, is commonly referred to as “behind-the-meter.” Microgrids come in a wide variety of sizes. Behind-the-meter installations are growing, especially as entities like hospitals and college campuses are installing their own systems. Where some once served a single residence or building, many now power entire commercial complexes and large housing communities. “Today, there's a whole new way to do DER management, which is a significant component of microgrids,” Nick Tumilowicz, director of Product Management for Distributed Energy Management with Itron, said as a guest on The POWER Podcast. “There is a way now to do that in a very local, automated, and cost-effective way just by leveraging what utilities have already deployed—hundreds of thousands of meters and the mesh networks that are communicating with those meters.” Tumilowicz said a variety of factors can influence if and/or when a microgrid gets deployed. Sometimes, a company is focused on running cleaner and greener operations. Other times, the grid a company is connected to may have reliability challenges that are affecting business adversely, or the company may just want to be energy independent, so the decision is frequently case specific. “The customer has this motivation to have this backup concept known as resiliency—if the grid's not there for me, I'll be there for me,” he said. “Generally speaking, nationally, we're well above 99.9% grid reliability,” Tumilowicz noted. Yet, even when power outages are rare, a microgrid can still provide value. “It can provide flexible services, such as capacity or resource adequacy, or energy services back to the distribution and the transmission up to the market operator level,” explained Tumilowicz. “So, this is a whole other way to be able to start thinking about how we participate with microgrids when 99-plus percent of the time they're grid connected, but they're also there for when the grid is not connected—in that very low probability of time.” However, the return on investment for microgrid systems is highly affected by location. “If you're in Australia, the equation is different than if you're in Hawaii, versus if you're in the northeast U.S.—one of the better-known accelerated paybacks to do this,” said Tumilowicz. For example, in areas where the market operator, such as an independent system operator or regional transmission organization, places a high value on peak power reductions within its system, the economics for microgrid owners can be greatly improved. But regardless of what may have driven the initial decision to create a microgrid, Tumilowicz said being flexible is important. “You might deploy your microgrid to satisfy three use cases and market mechanisms that exist in the beginning of 2024, but you need to be open and receptive—and this is where the innovation comes in—to add use cases over time, because the system is going through a significant energy transition, and you need to be dynamic and accommodating to do that,” he said.

    150. How Coal Fly Ash Is Reducing CO2 Emissions and Improving Concrete

    Play Episode Listen Later Nov 30, 2023 19:48


    Concrete is the most widely used construction material in the world. One of the key ingredients in concrete is Portland cement. The American Concrete Institute explains that Portland cement is a product obtained by pulverizing material consisting of hydraulic calcium silicates to which some calcium sulfate has usually been provided as an interground addition. When first made and used in the early 19th century in England, it was termed Portland cement because its hydration product resembled a building stone from the Isle of Portland off the British coast. Without going into detail, it suffices to say that a great deal of energy is required to produce Portland cement. The chemical and thermal combustion processes involved in its production are a large source of carbon dioxide (CO2) emissions. According to Chatham House, a UK-based think tank, more than 4 billion tonnes of cement are produced each year, accounting for about 8% of global CO2 emissions. However, fly ash from coal-fired power plants is a suitable substitute for a portion of the Portland cement used in most concrete mixtures. In fact, substituting fly ash for 20% to 25% of the Portland cement used in concrete mixtures has been proven to enhance the strength, impermeability, and durability of the final product. Therefore, using fly ash for this purpose rather than placing it in landfills or impoundments near coal power plants not only reduces waste management at sites, but also reduces CO2 emissions and improves concrete performance. Rob McNally, Chief Growth Officer and executive vice president with Eco Material Technologies, explained as a guest on The POWER Podcast that the ready-mix concrete industry has been reaping the benefits of using fly ash for years. “In terms of economics, fly ash was typically cheaper than Portland cement. It also has beneficial properties that typically makes it stronger long term and reduces permeability, which keeps water out of the concrete mixture and helps concrete to last longer. And, then, it's also environmentally friendly, because they're using what is a waste product as opposed to more Portland cement—and Portland cement is highly CO2 intensive. For every tonne of Portland cement produced, it's almost a tonne of CO2 that's introduced into the atmosphere. So, they have seen those benefits for years with the use of fresh fly ash,” McNally said. However, as climate change concerns have grown, many power companies have come under pressure to retire coal-fired power plants. As plants are retired, fresh fly ash has become less and less available. “The availability of fresh fly ash is declining,” said McNally. “In some places—many places actually—around the country, replacement rates that used to be 20% of Portland cement was replaced by fly ash are now down in single digits. But that's a reflection of fly ash availability.” Eco Material Technologies, which claims to be the leading producer of sustainable cementitious materials in the U.S., has a solution, however. It has developed a fly ash harvesting process and has nine fly ash harvesting plants in operation or under development to harvest millions of tons of landfilled ash from coal power plants. Locations include sites in Arizona, Georgia, North Dakota, Oregon, and Texas. “There are billions—with a b—of tons of impounded fly ash around the country, so we have many, many years of supply,” McNally said. Still, Eco Material is not resting its business solely on fly ash harvesting, or marketing fresh fly ash, which it has also done for years. “The other piece where we will fill the gap that fresh fly ash leaves behind is with the green cement products. Because with those, we're able to use natural pozzolans, like volcanic ash, and process those and replace 50% plus of Portland cement in concrete mixes. So, we think there's an answer for the decline in fly ash and that's where the next leg of our business is taking.”

    149. DOE Competition Helps College Students Prepare for Cyber Jobs in the Energy Industry

    Play Episode Listen Later Nov 21, 2023 29:18


    There is growing demand for cybersecurity professionals all around the world. According to the “2023 Official Cybersecurity Jobs Report,” sponsored by eSentire and released by Cybersecurity Ventures, there will be 3.5 million unfilled jobs in the cybersecurity industry through 2025. Furthermore, having these positions open can be costly. The researchers said damages resulting from cybercrime are expected to reach $10.5 trillion by 2025. In response to the escalating demand for adept cybersecurity professionals in the U.S., the Department of Energy (DOE) has tried to foster a well-equipped energy cybersecurity workforce through a hands-on operational technology cybersecurity competition with real-world challenges. On Nov. 4, the DOE hosted the ninth edition of its CyberForce Competition. The all-day event, led by DOE's Argonne National Laboratory (ANL), drew 95 teams—with nearly 550 students total—from universities and colleges across the nation. This year the focus was on distributed energy resources including solar panels and wind turbines. “The CyberForce Competition comes out of the Department of Energy's Office of Cybersecurity, Energy Security, and Emergency Response, which is CESER for short,” Amanda Theel, group leader for workforce development at ANL, said as a guest on The POWER Podcast. “Their main goal for this is really to help develop the pipeline of qualified cybersecurity applicants for the energy sector. And I say that meaning, we really dive heavily on the competition and looking at the operational technology side, along with the information technology side.” Theel said each team gets about six or seven virtual machines (VMs) that they have to harden and defend to the best of their ability. Besides monitoring and protecting the VMs, which include normal business systems such as email and file servers, the teams also have to defend grid operations and other energy resources. “We have a Red Team that's constantly trying to either come into the system from your regular attack-defend penetration. We also have a portion of our Red Team that we like to call our ‘assumed breach,' so we assume that adversary is already in the system,” Theel explained. “The Blue Team, which is what we call our college students, their job is to work to try to get those Red Team members out.” She said they also have what they call “our whack-a-mole,” which are vulnerabilities built into the system for the Blue Team members to identify and patch. Besides the college students, ANL brings in volunteers—high school students, parents, grandparents, people from the lab, and people from the general public—to test websites and try to pay pretend bills by logging in and out of the simulated systems. Theel said this helps students understand that while security is important, they must also ensure that owners, operators, and end-users can still get in and use the systems as intended. “So, you have to kind of play the balance of that,” she said. Other distractions are also incorporated into the competition, such as routine meetings and requests from supervisors, for example, to review a forensics file and check the last time a person in question logged into the system. The intention is to overload the teams with tasks so evaluators can see if the most critical items are prioritized and remedied. For the second year in a row, a team from the University of Central Florida (UCF) won first place in the competition (Figure 1). They received a score of 8,538 out of 10,000. Theel said the scores do vary quite significantly from the top-performing teams to lower-ranked groups. “What we've found is obviously teams that have returned year after year already have that—I'll use the word expectation—of already knowing what to expect in the competition,” explained Theel. “Once they come to year two, we've definitely seen massive improvements with teams.”

    148. Advanced Nuclear Fuel Approved for Installation at Plant Vogtle

    Play Episode Listen Later Oct 31, 2023 15:31


    Southern Nuclear, Southern Company's nuclear power plant operations business, announced in late September that it had received “first-of-a-kind approval” from the Nuclear Regulatory Commission (NRC) to use advanced fuel—accident tolerant fuel (ATF)—exceeding 5% enrichment of uranium-235 (U-235) in Plant Vogtle Unit 2. The fuel is expected to be loaded in 2025 and will have enrichments up to 6 weight % U-235. The company said this milestone “underscores the industry's effort to optimize fuel, enabling increased fuel efficiency and long-term affordability for nuclear power plants.” “5 weight % was deeply ingrained in all of our regulatory basis, licensing basis for shipment containers, licensing basis for the operation of the plants—it was somewhat of a line drawn in the sand,” Johnathan Chavers, Southern Nuclear's director of Nuclear Fuels and Analysis, explained as a guest on The POWER Podcast. “Testing of the increased enrichment component has been a licensing and regulatory exercise to see how we would move forward with existing licensing infrastructure to install weight percents above that legacy 5 weight %,” Chavers told POWER. Chavers said ATF became a focal point for the industry in March 2011 following the magnitude 9.0 Tohoku-Oki earthquake and resulting tsunami, which caused a crisis at the Fukushima nuclear power plant. “In 2012, Congress used the term ‘accident tolerant fuel' for the first time in an Appropriations Act, and that's where it all began,” Chavers explained. “It was really for the labs and the DOE [Department of Energy] to incentivize enhanced safety for our fuel in response to the Fukushima incident.” In 2015, the DOE issued a report to Congress outlining details of its accident tolerant fuel program. The report, titled “Development of Light Water Reactor Fuels with Enhanced Accident Tolerance,” set a target for inserting a lead fuel assembly into a commercial light water reactor by the end of fiscal year 2022. Notably, Southern Company achieved the goal four years early. “We were the first in the world to install fueled accident tolerant fuel assemblies of different technologies that were developed by GE at our Hatch unit in 2018,” Chavers noted. The following year, Southern Nuclear installed four Framatome-developed GAIA lead fuel assemblies containing enhanced accident-tolerant features applied to full-length fuel rods in Unit 2 at Plant Vogtle. “This is the third set that we're actually installing that is a Westinghouse-developed accident tolerant fuel, which also includes enrichments that exceed the historical limits of 5 weight %,” Chavers explained. While enhanced safety is perhaps the most significant benefit provided by ATF, advanced nuclear fuel is also important in lowering the cost of electricity. “Our ultimate goal is to enable 24-month [refueling] cycles for all U.S. nuclear power plants, to improve the quality of life for our workers, to lower the cost of electricity,” said Chavers. “Fundamentally, [nuclear power] is a clean green power source—carbon-free. The more we can keep it running—that's something we're trying to go after,” noted Chavers. “We see a lot of positives in this program in that not only are we improving safety, lowering the cost, but we're also increasing the amount of megawatts electric we can get out of the nuclear assets.”

    147. Five Key Transformations Required to Achieve Net-Zero in the U.S.

    Play Episode Listen Later Oct 13, 2023 26:07


    During President Biden's first year in office, his administration published a document titled “The Long-Term Strategy of the United States: Pathways to Net-Zero Greenhouse Gas Emissions by 2050.” The document says all viable routes to net-zero involve five key transformations. They are: • Decarbonize electricity. • Electrify end uses and switch to other clean fuels. • Cut energy waste. • Reduce methane and other non-CO2 emissions. • Scale up CO2 removal. Which of the key transformations will play the biggest role in reaching the U.S.'s net-zero goal is still up for debate. “The first step—decarbonize electricity—is critical and may be one of the most important steps in achieving net-zero emissions,” Brendan O'Brien, business development manager, and strategy and sales leader with Burns & McDonnell, said as a guest on The POWER Podcast. “That transition is going to include a lot of things that we're probably familiar with today, like clean energy driven by solar and wind, but also it'll look to the future for decarbonized technologies and decarbonized solution.” O'Brien noted that the U.S. is targeting 100% clean energy by 2035, and he suggested the transition is already well underway. “It's been occurring and even accelerating in recent years,” he said. “It's been driven by plummeting costs in key technologies, like solar, onshore wind, offshore wind, and batteries, which you're seeing more and more as deployed technology of the utilities in the United States. All that's being bolstered by policies and regulation that has been enacted by various governments. And then also—the final—the big push is really coming from the consumer. More and more consumers are demanding clean energy and clean power, and the power generation market in the United States has been reacting to it.” Complexity is added to the equation with the second key transformation, that is, electrifying end uses. O'Brien said the transportation sector's shift from internal combustion engines to electric vehicles will require a 65% increase in power generation. That's on top of other load growth from manufacturers reshoring operations, as well as the need to replace retiring power generation units, specifically coal plants. “I think there's going to be quite a fun challenge of figuring out what the energy mix is going to look like over the next 10 to 25 years to meet these targets,” said Megan Reusser, hydrogen technology manager with Burns & McDonnell, who also participated on the podcast. “What we really need to be looking at is the whole picture,” she said, noting that there are many sectors trying to electrify including industrial applications, agriculture, and forestry, among others. “Transportation is one piece, but when we start putting all the pieces together, it's going to be large amounts of generation required,” said Reusser. Meanwhile, cutting energy waste is a no-brainer. Likewise, reducing methane and other non-CO2 emissions follows a similar thought pattern. Lastly, scaling up CO2 capture is important. “We cover a wide range of these different technologies. So, we're looking at carbon capture and sequestration, whether that is amine technology or membrane technologies—doing a lot of work in the direct air capture, or DAC, markets. So, looking to essentially remove CO2 from the atmosphere that's already there, and then sequester that with various technologies,” Reusser explained. In the end, it's likely an integrated approach will be necessary to reach the U.S.'s net-zero target successfully. “There's not just going to be a single solution that's going to get us there. If you dive a little bit more into the U.S. strategy that we were talking about today, it really lays out the groundwork of how to get there. And as you dive into that, you'll see that it doesn't just focus on one single industry or one single technology, it's really across the value chain on how we can accomplish this by working together,” concluded Reusser.

    146. Reducing Carbon Intensity with Renewable Propane

    Play Episode Listen Later Oct 4, 2023 19:35


    Most propane used in the U.S. today is produced as a byproduct of natural gas processing and crude oil refining, which are not considered “green” technologies. However, renewable propane availability is growing. Renewable propane, like its conventional brother, is commonly made as a byproduct of other fuel production, in its case, often renewable diesel and sustainable aviation fuels (SAFs). Renewable diesel and SAF are primarily produced from plant and vegetable oils, animal fats, and used cooking oil. Renewable propane has the exact same features as conventional propane, which includes excellent reliability, portability, and power, as well as reduced carbon emissions on a per-unit-of-energy basis compared to many other fossil fuels. While the scale of renewable propane production is fairly small at present, most experts agree that it has the potential to ramp up quickly. “Looking at what we've done for the past five years is we were shipping about 40 million gallons [of renewable propane]. By the end of this year, we're going to be close to 100 million gallons, and by the end of 2024, we should be close to 200 million gallons. So, the scalability is coming up—there's more refineries coming on,” Jim Bunsey, director of commercial business development with the Propane Education & Research Council (PERC), said as a guest on The POWER Podcast. One way to judge the environmental impact of a fuel is through its carbon intensity (CI) score. The concept was brought to many peoples' attention in 2009, when the California Air Resources Board approved the state's Low Carbon Fuel Standard (LCFS) regulation. The LCFS set annual CI standards, or benchmarks, which reduce over time, for gasoline, diesel, and the fuels that replace them. CI is expressed in grams of carbon dioxide equivalent per megajoule of energy (gCO2e/MJ) provided by a fuel. CI takes into account the greenhouse gas (GHG) emissions associated with all of the steps of producing, transporting, and consuming a fuel—also known as the “complete lifecycle” of the fuel. According to Bunsey, conventional propane has a CI of about 79, but renewable propane is much lower. “We can have renewable propane having a carbon intensity of seven or up to 20.5,” he said. “There's a range—it depends on the feedstock that's available.” Notably, both conventional and renewable propane compare quite favorably to the U.S. power grid's average CI, which is about 130, according to Bunsey. While California has been a leader nationally in the push for GHG reductions, other jurisdictions are following its example. The Pacific Coast Collaborative, a regional agreement between California, Oregon, Washington, and British Columbia is one example. Over time, collaborative member LCFS programs are expected to build an integrated West Coast market for low‐carbon fuels that will create greater market pull, increased confidence for investors of low-carbon alternative fuels, and synergistic implementation and enforcement programs. Other regions of Canada and Brazil are also using California as a model to develop LCFS‐like performance standards for transportation fuels. Suppliers are also finding interest for renewable propane in the northeastern U.S. The first delivery of renewable propane in Massachusetts was received with a ceremony at the NGL Supply Wholesale Springfield terminal in West Springfield on Sept. 12. “The cost is just very slightly more than traditional propane today, but we anticipate as more of it is produced that that cost is going to come down. And if you think about the added benefit that you get by knowing you're helping the climate and helping the planet by using renewables, I think a lot of people are willing to spend just a little bit more to get that,” Leslie Anderson, president and CEO of Propane Gas Association of New England, told WWLP-22News, a western Massachusetts multimedia company.

    145. How Power Companies Benefit from Accurate Weather Forecasts

    Play Episode Listen Later Sep 26, 2023 31:08


    It's pretty easy to understand how the weather affects certain forms of power generation and infrastructure. Sunlight is obviously needed to generate solar power, wind is required to produce wind energy, and extreme storms of all kinds can wreak havoc on transmission and distribution lines, and other energy-related assets. Therefore, having accurate and constantly updated weather information is vital to power companies. “First and foremost, utilities need to understand as best as possible the forecast of the environmental resources that are supplying these generation sources. It's ultra-critical, because even small, slight changes in wind speed or solar radiation can have pretty substantial impacts as far as the capacity factor that a renewable generator is operating at,” Nic Wilson, director of product management for weather and climate risk with DTN, said as a guest on The POWER Podcast. Wilson highlighted some of the weather-related applications that utilities are integrating into their operations. “One of the focal points for DTN is working with utility emergency preparedness teams in order to help them better understand and forecast at-risk weather environmental hazards that are going to impact their overhead distribution operations, and understanding and communicating appropriately the outage impact risks,” he said. “Another application is asset inspection,” said Wilson. “After a storm goes through, how does the utility prioritize where it's going to do inspection along its lines for potential damage?” One way could be using DTN's tools. Wilson suggested, for example, a company responsible for the operations and maintenance of wind farms could use DTN data to identify turbines that may have experienced blade damage during a weather event. With that insight, the company could proactively inspect for compromises to the fiberglass blades before the damage turned catastrophic. Load forecasting is another important use case for DTN's data. Many things must be considered to develop load forecasts including historical trends and current events. Wilson suggested temperature, precipitation, cloud cover, time of day, time of year, and more will affect not only the renewable energy production, but also demand for electricity. With accurate forecasts, power companies can plan appropriately to take advantage of any given situation. If they anticipate a surplus, units could be taken offline for scheduled maintenance, but if the supply is expected to be tight, they can issue orders to increase plant readiness. “Then, there's some emerging applications, such as capital planning, where utilities are trying to climate-adjust the age, and understand the performance and condition monitoring of their assets in order to prioritize resiliency investments,” Wilson said. DTN's products are constantly being refined too. Wilson said artificial intelligence and machine learning are behind many of the improvements. “We are consistently doing what we call retraining. So, as new data becomes available from the utility, whether that's outage management system data, or condition monitoring information, or satellite- or LIDAR [light detection and ranging]-derived vegetation datasets, we're incorporating that into our models and updating them as frequently as possible in order to ensure that our predictions are as representative of the current environment as possible,” he said. Wilson said DTN is making some forays into climate modeling and trying to understand how different environmental factors of interest to utilities are going to evolve in not only the next three to six months on a seasonal basis, but also out to 30 years in the future. This is important information for power companies because they are often making investments with a 50-year time horizon in mind.

    144. Environmental Justice: What It Is and Why It's Important to Power Projects

    Play Episode Listen Later Sep 7, 2023 29:11


    The U.S. Department of Energy (DOE) defines environmental justice as: “The fair treatment and meaningful involvement of all people, regardless of race, color, national origin, or income, with respect to the development, implementation, and enforcement of environmental laws, regulations, and policies.” It says “fair treatment” means that no population bears a disproportionate share of negative environmental consequences resulting from industrial, municipal, and commercial operations or from the execution of federal, state, and local laws; regulations; and policies. “Meaningful involvement,” meanwhile, “requires effective access to decision makers for all, and the ability in all communities to make informed decisions and take positive actions to produce environmental justice for themselves,” according to the DOE. Environmental justice (EJ) has become a very important consideration when it comes to siting and/or expanding energy projects, including power plants. While many people associated with the power industry tend to focus on the benefits provided to communities when a project is developed, such as well-paying jobs and an increase in the tax base, people in the affected community may have a different view. They may be more focused on the negative effects, which could include an increase in harmful emissions, water usage, and heavy-haul traffic. “Communities are weighing the pros and cons of having industry there—having a job creator—and that, of course, generating additional economic activity. On the flip side, there are actual or perceived environmental or health issues,” Erich Almonte, a senior associate with King and Spalding, said as a guest on The POWER Podcast. King and Spalding is a full-service law firm with more than 1,300 lawyers and 23 offices globally, including a large team focused on energy-related matters. “It's important to note that there really isn't any ‘Environmental Justice Law.' What we have instead are a use of current statutes and regulations that were perhaps designed for something else to try to achieve environmental justice ends,” Almonte said. The impact EJ could have on a project is quite substantial. “A company could meet all of its environmental permitting requirements, but still have a permit denied, if there were disparate impacts that weren't mitigated properly, under Title VI of the Civil Rights Act,” Almonte explained. “This came out in a guidance document in April 2022, and since, it's featured a couple of times in subsequent guidance documents that the administration has put out,” he added. While Almonte said he wasn't aware of a permit being denied in that fashion to date, it's a major consideration for companies when planning projects. Another potential show-stopper could be trigger through Section 303 of the Clean Air Act. This section provides “emergency powers” to the Environmental Protection Agency (EPA). “When there's an environmental threat that poses an imminent and substantial endangerment to the public, or to the environmental welfare, then EPA can essentially stop that activity or file a lawsuit against it,” Almonte explained. “This is true even if the activity that's causing the supposed endangerment is allowed by the permit.” According to Almonte, the EPA has only used this authority 14 times in the past five decades, but four of those occurrences have been in the past two years. This suggests it could become a regular tool used by the administration to achieve its EJ goals.

    143. Power Grid Investments Improve Reliability and Make Blackouts Less Likely

    Play Episode Listen Later Aug 30, 2023 24:39


    While power outages are not uncommon in the U.S., widespread blackouts that last more than a couple of hours are pretty rare. However, this summer marks the 20th anniversary of one of the most significant blackouts in North American history. The incident didn't just affect the U.S., but also major parts of Canada. The blackout occurred on Aug. 14, 2003. The History Channel reports it began at 4:10 p.m. EDT, when 21 power plants shut down in just three minutes. Fifty million people were affected, including residents of New York City, Cleveland, and Detroit, as well as Toronto and Ottawa, Canada, among others. Although power companies were able to resume some service in as little as two hours, power remained off in other places for more than a day. The outage stopped trains and elevators, and disrupted everything from cellular telephone service to operations at hospitals and traffic at airports. “It was close to quitting time in the afternoon, and given the warm weather in the middle of the summer and thunderstorm season, our system was holding up well. I was looking forward to actually leaving on time for a change,” Paul Toscarelli, senior director of Electric Transmission and Distribution (T&D) Operations for the Palisades Division with Public Service Electric and Gas (PSE&G), New Jersey's largest utility, said as a guest on The POWER Podcast. Toscarelli was an engineer assigned to one of PSE&G's regional distribution divisions at the time and was in the distribution dispatch office when the incident occurred. He recalled the event quite vividly. “We were coming up around the second anniversary of 9/11, as I recollect, and just about everyone's gut feel—instinctive feel—was this was another kind of terrorist attack,” Toscarelli said. “Looking back at it, it was very strange to recollect how relieved we were to find out it was just a widespread system outage of epic proportions.” Of the 750,000 PSE&G customers that lost power that day, nearly three-quarters were back online within five hours and virtually all had service by noon the next day. PSE&G said diversification and design protections helped to contain the outage, and the company was safely able to reenergize the system circuit by circuit. “The industry learned a lot about the electric system vulnerabilities,” said Toscarelli. Based on studies of the incident, the North American Electric Reliability Corporation (NERC) enhanced its standards in an effort to prevent future blackouts. Since the 2003 blackout, PSE&G has spent billions of dollars to further enhance the reliability and resiliency of its T&D systems with the aim of mitigating future outages. In fact, the company's planned capital expenditures this year are the largest in the utility's history—more than $3.5 billion. Among the projects PSE&G expects to complete in 2023 is a Newark Switch Rebuild Project. The Newark Switching Station is the heart of the company's Newark T&D network. The $350 million project will modernize aging infrastructure that was put into service in 1957. Another example is the $550 million Roseland-Pleasant Valley Project, which was completed in May and was one of PSE&G's largest transmission projects to date. The 51-mile undertaking replaced transmission facilities that were, on average, about 90 years old. “Infrastructure continually ages. It's our job as the stewards of our system to monitor the usage of our equipment, inspect it, maintain it, and replace it where it's deemed necessary, in a timely manner, and continuously repeat that process,” said Toscarelli. “We have an asset management model that involves risk assessment and risk scoring, and it lets us stay in the forefront of this.”

    142. Nuclear Power, Electrification, and Carbon-Free Fuel Are Key to INL Achieving Net-Zero by 2031

    Play Episode Listen Later Aug 23, 2023 45:41


    In 2021, Idaho National Laboratory (INL) Director John Wagner set a lofty goal for the lab to achieve net-zero carbon emissions within 10 years. An uninformed observer might think that would be an easy task for an organization as focused on energy as INL, but it's important to recognize that the lab is spread over nearly 900 square miles—about three-quarters the size of the state of Rhode Island. To shuttle the lab's nearly 5,400 employees everywhere they need to go across that vast territory, INL has a fleet of about 85 motor coaches with an operating schedule that runs 24 hours a day, seven days a week. With all the transportation and 357 buildings to heat and cool throughout the year, achieving net-zero is a significant challenge. Jhansi Kandasamy, INL's net-zero program director, explained that more than half of the lab's carbon emissions come from purchased electricity. That means INL has to work with Idaho Power to cut much of its emissions. “Probably 60 to 80% is already pretty clean—carbon-free—because they have hydro as a majority electricity generation,” Kandasamy said as a guest on The POWER Podcast, but that still leaves a fairly large gap to fill. “With my background in nuclear and nuclear being dependable, secure, 24/7, we've worked with Idaho Power to say, ‘We'd like to include nuclear as the generation,' ” Kandasamy said. “If we accomplish that—if we get nuclear—that addresses the 54% of carbon emissions that we get from purchasing electricity. Without doing anything else, we would have reduced our carbon emissions by 70%.” The Carbon Free Power Project, spearheaded by Utah Associated Municipal Power Systems (UAMPS), with NuScale Power's VOYGR small modular reactor technology at its heart, seems like a logical fit for Idaho Power's needs. The six-module plant will be built on INL property. Kandasamy said INL helped get some potential project partners, including folks from UAMPS, NuScale, Idaho Power, Idaho Falls Power, and the Department of Energy (DOE), in a room to talk about the project and what needed to be done to ensure it is operational within the next decade. “It's a collaboration effort instead of competition. It's all collaboration—getting all the people that are the experts in the room and kind of working through it. And it's been great in that they're all coming up with these different ideas,” she said. In addition to motor coaches, INL also has more than 600 other vehicles in its transportation fleet. Kandasamy suggested there are plans to electrify much of INL's fleet, as well as adding some hydrogen-fueled vehicles and using carbon-free fuels, such as R99 (renewable diesel), in others, which will all help to cut carbon emissions. Still, getting the vehicles poses a challenge. INL is required to source its vehicles through the DOE, and the DOE's supply of electric and hydrogen-fueled models is lacking. “The Executive Order says by 2027 we need to have all of our light-duty vehicles transition to electric. That's not far away. We have 240 vehicles—light-duty vehicles—that we need to transition. We've gotten 24,” Kandasamy said. Yet, employees may be the real key to success. Kandasamy said the staff at INL has really gotten behind the initiative. “The big push is really the cultural shift across the entire laboratory. So, the communication becomes a really huge part of saying, ‘Here's what we're doing for each scope. Here's how each of the employees contributes to getting us to net-zero,' ” she said. “We've been putting in all these communications about how we're transitioning. The other part is for the employees to tell their story on how they are achieving net-zero,” said Kandasamy. “That has been huge. Now, it's like, everybody wants to have their story. So, they start talking about how they are transforming in their personal life, as well as how they're commuting to work, and so on, with net-zero stories.”

    141. CTOTF Conference: ‘Best One-Stop Shop to Hit It All'

    Play Episode Listen Later Aug 9, 2023 18:12


    The Combustion Turbine Operations Technical Forum (CTOTF) is the longest continuously active gas turbine industry organization driven by users, for users. CTOTF offers week-long conferences twice annually in the spring and fall. The conferences provide a balance of technical information, user-to-user interaction, and professional development and mentoring for the group's nationwide user base. CTOTF's 2023 Fall Conference will be held September 24–28 at the Mystic Lake Casino Hotel in Prior Lake, Minnesota. As a guest on The POWER Podcast, Dave Tummonds, senior director of Project Engineering with Louisville Gas and Electric (LG&E) and chairman of the board for the CTOTF, talked about the group and some of the things he's looking forward to during the upcoming event. “The biggest thing for me is, when you look at our agenda and what we strive to accomplish over the course of a week-long conference, we hit a lot of things that admittedly some other conferences hit, but we tend to be the best one-stop shop to hit it all,” Tummonds said. Sessions encourage interaction from all attendees and offer an intimate setting where newcomers don't get lost in the crowd. The agenda begins with opening presentations that often dive into industry trends, among other things. This fall, Aron Patrick, director of Research and Development (R&D) with PPL Corp., parent company of LG&E, will give a presentation focused on the energy transition. “On our kickoff day—Monday morning—we're going to have an update from my company's R&D director, who's going to go over some of the things that are being done in the heart of coal country—in Kentucky and similar areas—in preparation for the decarbonization effort,” said Tummonds. “What makes this interesting, I believe, is his analysis, and his group's analysis, which really points out that as we seriously look to decarbonize, we've got to do that with more backup from gas-fired megawatts as opposed to less. It's just a necessity to make up for the times when those renewable megawatts are not available. “The other thing I would mention associated with his presentation is he's going to touch on some efforts in the area of hydrogen blending that his group is specifically looking at, as well as carbon capture and sequestration, that again, when you look at the unique perspective of the heart of coal country, I think serves as an important note for us all.” On the podcast, Tummonds touched on many of the other sessions and activities that are planned this fall too. Among the highlights are presentations by original equipment manufacturers, topical discussions with third-party suppliers and other experts, technical education sessions, leadership development roundtables, environmental updates, and plenty of time for networking and fun.

    Claim The POWER Podcast

    In order to claim this podcast we'll send an email to with a verification link. Simply click the link and you will be able to edit tags, request a refresh, and other features to take control of your podcast page!

    Claim Cancel