Amir Cohen, founder & CEO of EGM (Electrical Grid Monitoring, Inc), argues that ‘multi-sensing’ technologies deployed on the current grid, accurately analysing many parameters in real-time, could greatly increase grid capacity and improve performance, reducing the need for spending on new infrastructure

electrical-gca13c3faa_640

Credit: Scott from Pixabay

The current energy security and climate crisis are creating a huge demand for new power grid infrastructure and a rising risk of gridlock. Electricity grids will need to dramatically increase capacity to fulfil the growing demand for everything from electric cars to heating systems and to incorporate more renewable power. Yet the enormous grid expansion that is projected could create unsustainable additional costs to be borne by taxpayers and consumers at a time of soaring inflation, and cause disruption to local ecosystems and communities. And few realise that much of the demand for new grid capacity is being driven by wasteful and inefficient usage of existing infrastructure.

Outdated methods of monitoring network conditions and capacity mean that global networks are suffering major power losses and running significantly below their true capacity, creating needless congestion and constraints on renewable generation. Cumulatively, this is hampering the electrification of the economy and decarbonisation of power and driving excessive and unsustainable demand for new infrastructure.

The capacity crunch

A looming power grid capacity crunch is impeding the transition to more secure, affordable, and clean energy. Record amounts of future renewable capacity are caught in interconnection queues while capacity constraints are forcing grid operators to curtail supply from existing renewable generators. This not only hampers the energy transition but creates a growing risk that soaring demand from electric cars and heating could cause power shortages and outages.

Some reports estimate that the energy transition requires a century of electric grid infrastructure development in just a decade. Even excluding net zero targets, some analysts project that future generation and demand would need at least $14 trillion worth of new transmission and distribution infrastructure by 2050. And global grid expansion is already far behind target due to spiralling supply chain costs, cumbersome permitting processes and land-use conflicts with local communities and environmental groups. The speed and scale of infrastructure envisaged are already facing major resistance due to the potential impact on everything from flora and fauna to water pollution.

Blind spots on the network

Widespread network inefficiencies are wasting enormous amounts of existing renewable power and network capacity that could avert the need for excessive new infrastructure construction. Many utilities have very limited oversight of primary circuits using basic SCADA (supervisory control and data acquisition) systems and little to no visibility of the secondary circuits beyond their substations. Operators currently rely on rough estimates of network conditions or capacity from a few narrow parameters such as weather conditions or faults, leaving major data blind spots across their networks. This denies them the opportunity to help reduce power loss by pinpointing and preventing electricity losses or seeing where they could share loads between lines. Crucially, limited data prevents them from taking advantage of favourable weather to increase power flows without exceeding safe conductor temperatures.

For example, grid operators currently rely on inaccurate, crude calculations based on limited parameters such as windspeed to estimate the ampacity or maximum current that overhead power lines can carry without overheating. This means they are often setting excessively cautious capacity limits and curtailments that fail to take into account of the greater thermal capacity of lines during cooler conditions. As a result, many power grids are running at 20% below their true capacity. Under-estimating ampacity also leads to renewable generators being needlessly curtailed and replaced with dirty power sources to lighten the loads on long-distance transmission lines during periods of peak demand. Resolving this would therefore help operators achieve electricity decarbonisation targets and increase capacity with existing infrastructure.

The lack of network visibility also means utilities are missing other opportunities to maximise efficiency and output without new infrastructure. For example, the lack of current, comprehensive data on loads across feeder lines is hampering operators from balancing loads between parallel lines to enhance grid reliability and flexibility. And outdated grid monitoring systems relying on limited, late information from call centres or technicians mean that power loss and theft are going undetected across many networks.

A pioneering new approach

Faced with these issues, some pioneering utilities are now harnessing ‘multi-sensing’ technologies that accurately analyse multiple parameters in real-time to minimise power losses, optimise usage and maximise grid capacity. This could help operators enhance grid performance while deferring or reducing capital investment in new infrastructure.

A recent pilot saw the world’s first deployment of multi-sensing systems on transmission lines to enable dynamic line rating, which allows power flows to be safely increased without exceeding safe temperature limits during cooler weather conditions.

The Meta-Alert™ Grid Operation and Management System, installed by EGM (Electrical Grid Monitoring, Inc) on a 161 kV line in the Middle East, was operational within minutes of installation and is analysing data on over 60 parameters, from voltage to temperature, helping accurately predict the maximum current that overhead lines can carry several days ahead. This will help operators harness favourable weather conditions to safely increase capacity, relieve congestion, and integrate more renewable power into networks without unnecessary extra infrastructure. By unlocking spare capacity on long-distance high-voltage transmission lines, this could also remove the need to curtail distant renewable energy generators during peak periods.

The same ‘multi-tasking’ technology can also use feeder or meter sensors to identify low power levels or leaking grid components and even find common causes of power loss across multiple sites, helping networks conserve power. This could ultimately be combined with machine learning systems to create smart ‘self-healing grids’ that can anticipate and avert power loss or other faults before they occur.

The technology can also accurately analyse load measurements from each section of feeder lines to help operators find opportunities to balance loads between parallel power lines, increasing the combined capacity of the network. The system can offer load balancing recommendations and even model new load conditions to find the optimal ways of maximising capacity using existing lines. It has additional applications from predictive maintenance to balancing renewable supply and demand in real-time and even improving grid resilience against extreme weather.

Cumulatively, this could drive a sea-change in network capacity, with power grids using comprehensive, current data on everything from voltage to temperatures to dynamically boost capacity and reduce power loss across networks. This could create smart, versatile networks that continuously increase flexibility and capacity, helping integrate more renewable energy without excessive new infrastructure.

With grids rapidly approaching gridlock and construction lagging far behind targets, increasing network efficiency represents the most economically and environmentally sustainable path to boosting global capacity.

This article also appeared in Modern Power Systems magazine.