When all electricity comes from a single source—perhaps a coal-fired or nuclear power plant—figuring out how to distribute the output is a complex but largely solved problem. But with a growing suite of energy production sources, including wind and solar, now being deployed, fine-tuning the delivery to the end consumer becomes more challenging: What percentage should flow from wind power? How do we store solar energy? When should a “dirty” power plant—such as a coal-fired plant that emits greenhouse gases and may contribute to air and water pollution—kick in, if at all?
According to the Renewable Energy Policy Network, “additions in installed renewable power capacity set new records in 2016, with 161 gigawatts (GW) installed, increasing total global capacity by almost 9 percent over 2015, to nearly 2,017 GW. Solar PV accounted for around 47 percent of the capacity added, followed by wind power at 34 percent and hydropower at 15.5 percent.” With combating climate change becoming ever more pressing, the world is turning to renewable solutions to meet energy demand. Technological innovations are helping to answer the complex questions about managing, monitoring, and optimizing this more sustainable array of assets all along the production and distribution chain.
Since the sun doesn’t always shine on a given solar farm, nor does the wind reliably blow, solar and wind sources can’t be counted on to provide electricity 24/7.
Today’s electricity landscape is chaotic with different kinds of distributed energy resources, or DERs: Batteries, UV charging stations, solar panels, and home batteries are just some of the many ways we receive power. What’s more, DERs often come with their own data-gathering interfaces. “Smart” grids and “smart” IoT meters that monitor energy consumption in real time and generate large amounts of data are everywhere. These smart devices make energy consumption more transparent, helping to pave a smoother two-way street between utility companies and consumers.
Kyle Garton, Principal Product Manager at AutoGrid, a California-based company that builds software to show energy consumption and forecast future needs, noted that “nowadays it’s a lot more cost-effective to implement a lot more sensors throughout [a] network.” These sensors and IoT devices generate large volumes of data about energy consumption, peak loads, energy production from specific assets, and so on. When analyzed well, these data points can paint a clear picture of how energy capacity and consumption are flowing, and even indicate when assets need to be taken off the grid for repairs.
AutoGrid’s two application suites are built upon the AutoGrid Energy Internet Platform, and use proprietary AutoGrid Predictive Analytics. The AutoGrid Energy Internet Platform is the foundational IoT platform that uses Hadoop, Redis, Kafka, and platform services written in Python and Java. AutoGrid Predictive Analytics is the data science engine, primarily composed of Python models and services, leveraging Spark for speed and scale. The AutoGrid Flex and Engage application layers are primarily Ruby on Rails with AngularJS. Everything is containerized with Docker and managed with Kubernetes.
Such software is also playing an important role in empowering consumers in the renewables energy equation, and can be seamlessly integrated into today’s “smart” grid. Case in point: The AutoGrid Flex platform interfaces with a wide variety of IoT devices, from residential to industrial-scale energy applications. In addition to energy-consumption data, typical residential appliances may also provide telemetry about air temperature, humidity, water temperature, and occupancy. Industrial devices often generate a variety of interesting process-specific data, but some of the most common and useful measurements include wind speed, solar irradiance, and thermal limits. These data streams can be leveraged by the AutoGrid machine learning algorithms to enhance forecasting and optimization of flexible energy resources throughout the network.
Since the sun doesn’t always shine on a given solar farm, nor does the wind reliably blow, solar and wind sources can’t be counted on to provide electricity 24/7—excess energy generated during peak production must be stored and dispatched as needed. Efficient storage of excess energy is critical if renewables are to play a significant role in meeting energy demands. But with store-and-deploy methods come additional questions. How much energy can be stored? When will it need to be released back into the grid? When does the utility have to switch from one type of DER to another? Effective software helps here, too—it can balance the grid as the industry moves toward accommodating greater percentages of renewable resources. Integrated flexibility management software can be designed to act like a benevolent overlord, utilizing advanced analytics for real-time optimization and asset dispatching.
Such constant monitoring and optimization is the key driver behind OhmConnect, a mobile app that has seen extensive adoption in California, which rewards customers for saving energy when a dirty power plant is about to be pressed into service.
OhmConnect’s basic stack is Python, Flask, Vue.js, MySQL, and Apache. It runs on AWS, using many of their services, as well as other third-party services.
OhmConnect constantly monitors the California ISO, which oversees the state’s grid and power supply. A spike in wholesale energy prices at the ISO indicates increased consumer demand, which might lead to the deployment of dirty power plants. When spikes occur, OhmConnect sends a message to its consumers, encouraging them to decrease their energy consumption. And because utilities actually have to pay more to use dirty power plants, they reward good behavior—positive actions following energy-saving requests is recognized with money. If, for example, an OhmConnect consumer saves one kilowatt hour (kWh) of electricity, the California ISO will reward OhmConnect as if that consumer generated one kWh. OhmConnect in turn passes a significant portion of that savings to its end user.
Also in this issue
Tomorrow’s power grid
As we lean more and more heavily on renewable energy sources, what will happen to the grid?
Curtis Tongue, co-founder and CMO of OhmConnect, said that this demand response is based on a concept called energy sharing, and compares the process to that of companies like Lyft and Airbnb: “Instead of your car or your home as a resource, it’s your electricity.”
By connecting their utility account to OhmConnect, consumers give the company access to their smart meter data, which can be parsed into 15-minute intervals to calibrate and visualize their energy consumption. A mobile app offers the end consumer a direct pathway to the ISO—and the company has a way to contextualize user participation. “When we have direct access to smart meter data, we can quantify what carbon reductions are. Climate change is a really nebulous challenge for a lot of people,” Tongue said. But keeping things close to home can give consumers a sense of control, and Tongue highlighted the value of geographically localized impact. “There’s this dirty power plant that is metaphorically in your backyard that you are helping keep offline. Your individual contribution is making an impact.”
A microgrid is a smaller network of energy supply, storage, and distribution assets that is connected to a central grid but which can also operate autonomously.
Also making waves in the renewable energy sector are companies facilitating more efficient management of microgrids.
Depending on its specifications, an individual microgrid can consist of a wind farm or solar energy panels, energy storage based on batteries, and perhaps a diesel generator as bulletproof backup. “Once you have selected those things, actually having them communicate and work together is […] very complicated,” said Francisco Morocz, CEO of Heila Technologies. “Usually you have big companies such as Siemens come in, and, for a handsome fee, they will customize what we call a central controller or SCADA—supervisory control and data acquisition.”
The central controller and the individual assets all run proprietary firmware and communicate with each other using industrial protocols. This system requires careful integration so that the central controller can manage these assets, which makes it a complex and expensive engineering integration project to install. Similarly, once the microgrid is operating, any changes to it—such as adding or removing assets—requires another integration project, as these changes introduce dependent changes on other assets in the microgrid.
Heila aims to make this process easier. It installs a small box called a Heila IQ in front of a specific bank of assets, which then talks to the central controller and optimizes the output from each individual energy asset. With Heila IQ, every asset in the microgrid is abstracted and standardized. It doesn’t matter if the box is connected to a battery or a solar panel or an inverter—the software aggregates the assets within the microgrid so they can talk to a central controller.
Heila’s local pieces run on Linux, use Java and Python, and leverage several open-source libraries for communication protocols. Cloud-based pieces run on AWS, using AWS and other third-party services.
The Heila IQ box runs powerful software that presents an abstract view to the operator. Instead of directly controlling the individual assets, the operator describes higher-level goals and constraints such as “reduce emissions” or “avoid using gas-based generators because they are expensive.” Then, as the microgrid is operating, the Heila IQ automatically controls the assets to try to optimize for these goals and satisfy the constraints. Later, if the operator adds new assets to the microgrid, they don’t need to configure the individual assets or try to rebalance the system. As long as they specify the higher-level goals and constraints, the Heila IQ-based microgrid continues to control the assets appropriately. You can then have two microgrids with two central controllers talking to each other through what is called an aggregator.
On top of that, there’s a system operator, the utility. “Instead of being centralized with a system operator trying to control a hundred million things at the same time, it’s all done level by level, where the complexities of one level don’t transpire to the complexities of the level above or below,” Morocz said. In Heila-powered microgrids, every asset is equipped with a Heila IQ optimizer, which forms a distributed intelligent network with the other Heila IQs and can talk to any controller. It’s part of a decentralized optimization strategy for energy assets. Since it can monitor all parts of the system, it can avoid disruptions in the system and kick in or dial back assets to maintain a constant flow of energy. Such equipment provides nimble and modular solutions as the power grid swings between various sources of energy.
Wind turbines, for example, are large like airplane propeller blades, and sending crews on inspection missions can be nerve-wracking.
Drones are useful monitoring tools, especially in precarious conditions where employing humans can be dangerous and expensive. Wind turbines, for example, are large like airplane propeller blades, and sending crews on inspection missions can be nerve-wracking; to install solar panels en masse, potentially large swaths of land need to be surveyed. Instead of subjecting human employees to this type of challenge and risk, drones can be used to inspect wind turbines and solar panels to ensure optimal performance and to monitor for damaged assets. The software packaged on these drones helps gather and analyze data from DERs so strategic decisions can be made about their use.
An example of how this is developing can be found in DroneDeploy, which builds software that does three things: enables users to fly drones, processes data into maps and 3D models using photos that are collected, and analyzes the data. Programs like DroneDeploy’s can be used by solar companies to provide topographic maps of the land under survey and, once the solar panels are installed, to monitor for defects. Mike Winn, founder and CEO of DroneDeploy, pointed to the example of SunPower, a solar industry player that uses DroneDeploy software to streamline the prospecting and design layout process when installing new solar farms.
Winn said that solar plant operators can also attach thermal cameras to drones to help identify solar cells that are less efficient, perhaps even broken: A solar cell that’s absorbing all the energy and producing electricity is going to be much cooler than one that is not.
Avitas is a cloud-based platform built on GE’s Predix.
Proactive monitoring of renewable energy equipment helps keep systems healthy and efficient. This is a strong suit for companies like GE’s Avitas Systems, whose software targets specific points of inspection and develops paths to collect data in the form of images and video for a variety of robotics, including drones. (The system also works with robotic crawlers and autonomous underwater vehicles, or AUVs.)
The software creates paths, driven by 3D models, that can be repeated from the same angles and locations. The paths’ repeatability means a wide variety of images captured over time can be input into the Avitas Systems cloud-based platform, where advanced image analytics can detect changes and measure exact defects—such as cracks and corrosion—on industrial assets. The platform can then rate the severity of the defects, which are often not visible to the human eye, enabling earlier resolution of potential issues.
“What are the two most complex machine systems ever made?” AutoGrid’s Garton asked. “People either say the internet or the power grid.
The Avitas Systems inspection platform centralizes and stores the data, allowing for archival searches of inspection records. Avitas then fuses inspection data, from both manual and autonomous inspection, with asset performance data, external data sources (e.g., weather reports), and new inputs from subsequent inspections. Advanced algorithms automatically detect asset defects and anomalies. As more data is ingested across diverse sources, the deep learning models retrain for smarter actionable insights. The platform uses predictive analytics to recommend targeted, risk-based inspection scheduling and planning.
Dominique Mann, Communications Manager at Avitas Systems, noted that there’s AI training involved in automated defect recognition. Avitas Systems’ data scientists build neural networks for image classification and generative adversarial neural networks to minimize the amount of work involved in labeling captured images. The software is trained to process many different images and determine when it is ready to identify defects, following a variety of models.
The road toward renewable energy solutions can be bumpy, as grids have to work with a range of distributed energy resources, whether their source is solar, wind, coal, or beyond. But software processes are working to make it smoother. Sensitive sensors with IoT technology are talking to software programs that monitor and fine-tune operations and optimize delivery options. And software platforms are leveraging the power of big data to provide continuous real-time monitoring and fine-tuning of critical infrastructure. It’s a big task, but there are teams building software platforms that are up for the challenge.
“What are the two most complex machine systems ever made?” AutoGrid’s Garton asked. “People either say the internet or the power grid. We’re putting the two together. We’re turning big data into power.”