What lies below the curve? What options are available to researchers looking to gain insights on household end-use loads?
Finally! The quality of load profile data from advanced metering infrastructure systems (AMI) is sufficient to provide reliable data for advanced utility analytics. But, what if the researcher wants to get below the total premise load profile curve and understand and quantify the major household end uses that make up that total load. Possible, yes, but accomplishing this next level of investigation depends on how much funding the researcher has and what level of accuracy the researcher requires.
End-Use Load Shapes and aggregated Load Curve
The first option, direct measurement is the costliest, most intrusive but, likely, the most accurate. Direct measurement has been with us a long time and typically involves instrumenting individual household appliances and major end uses with specialized metering and data logging equipment that captures energy consumption on a relatively high frequency basis—typically 15-minutes or even shorter intervals. End uses can be instrumented either at the end use itself or at the customer panel. Metering at the customer panel requires using specialized split core current transformers, and further requires the end use load to be isolated on a single circuit. You can see the problem with this option already! Although direct measurement gives the researcher the most accurate data, it is the costliest option, and can run in the ten to fifteen thousand dollars per household range. Moreover, the technology is highly intrusive and requires a high degree of interaction with the end customer.
So what’s next? Well, if you have some budget for hardware and resources to install equipment, and are willing to sacrifice a little accuracy to save significant dollars, maybe you should investigate non-intrusive load monitoring (NILM). NILM technology comes to us in two flavors. The first are business to consumer products (B2C) that are designed to be installed and used by consumers looking to get a fuller understanding of their energy consumption. These technologies are designed not only to monitor end-use utilization but to act on inefficiencies via snazzy mobile apps. The mobile apps give the user the capability to control/modify specific end uses using remote control switches. At the other end of the spectrum, we have utility grade hardware and analytical software systems that are designed with the utility researcher in mind. These devices are designed to provide quality end use data at a reasonable cost.
So how does it work? Fundamentally, both the B2C and utility-grade systems rely on high frequency load measurements at the premise level that identifies and records transitions in energy consumption. In turn, these transitions or edges are stored and transmitted back to a central processing system where the data is processed using proprietary algorithms that convert the collected data into individual load shape components. The level of accuracy is very reasonable with an expectation of 80% or better on major loads. Of course, the accuracy can vary and depends on knowledge on the type of loads present and their associated patterns. The output available from the various technologies depends on the vendors sophistication in the measurement device, the intricacy of a given vendors’ processing algorithms and the ability to correctly “label” a load.
B2C technologies have an edge in labeling loads as most technologies allow the end user to train the system by turning loads on and off and labeling the loads directly once the loads are recognized by the system. Utility grade systems don’t have this luxury but some are using direct measurement to capture some loads, helping reduce the overall uncertainty. By definition, utility researchers use NILM technology because they don’t want to go inside the consumer’s home, i.e., they want to be non-intrusive. To compensate for this, utility-grade vendors give the researcher a high degree of flexibility in system configuration by providing initial reference load shape libraries that can be calibrated for a given region or household type.
Individual equipment and installation cost for a B2C device along with a specified amount of “cloud-based” data storage and a mobile application are low compared to their direct measurement counterpart. The cost for the B2C deployment ranges from a one-time fee of $200 to $300, with additional monthly fees based on selected functionality. Utility-grade systems are a bit costlier, running around $1,500 for the measurement device and additional licensing fees for the vendor’s data collection and analytical software. Installation costs are dependent on whether the device is installed at the customer panel or at the utility meter where the measurement device sits between the utility meter and the customer service panel.
A new technology is emerging in the NILM category that uses harmonic signatures as the basis for identifying end uses. This technology is very promising, but unfortunately requires individual loads be turned on and off for the system to label specific loads, which makes the use of the technology somewhat more intrusive to use.
Statistical Based Algorithms
Statistical–based load disaggregation has been with us for some time now. These algorithms range from simple subtraction strategies to more sophisticated conditional demand analysis (CDA). Historically, CDA used multivariate regression models with annual energy consumption and saturation survey data as inputs to generate reasonable annual energy shares for major end uses. More recently, the availability of high frequency AMI data and powerful machine-learning algorithms have opened the possibility that a similar approach can be applied to develop end use load shapes. The clear advantage is a lower investment in hardware with a higher investment in software. Clearly, the end-use profiles provide the researcher more robust information than simple energy shares developed from annual consumption. However, the jury is still out on how well these techniques can generate accurate end use load shapes—albeit, there is some promise that using higher frequency premise level data (1-minute versus 15-minute), will provide reasonably accurate results.
The bottom line is that it all boils down to the amount of money you’re willing to invest and the accuracy you require. In future blogs, we will explore work being done to combine a variety of approaches to create and end-use data development strategy that is both accurate and cost effective.
This blog has been co-authored with DNV GL’s Curt Puckett.