Friday 1 April 2011

NIALM as a combinatorial optimisation problem

NIALM can be formulated as a combinatorial optimisation problem as follows:

For each time instant, minimise the function f:
where:

  • Pagg - household aggregate power
  • N - number of appliances
  • I - vector of appliance binary indicators (1 if appliance is on, 0 otherwise)
  • P - vector of appliance power when appliance is on
This approach was first proposed by Hart, as explained in the 1992 seminal paper on NIALM. Hart asserts that this is an NP-complete "weighted set" problem, and mentions that the number of combinations of 'on' appliances scales exponentially with the number of appliances. However, despite its computational difficulty, the optimal solution can be found within seconds for a realistic number of appliances, e.g. 30.

I have recently been investigating the performance of this technique using simulated data. 24 hours worth of truth data was generated using 30 appliance profiles, each with realistic power demands and usage patterns. The true household aggregate was calculated by summing the power consumed by each device at that time instant. No noise was added on appliance nor household level.

I have been investigating how the performance of this technique decays as the number of appliances in the household increases. The optimisation function was applied at 1 hour intervals to 24 hours of data, and the accuracy was calculated as the average fraction of correct classifications.

Below is a graph showing the accuracy of this technique for the number of appliances in the house.


The technique is very accurate when there is less than 10 appliances in the household. Although the accuracy starts to drop rapidly, if seems to converge around 0.5 or 0.6. However, further tests of large numbers of appliances are necessary to prove this.

Obvious future work is to investigate how the performance of this approach decays with the introduction of appliance and aggregate level noise to the data.

2 comments:

  1. Interesting! You had asked me of my research focus. I mainly focus on gas appliances. Have you considered using any genetic algorithms? if you have n appliances you need to disaggregate, you need to try 2^n - 1 combinations, with 30 appliances which is typical for electric, you are looking at 4 billion different combinations, all of which you are never going to register.. I looked into it briefly, and i thought I'd mention it..

    ReplyDelete
  2. Thanks for your comments! I'd be really interested to know more about what approaches you've investigated. Although gas isn't the primary focus of my topic, it is clearly closely related. There's significant potential for similar techniques to be used in both domains, or even collaborative techniques, e.g. using patterns in electricity readings to help disaggregate gas consumption.

    I haven't investigated using genetic algorithms myself, although I have come across them in the literature. At the moment I'm using the CPLEX optimiser, which can optimise this function for 30 appliances in just a few seconds. However, I do agree that this is intrinsically an exponential time problem and will never scale well for large numbers of appliances.

    Do you mind if I ask where you're studying and how far through your PhD you are? Feel free to drop me an email to op106 [at] soton.ac.uk for a more detailed discussion.

    ReplyDelete

Note: only a member of this blog may post a comment.