Advantages
- The HMM is a well studied probabilistic graphic model, for which algorithms are known for exact and approximate learning and inference
- HMMs are able to represent the variance of appliances' power demands through probability distributions
- HMMs capture the dependencies between consecutive measurements, as defined by Hart as the switch continuity principle
Disadvantages
- HMMs represent the behaviour of an appliance using a finite number of static distributions, and therefore fail to represent appliances with a continuously varying power demand
- Due to their Markovian nature, they do not take into account the sequence of states leading into any given state
- Again, due to their Markovian nature, the time spent in a given state is not captured explicitly. However, the hidden semi-Markov model does capture such behaviour
- Features other than the observed power demand are not captured (e.g. time of day). However, the input-output HMM allow such such state durations to be modelled
- Any dependency between appliances cannot be represented. However, the conditional-HMM can capture such dependencies
In summary, the basic HMM provides a useful model for many appliances. However, the appliances it can represent are limited by the intrinsic structure of the model. Many extensions exist that increase the representational power of the HMM, although the additional parameters required often complicate the learning and inference tasks.
No comments:
Post a Comment
Note: only a member of this blog may post a comment.