Staffordshire University logo
STORE - Staffordshire Online Repository

Anomaly Detection Using Hierarchical Temporal Memory in Smart Homes

Alshammari, Nasser Owaid (2018) Anomaly Detection Using Hierarchical Temporal Memory in Smart Homes. Doctoral thesis, Staffordshire University.

AlshammariNO_PhD thesis.pdf
Available under License All Rights Reserved.

Download (16MB) | Preview

Abstract or description

This work focuses on unsupervised biologically-inspired machine learning techniques and algorithms that can detect anomalies. Specifically, the aim is to investigate the applicability of the Hierarchical Temporal Memory (HTM) theory in detecting anomalies in the smart home domain. The HTM theory proposes a model for the neurons that is more faithful to the actual neurons than their usual counterparts in Artificial Neural Networks (ANN) based on the current Neuroscience understanding. The HTM theory has several algorithmic implementations, the most prominent one is the Cortical Learning Algorithm (CLA). The CLA model typically consists of three main regions: the encoder,
the spatial pooler and the temporal memory. Studying the performance of the CLA in the smart home domain revealed an issue with the standard encoders and high-dimensional datasets. In this domain, it is typical to have high-dimensional feature space representing the collection of smart devices. The standard CLA encoders are more suitable for low-dimensional datasets and there are encoders for categorical and scalar data types.
A novel Hash Indexed Sparse Distributed Representation (HI-SDR) encoder was proposed and developed, to overcome the high-dimensionality issue. The HI-SDR encoder creates unique representation of the data which allows the rest of the CLA regions to learn from. The standard approach when creating HTM models to work with datasets with many features is to concatenate the output of each encoder. This work concludes that the standard encoders produced representations for the input during every timestep that were similar and less distinguishable for the HTM model. This output similarity confuses the HTM model and makes it hard to discern meaningful representations. The
proposed novel encoder manages to capture the required properties in terms of sparsity and representations.
To investigate and validate the performance of a proposed machine learning technique, there has to be a representative dataset. In the smart home literature, there exists many real-world smart home datasets that allow the researchers to validate their models. However, most of the existing datasets are created for classification and recognition of Activities of Daily Living (ADL). The lack of datasets for anomaly detection applications in the domain of smart homes required the development of a simulation tool.
OpenSHS (Open Smart Home Simulator) was developed as an open-source, 3D and cross-platform smart home simulator that offers a novel hybrid approach to dataset generation.
The tool allows the researchers to design a smart home and populate it with
the needed smart devices. Then, the participants can use the designed smart home and simulate their habits and patterns.
Anomaly detection in the smart home domain is highly contextual and dependent on the inhabitant’s activities. One inhabitant’s anomaly could be the norm for another, therefore the definition of anomalies is a complex consideration. Using OpenSHS, seven participants were invited to generated forty-two datasets of their activities. Moreover, each participant defined his/her own anomalous pattern that he/she would like the model to detect. Thus, the resulting datasets are annotated with contextual anomalies specific to each participant.
The proposed encoder has been evaluated and compared against the standard CLA encoders and several state-of-the-art unsupervised anomaly detection algorithms, using Numenta Anomaly Benchmark (NAB). The HI-SDR encoder scored 81.9% accuracy, on the forty-two datasets, with 17.8% increase in accuracy compared to the k-NN algorithm and 47.5% increase over the standard CLA encoders. Using the Principal Component Analysis (PCA) algorithm as a preprocessing step proved to be beneficial to some
of the tested algorithms. The k-NN algorithm scored 39.9% accuracy without PCA and scored 64.1% accuracy with PCA. Similarly, the Histogram Based Outlier Score (HBOS) algorithm scored 28.5% accuracy without PCA and 61.9% with PCA.
The HTM-based models empirically showed good potential and exceeded in performance several algorithms, even without the HI-SDR encoder. However, the HTM-based models still lack an optimisation algorithm for its parameters when performing anomaly

Item Type: Thesis (Doctoral)
Faculty: School of Computing and Digital Technologies > Computing
Depositing User: Jeffrey HENSON
Date Deposited: 03 Apr 2018 12:45
Last Modified: 03 Apr 2018 12:45

Actions (login required)

View Item View Item

DisabledGo Staffordshire University is a recognised   Investor in People. Sustain Staffs
Legal | Freedom of Information | Site Map | Job Vacancies
Staffordshire University, College Road, Stoke-on-Trent, Staffordshire ST4 2DE t: +44 (0)1782 294000