Multi-head Attention Networks for Nonintrusive Load Monitoring
Lin,Nan1,2; Zhou,Binggui1,2; Yang,Guanghua1; Ma,Shaodan2
Source PublicationICSPCC 2020 - IEEE International Conference on Signal Processing, Communications and Computing, Proceedings
AbstractIn this paper, we proposed two multi-head attention neural networks for Nonintrusive Load Monitoring (NILM). The proposed networks are more suitable for the processing of sequential data by implementing the attention mechanism to learn the complex patterns and long-term dependencies. Compared with existing neural NILM schemes, the proposed multi-head attention networks achieve better disaggregation accuracy for different domestic appliances, are more robust to the dynamics of the aggregated data and more efficient for training.
KeywordEnergy disaggregation Multi-head attention Neural network NILM
URLView the original
Scopus ID2-s2.0-85097928152
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionUniversity of Macau
Corresponding AuthorYang,Guanghua
Affiliation1.Institute of Physical Internet,Jinan University,Zhuhai Campus,Zhuhai,519070,China
2.University of Macau,Department of Electrical and Computer Engineering,999078,Macao
First Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Lin,Nan,Zhou,Binggui,Yang,Guanghua,et al. Multi-head Attention Networks for Nonintrusive Load Monitoring[C],2020.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Lin,Nan]'s Articles
[Zhou,Binggui]'s Articles
[Yang,Guanghua]'s Articles
Baidu academic
Similar articles in Baidu academic
[Lin,Nan]'s Articles
[Zhou,Binggui]'s Articles
[Yang,Guanghua]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Lin,Nan]'s Articles
[Zhou,Binggui]'s Articles
[Yang,Guanghua]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.