UM
Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture
Chen, C. L. Philip; Liu, Zhulin
2018-01
Source PublicationIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
Volume29Issue:1Pages:10-24
AbstractBroad Learning System (BLS) that aims to offer an alternative way of learning in deep structure is proposed in this paper. Deep structure and learning suffer from a time-consuming training process because of a large number of connecting parameters in filters and layers. Moreover, it encounters a complete retraining process if the structure is not sufficient to model the system. The BLS is established in the form of a flat network, where the original inputs are transferred and placed as "mapped features" in feature nodes and the structure is expanded in wide sense in the "enhancement nodes." The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded. Two incremental learning algorithms are given for both the increment of the feature nodes (or filters in deep structure) and the increment of the enhancement nodes. The designed model and algorithms are very versatile for selecting a model rapidly. In addition, another incremental learning is developed for a system that has been modeled encounters a new incoming input. Specifically, the system can be remodeled in an incremental way without the entire retraining from the beginning. Satisfactory result for model reduction using singular value decomposition is conducted to simplify the final structure. Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed BLS.
KeywordBig data big data modeling broad learning system (BLS) deep learning incremental learning random vector functional-link neural networks (RVFLNN) single layer feedforward neural networks (SLFN) singular value decomposition (SVD)
DOI10.1109/TNNLS.2017.2716952
URLView the original
Indexed BySCI
Language英语
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:000419558900002
PublisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
The Source to ArticleWOS
Fulltext Access
Citation statistics
Cited Times [WOS]:78   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
CollectionUniversity of Macau
Recommended Citation
GB/T 7714
Chen, C. L. Philip,Liu, Zhulin. Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2018,29(1):10-24.
APA Chen, C. L. Philip,&Liu, Zhulin.(2018).Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,29(1),10-24.
MLA Chen, C. L. Philip,et al."Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 29.1(2018):10-24.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Chen, C. L. Philip]'s Articles
[Liu, Zhulin]'s Articles
Baidu academic
Similar articles in Baidu academic
[Chen, C. L. Philip]'s Articles
[Liu, Zhulin]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Chen, C. L. Philip]'s Articles
[Liu, Zhulin]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.