UM
Guided learning: A new paradigm for multi-task classification
Fu J.1; Zhang L.1; Zhang B.2; Jia W.3
2018
Source PublicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10996 LNCS
Pages239-246
AbstractA prevailing problem in many machine learning tasks is that the training and test data have different distribution (non i.i.d). Previous methods to solve this problem are called Transfer Learning (TL) or Domain Adaptation (DA), which belong to one stage models. In this paper, we propose a new, simple but effective paradigm, Guided Learning (GL), for multi-stage progressive training. This new paradigm is motivated by the “tutor guides student” learning mode in human world. Further, under the framework of GL, a Guided Subspace Learning (GSL) method is proposed for domain disparity reduction, which aims to learn an optimal, invariant and discriminative subspace through the guided learning strategy. Extensive experiments on various databases show that our method outperforms many state-of-the-art TL/DA methods.
KeywordDomain disparity Guided learning Subspace learning
DOI10.1007/978-3-319-97909-0_26
URLView the original
Language英語
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionUniversity of Macau
Affiliation1.Chongqing University
2.Universidade de Macau
3.Hefei University of Technology
Recommended Citation
GB/T 7714
Fu J.,Zhang L.,Zhang B.,et al. Guided learning: A new paradigm for multi-task classification[C],2018:239-246.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Fu J.]'s Articles
[Zhang L.]'s Articles
[Zhang B.]'s Articles
Baidu academic
Similar articles in Baidu academic
[Fu J.]'s Articles
[Zhang L.]'s Articles
[Zhang B.]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Fu J.]'s Articles
[Zhang L.]'s Articles
[Zhang B.]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.