Multi-source fast transfer learning algorithm based on support vector machine.

Peng Gao, Weifei Wu, Jingmei Li
Author Information
  1. Peng Gao: College of Computer Science and Technology, Harbin Engineering University, Harbin, China.
  2. Weifei Wu: College of Computer Science and Technology, Harbin Engineering University, Harbin, China.
  3. Jingmei Li: College of Computer Science and Technology, Harbin Engineering University, Harbin, China.

Abstract

Knowledge in the source domain can be used in transfer learning to help train and classification tasks within the target domain with fewer available data sets. Therefore, given the situation where the target domain contains only a small number of available unlabeled data sets and multi-source domains contain a large number of labeled data sets, a new Multi-source Fast Transfer Learning algorithm based on support vector machine(MultiFTLSVM) is proposed in this paper. Given the idea of multi-source transfer learning, more source domain knowledge is taken to train the target domain learning task to improve classification effect. At the same time, the representative data set of the source domain is taken to speed up the algorithm training process to improve the efficiency of the algorithm. Experimental results on several real data sets show the effectiveness of MultiFTLSVM, and it also has certain advantages compared with the benchmark algorithm.

Keywords

References

  1. IEEE Trans Neural Netw. 2011 Feb;22(2):199-210 [PMID: 21095864]
  2. IEEE Trans Pattern Anal Mach Intell. 2005 Apr;27(4):603-618 [PMID: 15794164]
  3. IEEE Trans Neural Netw Learn Syst. 2018 Feb;29(2):310-323 [PMID: 28113958]
  4. Comput Intell Neurosci. 2018 Feb 1;2018:7068349 [PMID: 29487619]
  5. Sensors (Basel). 2019 Sep 16;19(18): [PMID: 31527437]
  6. IEEE Trans Neural Netw Learn Syst. 2012 Mar;23(3):504-18 [PMID: 24808555]
  7. Science. 2015 Jul 17;349(6245):255-60 [PMID: 26185243]

Word Cloud

Created with Highcharts 10.0.0domainlearningdataalgorithmtransfersetssourcetargetMulti-sourcevectormachinetrainclassificationavailablenumbermulti-sourcebasedsupportMultiFTLSVMtakenimproveKnowledgecanusedhelptaskswithinfewerThereforegivensituationcontainssmallunlabeleddomainscontainlargelabelednewFastTransferLearningproposedpaperGivenideaknowledgetaskeffecttimerepresentativesetspeedtrainingprocessefficiencyExperimentalresultsseveralrealshoweffectivenessalsocertainadvantagescomparedbenchmarkfastClassificationSupport

Similar Articles

Cited By