Multi-Source Deep Transfer Neural Network Algorithm.

Jingmei Li, Weifei Wu, Di Xue, Peng Gao
Author Information
  1. Jingmei Li: College of Computer Science and Technology, Harbin Engineering University, No.145 Nantong Street, Harbin 150001, China. lijingmei@hrbeu.edu.cn.
  2. Weifei Wu: College of Computer Science and Technology, Harbin Engineering University, No.145 Nantong Street, Harbin 150001, China. wuweifei@hrbeu.edu.cn.
  3. Di Xue: College of Computer Science and Technology, Harbin Engineering University, No.145 Nantong Street, Harbin 150001, China. dixue@hrbeu.edu.cn.
  4. Peng Gao: College of Computer Science and Technology, Harbin Engineering University, No.145 Nantong Street, Harbin 150001, China. gaopeng1979@hrbeu.edu.cn.

Abstract

Transfer learning can enhance classification performance of a target domain with insufficient training data by utilizing knowledge relating to the target domain from source domain. Nowadays, it is common to see two or more source domains available for knowledge transfer, which can improve performance of learning tasks in the target domain. However, the classification performance of the target domain decreases due to mismatching of probability distribution. Recent studies have shown that deep learning can build deep structures by extracting more effective features to resist the mismatching. In this paper, we propose a new multi-source deep transfer neural network algorithm, MultiDTNN, based on convolutional neural network and multi-source transfer learning. In MultiDTNN, joint probability distribution adaptation (JPDA) is used for reducing the mismatching between source and target domains to enhance features transferability of the source domain in deep neural networks. Then, the convolutional neural network is trained by utilizing the datasets of each source and target domain to obtain a set of classifiers. Finally, the designed selection strategy selects classifier with the smallest classification error on the target domain from the set to assemble the MultiDTNN framework. The effectiveness of the proposed MultiDTNN is verified by comparing it with other state-of-the-art deep transfer learning on three datasets.

Keywords

References

  1. ISA Trans. 2019 Aug 12;:null [PMID: 31420125]
  2. IEEE Trans Neural Netw Learn Syst. 2019 Jun;30(6):1768-1779 [PMID: 30371396]
  3. IEEE J Biomed Health Inform. 2017 Jan;21(1):76-84 [PMID: 28114048]
  4. IEEE Trans Neural Netw. 2011 Feb;22(2):199-210 [PMID: 21095864]
  5. IEEE Trans Neural Netw Learn Syst. 2018 Feb;29(2):310-323 [PMID: 28113958]
  6. Comput Intell Neurosci. 2018 Feb 1;2018:7068349 [PMID: 29487619]
  7. Neural Netw. 2016 Jun;78:97-111 [PMID: 26783204]
  8. Science. 2006 Jul 28;313(5786):504-7 [PMID: 16873662]
  9. Sensors (Basel). 2017 Feb 22;17(2): [PMID: 28241451]
  10. IEEE Trans Neural Netw Learn Syst. 2012 Mar;23(3):504-18 [PMID: 24808555]
  11. Science. 2015 Jul 17;349(6245):255-60 [PMID: 26185243]
  12. IEEE Trans Pattern Anal Mach Intell. 2018 Apr;40(4):1002-1014 [PMID: 28475048]

Word Cloud

Created with Highcharts 10.0.0domainlearningtargetdeepsourcetransferneuralclassificationnetworkMultiDTNNcanperformancemismatchingmulti-sourceconvolutionalTransferenhanceutilizingknowledgedomainsprobabilitydistributionfeaturesdatasetssetinsufficienttrainingdatarelatingNowadayscommonseetwoavailableimprovetasksHoweverdecreasesdueRecentstudiesshownbuildstructuresextractingeffectiveresistpaperproposenewalgorithmbasedjointadaptationJPDAusedreducingtransferabilitynetworkstrainedobtainclassifiersFinallydesignedselectionstrategyselectsclassifiersmallesterrorassembleframeworkeffectivenessproposedverifiedcomparingstate-of-the-artthreeMulti-SourceDeepNeuralNetworkAlgorithm

Similar Articles

Cited By