An Information-Theoretic Perspective on Proper Quaternion Variational Autoencoders.

Eleonora Grassucci, Danilo Comminiello, Aurelio Uncini
Author Information
  1. Eleonora Grassucci: Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome, Via Eudossiana 18, 00184 Rome, Italy. ORCID
  2. Danilo Comminiello: Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome, Via Eudossiana 18, 00184 Rome, Italy. ORCID
  3. Aurelio Uncini: Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome, Via Eudossiana 18, 00184 Rome, Italy. ORCID

Abstract

Variational autoencoders are deep generative models that have recently received a great deal of attention due to their ability to model the latent distribution of any kind of input such as images and audio signals, among others. A novel variational autoncoder in the quaternion domain H, namely the QVAE, has been recently proposed, leveraging the augmented second order statics of H-proper signals. In this paper, we analyze the QVAE under an information-theoretic perspective, studying the ability of the H-proper model to approximate improper distributions as well as the built-in H-proper ones and the loss of entropy due to the improperness of the input signal. We conduct experiments on a substantial set of quaternion signals, for each of which the QVAE shows the ability of modelling the input distribution, while learning the improperness and increasing the entropy of the latent space. The proposed analysis will prove that proper QVAEs can be employed with a good approximation even when the quaternion input data are improper.

Keywords

References

  1. Entropy (Basel). 2021 Mar 21;23(3): [PMID: 33801048]
  2. IEEE Trans Neural Netw Learn Syst. 2015 Oct;26(10):2422-39 [PMID: 25594982]
  3. Entropy (Basel). 2021 Jan 19;23(1): [PMID: 33477766]
  4. Neural Netw. 2021 Jul;139:199-200 [PMID: 33774356]
  5. Neural Netw. 2021 Jan;133:132-147 [PMID: 33217682]
  6. Entropy (Basel). 2018 Jan 11;20(1): [PMID: 33265134]
  7. Entropy (Basel). 2020 Mar 29;22(4): [PMID: 33286164]
  8. Entropy (Basel). 2021 May 02;23(5): [PMID: 34063192]
  9. Neural Netw. 2020 Dec;132:321-332 [PMID: 32977277]
  10. IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):2879-2896 [PMID: 35749321]
  11. Entropy (Basel). 2020 Dec 17;22(12): [PMID: 33348816]
  12. IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):7327-7347 [PMID: 34591756]
  13. Entropy (Basel). 2019 Jan 17;21(1): [PMID: 33266795]
  14. Entropy (Basel). 2020 Sep 21;22(9): [PMID: 33286824]

Grants

  1. RG11916B88E1942F/Sapienza Università di Roma

Word Cloud

Created with Highcharts 10.0.0inputquaternionabilitysignalsQVAEH-properVariationalgenerativerecentlyduemodellatentdistributionvariationalproposedimproperentropyimpropernesslearningautoencodersdeepmodelsreceivedgreatdealattentionkindimagesaudioamongothersnovelautoncoderdomainHnamelyleveragingaugmentedsecondorderstaticspaperanalyzeinformation-theoreticperspectivestudyingapproximatedistributionswellbuilt-inoneslosssignalconductexperimentssubstantialsetshowsmodellingincreasingspaceanalysiswillproveproperQVAEscanemployedgoodapproximationevendataInformation-TheoreticPerspectiveProperQuaternionAutoencoderspropernessneuralnetworkssecond-ordercircularityautoencoder

Similar Articles

Cited By