Unpaired Underwater Image Synthesis with a Disentangled Representation for Underwater Depth Map Prediction.

Qi Zhao, Zhichao Xin, Zhibin Yu, Bing Zheng
Author Information
  1. Qi Zhao: College of Information Science and Engineering, Ocean University of China, Qingdao 266100, China.
  2. Zhichao Xin: College of Information Science and Engineering, Ocean University of China, Qingdao 266100, China.
  3. Zhibin Yu: College of Information Science and Engineering, Ocean University of China, Qingdao 266100, China. ORCID
  4. Bing Zheng: College of Information Science and Engineering, Ocean University of China, Qingdao 266100, China.

Abstract

As one of the key requirements for underwater exploration, underwater depth map estimation is of great importance in underwater vision research. Although significant progress has been achieved in the fields of image-to-image translation and depth map estimation, a gap between normal depth map estimation and underwater depth map estimation still remains. Additionally, it is a great challenge to build a mapping function that converts a single underwater image into an underwater depth map due to the lack of paired data. Moreover, the ever-changing underwater environment further intensifies the difficulty of finding an optimal mapping solution. To eliminate these bottlenecks, we developed a novel image-to-image framework for underwater image synthesis and depth map estimation in underwater conditions. For the problem of the lack of paired data, by translating hazy in-air images (with a depth map) into underwater images, we initially obtained a paired dataset of underwater images and corresponding depth maps. To enrich our synthesized underwater dataset, we further translated hazy in-air images into a series of continuously changing underwater images with a specified style. For the depth map estimation, we included a coarse-to-fine network to provide a precise depth map estimation result. We evaluated the efficiency of our framework for a real underwater RGB-D dataset. The experimental results show that our method can provide a diversity of underwater images and the best depth map estimation precision.

Keywords

References

  1. Sensors (Basel). 2015 Dec 15;15(12):31525-57 [PMID: 26694389]
  2. Sensors (Basel). 2016 Sep 20;16(9): [PMID: 27657074]
  3. IEEE Trans Image Process. 2018 May 14;27(8):4066-4079 [PMID: 29993743]
  4. IEEE Comput Graph Appl. 2016 Mar-Apr;36(2):24-35 [PMID: 26960026]
  5. IEEE Trans Pattern Anal Mach Intell. 2011 Dec;33(12):2341-53 [PMID: 20820075]

Grants

  1. ZDKJ202017/technology project of 630 Hainan province of China
  2. 61701463/Natural Science Foundation of Ningbo

Word Cloud

Created with Highcharts 10.0.0underwaterdepthmapestimationimagesimage-to-imageimagepaireddatasetgreattranslationmappinglackdataframeworksynthesishazyin-airprovideUnderwateronekeyrequirementsexplorationimportancevisionresearchAlthoughsignificantprogressachievedfieldsgapnormalstillremainsAdditionallychallengebuildfunctionconvertssingledueMoreoverever-changingenvironmentintensifiesdifficultyfindingoptimalsolutioneliminatebottlenecksdevelopednovelconditionsproblemtranslatinginitiallyobtainedcorrespondingmapsenrichsynthesizedtranslatedseriescontinuouslychangingspecifiedstyleincludedcoarse-to-finenetworkpreciseresultevaluatedefficiencyrealRGB-DexperimentalresultsshowmethodcandiversitybestprecisionUnpairedImageSynthesisDisentangledRepresentationDepthMapPrediction

Similar Articles

Cited By (1)