An interpretable approach for automatic aesthetic assessment of remote sensing images.

Jingru Tong, Guo Zhang, Peijie Kong, Yu Rao, Zhengkai Wei, Hao Cui, Qing Guan
Author Information
  1. Jingru Tong: School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, China.
  2. Guo Zhang: State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University, Wuhan, China.
  3. Peijie Kong: School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, China.
  4. Yu Rao: School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, China.
  5. Zhengkai Wei: School of Remote Sensing and Information Engineering, Wuhan University, Wuhan, China.
  6. Hao Cui: State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University, Wuhan, China.
  7. Qing Guan: State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University, Wuhan, China.

Abstract

The increase of remote sensing images in recent decades has resulted in their use in non-scientific fields such as environmental protection, education, and art. In this situation, we need to focus on the aesthetic assessment of remote sensing, which has received little attention in research. While according to studies on human brain's attention mechanism, certain areas of an image can trigger visual stimuli during aesthetic evaluation. Inspired by this, we used convolutional neural network (CNN), a deep learning model resembling the human neural system, for image aesthetic assessment. So we propose an interpretable approach for automatic aesthetic assessment of remote sensing images. Firstly, we created the Remote Sensing Aesthetics Dataset (RSAD). We collected remote sensing images from Google Earth, designed the four evaluation criteria of remote sensing image aesthetic quality-color harmony, light and shadow, prominent theme, and visual balance-and then labeled the samples based on expert photographers' judgment on the four evaluation criteria. Secondly, we feed RSAD into the ResNet-18 architecture for training. Experimental results show that the proposed method can accurately identify visually pleasing remote sensing images. Finally, we provided a visual explanation of aesthetic assessment by adopting Gradient-weighted Class Activation Mapping (Grad-CAM) to highlight the important image area that influenced model's decision. Overall, this paper is the first to propose and realize automatic aesthetic assessment of remote sensing images, contributing to the non-scientific applications of remote sensing and demonstrating the interpretability of deep-learning based image aesthetic evaluation.

Keywords

References

  1. J Environ Manage. 2014 May 1;137:36-44 [PMID: 24594757]
  2. Perspect Psychol Sci. 2020 May;15(3):630-642 [PMID: 32027577]
  3. Front Psychol. 2019 Jan 22;10:14 [PMID: 30723437]
  4. Perception. 1997;26(7):807-22 [PMID: 9509135]
  5. Prog Neurobiol. 2011 Jun;94(1):39-48 [PMID: 21421021]
  6. Annu Rev Psychol. 2013;64:77-107 [PMID: 23020642]
  7. IEEE Trans Pattern Anal Mach Intell. 2019 Jul;41(7):1531-1544 [PMID: 29993710]

Word Cloud

Created with Highcharts 10.0.0aestheticremotesensingimagesassessmentimageevaluationattentionvisualautomaticnon-scientifichumanmechanismcanneuraldeeplearningproposeinterpretableapproachRSADfourcriteriabasedinterpretabilityincreaserecentdecadesresultedusefieldsenvironmentalprotectioneducationartsituationneedfocusreceivedlittleresearchaccordingstudiesbrain'scertainareastriggerstimuliInspiredusedconvolutionalnetworkCNNmodelresemblingsystemFirstlycreatedRemoteSensingAestheticsDatasetcollectedGoogleEarthdesignedquality-colorharmonylightshadowprominentthemebalance-andlabeledsamplesexpertphotographers'judgmentSecondlyfeedResNet-18architecturetrainingExperimentalresultsshowproposedmethodaccuratelyidentifyvisuallypleasingFinallyprovidedexplanationadoptingGradient-weightedClassActivationMappingGrad-CAMhighlightimportantareainfluencedmodel'sdecisionOverallpaperfirstrealizecontributingapplicationsdemonstratingdeep-learningquality

Similar Articles

Cited By

No available data.