Effect of AI chatbot emotional disclosure on user satisfaction and reuse intention for mental health counseling: a serial mediation model.

Gain Park, Jiyun Chung, Seyoung Lee
Author Information
  1. Gain Park: Department of Journalism and Media Studies, New Mexico State University, 2915 McFie Circle, Milton Hall 158, Las Cruces, NM 88003 USA. ORCID
  2. Jiyun Chung: Convergence and Open Sharing System-Artificial Intelligence, Sungkyunkwan University, 25-2 Sungkyunkwan-Ro, 50212 Hoam Hall, Jongno-Gu, Seoul, South Korea 03063.
  3. Seyoung Lee: Department of Media and Communication, Sungkyunkwan University, 25-2, Sungkyunkwan-Ro, 50505 Hoam Hall, Jongno-Gu, Seoul, South Korea 03063. ORCID

Abstract

This study explored the effect of chatbot emotional disclosure on user satisfaction and reuse intention for a chatbot counseling service. It also examined the independent and sequential mediation roles of user emotional disclosure intention and perceived intimacy with a chatbot on the relationship between chatbot emotional disclosure, user satisfaction, and reuse intention for chatbot counseling. In total, 348 American adults were recruited to participate in a mental health counseling session with either of the two types of artificial intelligence-powered mental health counseling chatbots. These included a chatbot disclosing factual information only or a chatbot disclosing humanlike emotions. The results revealed that chatbot emotional disclosure significantly increased user satisfaction and reuse intention for a chatbot counseling service. The results further revealed that user emotional disclosure intention and perceived intimacy with a chatbot independently and serially mediates the effect of chatbot emotional disclosure on user satisfaction and chatbot counseling service reuse intention. The results indicate positive effects of artificial emotions and their disclosure in the context of chatbot moderated mental health counseling. Practical implications and psychological mechanisms are discussed.

Keywords

References

  1. JMIR Ment Health. 2017 Jun 06;4(2):e19 [PMID: 28588005]
  2. Sci Robot. 2018 Aug 15;3(21): [PMID: 33141720]
  3. J Fam Psychol. 2006 Jun;20(2):256-65 [PMID: 16756401]
  4. J Health Commun. 2019;24(3):217-225 [PMID: 30912708]
  5. J Pers Soc Psychol. 2004 Feb;86(2):265-84 [PMID: 14769083]
  6. J Commun. 2018 Aug;68(4):712-733 [PMID: 30100620]
  7. Cyberpsychol Behav Soc Netw. 2018 Oct;21(10):625-636 [PMID: 30334655]
  8. JAMA. 2017 Oct 3;318(13):1217-1218 [PMID: 28973225]

Word Cloud

Created with Highcharts 10.0.0chatbotdisclosureemotionaluserintentioncounselingsatisfactionreusehealthmentalserviceresultseffectmediationperceivedintimacyartificialdisclosingemotionsrevealedstudyexploredalsoexaminedindependentsequentialrolesrelationshiptotal348Americanadultsrecruitedparticipatesessioneithertwotypesintelligence-poweredchatbotsincludedfactualinformationhumanlikesignificantlyincreasedindependentlyseriallymediatesindicatepositiveeffectscontextmoderatedPracticalimplicationspsychologicalmechanismsdiscussedEffectAIcounseling:serialmodelChatbotEmotionalIntentionIntimacyMentalUser

Similar Articles

Cited By (2)