Full metadata record

DC FieldValueLanguage
dc.contributor.authorCasey Bennett-
dc.date.accessioned2022-11-22T01:40:28Z-
dc.date.available2022-11-22T01:40:28Z-
dc.date.issued2022-02-
dc.identifier.citationMULTIMODAL TECHNOLOGIES AND INTERACTION, v. 6, NO. 2, article no. 16, Page. 1-18en_US
dc.identifier.issn2414-4088en_US
dc.identifier.urihttps://www.mdpi.com/2414-4088/6/2/16en_US
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/177145-
dc.description.abstractThe development of new approaches for creating more “life-like” artificial intelligence (AI) capable of natural social interaction is of interest to a number of scientific fields, from virtual reality to human–robot interaction to natural language speech systems. Yet how such “Social AI” agents might be manifested remains an open question. Previous research has shown that both behavioral factors related to the artificial agent itself as well as contextual factors beyond the agent (i.e., interaction context) play a critical role in how people perceive interactions with interactive technology. As such, there is a need for customizable agents and customizable environments that allow us to explore both sides in a simultaneous manner. To that end, we describe here the development of a cooperative game environment and Social AI using a data-driven approach, which allows us to simultaneously manipulate different components of the social interaction (both behavioral and contextual). We conducted multiple human–human and human–AI interaction experiments to better understand the components necessary for creation of a Social AI virtual avatar capable of autonomously speaking and interacting with humans in multiple languages during cooperative gameplay (in this case, a social survival video game) in context-relevant ways.en_US
dc.description.sponsorshipThis research was supported through funding by a grant from the National Research Foundation of Korea (NRF) (Grant number: 2021R1G1A1003801).en_US
dc.languageenen_US
dc.publisherMDPIen_US
dc.subjecthuman-robot interactionen_US
dc.subjectsocial cognitionen_US
dc.subjectcooperative gamesen_US
dc.subjectspeech systemsen_US
dc.subjectvirtual avataren_US
dc.subjectautonomous agentsen_US
dc.titleExploring Data-Driven Components of Socially Intelligent AI through Cooperative Game Paradigmsen_US
dc.typeArticleen_US
dc.relation.no2-
dc.relation.volume6-
dc.identifier.doi10.3390/mti6020016en_US
dc.relation.page1-18-
dc.relation.journalMULTIMODAL TECHNOLOGIES AND INTERACTION-
dc.contributor.googleauthorBennett, Casey-
dc.contributor.googleauthorWeiss, Benjamin-
dc.contributor.googleauthorSuh, Jaeyoung-
dc.contributor.googleauthorYoon, Eunseo-
dc.contributor.googleauthorJeong, Jihong-
dc.contributor.googleauthorChae, Yejin-
dc.sector.campusS-
dc.sector.daehak공과대학-
dc.sector.department데이터사이언스전공-
dc.identifier.pidcabennet-
dc.identifier.orcidhttps://orcid.org/0000-0003-2012-9250-


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE