A Perspective on Building Ethical Datasets for Children's Conversational Agents
- PMID: 34056578
- PMCID: PMC8155711
- DOI: 10.3389/frai.2021.637532
A Perspective on Building Ethical Datasets for Children's Conversational Agents
Abstract
Artificial intelligence (AI)-powered technologies are becoming an integral part of youth's environments, impacting how they socialize and learn. Children (12 years of age and younger) often interact with AI through conversational agents (e.g., Siri and Alexa) that they speak with to receive information about the world. Conversational agents can mimic human social interactions, and it is important to develop socially intelligent agents appropriate for younger populations. Yet it is often unclear what data are curated to power many of these systems. This article applies a sociocultural developmental approach to examine child-centric intelligent conversational agents, including an overview of how children's development influences their social learning in the world and how that relates to AI. Examples are presented that reflect potential data types available for training AI models to generate children's conversational agents' speech. The ethical implications for building different datasets and training models using them are discussed as well as future directions for the use of social AI-driven technology for children.
Keywords: artificial intelligence; children; conversational agents; datasets; ethic.
Copyright © 2021 Bailey, Patel and Gurari.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Similar articles
-
The role of valence, dominance, and pitch in perceptions of artificial intelligence (AI) conversational agents' voices.Sci Rep. 2022 Dec 28;12(1):22479. doi: 10.1038/s41598-022-27124-8. Sci Rep. 2022. PMID: 36577918 Free PMC article.
-
Children's questions: a mechanism for cognitive development.Monogr Soc Res Child Dev. 2007;72(1):vii-ix, 1-112; discussion 113-26. doi: 10.1111/j.1540-5834.2007.00412.x. Monogr Soc Res Child Dev. 2007. PMID: 17394580
-
Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy.J Med Internet Res. 2019 May 9;21(5):e13216. doi: 10.2196/13216. J Med Internet Res. 2019. PMID: 31094356 Free PMC article. Review.
-
Artificial intelligence technologies and compassion in healthcare: A systematic scoping review.Front Psychol. 2023 Jan 17;13:971044. doi: 10.3389/fpsyg.2022.971044. eCollection 2022. Front Psychol. 2023. PMID: 36733854 Free PMC article.
-
Feasibility and effectiveness of artificial intelligence-driven conversational agents in healthcare interventions: A systematic review of randomized controlled trials.Int J Nurs Stud. 2023 Jul;143:104494. doi: 10.1016/j.ijnurstu.2023.104494. Epub 2023 Apr 5. Int J Nurs Stud. 2023. PMID: 37146391 Review.
References
-
- Baig E.C. (2019, October 15). Should we use words like “please,” “sorry” and “thank you” when talking to Alexa, Siri, and the Google Assistant? USA Today. Available online at: https://www.usatoday.com/story/tech/2019/10/10/do-ai-driven-voice-assist...
-
- Baker R. S., Sidney K. D., Rodrigo M. M. T., Graesser A. C. (2010). Better to be frustrated than bored: the incidence, persistence, and impact of learners' cognitive–affective states during interactions with three different computer-based learning environments. Int. J. Hum. Comput. Stud. 68, 223–241. 10.1016/j.ijhcs.2009.12.003 - DOI
-
- Beneteau E., Richards O. K., Zhang M., Kientz J. A., Yip J., Hiniker A. (2019). Communication breakdowns between families and alexa, in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI '19 (New York, NY: Association for Computing Machinery; ), 1–13. 10.1145/3290605.3300473 - DOI
-
- Biswas G., Jeong H., Kinnebrew J. S., Sulcer B., Roscoe R. (2010). Measuring self-regulated learning skills through social interactions in a teachable agent environment. Res. Practice Technol. Enhanced Learn. 5, 123–152. 10.1142/S1793206810000839 - DOI
-
- Bligh M. C., Schlehofer M. M., Casad B. J., Gaffney A. M. (2012). Competent enough, but would you vote for her? Gender stereotypes and media influences on perceptions of women politicians. J. Appl. Soc. Psychol. 42, 560–597. 10.1111/j.1559-1816.2011.00781.x - DOI
LinkOut - more resources
Full Text Sources
Other Literature Sources