Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 May 13:4:637532.
doi: 10.3389/frai.2021.637532. eCollection 2021.

A Perspective on Building Ethical Datasets for Children's Conversational Agents

Affiliations

A Perspective on Building Ethical Datasets for Children's Conversational Agents

Jakki O Bailey et al. Front Artif Intell. .

Abstract

Artificial intelligence (AI)-powered technologies are becoming an integral part of youth's environments, impacting how they socialize and learn. Children (12 years of age and younger) often interact with AI through conversational agents (e.g., Siri and Alexa) that they speak with to receive information about the world. Conversational agents can mimic human social interactions, and it is important to develop socially intelligent agents appropriate for younger populations. Yet it is often unclear what data are curated to power many of these systems. This article applies a sociocultural developmental approach to examine child-centric intelligent conversational agents, including an overview of how children's development influences their social learning in the world and how that relates to AI. Examples are presented that reflect potential data types available for training AI models to generate children's conversational agents' speech. The ethical implications for building different datasets and training models using them are discussed as well as future directions for the use of social AI-driven technology for children.

Keywords: artificial intelligence; children; conversational agents; datasets; ethic.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Similar articles

References

    1. Baig E.C. (2019, October 15). Should we use words like “please,” “sorry” and “thank you” when talking to Alexa, Siri, and the Google Assistant? USA Today. Available online at: https://www.usatoday.com/story/tech/2019/10/10/do-ai-driven-voice-assist...
    1. Baker R. S., Sidney K. D., Rodrigo M. M. T., Graesser A. C. (2010). Better to be frustrated than bored: the incidence, persistence, and impact of learners' cognitive–affective states during interactions with three different computer-based learning environments. Int. J. Hum. Comput. Stud. 68, 223–241. 10.1016/j.ijhcs.2009.12.003 - DOI
    1. Beneteau E., Richards O. K., Zhang M., Kientz J. A., Yip J., Hiniker A. (2019). Communication breakdowns between families and alexa, in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI '19 (New York, NY: Association for Computing Machinery; ), 1–13. 10.1145/3290605.3300473 - DOI
    1. Biswas G., Jeong H., Kinnebrew J. S., Sulcer B., Roscoe R. (2010). Measuring self-regulated learning skills through social interactions in a teachable agent environment. Res. Practice Technol. Enhanced Learn. 5, 123–152. 10.1142/S1793206810000839 - DOI
    1. Bligh M. C., Schlehofer M. M., Casad B. J., Gaffney A. M. (2012). Competent enough, but would you vote for her? Gender stereotypes and media influences on perceptions of women politicians. J. Appl. Soc. Psychol. 42, 560–597. 10.1111/j.1559-1816.2011.00781.x - DOI

LinkOut - more resources