CARE-Kids: Fostering Critical Thinking and Gender Bias Recognition in Children

CARE-kids logo of a blue hand graphic in a circle against a black background, with white text reading 'care kids'

The widespread adoption of generative AI risks increasing the reproduction of stereotyped digital content and, therefore, the spread of gender biases to a broader audience. This reflects a growing interest in addressing children’s AI literacy and mitigating the potential negative impact of biased AI-generated content on their well-being. CARE-Kids tackles this challenge by empowering children (aged 8 to 12) to think critically about AI-generated content, educating them to recognise stereotypes in these images and helping them to become aware of their beliefs about themselves and others. CARE-Kids aims to deconstruct stereotypes perpetuated by biased AI algorithms (and society) by fostering children’s active engagement to mitigate its impact on their lives.

We will develop a web app to enhance children’s critical thinking and awareness of gender biases enabling children to interact with AI-generated content and understand gender biases. The project’s innovation lies in raising children’s awareness of gender biases by using AI-generated images with the aim of:

  1. Conducting research to identify existing gender biases in AI-generated images.
  2. Improving understanding of how to promote critical thinking and gender bias recognition.

Following a participatory design approach, CARE-Kids will answer to two main questions:

  1. How does the app enhance children’s critical understanding and recognition of gender biases in AI-generated images?
  2. In what ways does the app contribute to broader efforts to promote digital equity and meaningful digital inclusion and AI literacy in schools located in marginalised areas?

Through collaborative partnerships and a holistic approach with local communities, we envision creating a lasting impact on children’s digital literacy and societal attitudes by providing a meaningful experience. The interdisciplinary team is composed of a computer scientist expert in AI and a child-computer interaction scholar. The team will cover all the skills needed to achieve the project objectives.

Outcomes

Building on the success of the initial programme, CARE-Kids has been extended to allow further workshops in the two participating schools, further co-design sessions with the teachers and refinement of the prototype and the storytelling generative AI tool.

Phase one outputs include:

  • Literature review on gender stereotypes on media for kids, AI literacy, biases on generative AI, critical thinking and AI literacy, generative AI tools for kids
  • The CareKids tool: Image Generative AI tool to be used by children
  • A questionnaire to assess children gender stereotypes knowledge and endorsement (Wood et al. 2022)
  • A dataset of 216 images created by researchers using a standard generative AI tool and classified as classified by 111 adults as feminine/masculine/neutral
  • A dataset of 1,789 mages created by children using CareKids tool associated with female/male traits, emotions and hobbies, classified by 4 experts as feminine/masculine/neutral (algorithmic auditing)
  • A paper accepted to ACM CHI conference

Team

Dr Elisa Rubegni – Senior Lecturer, Lancaster University, School of Computing and Communication