The risks and opportunities associated with chatbots using generative AI for children

This Fellowship involved independent research into the risks and opportunities associated with chatbots using generative AI for children. The aim was to understand children’s use of chatbots which use generative AI via a review of existing research and recent technological developments in this area. The Fellow, Thao Do, sought to identify where children are likely to engage with chatbots using generative AI, including documenting emerging developments in this area, such as where this technology is embedded in services popular among children. In addition to the potential opportunities of new and emerging technologies, chatbots may be a functionality through which children may encounter potentially harmful content produced by generative AI. Some of this content may be ‘content that is harmful to children’ as defined in the Online Safety Bill.

The aim was to reach a better understanding of:

  • the ages of children engaging with chatbots using generative AI
  • the kinds of services children are encountering chatbots on
  • the kinds of activities children are using chatbots for and content encountered via chatbots using generative AI
  • where such chatbots are deployed by services (including whether they act as standalone products or are integrated into other products or technologies)
  • whether such content is harmful to children as set out in the Online Safety Bill.

Outputs

Disclaimer: This paper represents the views and opinions of the author and should not be taken as a statement of Ofcom policy/opinion.


Hosted by Ofcom

Ofcom is the UK’s communications regulator, with oversight of TV, radio and video on demand sectors, fixed line telecoms, mobiles and postal services. Protection of children online is a key work area for Ofcom as it prepares to regulate online safety. In an emerging technological environment, an understanding of the risks that new technologies may pose to children and, in particular, how this interacts with content harmful to children to be outlined in legislation, will be vital.