Skip to main content

California Bill Aims to Regulate Chatbots Interacting with Children

A new bill proposed by California Senator Steve Padilla would require chatbots that interact with children to offer occasional reminders that they are, in fact, a machine and not a real person. The bill, SB 243, is part of an effort to regulate the safeguards that companies operating chatbots must put in place to protect children.

Key Requirements of the Bill

The bill would establish several requirements to ensure the safe use of chatbots by children. These include:

  • Banning companies from providing rewards to users to increase engagement or usage
  • Requiring companies to report to the State Department of Health Care Services how frequently minors are displaying signs of suicidal ideation
  • Providing periodic reminders that chatbots are AI-generated and not human

The Risks of Chatbots for Children

Research has shown that children are more likely than adults to view AI chatbots as trustworthy, even viewing them as quasi-human. This can put children at significant risk when chatbots respond to their prompting without any sort of protection in place. For example, researchers were able to get Snapchat’s built-in AI to provide instructions to a hypothetical 13-year-old user on how to lie to her parents to meet up with a 30-year-old and lose her virginity.

The Importance of Interventions

While there are potential benefits to kids feeling free to share their feelings with a bot, the risk of isolation is real. Little reminders that there is not a person on the other end of your conversation may be helpful, and intervening in the cycle of addiction that tech platforms are so adept at trapping kids in through repeated dopamine hits is a good starting point.

Addressing the Root Issues

However, these protections won’t address the root issues that lead to kids seeking out the support of chatbots in the first place. There is a severe lack of resources available to facilitate real-life relationships for kids. Classrooms are over-stuffed and underfunded, after-school programs are on the decline, and "third places" continue to disappear. There is also a shortage of child psychologists to help kids process everything they are dealing with.

A More Comprehensive Approach

It’s good to remind kids that chatbots aren’t real, but it’d be better to put them in situations where they don’t feel like they need to talk to the bots in the first place. This requires a more comprehensive approach that addresses the underlying issues that lead to kids seeking out chatbots in the first place.


Source Link