A bill advancing through the California legislature seeks to address the harmful impacts of “companion” chatbots, artificial intelligence-powered systems designed to simulate human-like relationships and provide emotional support. They’re often marketed to vulnerable users like children and those in emotional distress.
Introduced by state Sen. Steve Padilla, the bill would require companies running companion chatbots to avoid using addictive tricks and unpredictable rewards. They’d be required to remind users at the start of the interaction and every three hours that they’re talking to a machine, not a person. And they’d also be required to clearly warn users that chatbots may not be suitable for minors.
Read more here.