{"id":63771,"date":"2026-03-19T09:17:51","date_gmt":"2026-03-19T02:17:51","guid":{"rendered":"https:\/\/hbbgroup.net\/coalition-urges-openai-to-scrap-ai-ballot-measure-over-child-safety-concerns\/"},"modified":"2026-03-19T09:17:51","modified_gmt":"2026-03-19T02:17:51","slug":"coalition-urges-openai-to-scrap-ai-ballot-measure-over-child-safety-concerns","status":"publish","type":"post","link":"https:\/\/hbbgroup.net\/vi\/coalition-urges-openai-to-scrap-ai-ballot-measure-over-child-safety-concerns\/","title":{"rendered":"Coalition Urges OpenAI to Scrap AI Ballot Measure Over Child Safety Concerns"},"content":{"rendered":"<div>\n<div>\n<h4 color=\"#333\">In brief<\/h4>\n<ul>\n<li>A coalition of advocacy groups asks OpenAI to withdraw a California AI safety ballot initiative.<\/li>\n<li>Critics say the measure would limit legal accountability and weaken protections for children.<\/li>\n<li>While OpenAI has paused the campaign, the coalition claims it retains control of the initiative ahead of key deadlines.<\/li>\n<\/ul>\n<\/div>\n<p>A coalition of advocacy groups is urging ChatGPT developer OpenAI to withdraw a California ballot initiative that critics say could weaken protections for children and limit legal accountability for AI companies.<\/p>\n<p>In a letter sent to OpenAI on Wednesday, reviewed by <i>Decrypt<\/i>, the group argues that the measure would lock in narrow child-safety protections, limit families\u2019 ability to sue, and restrict California\u2019s ability to strengthen AI laws in the future.<\/p>\n<p>The letter, signed by more than two dozen organizations including AI policy non-profit Encode AI, the Center for Humane Technology, and the Electronic Privacy Information Center, asks OpenAI to dissolve its ballot committee and step back from the proposal while lawmakers work on legislation.<\/p>\n<p>\u201cThe main demand here is for OpenAI to withdraw from the ballot,\u201d Adam Billen, co-executive director of Encode AI, told <i>Decrypt.<\/i><\/p>\n<p>The dispute centers on a proposed \u201cParents &#038; Kids Safe AI Act,\u201d a California ballot <a href=\"https:\/\/www.commonsensemedia.org\/press-releases\/common-sense-media-openai-join-forces-on-strongest-youth-ai-safety-measure-in-us\" target=\"_blank\" rel=\"noopener nofollow external\">initiative<\/a> backed by OpenAI and Common Sense Media that would establish rules for how AI chatbots interact with minors, including safety requirements and compliance standards.<\/p>\n<p>In the letter, the groups argue that those rules fall short. They say the measure defines harm too narrowly, limits enforcement, and restricts families&#8217; ability to bring claims when children are harmed.<\/p>\n<p>But OpenAI controls the actual ballot initiative, Billen said.<\/p>\n<p>\u201cOpenAI has the power to withdraw it or put the money in for signatures. All of the legal authority rests in their hands,\u201d he said. \u201cThey have not actually withdrawn the initiative from the ballot. This is a common tactic in California, where you put an initiative up and put money in the committee.\u201d<\/p>\n<p>The letter points to the initiative\u2019s definition of \u201csevere harm,\u201d which focuses on physical injury tied to <a href=\"https:\/\/decrypt.co\/353927\/google-character-ai-settle-us-lawsuit-teens-suicide\" target=\"_blank\" rel=\"noopener\">suicide<\/a> or violence, excluding a range of <a href=\"https:\/\/decrypt.co\/287925\/character-ai-safety-rules-teen-user-commits-suicide\" target=\"_blank\" rel=\"noopener\">mental health<\/a> impacts that researchers and families have raised as concerns.<\/p>\n<p>It also highlights provisions that would bar parents and children from bringing claims under the initiative and limit enforcement tools available to state and local officials.<\/p>\n<p>Another concern centers on how the proposal treats user data. The groups argue that its definition of encrypted user content could make it harder to access chatbot conversations that have served as key evidence in recent lawsuits.<\/p>\n<p>\u201cWe read that as an attempt to block families from being able to disclose their dead children\u2019s chat logs in court,\u201d Billen said.<\/p>\n<p>The letter also warns that the measure could be difficult to revise if passed. It would require a two-thirds vote in the legislature to amend and tie future changes to standards such as supporting \u201ceconomic progress,\u201d which advocates say could limit lawmakers\u2019 ability to respond to new risks.<\/p>\n<p>Billen said the initiative remains a factor in ongoing negotiations in Sacramento, even as OpenAI has paused its efforts to qualify it for the ballot.<\/p>\n<p>\u201cThey have $10 million in the committee, and then you say to the legislature, if you don&#8217;t do what we want, we&#8217;ll put the money in and get the signatures and put this on the ballot, and if it passes, it will override whatever the legislature does,\u201d he said. \u201cSo essentially, what&#8217;s happening now is they&#8217;re trying to steer and control what state legislators do through the use of the initiative as a threat they&#8217;re leaving on the table.\u201d<\/p>\n<p>OpenAI is not the only company facing scrutiny over chatbot-related harms. Earlier this month, the family of Jonathan Gavalas <a href=\"https:\/\/decrypt.co\/359966\/google-gemini-ai-pushed-florida-man-suicide-lawsuit\" target=\"_blank\" rel=\"noopener\">sued<\/a> Google, claiming that Gemini pushed a delusion that escalated to violence and his ultimate suicide. Billen, however, said OpenAI\u2019s approach reflects a broader pattern in the tech industry.<\/p>\n<p>\u201cThe lobbying playbook that\u2019s getting used on AI from these big guys in particular\u2014the Googles, the Metas, Amazons\u2014is the same strategy that was used previously on other tech issues,\u201d he said.<\/p>\n<p>For now, the coalition is focused on getting OpenAI to withdraw the measure and allow lawmakers to move forward through the legislative process.<\/p>\n<p>\u201cIt\u2019s really important, particularly for the companies that are putting that technology out there, to not be the ones who are writing the rules that regulate them, because that\u2019s not meaningful protections,\u201d Billen said.<\/p>\n<p>OpenAI did not immediately respond to <i>Decrypt&#8217;s<\/i> request for comment<i>.<\/i><\/p>\n<div>\n<h3>Daily Debrief Newsletter<\/h3>\n<p>Start every day with the top news stories right now, plus original features, a podcast, videos and more.<\/p>\n<\/div>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>In brief A coalition of advocacy groups asks OpenAI to withdraw a California AI safety ballot initiative. Critics say the [&hellip;]<\/p>","protected":false},"author":5,"featured_media":63773,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[220],"tags":[],"class_list":["post-63771","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tien-dien-tu"],"acf":[],"_links":{"self":[{"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/posts\/63771","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/comments?post=63771"}],"version-history":[{"count":0,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/posts\/63771\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/media\/63773"}],"wp:attachment":[{"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/media?parent=63771"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/categories?post=63771"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hbbgroup.net\/vi\/wp-json\/wp\/v2\/tags?post=63771"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}