Privacy experts who spoke to WIRED described Rumble, Quora and WeChat as unusual suspects, but declined to speculate on why they were included in the investigation. Josh Golin, executive director of the nonprofit Fairplay, which advocates for digital safety for children, says the concerns aren’t always clear. Few advocacy groups were concerned about Pinterest, he says, until, for example, the case of a British teenager who died of self-harm after being exposed to sensitive content on the platform.
Paxton’s press release last month called her new research “a critical step toward ensuring that social media and artificial intelligence companies comply with our laws designed to protect children from exploitation and harm.” .
The US Congress has never passed comprehensive privacy legislation and has not significantly updated children’s online safety laws in a quarter of a century. This has given state legislators and regulators a big role to play.
Paxton’s investigation focuses on compliance with the Texas Safeguarding Children Online through the Empowering Parents Act, or SCOPE, which went into effect in September. It applies to any website or application with a social media or chat function and registers users under 18, and goes beyond federal law, which only covers services for users under 13.
SCOPE requires services that ask users’ ages and give parents or guardians power over children’s account settings and user data. Companies are also prohibited from selling information collected about minors without parental permission. In October, Paxton sued TikTok for violating the law by providing inadequate parental controls and disclosing data without consent. TikTok has denied the allegations.
The investigation announced last month also cited the Texas Data Privacy and Security Act, or TDPSA, which went into effect in July and requires parental consent before processing data about users under 13. SCOPE Act and TDPSA, subject to legal requests obtained through public records requests.
In all, the companies must answer eight questions by next week, including the number of Texas minors counted as users they banned for entering an incorrect date of birth. Lists to which minors’ data is sold or shared must be turned over. It cannot be learned whether any company has already responded to the request.
Tech lobby groups are challenging SCOPE’s constitutionality in court. In August, they scored an initial, partial victory when a federal judge in Austin, Texas, ruled that regulations requiring companies to take steps to prevent minors from viewing offensive and harmful content.
But even a complete victory may not be a panacea for tech companies. States including Maryland and New York are expected to implement similar laws later this year, says Arielle Fox Johnson, an attorney and director of Smart Digital Law and Policy Advisors. And state attorneys general can resort to pursuing more limited cases under their tried-and-true laws that prevent deceptive business practices. “What we see often is that information is shared or sold or disclosed in ways that families don’t expect or understand,” Johnson said. As more laws are put in place that create stricter requirements, it seems to become clearer that not everyone is following them.