Earlier this week, two senators submitted a new law that would give online companies an obligation to act in the best interests of children and avoid or lessen the danger of certain damages.
Sens. Richard Blumenthal (D-Connecticut) and Marsha Blackburn (R-Tennessee), the chairs and ranking members of the Senate Commerce subcommittee on consumer protection, presented the Kids Online Safety Act. These changes would have a huge impact on the architecture of platforms like Facebook parent Meta, Snap, Google, and TikTok if the measure were to become law.
Former Facebook employee Frances Haugen, who also spoke before the panel, provided the subcommittee with tens of thousands of pages of records. According to the papers, the corporation conducted studies on the impacts of its platforms on children’s mental health and came to the conclusion that some of the results were unfavourable. Lawmakers who addressed Facebook executives, including Instagram president Adam Mosseri, were upset that the corporation had not done more to change its services after the research findings. The Kids Online Safety Act aims to improve protection for children under the age of 16 by raising the requirement for online platforms that are “reasonably expected to be used.”
In order to “manage their experience and personal data,” these organizations must put in place measures that kids or their guardians may easily access. Platform settings that enable children limit the ability of others to locate them online, restrict the amount of data that may be gathered on them, and provide the option of removing their data from algorithms that use it to make recommendations would fall under this category.
According to Blumenthal, “I think we are on the threshold of a new age for Big Tech that is imposition of a sense of responsibility that has been absolutely missing so far.” Blumenthal told reporters during a news conference on Wednesday. Because it is not just doable and practicable, but has been proven to be effective. He referred to European norms stated by experts during one of the sessions of the subcommittee.
However, Blumenthal says he’s open to hearing from the tech corporations that would be affected by the law, but he’s sceptical of their genuine motives for doing so. A willingness to hear from tech businesses is what Blumenthal emphasized. They have, in the past, used their armies of attorneys and lobbyists to block legislation, but this has happened much too often. Blumenthal compared the bill to other sorts of product safety laws, saying it was comparable.
Blumenthal said that the internet has been viewed as distinct from other things for “far too long.” There will be guardrails and safety measures in place for the Internet, which will allow both children and their parents to keep themselves secure from online predators. Platforms must also make the strongest version of these safety measures the default option as part of the legislation. As a result, services would be prohibited from encouraging children to disable these restrictions.
As a result of an impartial, third-party assessment on the risks of damage to children, covered platforms would be required to issue annual public reports. Researchers certified by the National Telecommunications and Information Administration would require access to data in order to undertake public interest study on the damages caused to kids online if they were to get it. No minimum size barrier is included, in contrast to other measures aimed at online platforms, to be held accountable under the act. Toxic information does not have to be seen by large numbers of people to be harmful, according to Blumenthal.
The bill also urges government organizations to determine the best methods to protect adolescents on these services. For example, the FTC is tasked with creating rules for covered platforms on how to conduct market and product-focused research on minors under the mandate. The National Telecommunications and Information Administration (NTIA) is also required to do research on how platforms can most effectively and reliably verify the ages of its users. As part of the legislation, the Commerce Secretary would assemble a new council of parents, experts, tech representatives, law enforcers, and young voices to provide advise on how to carry out the law’s provisions. The Federal Trade Commission and state attorneys general would be responsible for enforcing it.
According to Blumenthal, he has high hopes that the Commerce Committee would expedite the markup process so that the measure may be debated by the full Senate. According to Blackburn, she and her colleagues frequently receive concerns about children’s internet safety from their constituents. There is a constant chorus of people demanding, “There needs to be something done about this,” she added.