Lawmakers are starting to take action against social media companies for the perceived effects of their platforms on children, targeting algorithms that recommend explicit content to underage users — but some worry these efforts might fall short.
An investigation by The Wall Street Journal published in July found that young users on TikTok were being recommended images and videos of drug use, pornography and links to adult websites, including OnlyFans. The National Center on Sexual Exploitation (NCOSE) conducted an investigation of its own into the same phenomenon, also finding that children were being recommended sexually explicit content.
Earlier this month, testimony and revelations from former Facebook employee Frances Haugen shed light on how Instagram’s engagement-based ranking algorithm recommends content to young users that may negatively affect their mental health.
Lawmakers have sought to address these issues by floating possible solutions and proposing legislation that takes aim at algorithms that recommend potentially harmful content. Executives from TikTok, Snap and YouTube are due to testify Tuesday before the Senate Commerce Committee on the harms their platforms may have on young users, the latest in a series of hearings revolving around social media companies for their impact on children.
HELP LIFENEWS SAVE BABIES FROM ABORTION! Please help LifeNews.com with a year-end donation!
“The bombshell reports about Facebook & Instagram—their toxic impacts on young users & lack of truth or transparency—raise serious concerns about Big Tech’s approach toward kids across the board,” Democratic Sen. Richard Blumenthal of Connecticut said in a statement announcing the hearing.
House Democrats proposed a bill earlier this month stripping Section 230 liability protections from social media platforms for “personalized” recommendations of content that contribute to “physical or severe emotional injury.” If enacted, the law would allow users to sue social media companies for recommending harmful content to them.
Republican Sen. John Thune of South Dakota advocated for a bipartisan bill he co-sponsored with Blumenthal and Republican Sen. Marsha Blackburn of Tennessee, known as the Filter Bubble Transparency Act, that would allow users to opt out of newsfeeds tailored by algorithms and require platforms to notify users on how they recommend content.
Blumenthal and Democratic Sen. Ed Markey of Massachusetts, along with Democratic Rep. Kathy Castor of Florida, reintroduced the Kids Internet Design and Safety (KIDS) Act in late September in reaction to Haugen’s leaks. The bill, among other things, prohibits platforms that are “directed to children” from amplifying sexually explicit content or content depicting select types of violence through algorithms.
Some child advocates, while supportive of the lawmakers’ efforts, believe that the focus on algorithms to be misguided and suggest a more holistic approach.
“It would be a mistake to focus there, or think that’s going to be a solution to these wider problems,” Lina Nealon, director of corporate and strategic initiatives at the NCOSE, told the Daily Caller News Foundation.
Nealon said she strongly favors the KIDS Act but believes lawmakers should seek to involve parents in making platforms safer for children, favoring regulations that would require more parental involvement when a child registers for a social media account as well as parental controls over certain social media functions like direct messaging.
“You need to give parents more control and oversight of the content their kids are able to access, as well as the people they’re interacting with online,” Nealon said.
Josh Golin, executive director of child advocacy organization Fairplay, told the DCNF that while the focus on algorithms is well-intentioned and a step in the right direction, enhancing platform transparency and imposing stricter privacy protections for children online must be part of any solution.
“You need to address both ends. You need to address this both from a privacy perspective and a design perspective,” Golin said. “If you limit the amount of data collected, you can limit the kind of personalized content that’s harming kids.”
Golin also backed the KIDS Act, and he suggested strengthening privacy requirements for teens by raising the age of individuals covered by the Children’s Online Privacy Protection Rule (COPPA). COPPA imposes limits on what data tech companies can collect from children under the age of 13.
“I think it’s insane that a thirteen year-old is treated like an adult online,” Golin said.
Social media platforms base their personalized content recommendations on data collected from users by tracking their behavior, often across multiple platforms. Golin argued that restricting how much data social media companies can collect on children would help to mitigate the ease with which they may be targeted by recommendation algorithms.
Solutions that avoid direct regulation of algorithms may also be more politically viable, as restricting how social media companies amplify and distribute content could affect free speech.
“Every time a court has looked at an attempt to limit the distribution of particular kinds of speech, they’ve said, ‘This is exactly the same as if we had banned that speech outright. We recognize no distinction,’” Daphne Keller, director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center, told The Washington Post earlier in October.
Nealon and Golin also both backed the CAMRA Act, a bipartisan bill that seeks to shed more light on the effects of social media platforms on child and teen users. If enacted, the law would fund a National Institutes of Health research project into how technology impacts children and adolescents.
“It seems like a no-brainer to know how these platforms affect children. These platforms are so opaque, and transparency would go a long way,” Nealon said.
LifeNews Note: Ailan Evans writes for Daily Caller. Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience.