Skip to content
Search

Latest Stories

Follow Us:
Top Stories

Docuseries Highlights Need for Legal Protections for Kid Influencers

News

Docuseries Highlights Need for Legal Protections for Kid Influencers

child holding smartphone

Getty Images/Keiko Iwabuchi

A new Netflix docuseries explores the unseen complexities and dark possibilities of child influencing in our modern internet age, raising urgent questions and highlighting the critical need for legal protections for kid influencers once their internet presence turns into work—a full-time job that, at times, financially supports their families.

Released last week, “ Bad Influence: The Dark Side of Kidfluencing ” shares how Youtube star Piper Rockelle—who began posting videos at eight years old and garnered 12 million subscribers and about 1.87 billion views—and her “Squad” of fellow pre-teen social media influencers worked and lived in a toxic environment under Rockelle's "momager", Tiffany Smith, and Smith's boyfriend, Hunter Hill.


The three-part exposé dives into the harsh, manipulative, and complex working conditions that “Squad” members experienced while working with Smith and Hill, who created a physically, mentally, and emotionally unsafe environment for the underage content creators.

In 2022, eleven former “Squad” members filed a complaint against Smith and Hill for “emotional, verbal, physical, and, at times, sexual abuse” when they were active members of the Squad. The child abuse lawsuit was settled in October 2024 for $1.85 million—incredibly short of the $22 million that was originally sought—with all parties specifically disclaiming any liability.

All former “Squad” members who have spoken out are still intensely impacted by the trauma caused by Smith and Hunter, whether their online careers have been irreparably damaged and/or they are experiencing long-term post-traumatic stress. Attorney Matt Sarelson shared in the documentary that, “In many ways, a lawsuit is where justice goes to die.”

The viral series explains how managers of influencers have been able to circumvent child labor laws and protections put in place for children in the entertainment industry.

“These abuse allegations against Tiffany, which include battery and child labor violations, are not unique to the Piper Rockelle/Tiffany case,” Lorenz said in the series. “These are common forms of abuse that are rampant in the ‘kidfluencer’ industry.

Several culture experts have criticized the lack of connection between many political figures and pop culture, emphasizing the importance of understanding pop culture and acknowledging its significant impact on individuals and groups.

“‘Kidfluencing’ right now is the wild, wild west. I mean, there’s no regulations that keep these influencers safe,” said Brandon Stewart, Content Strategist, CEO of Brandon Studios

“It’s an unregulated frontier of the entertainment industry,” shared Attorney Jeremiah D. Graham. “When a child is treated like this, they shouldn’t have to go out and hire private attorneys in order to vindicate their rights.”

“The government has absolutely no appetite to implement any sort of meaningful regulations in this industry. They still treat this industry as a joke,” said Lorenz. “Lawmakers are often 70 to 80 years old. They don’t take this world seriously at all. They make fun of it. They mock it…And until we start taking this industry seriously until we start viewing influencing as labor, these kids are screwed.”

Legal Protections for Child and Teenage Influencers

Quit Clicking Kids, founded by Chris McCarty, who was featured in the docuseries, advocates for legislation that protects the well-being of child influencers. The initiative looks to expand protections for child actors to child influencers.

In 2022, McCarty worked with Washington State Rep. Emily Wicks (D) to craft and introduce HB 2023. The bill would require guardians to set aside a percentage of social media earnings for children featured in the content and, once they reach the age of 18, allow former child influencers to request the removal of content in which they appear. In 2023, the bill was reintroduced as HB 1627 by Washington State Rep. Kristine Reeves (D) with no changes.

“I think one of the biggest misconceptions is not seeing it as work, especially for the kids,” commented McCarty. “It is very much not a hobby for many of these influencers. It is a job. And in some cases, it’s the primary or even the only source of income for these families. That has the potential to place an undue burden on these children to create content.”

In 2023, Governor J.B. Pritzker (D) signed SB 1782 into law, making Illinois the first state to implement financial protections for child influencers.

In 2024, California Gov. Gavin Newsom (D) signed two bills that protect child and teenage influencers from financial abuse:

AB 1880 expanded the Coogan Law —which requires employers of child performers and creators to save at least 15% of their gross earnings in a trust, accessible once the child reaches adulthood—to also financially protect underaged content creators.

SB 764 requires that parents or guardians of minors featured in monetized online content set aside a percentage of their earnings in trust accounts.

Despite the growing call for legal protections, the pressing question remains:

How can we increase safety regulations to protect the entire well-being, not just the financial well-being, of child and teenage influencers?

Currently, advocates call for educating audiences and reforming internet culture to be more skeptical about child-centered content and to be more concerned for the well-being of children featured in monetized content. Others point to social media platforms, stating that it is their responsibility to rethink their business models and prioritize the safety of children.

“We really just need to educate people. We need to change the culture. We need to change norms around parenting,” stated Lorenz. “The fundamental problem is the business model of these platforms and these capitalist incentives.”

Belén Dumont is a freelance reporter and associate editor at The Fulcrum.


Read More

The robot arm is assembling the word AI, Artificial Intelligence. 3D illustration

AI has the potential to transform education, mental health, and accessibility—but only if society actively shapes its use. Explore how community-driven norms, better data, and open experimentation can unlock better AI.

Getty Images, sarawuth702

Build Better AI

Something I think just about all of us agree on: we want better AI. Regardless of your current perspective on AI, it's undeniable that, like any other tool, it can unleash human flourishing. There's progress to be made with AI that we should all applaud and aim to make happen as soon as possible.

There are kids in rural communities who stand to benefit from AI tutors. There are visually impaired individuals who can more easily navigate the world with AI wearables. There are folks struggling with mental health issues who lack access to therapists who are in need of guidance during trying moments. A key barrier to leveraging AI "for good" is our imagination—because in many domains, we've become accustomed to an unacceptable status quo. That's the real comparison. The alternative to AI isn't well-functioning systems that are efficiently and effectively operating for everyone.

Keep ReadingShow less
Government Cyber Security Breach

An urgent look at the risks of unregulated artificial intelligence—from job loss and environmental strain to national security threats—and the growing political battle to regulate AI in the United States.

Getty Images, Douglas Rissing

AI Has Put Humanity on the Ballot

AI may not be the only existential threat out there, but it is coming for us the fastest. When I started law school in 2022, AI could barely handle basic math, but by graduation, it could pass the bar exam. Instead of taking the bar myself, I rolled immediately into a Master of Laws in Global Business Law at Columbia, where I took classes like Regulation of the Digital Economy and Applied AI in Legal Practice. By the end of the program, managing partners were comparing using AI to working with a team of associates; the CEO of Anthropic is now warning that it will be more capable than everyone in less than two years.

AI is dangerous in ways we are just beginning to see. Data centers that power AI require vast amounts of water to keep the servers cool, but two-thirds are in places already facing high water stress, with researchers estimating that water needs could grow from 60 billion liters in 2022 to as high as 275 billion liters by 2028. By then, data centers’ share of U.S. electricity consumption could nearly triple.

Keep ReadingShow less
Posters are displayed next to Sen. Ted Cruz (R-TX) as he speaks at a news conference to unveil the Take It Down Act to protect victims against non-consensual intimate image abuse, on Capitol Hill on June 18, 2024 in Washington, DC.

A lawsuit against xAI over AI-generated deepfakes targeting teenage girls exposes a growing crisis in schools. As laws struggle to keep up, this story explores AI accountability, teen safety, and what educators and parents must do now.

Getty Images, Andrew Harnik

Deepfakes: The New Face of Cyberbullying and Why Parents, Schools, and Lawmakers Must Act

As a former teacher who worked in a high school when Snapchat was born, I witnessed the birth of sexting and its impact on teens. I recall asking a parent whether he was checking his daughter’s phone for inappropriate messages. His response was, “sometimes you just don’t want to know.” But the federal lawsuit filed last week against Elon Musk's xAI has put a national spotlight on AI-generated deepfakes and the teenage girls they target. Parents and teachers can’t ignore the crisis inside our schools.

AI Companies Built the Tool. The Grok Lawsuit Says They Own the Damage.

Whether the theory of French prosecutors–that Elon Musk deliberately allowed the sexualized image controversy to grow so that it would drive up activity on the platform and boost the company’s valuation–is true or not, when a company makes the decision to build a tool and knows that it can be weaponized but chooses to release it anyway, they are making a risk-based decision believing that they can act without consequence. The Grok lawsuit could make these types of business decisions much more costly.

Keep ReadingShow less
Sketch collage image of businessman it specialist coding programming app protection security website web isolated on drawing background.

Amazon’s court loss over Just Walk Out highlights a deeper issue: employers are increasingly collecting workers’ biometric data without meaningful consent. Explore the growing conflict between workplace surveillance, privacy rights, and outdated U.S. laws.

Getty Images, Deagreez

The Quiet Rise of Employee Surveillance

Amazon’s loss in court over its attempt to shield the source code behind its Just Walk Out technology is a small win for shoppers, but the bigger story is how employers are quietly collecting biometric data from their own workers.

From factories to Fortune 500 companies, employers are demanding fingerprints, palmprints, retinal scans, facial scans, or even voice prints. These biometric technologies are eroding the boundary between workplace oversight and employee autonomy, often without consent or meaningful regulation.

Keep ReadingShow less