Nicki Petrossi: Building a Platform for the Families Big Tech Ignores
After thousands of hours of conversations with affected families, experts, whistleblowers: Nicki's take on how we can stand up to social media harms
If research says social media harms kids, and children are actually dying — shouldn’t every parent know about it? That question led Nicki Petrossi to walk away from a career in social media marketing and launch Scrolling2Death, a podcast that investigates Big Tech’s harms and amplifies the voices of survivor parents, doctors, whistleblowers, and other social media experts.
Soon after I came out with my lawsuit and first-hand experience of the lengths Meta leadership was willing to go to cover up their harms to children in order to maximize profits, I heard from Nicki—she invited me to tell my story on her podcast.
Since 2023, Nicki’s built a multi-pronged advocacy operation: co-founding the Tech-Safe Learning Coalition, launching the S2D Foundation, and has become a regular presence on Capitol Hill pushing for meaningful legislation.
In large part thanks to Nicki’s connector nature, I’ve had the opportunity to join these advocacy efforts on Capitol Hill and formed many meaningful relationships with survivor families and other advocates.
And today, I’ve joined her again as part of The Heat is On, Scrolling2Death’s miniseries in partnership with the Heat Initiative to expose Big Tech’s most powerful companies. The series features other experts like whistleblower Arturo Bejar, Anxious Generation author Jonathan Haidt, and advocate and attorney Laura Marquez-Garrett.
Part 2 on Meta will be out next week!
Inspired by Nicki and her advocacy, I also had some burning questions for her which she graciously agreed to answer!
We discussed the stories that keep her up at night, what’s broken about the legislative process, and why she’ll never download TikTok.
Here is that conversation:
On what every parent deserves to know:
Kelly: After leading a successful social media marketing company, you made the choice to walk away and publicly challenge the industry. What was the turning point that prompted you to launch Scrolling2Death?
Nicki: I would say it was more of a slow progression, rather than a single turning point. I knew I hated my job. Targeting people on social media and manipulating their thoughts and decisions did not feel good. Especially when, personally, I wasn’t active on social media at all. From 2020 to 2023, I didn’t engage on social media personally in any capacity. My life was never better. My relationships were never stronger. I was raising my kids; being present with them.
But when I looked around me in the world, I saw everyone else addicted to their devices and to social media; grandparents, parents, children of all ages. I wondered, didn’t these people know about the latest Surgeon General’s warning about social media? Don’t they know that smartphones and social media is super unhealthy for kids?
That’s when I started to dig further and found the story of Carson Bride, a 16-year-old who had been cyberbullied relentlessly through anonymous apps and Snapchat. It was so bad that he took his life. Carsons’s mom, Kristin, was speaking out about online harms and warning parents.
I thought: If research says it’s bad, and children are actually dying from social media…shouldn’t every parent know about this?
I made it my goal to tell the stores and to share the research so that parents everywhere can make safe decisions in their home. That was where Scrolling 2 Death started, and Kristin was one of my first interviews.
“I first heard Nicki tell Carson’s story in Sept 2023 in one of her initial podcasts. I was so impressed with the respectful manner in which she handled his story and my advocacy work. I could tell that she genuinely cares about our kids and what Big Tech has put them through.
Nicki’s videos have grown so much over the last few years due to the sad fact that the online harms to our kids are growing exponentially. Her daily video posts have become an invaluable resource for so many parents and are ultimately saving young lives.”
- Kristin Bride, mother of Carson Bride, forever 16, founding member of ParentsRISE!, and Executive Director of the Carson J. Bride Effect
On the hard questions driving Nicki’s work:
Kelly: You’ve said this journey started with asking “why” questions, like why we spend so much time on platforms that leave us feeling bad. What were some of those “whys” and what answers have you uncovered to those initial questions since immersing yourself in this work?
Nicki: Why do we use platforms that make us feel bad? Well, it’s because we’ve been manipulated into this life. Our attention has been sold and tech companies make billions on keeping us scrolling.
Why do parents give access to smartphones and platforms that we know, deep down, are bad for them? Because our kids get teased and left out if we don’t.
Why are we losing this fight? Because we didn’t quite understand the intentionality of the harm. We thought we were accepting a little bad with a lot of good, and that it was bad people on the platforms sneaking through. Turns out, the platforms are also the bad guys.
In the first year or so of the Scrolling 2 Death journey, I was fact-finding; answering “why” questions and investigating the problems that these companies cause. I took the time to educate myself and others on where the harms lie, and who is at fault.
Then, it was time for solutions. Solutions in the form of lawsuits, legislation and direct pressure on these negligent companies.
Kelly: What pressing questions about youth and digital culture have come up since starting this work, and what do you think society still isn’t confronting?
Nicki: I often hear, “It’s just the way it is. We can’t change it.” Or…”Technology is here to stay.”
It’s not about getting rid of technology and its benefits. It’s about challenging the business practices of companies who harm children. Engagement at all costs doesn’t align with a child’s well-being. Period.
So the digital culture of young people living their entire lives online…it’s not working. It’s not good for them. We know it. They know it. So, we have to change it.
Parents, advocates and most importantly, youth,are fighting back. Working together across generations will be the key to putting humans back in the driver’s seat.
On what every member of Congress needs to hear:
Kelly: Across all of the tech experts, young people, and survivor parents you’ve interviewed, what one moment, story, or theme sticks with you the most? What’s the story you wish every member of Congress could hear before they vote on any tech legislation?
Nicki: There are so many. Too many.
Snapchat allowing a speed filter for 8 years, rewarding users for driving fast and causing many car crashes and deaths.
YouTube and TikTok pushing choking challenge videos to children too young to even be on their platform, causing their deaths.
OpenAI and Google’s chatbots suicide-coaching children.
Children falling victim to sextortion, leading to suicide within 2 days of signing up for Instagram.
Teen test accounts on Pinterest and Snapchat turning into a constant stream of pro-self-harm and suicide content. I’m talking cut wrists, blades and nooses.
I could go on and on. But what sticks with me most is the survivor parents. I think of them every day; the grief and sadness they live with. I would ask every member of Congress (and our President, for that matter) to sit down with a parent who has been affected by social media harms. This should be a requirement before drafting or passing any related legislation.
On fighting a system many of us exist within:
Kelly: It’s near impossible to succeed at podcasting without relying on community and distribution from exploitative online platforms. How do you think about these tradeoffs and what advice do you have for others trying to build digital businesses who are struggling with this tension?
Nicki: If my target audience were minors, teenagers, I would not be on Instagram. I don’t believe in promoting content to consumers who are overwhelmingly harmed on that same platform.
My audience is parents. Parents are on Instagram. And not to say that parents aren’t addicted to Instagram, too. But I will continue to reach parents where they are, with hopes that a safer option for connection will reveal itself in 2026. I’m actually in talks with several safer platforms and am looking forward to sharing more.
I think it’s a balance and sometimes it feels good to take down a company from within. When an Instagram post about how bad Instagram is hits 1 million views, it feels like a win. 1 million people – parents – becoming more aware of the harms of that platform and keeping their children off.
I do draw the line at TikTok. That is a platform I will never download to my devices. And you will never see me monetizing through Meta’s platforms.
So I would recommend content creators define their own values and stick to them.
Kelly: I’ve experienced my own guilt and conflict about having contributed to the very system I’m now fighting to reform. Has this also been your experience? What insight does your own experience give you into how well-intentioned people become entangled in harmful tech practices, and what would you say to others in the industry who suspect something is wrong but feel stuck or fearful about speaking up?
Nicki: I did work in the social media management space; helping tech companies reach their customers through social media content. In my case, the customers were other businesses and decision-makers at those businesses. Definitely not minors and most of my work was done on LinkedIn and Twitter.
So fortunately, I don’t fight the guilt about contributing to the system. But I have spoken to many whistleblowers, like you, Kelly, who worked within the system, found their way out and now fight back. Some are louder than others.
Early on, I often wondered…how could someone possibly work at Meta? Or Snap? How do they sleep at night?
I see now that those were naive sentiments (unless we’re talking about the top execs). Many of the people who work at the tech companies want to help consumers; want to help children. They hope that their work can make a difference. Apparently, though, it doesn’t take long for them to realize that any real safety measures will get shot down from the top.
So what do they do then…with kids and a mortgage? Leave this company and take a much lower salary? Speak out and risk being “unemployable”? It’s a tough choice these people are making and I’m in awe of the whistleblowers who have done the right thing; the brave thing.
We need more people speaking out.
On making an impact from content to Capitol Hill:
Kelly: In addition to Scrolling2Death, you’ve co-founded initiatives like the Tech-Safe Learning Coalition to tackle school-issued devices, and launched the S2D Foundation to support advocacy work. Considering your fight from content to Capitol Hill, what has surprised or challenged you most about trying to change tech culture broadly?
Nicki: I’ve been shocked at how these companies straight-up lie and scheme.
The amount of money they spend lobbying lies to lawmakers is astounding.
The PR campaigns are elaborate and timed specifically to drown out bad press.
And this applies to tech companies across the industry, whether their focus is social media, messaging, hardware, EdTech or AI.
You think to yourself, “These are human beings with children. Mark Zuckerberg has kids. Surely he won’t keep making decisions that cause harm to children.”
Unfortunately, he will. And he is. That is what shocks me most.
Kelly: How have your experiences hosting Scrolling2Death influenced your approach to legislative advocacy and vice versa?
Nicki: I’ve learned a lot about the legislative process, some of it frustrating. I’ve learned that bills are often “watered down” by lawmakers who are influenced by tech company’s lobbyists. It’s pretty hard to get a good bill that actually puts children over Big Tech profits.
And when that does happen and it passes one chamber, a single person (like the Speaker of the House) can throw a wrench in the whole thing.
I’ve learned that even if a law passes, the tech companies sue the government and hold the law up in court for years.
I’ve learned that even if a law passes, the President can sign an Executive Order pushing back enforcement for months or even years.
Suffice it to say, I’m not overly impressed with the legislative process at the moment. There are so many hurdles and loopholes and backdoors.
There is progress being made more quickly on the state level, but now we have the federal government threatening an Executive Order restricting states from passing AI-related laws.
So I’ve turned a lot of my efforts to the legal and direct action work, which seems to be more promising when it comes to real change that will stick.

On authenticity in digital spaces and 5 am recordings:
Kelly: One issue you’ve raised is how social media encourages us to present an edited, “fake” version of ourselves. For young people still figuring out who they are, what do you think constant exposure to these filtered, performative online lives is doing to their sense of self and mental health? How do you think about your own authenticity and real connection while showing up in a digital culture that often rewards the opposite?
Nicki: The ability for young people to present a filtered, edited version of themselves is extremely harmful. I often talk about the egregious harms like suicide or eating disorder content, but more commonly, children are losing their confidence and sense of self when they can edit how they look online.
How would you feel when you look in the mirror, if every photo of yourself was flawless?
This applies to adults, too. I pride myself on presenting as my “regular” self, a tired mom of 3 who sometimes films an update at 5AM with bags under my eyes. I actually got a comment once from a woman saying she “appreciates my natural face.” That made me laugh, but also validated my thought that authenticity is key when interacting online.
Imagine if beauty filters didn’t exist, and we just saw each other for who we really are?
On the practical steps we can take to protect kids:
Kelly: What kind of digital world do you hope today’s children will inherit as they grow up? You often talk about letting kids “be kids, for as long as possible.” In practical terms, what would that ideal look like, and what changes (in tech design, laws, or cultural attitudes) do you think are most vital to make that vision a reality? Do you see any signs of progress or reasons to hope that we can turn the tide and create a safer online landscape for the next generation?
Nicki: Here’s what I would envision, give the thousands of hours of conversations I’ve had with affected families, doctors, whistleblowers, researchers and other online safety experts:
At school:
Get 1:1 devices out of elementary and middle school. Introducing tech at appropriate times for tech-specific lessons only.
Never allow school-related clubs or sports to communicate via social media.
Offer digital safety lessons in all grades and extend the education to parents.
Implement bell-to-bell phone free schools at all states and districts
At home:
Restrict all solo media use through childhood. This means, kids shouldn’t be holding tech and using it alone. Adopt a “tech together” mentality for the family.
No devices in bedrooms, ever.
See iPads for what they are – large iPhones - that do not belong in a child’s hands, at all. Ditto for other digital tablets.
No use of AI companions. I think it’s even time to rethink Alexa and Google Home.
No smartphones until 16. Teen-safe devices are OK but should be restricted until high school. Before then, watches suffice if needed and landlines for home communication.
Restrict video games through childhood, unless played together with parents in a common space with strict time limits.
From tech companies (led by legislation):
This is not an exhaustive list but gets us moving in the right direction:
Create separation between children and adults in online spaces.
For example, online games for children should be for children only. Social media spaces where there are children shouldn’t allow adults.
Age-gate adult spaces i.e. pornography
Use AI technology to proactively scan user-generated content for topics that go against policy, then remove it.
Restrict multiple accounts per device ID.
Respond to subpoenas from law enforcement within 24 hours
Stop deleting data on the back-end (Snapchat)
Tech companies should have a duty of care to protect children
I hope we can create a world where our kids’ health is prioritized over the profit of technology companies. This will require action from all sides.
Thank you, Nicki!
You can subscribe to new episodes of Scrolling2Death and learn more about supporting Nicki’s legislative efforts here, or book her for speaking events here.








You both are incredible!!! Thank you for everything you do for the movement.
Nicki, I love how you listened to that "nudge" after reading the Surgeon General's report and finding Carson's story. And look at what has happened! You're an inspiration. I hope more parents, caregivers, and those with influence read your story, wake up, and move.