When Parents Beg the Government to Raise Their Children for Them
Dismantling the App Store Accountability Act
There’s a moment in every coordinated deception when the mask slips just enough to reveal what’s really happening underneath.
For the “App Store Accountability Act,” that moment came in a Louisiana Senate Finance Committee hearing in 2024, when Digital Childhood Alliance Executive Director Casey Stefanski squirmed under questioning from Senator Jay Morris. When asked whether tech companies funded her organization, she deflected, claimed discomfort, and eventually admitted they received tech funding but refused to name which companies were writing the checks (cough, Meta).
That awkward silence told you everything.
But a recent piece titled “Making the Case for the App Store Accountability Act” by the “Digital Childhood Alliance” wants you to ignore that moment entirely. They want you to believe you’re powerless to protect your own children, that only government regulation can save kids from Big Tech, and that Apple and Google bear responsibility for harm they didn’t create and can’t control.
As you’re about to see, every claim is wrong.
It’s a coordinated campaign to convince parents they’re incompetent while the company actually destroying children’s mental health funds legislation that lets them off the hook.
The fundamental truth they’re desperate to obscure:
You already have complete power to protect your children. The solution isn’t more regulation. It’s taking the damn phone away.
The Digital Childhood Alliance claims “the current system places an impossible burden on parents.” Uh, excuse me? According to them, expecting parents to monitor what apps their children download represents some insurmountable challenge only government can solve.
Reading an app description before your child downloads it is not an “impossible burden.” You know what’s actually impossible?
Expecting Apple to police every piece of content in millions of apps while Meta continues serving algorithmic poison directly to your child’s brain.
If spending 30 seconds reading an app description feels too burdensome, forcing Apple to send you a digital permission slip won’t transform you into an engaged parent (you’ll click “approve” without reading it, exactly like you breeze through every Terms of Service agreement).
The article claims parents can’t be “responsible for every minute of content their kids consume.”
But that’s literally parenting is it not?!
When your 12-year-old asks for Instagram, you can say no
When they beg for TikTok, you can refuse
When they want a smartphone, you can hand them a flip phone instead
Pretending you’re helpless is the surrender Meta is banking on.
The Digital Childhood Alliance points to Instagram’s 12+ rating and cheerful app description, then reveals it mentions nothing about sexual content, violence, or eating disorder tutorials.
Their “gotcha” question: “Does this sound like an app that is appropriate for children?”
Here’s the answer:
Don’t let your child download it.
This may come as a shocker, but Instagram cannot install itself on your child’s phone.
It requires your permission, your payment method, your WiFi, your device. You control every point of access. But notice the sleight of hand? Instagram’s rating is misleading because Meta makes it misleading.
So why regulate Apple instead of Meta?
☑️ It’s Meta’s content
☑️ It’s Meta’s algorithm
☑️ It’s Meta’s platform causing documented harm
So, if you’re gonna do something, why not make Meta verify ages at the point where they serve content.
But nope. Instead, they compare digital parenting with building codes and allergen warnings, claiming “even the most diligent adults can’t know every danger in every product.”
This is dishonest at best.
You need building codes because you can’t inspect structural engineering. You need allergen warnings because you can’t perform chemical analysis. But deciding whether your 12-year-old should have Instagram doesn’t require specialized expertise.
It requires the willingness to say “no” and enforce boundaries.
What ASAA Actually Does (Spoiler: Nothing)
Requirement #1: Verified parental approval for each app download.
Three scenarios matter:
If you’re an engaged parent monitoring your child’s phone. ASAA gives you zero new control.
If you’re a checked-out parent who gave unrestricted device access. You’ll click “approve” without reading. ASAA changes nothing.
Your child accesses social media through a friend’s phone, web browsers, or the thousand workarounds teenagers have perfected. ASAA does nothing because it only covers app store downloads.
The only thing ASAA accomplishes?
Forcing Apple and Google to build expensive verification systems for Meta’s harmful products. Meanwhile, Meta serves whatever content maximizes engagement, but now has a liability shield because “the parent approved the download.”
Meta creates the harm. Apple handles the paperwork.
Meta faces zero accountability.
Requirement #2: App stores use age verification they “already have.”
This misses the point.
Storing birthdates doesn’t prevent Meta from serving eating disorder content to vulnerable 14-year-olds after they open the app.
But when Idaho targeted pornography sites directly, those sites had to verify ages before serving explicit content. No hiding behind app stores. Verification happened where it mattered: at the point of content access.
ASAA does the opposite: Verify at download, create false security, then let Meta serve whatever their algorithm decides.
Requirement #3: “Accurate” age ratings that reflect actual content.
Question: who gets to decide what’s “accurate?”
Will Apple review every Instagram post? Every TikTok video? Meta serves billions of pieces of content daily through algorithms Apple can’t see.
The real solution: make Meta (and similiar platforms) accountable for what they serve. If Instagram’s algorithm pushes pro-anorexia content to middle schoolers, that’s Meta’s responsibility. They built it. They profit from it.
But ASAA lets Meta operate freely while forcing Apple to rate content they don’t create and can’t monitor.
When the Excuses Are Worse Than the Problem
The Digital Childhood Alliance apparently thinks you were born yesterday. So let’s examine their defenses for ASAA and watch them collapse under the weight of their own absurdity.
“Age verification protects privacy without storing data.”
Meta already collects massive amounts of personal data when you create an Instagram account. They know your age, location, interests, relationships, and psychological vulnerabilities. They use this to serve targeted content designed to maximize engagement regardless of impact. But somehow we should worry about Apple knowing your birthdate?
Meta wants app stores handling verification so Meta faces no direct accountability.
“App stores handle verification, NOT developers, protecting smaller companies.”
Meta creates the harmful content. Meta designs algorithms that research shows damage teenage girls’ mental health. Meta profits from engagement patterns their own internal documents acknowledge cause psychological harm. But with the ASAA, Meta takes zero responsibility because they’ve outsourced all liability to Apple and Google.
Compare that to Idaho’s age verification law: Pornhub must verify ages before serving content. Not Apple. Not Google. The company profiting from potentially harmful material bears the compliance burden.
“Children as young as 18 months are hypnotized by screens.”
Yes... because you hand them screens instead of parenting.
An 18-month-old didn’t download Instagram or purchase a smartphone. Every step required your choices.
“Children are targeted by predators through social media and gaming platforms.”
Yes… because you give them unrestricted access to platforms designed for public interaction with strangers.
The solution isn’t Apple verifying ages. It’s not giving your 8-year-old unrestricted online access where strangers can message them.
You control device access, screen time, app downloads, and whether devices live in bedrooms overnight.
Where Idaho Got It Right
Idaho didn’t regulate internet providers, app stores, or device manufacturers.
We went straight to the source: websites profiting from explicit content must verify ages before serving that content. The law hit hard enough that Pornhub geoblocked Idaho entirely rather than comply.
That’s real deterrence and actual accountability.
Because we recognized a principle Meta desperately wants ignored: those who create and profit from potentially harmful content should prevent minors from accessing it. Not third parties. The actual source.
ASAA embraces the opposite: Meta designs algorithms pushing eating disorder content to teenage girls, but Apple must somehow rate apps containing content they can’t see or control.
When you hand a 12-year-old an unrestricted smartphone with social media accounts and bedroom device access, you made deliberate choices with predictable consequences.
Rising depression
Skyrocketing self-harm among young girls
Suicide attempts completely out of control
Meta’s own internal research, exposed through whistleblower documents, proved Facebook executives knew their platforms devastated young people’s mental health. Instagram made body image issues worse for one in three teen girls. The platform worsened eating disorders and suicidal thoughts.
But instead of fixing their platforms, Meta hired lobbyists, spent $24 million in 2024 (exceeding defense contractors), and backed organizations pushing legislation that shifts compliance burden to their competitors.
The tobacco industry blamed consumers for addiction. The alcohol industry resisted age restrictions. But the difference is…
Parents weren’t handing cigarettes to their children while demanding the government regulate the convenience store.
That’s exactly what’s happening with ASAA. You’re giving children access to platforms research proves harmful, then demanding Apple or Google prevent the harm you’re enabling.
When your child asks for Instagram, say no.
When they claim “everyone has it,” explain why popularity doesn’t mean healthy. When they argue they need it for social connection, teach them about face-to-face relationships without algorithmic mediation.
These are parenting decisions.
The “anxious generation” wasn’t created by app stores failing to verify ages.
It was created by algorithms deliberately designed to maximize engagement regardless of psychological impact, combined with parents who surrendered authority and demanded someone else fix problems they enabled.
When Casey Stefanski (executive director of the Digital Childhood Alliance) refused to name her Big Tech backers in Louisiana, she was protecting coordinated deception. Meta identified a brilliant strategy: weaponize parental anxiety to wage regulatory war against competitors while preserving their ability to serve algorithmic content to minors.
Stop asking state government to be mommy and daddy.
You’re the parent.
Your child lives in your home. Uses devices you purchased. Accesses apps you have authority to permit or prohibit. You have all the power, but only if you use it.
The App Store Accountability Act is theater funded by the company destroying children’s mental health. It accomplishes nothing except shifting burden to Apple and Google while Meta operates harmful algorithms without restriction.
Take the damn phone away.
Be the parent. Set the boundaries. Monitor their lives. Say no and mean it.
That’s the job you signed up for, right?











Laws and penalties — even ones Idaho passed — are not the best answer to protecting kids online. Technology changes too quickly to be useful long-term. You’re just playing a game of legislative whack-a-mole otherwise.
Has any parent or kid lied about their age, ever? YES.
Do we want to provide sensitive information including photos and other biometrics, social security numbers (last 4) or birthdates to websites? NOPE.
You provided the perfect answer when you wrote: “You already have complete power to protect your children. The solution isn’t more regulation. It’s taking the damn phone away.”
Let’s go several steps further (some of which you already mentioned). Flip phones only, no GPS, minimal texting features, no Internet access. That’s all kids and most adults need for mobile use. Search the web for these phone features. You will find them! What’s more, the simpler phones are cheaper, more robust, emit less harmful electromagnetic radiation, and are more reliable too!
For everything else Internetsky, use your desktop or laptop at home or office (or a “family owned and controlled” smart phone), and don’t have any of it in the kids’ bedrooms unless they are older (e.g., late high school) and/or proven extraordinarily responsible.
Finally, turn off Wi-Fi / internet at bedtime to protect everyone against electromagnetic radiation and sleep disruption.
This is what we should be doing to protect kids from bad actors on the Internet. And we should be doing it without any whining from parents or kids!
https://leveluphumanity.substack.com/p/whos-your-daddy-brian-lenney?r=8ogal&utm_campaign=post&utm_medium=web&fbclid=IwY2xjawNL0x5leHRuA2FlbQIxMQABHu1iy5UQOgQInrEaBa_w_X456uf4edibyKP4uqpwwTYFyOY4o_Tk01edtFDT_aem_t8QXZKVyPZ3lMKkOeS-hUw&triedRedirect=true