Safety Features Nobody Uses Aren't Really Safety Features
There's a statistic that's been sitting with me for weeks.
On platforms like Discord and Snapchat, fewer than 1% of minors have parents using the built-in parental monitoring tools. On Instagram, fewer than 10% of teens had turned on parental controls by the end of 2022. The tools exist. They're built into the platforms. Parents just aren't using them.
The easy interpretation is to blame parents. They're not paying attention. They're not tech-savvy enough. They should be doing more.
But I don't think that's it.
Ian and I are both parents. We know what it's like to be exhausted at the end of the day, trying to figure out whether our children are safe online whilst also trying to get dinner made and homework checked. We know the reality of trying to stay on top of multiple platforms, each with their own settings buried in different menus, each updating their interface regularly so what you learned last month doesn't work anymore.
When a safety feature requires parents to navigate complex settings, remember passwords for multiple platforms, and stay current with constant interface changes, that's not really a safety feature. That's a compliance checkbox. It exists so the platform can say "we offer parental controls" without actually making them usable.
The gap between existence and adoption is a design failure.
Real safety tools work by default. They don't require parents to opt in, hunt through menus, or become platform experts. They're embedded in the architecture so deeply that using the platform safely is easier than using it unsafely.
When we're building oodlü, we're thinking about this differently. Adults aren't optional add-ons to safety. They're central to it. Parents, guardians, or teachers create the groups children are part of. They're notified when those groups form and can see who's in them. If a child reports an issue, it goes directly to their actual adult, someone who knows them and the other children involved, not to a distant moderation team.
We're building a system where adult oversight is the default state. No activation required. No maintenance needed.
Does this mean fewer children can access the platform? Probably. Does it mean less viral growth? Almost certainly. But those trade-offs are the point. We're not trying to maximise user numbers. We're trying to build something families can actually trust without needing to become platform experts first.
The 1% adoption rate tells us something important about design. When safety features require expertise and effort, most people won't use them. We think there's a better way.
We'd love to hear your thoughts on this. Find us on the social channels linked at the top of the page.