For years, questions about how digital platforms shape behaviour, especially in children, have been discussed in research, policy circles, and quietly among practitioners. What is different now is that these questions are no longer peripheral. They are being examined openly, including in courtrooms, and acknowledged by companies such as Meta themselves.
That shift matters. But it is difficult to ignore what has brought us here. The fact that it has taken legal action, and the lived experiences of children, to bring these design choices into full view should concern everyone involved in digital product design.
We Are Beyond Awareness
We are now past the point of simply recognising the problem. The focus must move to action. As argued in our analysis of platform restrictions for teenagers, awareness on its own is not a safeguarding strategy. If digital systems can shape children’s behaviour, and the evidence increasingly suggests they can, then responsibility cannot sit outside the design process. It must be embedded within it.
This is where the conversation on safe and ethical AI for children becomes broader than AI alone. It applies to feeds, recommendation systems, safety settings, social mechanics, and every design choice that influences how children engage with technology.
What Must Change in Product Design
This moment requires a fundamental recalibration of priorities for companies building child-facing or child-impacted systems:
- Wellbeing must become a core design principle, not a secondary consideration
- Age-appropriate experiences should be the default, not an exception
- Success metrics need to move beyond engagement toward healthy, sustainable use
- Accountability must be present at every level, from product design to leadership
These are not abstract ideals. They are practical product and governance questions for teams working on online safety and regulatory readiness and for organisations reviewing how their systems affect children in practice.
The Real Question Now
Technology will continue to play a defining role in how the next generation learns, connects, and understands the world. The question is no longer whether it has influence, but how that influence is directed. The same urgency appears in our work on children and chatbots, where design decisions shape trust, dependency, and advice in deeply human ways.
This moment demands that we build systems where children’s wellbeing is not merely a consideration, but the foundation. The next phase is not more discussion. It is redesign. For those shaping digital platforms, the question is simple: what are you changing now?