Generation AI: what the numbers tell us
Generation AI: what the numbers tell us
A new report reveals a picture of two generations living in the same technological moment but experiencing it very differently. The report by Common Sense Media, surveyed 1,244 parents and 800 young people aged 12–17 across the United States and reveals a set of challenges for everyone involved in children’s lives.
Young people are using AI far more than adults realise
Two-thirds of young people (67%) say they use AI tools, chatbots or programmes at least sometimes – compared to just 49% of parents who say the same. But it’s not just the frequency gap that matters, it’s the ‘why’ gap.
When parents were asked what they thought young people mainly use AI for, 46% said creating images or videos and 23% said companionship. The reality, according to ‘Generation AI’ themselves, is rather different. The top uses are searching for information or facts (59%) and getting help with homework or school assignments (55%). Companionship? Just 8% of young people cite it as a primary use.
This matters because parents – and by extension, many teachers – are imagining AI use as perhaps something frivolous or socially concerning, when in practice it has already embedded itself into the core of how young people learn and seek knowledge. The worry about a lonely teenager forming an attachment to a chatbot could be misplaced. The reality is that AI has become, for millions of young people, a default first port of call when they need to understand something or complete a task.
The generation gap is real – and it runs in surprising directions
The report reveals a consistent optimism among Generation AI that adults simply don’t share. While 56% of 12–17 year olds think AI will help society in the short term, parents are almost evenly split – 46% say it will help, 45% say it will hurt. On the long term, young people remain cautiously positive (57% say it will help), while parents again hover at near-deadlock (45% help, 43% hurt).
We might think this just inaivety on the part of young people but dismissing it as that would be too simplistic. The report also shows that 65% of parents of children under 18 believe AI will change life as dramatically as the internet or electricity. Young people aren’t blind to the risks – 47% say they feel worried about AI’s impact on their economic future. They are, however, more likely to see themselves as active participants in an AI-driven world rather than passive victims of it.
That confidence may be partly grounded in practical experience. 70% of Generation AI people believe they can tell the difference between AI-generated content and human-produced content. Only 42% of parents believe their children can make the distinction. Whether the young people are right to be confident is a separate issue – but dismissing their self-assessment entirely could be a mistake.
The creativity and critical thinking problem
Adults and young people find significant common ground on creativity – and it’s not reassuring. 70% of parents believe that using AI tools will make young people less creative, and 62% of young people agree with them. That number is worth thinking about – nearly two-thirds of young people are using AI regularly while simultaneously believing it’s undermining creativity. It’s not ignorance more than a kind of uncomfortable self-awareness.
The concern about critical thinking runs even deeper. 83% of both parents and young people agreed that young people need to learn to think critically for themselves without AI support. This figure was consistent across both groups, with 51% of parents and 46% of young people strongly agreeing. Yet 71% of parents and 60% of young people believe that by the time today’s teenagers are adults, people will be so dependent on AI that they won’t be able to function without it.
These two positions sit in tension with each other and navigating that tension is probably the most pressing challenge for schools in the coming decade.
The school question – and a telling divide
On the ethics of AI in schoolwork, the report exposes a sharp generational split. Exactly 52% of parents believe using AI for school assignments is unethical and deserves consequences. An identical 52% of young people believe it is innovative and should be encouraged.
What unites both groups is more revealing – 68% of young people and 52% of parents agree that schools have a responsibility to teach students how to use AI responsibly. That’s a mandate for teachers, clearly expressed by the young people sitting in their classrooms. It also signals that while young people embrace AI, they’re are not asking for a free-for-all – they want guidance, frameworks and honest conversations about what responsible use looks like.
Worth noting too – when asked whether they would want their child to use AI to significantly improve a college or job application, 65% of parents of under-18s said yes. Principles and practice, as so often, rather diverge…
Safety, privacy, and a trust deficit
Perhaps the most noteworthy finding in the report is not about AI use itself but about views of those building it. 64% of parents are not confident that AI companies are prioritising the safety of young people. Among young people, 56% share that lack of confidence. This is the majority view across both generations.
The specific concerns are wide-ranging but consistent – 84% of parents are concerned about AI misusing young people’s personal data; 82% are concerned about AI being used to impersonate young people; 78% are concerned about young people receiving inaccurate information from AI. Among the young people themselves, 76% are concerned about AI collecting their information without their knowledge and 72% about impersonation.
Most worryingly 58% of parents say they know little or nothing about the safety features in AI products their teenager might be using – a knowledge gap that urgently needs closing.
What you can do: practical steps for adults
The report’s findings point clearly to where the most useful adult intervention lies:
Ask, don’t assume. The gap between what parents think young people are doing with AI and what Generation AI are actually doing is significant. Ask directly, regularly and without judgment. Young people are primarily using AI as a research and homework tool – so start the conversation there.
Teach AI literacy as you would media literacy. The 68% of young people who want schools to teach responsible AI use are telling us something. This isn’t a subject to leave to chance or hope young people will figure out alone. Schools should be explicit about how AI works, where it fails and what responsible use looks like in an academic context.
Talk about data and privacy. Young people are concerned about this themselves – 76% worry about data collection without their knowledge. This is a natural entry point for your conversation. Help young people understand what they are sharing when they use AI tools – and with whom.
Model honest uncertainty. 65% of parents believe AI will be transformative on the scale of electricity or the internet, yet most adults are still working out what that means in practice. Modelling a bit of humility – “I don’t know how to navigate this either, but let’s think about it together” – is more useful than false certainty in either direction.
Don’t conflate AI use with cheating. The ethics of AI in school work are complex and contested – the 52/52 split in the report demonstrates this. Blanket prohibition without explanation is unlikely to be effective and may simply drive use underground. Clearer guidance on what counts as ‘acceptable use’, and why, is more constructive.
Advocate for guardrails. The report found that 75% of parents and 72% of young people support establishing a government oversight body for AI safety. If you’re concerned about the pace at which AI is being deployed in children’s lives without adequate safeguards, say so – to schools, to policymakers, and in public conversation.
The young people I’ve dubbed ‘Generation AI’ in this report strongly come across not as passive recipients of AI – but active, largely unguided users of possibly one of the most transformative technologies ever built. The gap between their confidence and adult understanding of it is the space in which harm is most likely to occur – we need to all work together to close the gap.