Children, screens and wellbeing: what the latest research tells us

Apr 14, 2025 | children online

The question of how social media and screen time affect children’s mental and physical health is far from new – but a fresh European Commission report brings urgency, nuance and depth to the conversation. The interdisciplinary review not only pulls together the latest science (right up to late 2024) but also reframes the issue of screens and wellbeing: not just as one of individual behaviour, but of platform responsibility, digital design, and policy action.

The report, ‘Minors’ Health and Social Media: An Interdisciplinary Scientific Perspective’, synthesises evidence discussed at high-level expert roundtables hosted by the European Centre for Algorithmic Transparency (ECAT). The conversations involved leading voices from neuroscience, paediatrics, mental health, media psychology, and education.

For parents, teachers, child safety experts and mental health professionals, it provides an important overview of where we are now on children’s screens and wellbeing – and where we need to go next.

A more complex and nuanced picture

One of the strongest messages to emerge from the report is that the impacts of screen time and social media use are not universally good or bad – they depend on how, why, when, and who.

This nuanced view moves beyond debates over “how many hours is too many?” and instead looks at context: the kind of content being consumed, the design of the platform, the time of day, and the child’s individual vulnerabilities or resilience.

In this sense, the report echoes and builds on a growing consensus: screen time totals alone don’t tell us much. We must understand digital experience more broadly.

Mental health: vulnerability, comparison, and connection

Social media is often discussed in relation to mental health, and rightly so. Children and teens now spend an average of 3 hours a day online, according to data cited from Smahel et al. (2020). But the effects of that time are hugely variable.

Negative effects can include:

• Depression and anxiety, especially linked to cyberbullying, fear of missing out (FoMO), and negative social comparison.

• Performance pressure, particularly when teens compare themselves to idealised or curated content online.

• Exposure to harmful content, like self-harm imagery – which algorithms can serve up with startling speed. According to the Center for Countering Digital Hate (2022), TikTok can begin showing self-harm content to new users in as little as 2.6 minutes, and a user’s entire feed can be saturated within 15 minutes.

But the report is also careful to highlight positive aspects:

• Platforms can foster a sense of belonging, especially for marginalised or isolated youth.

• Exposure to body-positive content can promote pro-social attitudes and more inclusive standards of beauty.

• Authentic self-expression, creative sharing, and building friendships online all support what researchers call “flourishing”.

This dual perspective – acknowledging both potential for harm and opportunities for wellbeing – is one of the report’s most valuable contributions, and marks a shift from past literature that has sometimes been alarmist or, conversely, too dismissive.

Self-harm and moderation failures

One area where the report breaks new ground on screens and wellbeing is in its exposure of moderation failures by social platforms. Using test profiles of minors, researchers found that explicit self-harm content – including razor blades, blood and messages encouraging injury – was not removed after a week on Instagram. Even more worrying, the platform’s algorithm actively recommended that child users connect with other members of self-harm groups.

This evidence points to an urgent need for algorithmic accountability when considering screens and wellbeing – something the report ties directly to the EU’s Digital Services Act (DSA), which legally obliges platforms to assess and mitigate risks to children’s health and safety.

Body image and early exposure to perfection

A standout theme is the increasing exposure of very young children – sometimes under 10 – to idealised and sexualised content. Researchers noted a rise in eating disorders, including anorexia, emerging from this age group.

The “fitspiration” trend (idealised fitness and body content) was flagged as particularly problematic. Children and teens aren’t just passive consumers of this material – they increasingly create and share it, often using filters or editing apps. This promotes unattainable beauty ideals and increases pressure to modify one’s appearance offline, including through cosmetic procedures.

The report also highlights new evidence linking these online behaviours to rising perfectionism, performance anxiety, and low self-esteem.

Addiction, brain changes and digital design

While terms like “social media addiction” are still debated scientifically, the report makes a case for treating problematic use seriously. Drawing from research on gaming and gambling disorders, it describes behaviours like compulsive checking, loss of control, and functional impairment.

In a particularly noteworthy section, the report discusses recent neuroimaging studies showing that high levels of social media use can correlate with:

• Disruption in areas of the brain linked to emotional regulation, decision-making, and reward processing.

• Possible early brain maturation in frequent users aged 12–15, which coincides with the developmental window for many psychiatric conditions.

However, the authors are cautious not to overstate the case: these findings suggest correlations, not causation, and often come from US-based samples that may not generalise globally.

Physical health: vision, sleep and offline consequences

The report also brings new focus to the physical health consequences of screen time – an area that tends to receive less public attention than mental health.

Key findings include:

• A rise in digital eye strain among children, linked to longer screen use and poor ergonomics.

• No strong evidence yet that screens cause myopia directly, but compelling data suggests outdoor time helps delay its onset.

Sleep disruption is one of the most consistent harms identified: screen use (especially interactive content at night) delays sleep onset, lowers sleep quality, and reduces overall sleep time.

Studies show that more than 60% of teens use their phones between midnight and 6am (Radesky et al., 2023), and a significant number are awakened by notifications or even set alarms to check messages during the night.

Notably, research cited in the report shows that removing phones from bedrooms improves sleep by up to 30 minutes per night – offering a clear, actionable change for families and schools.

So what can be done?

While the report is rich in diagnosis, it also offers a host of forward-thinking recommendations – some familiar, others quite new in their specificity and emphasis.

For teachers and schools:

• Prioritise digital literacy, not just for children but also for teachers, many of whom feel underprepared for the evolving tech landscape.

• Include online wellbeing, emotional regulation, and critical content analysis in the curriculum.

• Work with tech companies to access tools and resources that support safe online behaviours.

For parents and carers:

• Encourage balanced digital habits rather than outright bans – support young people to engage critically, creatively, and socially online.

• Model good screen use behaviour and promote “no devices in bedrooms” policies.

• Learn to recognise signs of distress linked to online activity – such as withdrawal, disrupted sleep, or body image concerns.

For tech platforms:

• Improve age verification systems and transparency about algorithm design.

• Proactively moderate harmful content, including self-harm and disordered eating material.

• Design features that support digital empathy and critical reflection (e.g. prompts before posting).

• Engage young people themselves in the design of safer, healthier digital experiences.

For policymakers and researchers:

• Fund longitudinal, EU-based studies to avoid relying solely on U.S. data.

• Introduce test phases for new regulations to monitor unintended consequences.

• Enforce transparency requirements for platforms, including data-sharing with independent researchers.

A healthier digital childhood

This new report on screens and wellbeing doesn’t offer easy answers – but it does offer a realistic, evidence-based, and action-oriented framework for understanding the full impact of digital platforms on our children’s lives.

What makes it stand out is its balance. It resists fearmongering while refusing complacency. It centres children not as passive victims or reckless users, but as diverse individuals navigating complex systems – systems that must now be held to a higher standard.