Social media isn’t simply watching your children. It’s learning them.
Each like, scroll and share feeds an invisible machine designed to trace their habits, predict their pursuits and preserve them hooked for so long as doable. Platforms like TikTok, Instagram and YouTube acquire large quantities of knowledge from younger customers, typically with out clear consent or understanding. That information isn’t simply saved. It’s monetized, shaping what youngsters see, purchase and even how they assume.
The issue goes deeper than focused advertisements. Weak age verification makes it simple for kids beneath 13 to enroll, slipping previous laws like COPPA, that are supposed to guard them. In the meantime, privateness insurance policies are so dense that even adults battle to decode them. And when information leaks occur, youngsters, who could not even realise how a lot private info they’ve given away, turn out to be simple targets for id theft and exploitation.
Collateral injury
Tech corporations declare they prioritise security, however fines for violations are sometimes only a value of doing enterprise. Whereas regulators scramble to catch up, youngsters’s digital footprints proceed to develop, fuelling an trade that income from their consideration. Within the race for engagement, their privateness is commonly simply seen as collateral injury.
On March 3, 2025, the Info Commissioner’s Workplace (ICO) revealed that it has launched an investigation into how social media and video sharing platforms use UK youngsters’s private info
The ICO is the UK’s unbiased authority charged with regulating and implementing the information safety and freedom of data regimes within the UK.
Its investigation is trying into how TikTok, Reddit and Imgur shield the privateness of their baby customers within the UK.
Its research of TikTok is contemplating how the platform makes use of private info of 13–17-year-olds within the UK to make suggestions to them and ship steered content material to their feeds.
That is in gentle of rising issues about social media and video sharing platforms utilizing information generated by youngsters’s on-line exercise of their recommender techniques, which may result in younger folks being served inappropriate or dangerous content material.
The ICO’s investigations into Imgur and Reddit are contemplating how the platforms use UK youngsters’s private info and their use of age assurance measures. Age assurance performs an necessary position in holding youngsters, and their private info, secure on-line. There are instruments or approaches that may assist estimate or confirm a baby’s age, which then permit companies to be tailor-made to their wants or entry to be restricted.
The investigations are a part of the regulator’s efforts to make sure corporations are designing digital companies that shield youngsters.
At this stage, the ICO is investigating whether or not there have been any infringements of knowledge safety laws. If it finds there may be ample proof that any of those corporations have damaged the regulation, it’ll put this to them and procure their representations earlier than reaching a ultimate conclusion.
ICO analysis carried out in February 2025 by Savanta revealed virtually half of British mother and father (42%) really feel they’ve little or no management over the knowledge social media and video sharing platforms are amassing about their youngsters – and the identical quantity once more really feel unable to elucidate it to their youngsters.
1 / 4 of the general public (23%) say they, or their youngsters, have stopped utilizing explicit platforms and channels as a result of they’re involved about how this information is used or collected.
80% of the general public agreed that platforms needs to be extra clear with folks about how they use private info to advocate content material. Whereas 52% of respondents really feel they’ve little or no management over the knowledge that social media and video sharing platforms acquire about them.
John Edwards, UK Info Commissioner, says: “We welcome the expertise and innovation that corporations like social media carry to the UK and need them to thrive in our financial system. However this can’t be on the expense of youngsters’s privateness.
“My message is straightforward. If social media and video sharing platforms wish to profit from working within the UK they have to adjust to information safety regulation.
“The duty to maintain youngsters secure on-line lies firmly on the door of the businesses providing these companies and my workplace is steadfast in its dedication to carry them to account.
“I additionally wish to take this chance to guarantee youngsters, mother and father and carers within the UK that we’re engaged on their behalf to make the web world a safer place.
“In saying these investigations, we’re making it clear to the general public what motion we’re at present taking to make sure youngsters’s info rights are upheld. It is a precedence space, and we’ll present updates about any additional motion we determine to take.”
William Richmond-Coggan, companion and dispute administration, at Freeths, a companion specialising in information safety and expertise at Freeths LLP, isn’t shocked that the ICO has chosen to focus its investigatory efforts on Tiktok.
The platform is already interesting in opposition to an ICO high-quality of £12.7m in 2023 for misusing youngsters’s information issued in 2023.
With restricted assets, the ICO will wish to make sure that any intervention is prone to profit the best variety of customers, and that any conclusions they attain will entice as a lot consideration as doable.
He says: “However it will be a mistake for organisations that aren’t of the scale of Tiktok or Meta to think about that which means that they’re able to function with impunity. A spread of very critical obligations are imposed in relation to the protection of younger folks, and the safety of their private information, beneath statutory steering just like the ICO’s Kids’s Code and beneath laws just like the On-line Security Act.”
Most of these obligations apply to any enterprise that operates utilizing vital portions of youngsters’s information, or which presents services and products that is likely to be anticipated to be of curiosity to younger folks (whether or not or not focused at them).
“It might be that smaller companies will be capable to escape direct regulatory scrutiny, no less than except they endure a breach or different incident,” Richmond-Coggan provides. “However now we have already seen some litigation within the UK and elsewhere focused on the probably dangerous influence on younger folks of Tiktok’s algorithms.”
Dangerous processing
However he warns that involved mother and father and the younger folks straight affected by dangerous or careless processing, usually are not prone to anticipate regulatory processes to run their course earlier than they take motion in opposition to what they understand to be the worst offenders.
Our free interactions on social media platforms can include a trade-off, explains Emily Keaney, the ICO’s deputy commissioner for Regulatory Coverage.
“From the second a teenager opens an app or performs a video, a considerable amount of information begins to be gathered to probably form the content material they’re served with,” she says.
“These are referred to as recommender techniques, and so they can work properly, for instance to recommend a enjoyable dance routine. However now we have issues the place the profile fashioned based mostly on a baby’s private info could advocate content material that isn’t applicable for kids to see. And we’re not the one ones – 80% of the general public advised us they agree that platforms needs to be extra clear with folks about how they use private info to advocate content material.”
Keaney notes that regulating how youngsters’s information is utilized by digital companies is one facet of a posh worldwide and nationwide on-line security ecosystem.
The ICO works intently with Ofcom, because the On-line Security Act regulator, and different worldwide organisations to make sure youngsters within the UK have a greater digital expertise.
“We’ll proceed our work to drive modifications and, the place mandatory, we’ll use the complete pressure of our regulatory powers to make sure younger folks can each profit from and be secure inside the on-line world,” she says.
Photograph by Gaelle Marcel on Unsplash
Fascinated by listening to main international manufacturers talk about topics like this in individual? Discover out extra about Digital Advertising and marketing World Discussion board (#DMWF) Europe, London, North America, and Singapore.