December 15, 2025
What might social media look like if it was designed to be safer for young people? Prompted by a comment from Hannah Keal I have been thinking what it might look like. Then I remembered we already have 1 large social media attempt at this, YouTube Kids (not perfect but an attempt). So it is possible for big tech companies to do better. Here is my attempt at a wish list and explanation of why I want each feature for an child/adolescent social media.
Remove the like/dislike button and counter
Remove the views counter
Limit the number of replies to any post to a set number (say 5-10)
Remove the ability to post any photos of people. Art and cartoons yes, photos of nature/landscape yes, but no selfies no full body shots of people.
Each child account needs to be publicly connected to the sponsoring adult who takes responsibility for supporting that child adolescent.
All friends/connections need to be approved by an adult
Limit the number of connections someone can have to 50 maybe 100 but no friends counter.
All friend connections have to be made in the real world with two accounts on two different devices being paired via bluetooth.
No algorithmic way to scroll an endless stream of content from strangers. You can only see what friends post. No repost/reshare function.
All accounts can publish a limited number of posts, say 5 a day.
No adverts
Remove the like/dislike button and counter
Remove the views counter
Limit the number of replies to any post to a set number (say 5-10)
These 3 are all about reducing the social media pressure to get lots of engagement. That unhealthy obsession to get a dopamine hit from numbers going up up up.
Remove the ability to post any photos of people. Art and cartoons yes, photos of nature/landscape yes, but no selfies no full body shots of people.
Social media pressure to look a certain way is huge, modern social media is very good at "reading" the content of an image. Most of the time it could catch and prevent any images of people being posted. which would equal less body image impact, less body shaming, less comparison etc. It wouldn't be perfect but the rule would change how users think about images and hopefully curate a more healthy culture. Please share art, share photos of beautiful flowers and sunsets. Express creativity but no need for selfies.
Each child account needs to be publicly connected to the sponsoring adult who takes responsibility for supporting that child adolescent.
All friends/connections need to be approved by an adult
The individual value of this depends on the attitude of the adult. But it fosters accountability and helps users know who they might want to speak to as an adult when online arguments break out (they always can happen, like any playground). It hopefully would also reduce the risk of adult posing as children. With the drive to verify 18+ accounts growing online anway the companies could make it a requirement that only accounts that have verified their ID can sponsor a child account. The aim is to ensure every child account is linked to an adult who authorities can contact if needed but more importantly the child and adult sponsors can see how everyone is connected.
Limit the number of connections someone can have to 50 maybe 100 but no friends counter.
This is again about reducing the competition for having the highest number but is also a way of capping connections so no one can become an online influencer with 10,000 followers with a child account. It makes social media much more manageable. A limit of 50 means a child could be friends with everyone in their class, a couple of relatives their age and a few people from after school clubs. Maybe increase it to 100 a year before they get an adult account. Or when they are younger make it even lower.
All friend connections have to be made in the real world with accounts on different devices being paired via bluetooth. The children and sponsoring adults would all need to be agreeing to the new connection.
This could be the most significant safety feature to add as it means all connections have to be with real life contacts. This allows people to have a greater sense of confidence that we know who everyone is who they are talking to. However, this could also be a massive limit for young people (I am thinking 13+) who want to meet people who share interests, situations or needs that none of their real life contacts share. I am thinking about the positive potential for friendships to bloom from shared interests around topics that are uncommon. Or young people with specific challenges (health, social etc) that none of their peers can relate over. So I am not sure if this is a net positive or negative for young people but it is something tech companies could do and might improve safety online.
No algorithmic way to scroll an endless stream of content from strangers. You can only see what friends post. No repost/reshare function.
All accounts can publish a limited number of posts, say 5 a day.
One of the identified worries around social media is how much of a time sink it can be, normally this isn't endless posting their own content but scrolling the 'for you' or 'trending' stream of never ending content. Even if someone had the max 100 friends who each posted the maximum of 5 posts a day. That would be 500 posts they could see each day maximum. Statistics about how many posts people see on social media vary by source. But over 1,000 seems a baseline from many studies, with lots of people massively going over that number. If we remove the endless scrolling of algorithmic content we can improve sleep and deal with some of the biggest worries.
These are just my initial thoughts, I am still reflecting on direct messaging and wondering if it should or shouldn't be a part of a child friendly social media platform. It is a key way people communicate in the modern world. Do young people need a safe place to explore this? and learn online communication, or is it too risky? I would also ban all adverts of course.
Firstly, it build brand loyalty and system functionality. Just like youtube for kids. It is a way to ensure the next generation knows their services and when they become adults who can be sold content they are already a user. It is a way to build long term brand loyalty.
Secondly, it is good public relations and could drive adult users back to their service. Adults would be required to be users to sponsor their kids account and if the social media is liked and proven to be safer (nothing is 100% safe, no bike is, yet we encourage children to ride kids bikes). The first company to do this well can milk it in the press as the socially responsible option.
I grew up with text messaging, MSN messenger on dial up and online forums. Were they perfect no, did we use them yes. Why? because it meet a need to build social connections. I grew up in a village, I couldn't chat with my school friends after 6pm. We lived too far away and needed lifts to connect. Yes I could go outside (and frequently did) but I was limited in how late I could stay out as a child and a teenager. Plus I could only connect with people who lived within walking distance. Many parents are more nervous now about letting their kids 'room the streets' to just hang out then they were 30 years ago. Many children have less and less opportunities to be social after school. This risks loneliness and isolation. Would a highly limited social media be their prefered option when they know adult accounts can do loads more, no. But would they use it? I think yes.