Facebook, Instagram and YouTube: Government forcing companies to protect you online
- Published
- comments
We often talk about the risks you might find online and whether social media companies need to do more to make sure you don't come across inappropriate content.
Well, now media regulator Ofcom is getting new powers, to make sure companies protect both adults and children from harmful content online.
The media regulator makes sure everyone in media, including the 麻豆社, is keeping to the rules.
Harmful content refers to things like violence, terrorism, cyber-bullying and child abuse.
The new rules will likely apply to Facebook - who also own Instagram and WhatsApp - Snapchat, Twitter, YouTube and TikTok, and will include things like comments, forums and video-sharing.
Platforms will need to ensure that illegal content is removed quickly, and may also have to "minimise the risks" of it appearing at all.
These plans have been talked about for a while now.
The idea of new rules to tackle 'online harms' was originally set out by the Department for Digital, Culture, Media and Sport in May 2018.
The government has now decided to give Ofcom these new powers following research called the 'Online Harms consultation', carried out in the UK in 2019.
Plans allowing Ofcom to take control of social media were first spoken of in August last year.
Ofcom is a government organisation that looks after things like TV, mobile phones, and the internet.
It already regulates television and radio broadcasters, including the 麻豆社, and deals with complaints about them.
But now with these new powers, it will have a greater role with dealing with complaints about social media and the internet too.
The government will officially announce these new powers for Ofcom on Wednesday 12 February.
But we won't know right away exactly what new rules will be introduced, or what will happen to tech or social media companies who break the new rules.
Children's charity the NSPCC has welcomed the news. It says trusting companies to keep children safe online has failed.
"Too many times social media companies have said: 'We don't like the idea of children being abused on our sites, we'll do something, leave it to us'," said chief executive Peter Wanless.
"Thirteen self-regulatory attempts to keep children safe online have failed."
The UK government's digital secretary, Baroness Nicky Morgan said: "There are many platforms who ideally would not have wanted regulation, but I think that's changing."
"I think they understand now that actually regulation is coming."
In many countries, social media platforms are allowed to regulate themselves, as long as they stick to local laws on illegal material.
But some, including Germany and Australia, have introduced strict rules to force social media platforms do more to protect users online.
In Australia, social media companies have to pay big fines and bosses can even be sent to prison if they break the rules.
For more information and tips about staying safe online, go to 麻豆社 Own It, and find out how to make the internet a better place for all of us.
- Published4 February 2020
- Published29 January 2019