UK regular Ofcom has set out new guidelines require video-sharing platforms (VSPs) like Twitch to do protect users from harmful content.
As reported by BBC News, these platforms – which include Twitch, TikTok, Snapchat, and Vimeo – must take “appropriate measures” to protect users from content related to terrorism, child sexual abuse, and racism, with the regulator claiming, “a third of users have seen hateful content on such sites.”
Under the new rules, sites like Twitch will have to provide and effectively enforce clear rules for uploading content, make the reporting and complaints process easier, and restrict access to adult content with robust age-verification.
Those found breaching the guidelines will be subjected to fines, while serious cases may result in suspension of the entire service.
Ofcom said one of its main priorities in the coming year would be to work with VSPs to reduce the risk of child sexual abuse material being uploaded.
Minors (under-18s) must also be protected from material which might “impair their physical, mental or moral development”.
While this would not mean Ofcom would be assessing individual videos, and acknowledges it’s impossible to prevent every instance of harm, it nonetheless promised “rigorous but fair” approach to its new duties.
“Online videos play a huge role in our lives now, particularly for children,” said Chief executive Dame Melanie Dawes. “But many people see hateful, violent or inappropriate material while using them. The platforms where these videos are shared now have a legal duty to take steps to protect their users.”
The full guidance can be found on Ofcom’s website.
Elsewhere, the entirety of Twitch has reportedly been hacked and leaked, including its source code, user payment information, and encrypted passwords. Users of the platform are advised to change their passwords and enable two-factor authentication.