Unsplash

Joint Parliamentary Committee in UK Publishes Report on Improving Online Safety Bill

The joint parliamentary committee for the draft Online Safety Bill has published a 193-page report that calls for an end to the self-regulation of big tech and makes several recommendations on how the law can make internet service providers (ISPs) accountable for what is happening on their platforms.

The committee, established in July 2021 to scrutinise the Online Safety Bill and propose improvements before it goes to Parliament for final approval in 2022, said it had reached a unanimous conclusion to “call time on the Wild West online”.

“For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and, in some cases, even loss of life,” 

said committee chair Damian Collins.

“The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”

Major changes recommended in the report include placing criminal sanctions on senior managers for “repeated and systemic failings” to protect users from harm online and expanding the powers of Ofcom to investigate, audit and fine non-compliant tech companies.

“The committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services it will regulate, and to take enforcement action against companies if they don’t comply,” 

said Collins.

As former chair of the House of Commons DCMS Select Committee, Collins previously led an inquiry into disinformation and “fake news”, which similarly concluded by calling for an end to the self-regulation of social media firms.

As it currently stands, the draft Online Safety Bill would impose a statutory “duty of care” on technology companies that host user-generated content or allow people to communicate, meaning they would be legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content, such as child sexual abuse, terrorism and suicide material. Failure to do so could result in fines of up to 10% of their turnover by the online harms regulator, which was confirmed in December 2020 to be Ofcom.

Following its inquiry, the committee said in its report that “in seeking to regulate large multinational companies with the resources to undertake legal challenges”, Parliament should provide more clarity around the overarching duty of care through specific duties that explicitly set out what service providers should be doing to prevent harm online.

To read more, click on the source link below:

Source: