The regulator’s chief executive, Sharon White, has published a discussion document claiming 12 million people in the UK have experienced harassment, fraud and abuse online.
Ofcom suggested that principles from broadcasting regulation could be relevant as the government begins to consider how to legislate to tackle this abuse.
The home secretary has warned technology companies to do more to tackle abuse on the web.
“I’m not just asking for change, I’m demanding it,” Sajid Javid said, adding: “I will not be afraid to take action.”
Echoing an EU proposal which suggests that social media companies should be fined if they do not remove terror content within an hour, the Ofcom suggestions include target times for removing offensive content.
Such time requirements have drawn criticisms from free speech campaigners, who have expressed concerns that social media companies are not equipped to make the decisions about whether content is legally acceptable or not.
Last year, YouTube deleted potential evidence of war crimes in Syria following pressure to tackle terrorism propaganda on its platform.
Rachel Coldicutt, chief executive of think tank DotEveryone, told Sky News that she believed the independence of any given internet regulation would be paramount.
“The problem with a single body is that the internet touches everything, so a single regulator would have enormous power,” she explained.
A cross-party panel discussion in the House of Lords in July agreed on the need for an independent internet regulator, although the panel stressed strengthening existing regulatory capacity rather than creating a new watchdog.
The aim should be develop the infrastructure for regulation, Ms Coldicutt said at the time, echoing an independent report by Mark Bunting, a partner at Communications Chambers, which was commissioned by Sky.
Mr Bunting’s report warned of the risk of “regulation by outrage” and cites communication regulator Ofcom, which stated: “Effective regulation requires a clear definition of the services that are to be regulated, a specific account of the potential harm to be addressed, and hence a clear rationale for the specific regulation.”
Ms Coldicutt suggested that post hoc enforcement of addressing harmful content online was problematic because it either relied upon algorithms or people who are exposed to horrific material.
“Really, rather than focusing on take down times, we need to be focused on how content is uploaded,” she said.