Social media platform X has pledged to review UK reports of suspected illegal hate. terrorist content within 24 hours on average, under commitments accepted by Ofcom.
The Elon Musk-owned company said this would apply to content flagged through its illegal content reporting tool.
Ofcom's online safety director Oliver Griffiths called the commitments a "step forward". said they were of particular importance afterrecent religiously-motivated crimes targeting Jewish communities in the UK.
Ofcom said aseparate investigation into X's AI tool Grok, over concerns it was used to create sexualised images, is ongoing.
The BBC has contacted X for comment.
The announcement follows the launch of an Ofcom compliance programme in December, assessing whether the biggest social media companies have adequate systems. processes for dealing with reports of illegal hate and terror material.
Griffiths said the regulator had evidence that terrorist content. illegal hate speech was "persisting on some of the largest social media sites".
He said Ofcom was challenging the platforms to tackle the issue and take firmer action.
X will submit performance data to Ofcom every three months for a year. so the regulator can monitor whether it is meeting the targets.
While it has set a target to average less than 24 hours for its reviews. it has also promised to assess at least 85% of reports within 48 hours.
Ofcom also set out two further commitments from X to better protect UK users from illegal hate and terror content.
Under the first, the company will engage with experts about reporting systems for such content.
Ofcom said this followed concerns from some organisations that they had flagged "multiple pieces" of suspected illegal hate. terrorist content to X, but were unclear whether the reports had been received or acted upon.
X's second commitment is to withhold UK access to accounts reported for posting UK illegal terrorist content. if it determines they are operated by, or on behalf of, a terrorist organisation proscribed in the UK.
Danny Stone, chief executive of the Antisemitism Policy Trust, said the action was a "good start". that there was still more to do.
"X is failing in so many regards to tackle open racism on its platform," he said.
"We know where this online harm leads,. so for the sake and safety of all of us in Britain, I hope Ofcom will hold X to account for what it has promised the regulator it will do."
The UK has seen a series of recent attacks targeting Jewish communities, including the Heaton Park Synagogue attack in Manchester in October 2025, an attack in Golders Green in April,. recent arson attempts on Jewish sites in London.
Iman Atta. director of Tell Mama, a national project which records anti-Muslim incidents in the UK, welcomed the updated targets, saying they signalled "a more accountable approach".
She said the group was "particularly encouraged" by the commitment to take action against accounts operated by or on behalf of terrorist organisations proscribed in the UK.
"This sends an important message that no platform or body operating in this country is above scrutiny," she said, adding that the test was "not what is promised,. what is delivered".
Sign up for our Tech Decoded newsletterto follow the world's top tech stories and trends.Outside the UK? Sign up here.
Discussion
Sign in to join the thread, react, and share images.