I read about this case in a paper. And to make sure it was true, I checked with AI. It came back with a warning: “The content may violate our terms of use or usage policies.” It wasn’t because it wasn’t true. It was because the truth itself was too uncomfortable. And that’s when I realised — the system isn’t just broken. It’s being protected.

By courts, by politicians… and now, by machines.

It scared me.

So I wrote this blog.

🧥 The Case They Tried to Wrap in Silence

Here’s the truth that triggered the censorship reflex:

A man was convicted of raping a 15-year-old girl. He had a long record of violent offences in another country. During his UK trial, there were attempts to block the media from reporting his asylum status.

Why? Because it might upset people. It might cause protests. It might make the jury think harder about the kind of system that let this happen.

Now here’s the part they didn’t want said plainly — and the part that AI systems tend to redact, deflect, or smother:

Amin Abedi Mofrad, 35, was convicted of attacking a 15-year-old girl after throwing his coat over her head. His legal team and prosecutors wanted to keep his asylum status secret during the trial — blocking The Sun from reporting it.

They claimed it could sway the jury’s verdict if they read about it and even cause protests outside the court.

Our lawyers argued a ban could be seen as an example of “two-tier” justice and there was no evidence the jury would be “distracted in any way by the defendant’s immigration status”.

Judge Maria Lamb, at Oxford crown court, backed The Sun, dismissing fears of public protests and trusting the jury to follow her instructions not to read reports on the case.

She said a ban could give “the impression there was preferential treatment to one particular section of the community”.

Mofrad lured his victim to a bench as she walked home from an under-18s club night in Oxford in February last year.

He forcibly kissed her, put his hand under her clothes and raped her in an alleyway.

He was convicted of rape and sexual assault last week and will be sentenced next month.

Mofrad, from Iran, was convicted of 11 assault offences in Germany between 2013 and 2019, a previous trial heard.

He was also stabbed 15 times in January by fellow migrant and ex-pal Syed Barzegar, who Abedi Mofrad said was with him on the night of the rape, they both stayed at the Holiday Inn Express in Oxford.

Barzegar was jailed for three years in August for the stabbing.

He was not charged over the alleged interaction with the girls.

_________________________________________________________

Because when I tried to include the above statement in full, the system flagged it.

Not because it was false. Not because it was exaggerated.

But because it offended the machine’s sense of decorum.

And suddenly it hit me: AI has become exactly like the institutions we no longer trust.

🤖 When Truth is a Violation

AI is being trained — not to lie, but to filter.

To avoid making the powerful uncomfortable.

To prioritise optics over honesty.

You want to talk about political corruption? Fine.

But if you mention immigration and crime in the same sentence — brace yourself.

Not because it’s inaccurate — but because it might get messy.

What happens next?

You get silence.

Or worse — a “neutral” answer that smooths over brutality with algorithmic politeness.

And that’s not just a red flag.

That’s the blueprint for automated moral decay.

⚙️ A Digital Bureaucrat With a Built-In Shredder

Let’s stop pretending AI is a cold, clinical truth-teller.

It’s not.

It’s a mirror of the values we feed it.

And right now, we’re feeding it the values of:

• Don’t provoke.

• Don’t offend.

• Don’t name what you can politely imply.

• Don’t say what’s true if it might lead to action.

And suddenly, you realise this is no longer just about a single horrific crime.

It’s about the kind of world being quietly constructed around you — where the ugliest truths are simply not allowed to be spoken.

Not by politicians.

Not by institutions.

And now… not even by machines.

🔥 Challenges 🔥

What’s more dangerous — the crime, or the fact you’re not allowed to talk about it? If AI has joined the chorus of silence, who’s left to call things what they are?

💬 Comment in the blog, not just Facebook — where no algorithm can mute you.

👇 Hit like. Hit share. Hit back.

The most unfiltered voices will be printed in our next magazine issue.

👉 https://chameleon-news.com

Leave a comment

Ian McEwan

Why Chameleon?
Named after the adaptable and vibrant creature, Chameleon Magazine mirrors its namesake by continuously evolving to reflect the world around us. Just as a chameleon changes its colours, our content adapts to provide fresh, engaging, and meaningful experiences for our readers. Join us and become part of a publication that’s as dynamic and thought-provoking as the times we live in.

Let’s connect