PROTECT YOUR DNA WITH QUANTUM TECHNOLOGY
Orgo-Life the new way to the future Advertising by Adpathway- Richard Willett - Memes and headline comments by David Icke
- 18 October 2025
An immigration barrister was found by a judge to be using AI to do his work for a tribunal hearing after citing cases that were “entirely fictitious” or “wholly irrelevant”.
Chowdhury Rahman was discovered using ChatGPT-like software to prepare his legal research, a tribunal heard. Rahman was found not only to have used AI to prepare his work, but “failed thereafter to undertake any proper checks on the accuracy”.
The upper tribunal judge Mark Blundell said Rahman had even tried to hide the fact he had used AI and “wasted” the tribunal’s time. Blundell said he was considering reporting Rahman to the Bar Standards Board. The Guardian has contacted Rahman’s firm for comment.
The matter came to light in the case of two Honduran sisters who claimed asylum on the basis that they were being targeted by a criminal gang in their home country. Rahman represented the sisters, aged 29 and 35. The case escalated to the upper tribunal.
Blundell rejected Rahman’s arguments, adding that “nothing said by Mr Rahman orally or in writing establishes an error of law on the part of the judge and the appeal must be dismissed”.
Then, in a rare ruling, Blundell went on to say in a postscript that there were “significant problems” within the grounds of appeal put before him.
He said that 12 authorities were cited in the paperwork by Rahman, but when he came to read the grounds, he noticed that “some of those authorities did not exist and that others did not support the propositions of law for which they were cited in the grounds”.
In his judgment, he listed 10 of these cases and set out “what was said by Mr Rahman about those actual or fictitious cases”.
Blundell said: “Mr Rahman appeared to know nothing about any of the authorities he had cited in the grounds of appeal he had supposedly settled in July this year. He had apparently not intended to take me to any of those decisions in his submissions.
“Some of the decisions did not exist. Not one decision supported the proposition of law set out in the grounds.”
Blundell said the submissions made by Rahman – who said he had used “various websites” to conduct his research – were therefore misleading.
Read More: Barrister found to have used AI to prepare for hearing after citing ‘fictitious’ cases