How doctors can remove damaging news articles from Google search
When you're trying to rebuild your life after a conviction, one of the hardest challenges is escaping your digital past.
For one junior doctor, the internet wouldn’t let him move on even after serving his sentence, being cleared to practise, and receiving support from his peers. Despite rehabilitation and professional endorsement, his name remained linked online to articles that painted a skewed version of events.
How being sent illegal images through WhatsApp led to a strict liability conviction
How damaging headlines continued to affect a rehabilitated doctor
How to respond when Google rejects your right to be forgotten due to 'public interest'
How we challenged Google’s decision using GDPR
How delisting articles can change your life
When and why we escalate delisting requests
How being sent illegal images through WhatsApp led to a strict liability conviction
This case involved a junior doctor who had pleaded guilty to a strict liability offence involving possession of illegal images. The images had been sent to him via a WhatsApp group chat he was part of a group that included dozens of young men from his local gym and rugby club. These chats often featured vulgar humour and inappropriate content, shared freely among members.
Crucially, the doctor did not solicit or engage with the images. He had been focused on an upcoming medical exam and had not even noticed the files until questioned by police about a separate matter. When asked, he handed over his phone voluntarily, believing he had nothing to hide.
During that search, police discovered one image of a child and three images classed as extreme pornography automatically downloaded by WhatsApp without his knowledge. The court accepted that he had not actively sought out the images. Nonetheless, due to the nature of the offence and the UK’s strict liability laws, he was advised to plead guilty to avoid a more severe sentence.
He received a community order and was placed on the Sex Offenders Register for five years. Despite these facts and despite his later clearance to practise medicine the digital record of the offence continued to cause significant harm long after the legal and professional systems had judged the matter closed.
How damaging headlines continued to affect a rehabilitated doctor
The client was not a public figure in the typical sense, but because he was a doctor, and the nature of the charges related to an offence involving indecent images, the media gave the story significant coverage.
Several national and local newspapers featured his full name, occupation, and photograph prominently in their headlines and articles. These stories created a lasting association between his name and a narrative that lacked important context about the nature of the offence and his rehabilitation.
Although others in the WhatsApp group had also received the offending images, none faced the same level of public scrutiny or consequence. The client, meanwhile, had to deal with ongoing damage to his personal and professional life.
After successfully reinstating his medical licence, restarting his hospital training, and earning strong feedback from his supervisors, he remained plagued by the fear of being searched online.
That fear materialised when a medical student assigned to shadow him made a complaint after finding the articles through a simple Google search. His superiors stood by him, reaffirming his right to practise, but the psychological toll of constantly fearing exposure remained immense.
How to respond when Google rejects your right to be forgotten due to 'public interest'
In this case, Google initially refused the RTBF request, arguing that the client’s status as a medical professional justified the ongoing public access to the articles. This is a common yet flawed argument.
Holding a position of responsibility does not automatically nullify a person's right to digital privacy and rehabilitation. Google often relies on 'public interest' to justify keeping links online, but it tends to apply this term broadly without assessing the actual impact or relevance of the information.
A more accurate interpretation considers whether the offence directly involved the individual's professional role and whether the public still needs to know about the event for safety or trust purposes.
In this instance, the offence was not committed within the course of the doctor’s professional duties. It was unrelated to his work, and his regulatory bodies, such as the General Medical Council and the Disclosure and Barring Service, fully reviewed the case. They concluded he posed no risk to the public and was fit to continue practising.
That professional clearance undermines the claim that public access to the outdated articles is still warranted. Therefore, citing public interest based solely on someone's job title is legally and ethically unsound. Our approach demonstrated how to dismantle such arguments using evidence, legal principles, and the findings of those best qualified to assess risk.
When faced with a ‘public interest’ rejection, the key is to show that the relevant legal bodies have already resolved the matter and that search engines have no mandate to go beyond them.
How we challenged Google’s decision using GDPR
The next step was a detailed GDPR notice, which became the turning point of the case. We argued that the continued indexing of these outdated articles violated several key principles of data protection law.
First, the information being processed lacked accuracy. The articles presented a snapshot of events that omitted essential context, such as the fact that the images were automatically downloaded and not solicited, that the client had handed over his phone voluntarily, and that multiple regulatory bodies had judged him fit to return to medical practice.
Second, we challenged the proportionality of Google's processing of the data. The client had already served his sentence, complied with all legal requirements, and undergone a professional review process. The continued publication of these links served no ongoing public interest and inflicted disproportionate harm.
Third, we addressed the principle of purpose limitation. The regulatory and judicial systems had already served their purpose in protecting the public and administering justice. Google, by continuing to process this outdated content, was acting outside the boundaries of lawful necessity.
Ultimately, our comprehensive and legally grounded submission led Google to delist several of the most damaging URLs, significantly improving our client’s digital footprint across the UK and EU.
How delisting articles can change your life
The emotional relief was immediate. The client described it as having a weight lifted off his shoulders. With the most damaging articles gone from the first page of search results, he could finally focus on his work without the fear of being blindsided by prejudice or misunderstanding.
His mother, a university academic, and father, a trainee teacher, also expressed deep gratitude, having previously faced reputational damage themselves due to association by surname.
When and why we escalate delisting requests
Google often rejects initial delisting applications for sexual offences, regardless of nuance or context. This is particularly frustrating in cases involving strict liability offences, where intent is not a legal factor.
In our experience, initial reviews by Google are conducted by relatively junior caseworkers who may not be legally trained and are more likely to make a blanket rejection based on the nature of the offence alone.
Escalation becomes essential at this stage. When a GDPR notice is served, it requires Google to consider specific legal principles, including accuracy, necessity, and proportionality. This notice often reaches more senior legal professionals within Google, who are better equipped to evaluate complex issues in law and balance the right to privacy with public interest.
If the GDPR notice is accompanied by legal reasoning and relevant evidence- particularly evidence that the individual has been rehabilitated and cleared by regulatory authorities- it can shift the balance toward delisting.
Where Google continues to resist, the next step is the preparation of court proceedings. This involves the drafting of a legal claim, including arguments under data protection and privacy laws. Crucially, we always include an application for anonymity at this stage to ensure that no further damage is caused by the court process itself.
This helps protect the individual from renewed media interest and reinforces the argument that further processing of the data is no longer justified or necessary.
Our experience shows that legal escalation is often not only effective but essential. It helps overcome systemic barriers in search engine delisting, particularly where stigma surrounding the offence hides the truth of rehabilitation and lawful reintegration.
Lawyers’ thoughts on the case
This case brings to light the pressing issue of how social media platforms like WhatsApp and Telegram can unintentionally expose users to illegal content.
In our client’s situation, an image was automatically downloaded onto his phone through a group chat. He did not request it, did not share it, and was unaware of its presence. Yet because of the way these apps operate, automatically downloading files by default, he ended up convicted under a strict liability law.
This meant that the mere presence of the file on his device, regardless of intent, constituted an offence. What followed was a media firestorm and an overzealous reliance on search engines to continue punishing him long after the courts and professional regulators had concluded their actions.
Google’s argument that the public should continue to have access to the articles because of his profession as a doctor is not only misguided, it is legally unsound. Regulatory authorities including the GMC had all the facts and determined that he was safe to practise. That decision came after detailed investigations and hearings.
Google, which did not hear the evidence or understand the full context, should not be allowed to override those conclusions. Search engines are not courts, nor are they regulatory bodies. They should not continue to perpetuate outdated or misleading portrayals of individuals who have already been rehabilitated, especially when the offence arose from the automatic function of a social media app rather than malicious intent.
By delisting these articles, we simply restored balance, ensuring that search engines reflect rehabilitation, not just condemnation.
I am proud to have stood up to Google’s initial refusal and to have presented a compelling legal case rooted in data protection, privacy rights, and fairness. The successful outcome of this case serves as a reminder that the right to be forgotten is not a loophole, but a vital mechanism to protect individuals from enduring digital punishment where justice has already been served.