Dr Amir has been left mortified by the video (Image: Gareth Cattermole, Getty Images)
Dr Amir Khan is a respected NHS GP who frequently features on ITV’s Lorraine to provide medical guidance and address topical issues. However, he’s alerted his followers about a recent clip he was made aware of by Paul Ince, a content marketing strategist and creator, which showed him endorsing a product he’d never encountered, let alone recommended.
In the disturbingly convincing clip, Dr Amir appeared to be promoting a product aimed at helping with erections in gentlemen over 40, attempting to exploit people who may be feeling vulnerable. But naturally, the footage was an alarmingly realistic deepfake of Dr Amir Khan, who was utterly dismayed to discover what individuals had created.
In his Instagram post, he stated: « Scam alert! Thank you, @bizpaul, for alerting me to this deepfake of me selling some kind of bull**** prostate thing. It’s not me, and I would never put my name to anything like this. There’s going to be a lot more deepfakes of trusted people in the coming months and years. Please be careful. Also, the most offensive thing about this is they’ve made me sound like a southerner – no offence, but I’m proud of my northern roots! »
The fraudulent Dr Amir, which is AI-generated, claimed: « If your prostate hurts and you don’t have morning erections, and sex lasts less than five minutes, it’s not age, it’s poison destroying you, and all of this is leading to prostate cancer.
« I’m going to tell you about one ingredient that eliminates 96% of the problem within 24 hours, without surgery, right at home, and without side effects. Just promise you won’t ignore it ».
READ MORE: Unlock ‘unlimited mint’ using easy 55p gardening method
READ MORE: NHS GP admits what some healthcare workers eat in staff room after diet advice
The fraudulent deep-fake portrayal of Dr Amir had already made several extraordinary assertions at the beginning of the footage, seemingly designed to maintain viewers’ attention, exploiting his reputation as a trusted medical professional whose guidance people value.
The footage proceeded: « In November 2025, our laboratory discovered that men over 40 during andropause, male menopause, begin to secrete a toxic substance that causes prostatitis and prostate cancer, destroys erections, leads to erectile dysfunction, reduces the size of the penis and testicles, and triggers many other symptoms.
« Race, height, physical activity – none of this matters. This toxin was found in 99% of men. We have found a remedy, an ingredient that removes the toxin from the body in just a few hours, unblocks blood vessels and restores normal erections ».
The scam then characterised the product as « completely natural, » describing it as the « first remedy in human history for male diseases, » which has « passed all clinical trials ». However, this is entirely false, as the product is fraudulent.
The fraudulent Dr Amir then claimed the product was available at a reduced price as « part of a government programme, » attempting to lend credibility to the deceptive footage. « I will personally pay you $1000 (£734.48) if you do not see results within 24 hours, » the troubling video went on to state.
Further assertions followed. According to the video, not only would prostate cancer risk supposedly disappear entirely, but one’s « vitality will return, » whilst « joint pain » would vanish.
« Headaches and blood pressure problems will go away. Some patients report feeling 20 years younger thanks to improved blood circulation and cleansed blood vessels, » the footage alleged.
It then encouraged viewers not to « wait for the product to appear in pharmacies with a 100% mark-up, » but instead advised people to « order now and tell your friends and relatives ».
How can you identify if an online advertisement is fraudulent?
Identifying fraudulent online advertisements can prove challenging, but several warning signs can help safeguard you. One particularly prevalent indicator is manufactured urgency.
Fraudsters frequently attempt to create pressure through phrases such as « limited time offer, « only a few left, » or « act now before it’s gone. »
This tactic aims to force hasty decisions without adequate consideration or investigation. Another significant red flag is an abnormally low price. If the product or service is available at a cost that appears too good to be true, considerably cheaper than comparable offerings elsewhere, it’s a clear indicator that something isn’t right.
Identifying a deepfake can be difficult, particularly as the technology advances, but there remain several revealing indicators to watch for if you suspect a video or audio recording might be digitally manufactured. One of the most frequent giveaways is unnatural facial movements or expressions.
Deepfake technology frequently struggles to authentically reproduce the nuanced movements of our faces, particularly around the eyes and mouth. You may observe that the individual’s blinking appears unusual, either excessively frequent, insufficiently frequent, or out of synchronisation with their speech.
Occasionally, the mouth movements don’t properly align with the words being uttered, or the lips appear to hover unnaturally on the face. Smiles, frowns, and other expressions may appear rigid, delayed, or overemphasised.
You may notice peculiar shadows, highlights that don’t correspond with the direction of the light source, or portions of the face that appear unusually illuminated or darkened compared to the environment. Similarly, skin texture may appear excessively smooth, artificial, or inconsistent with the individual’s neck, ears, or hands.
Occasionally, accessories such as glasses, earrings, or hats can malfunction or seem distorted, particularly if the subject moves their head rapidly.
Listen carefully to the audio as well. Deepfake audio, or voice cloning, may sound mechanical, flat, or lack the natural inflexions and emotional subtleties of genuine speech.
There could be awkward pauses, unusual intonations, or background noise that doesn’t correspond with the environment depicted in the footage. Occasionally, the voice doesn’t quite align with the individual’s known accent or speech patterns, or it sounds marginally off in pitch or tone.
Context is equally vital. If you encounter footage of a prominent figure saying or doing something startling, uncharacteristic, or controversial, particularly if it hasn’t been covered by reputable news sources, remain sceptical.
Deepfakes are frequently used to disseminate misinformation or generate sensational content, so it’s essential to verify the video’s source. Examine whether the footage was uploaded by a trusted account and whether other reliable outlets have reported the same incident.
Ultimately, trust your instincts and remember that as deepfake technology improves, it’s wise to approach sensational or suspicious videos with scepticism. If something feels off, whether it’s the way someone looks, sounds, or acts, it’s worth taking extra steps to verify its authenticity before believing or sharing it, or buying any product from it.
Source link

