What a hassle!
Some of Britain’s best-known TV doctors are increasingly having their names and likenesses used to sell fraudulent products to social media users. New study warns.
The phenomenon, known as deepfaking, uses artificial intelligence to create sophisticated digital replicas of real people. In these fake videos, one person’s head is superimposed onto another’s body, or their voice is recreated to sound realistic.
Research – It was featured in Wednesday’s issue of the BMJ — General practitioners Hilary Jones and Rangan Chatterjee, as well as health guru Michael Mosley, who died last month, have been found to have been used to promote products without their consent.
In Jones’ case, that means unwittingly promoting blood pressure and diabetes cure-alls and hemp gummies.
Jones, 71, known for his TV appearances on “Good Morning Britain” and other shows, said he was hiring social media experts to find deepfake videos online that misrepresent his views and have them removed.
“We’ve seen a massive increase in this type of activity,” Jones said. “Even if it gets taken down, it pops up the next day under a different name.”
It can be difficult to tell which videos are fake. Recent research findings It found that between 27% and 50% of people could not distinguish between real videos on scientific subjects and deepfakes.
It can be even more difficult when the video features a trusted medical professional who has been in the media for many years.

John Cormack, a retired British doctor, worked with the BMJ to try to understand just how widespread the deepfake doctor phenomenon is on social media.
“After all, it’s much cheaper to spend money on producing videos than it is to research, develop and bring new products to market the traditional way,” Cormack said in the article. “They appear to have found a way to print money.”
Cormack said platforms that host content, such as Facebook, Instagram, X, YouTube and TikTok, should be held responsible for computer-generated videos.
A spokesperson for Meta, which owns and operates Facebook and Instagram, told the BMJ it would investigate the cases highlighted in the study.
“We do not allow content that knowingly misleads or attempts to defraud others, and we are constantly working to improve our detection and enforcement,” the spokesperson said. “If you see content that may violate our policies, please report it so we can investigate and take action.”
What to do if you discover a deepfake video
- Check the content carefully or listen to the audio to see if your suspicions are correct.
- Contact the person advertising the product to verify if the video, images, and audio are genuine.
- Comment on a post and question its veracity
- Share your concerns using the platform’s built-in reporting tools
- Report the user or account that shared the post





