Dangers of AI in Aesthetics.
Ai is everywhere
I don’t think it would be an understatement to say that this industry loves technology and has always been on the cutting edge of it. From dictation to Skin analysis and marketing tools the cosmetics industry has embraced Ai wholeheartedly. And who can blame them? We are all looking for efficiency and edge or a way to grow our clinics and influence.
There is however, a danger with early adoption.
AI is already in aesthetics.
I make it as part of my visual diet to look at trends and marketing of clinical photographs in this industry, and unfortunately in the last six months I’ve started to notice more, let’s ‘super real’ images - Images similar to stock photographs, or highly retouched images that are all too common in this industry already but they have a certain perfection to them that’s uncanny.
Fig 1 ;
Currently being sued by BTL.
People are using Ai for deceptive marketing
You have to look at items raiser for the reason why this is happening as Clinics outsource their marketing to agencies, and most agencies outsource their marketing. Along the way standards can be eroded and in the spirit of ‘let’s just get it done’ brands arrive at marketing that is unethical. I was shocked seeing a device brand at this year’s ASCD conference advertise using AI models - but then the more I thought about it the more I realised it’s just the direction of the tide.
Fig 2
Looks like stock image, but really is AI.
Obviously the greatest danger is setting unrealistic expectations amongst patients. I would go one step further and say that using AI images for ‘results’ is pure fraud - and should be treated as such.
It’s up to the industry and the regulators. And in many ways, there is an opportunity now to come down hard on it early on as opposed to stock photographs, they took two decades to be outlawed.
Ahpra
4.4 Advertising must not create an unreasonable expectation of beneficial treatment. The claims of beneficial treatment can range from unsubstantiated scientific claims through to miracle cures. Advertising of treatments or services must not encourage or promote unreasonable expectations.
Whilst the regulations are still in flux and Ai is not specifically mentioned, I believe it would be covered under section 4.4. It will be interesting to see if latest guidelines set to arrive soon include the new wave of AI generated misinformation.
Great summary by Meridian Lawyers
Avant has a similar opinion.
Take care with images used in your advertising. ‘Before and after’ photographs may be misleading and could be seen to create unreasonable expectations. Images must be of a real patient who has undergone the advertised procedure in your practice.
Stock images are acceptable in your advertising provided they are not used to infer a false claim or mislead.
The US currently relies on federal laws to regulate AI but plans to introduce dedicated legislation and a national regulator; meanwhile, AI developers face a growing patchwork of state and local rules (b)- and Australia is still developing it’s own approach to regulation. (c)
There are however platforms that are self regulating such as a new standard called Content Authentication that is built into real cameras.
I’m proud that we are already across this as our image format that we use in our installations has content authentication built-in, so you know a photograph is a photograph.
The Irony is ‘bad photography’ is effectively doing the same thing as AI - misinforming and costing opportunities to educate, convert and retain client. Media / technology literacy is historical poor within the cosmetics industry and wider public - and critical thinking and informed consent is vital when making decisions on invasive and permanent treatments. Clinicians need to start from a place of trust, authenticity and honesty when showing their results - and standardised, objective and clear photograph is still the best method to do this.
Not sure something is AI? Try these tips.
Imperfections usually mean someone is real - so look out for models who are too symmetrical.
The reflection in the eyeballs of the patient is exactly the same - impossible to do in real life.
The Skin has a haze to it, and if you zoom in you’ll notice really fine squiggled line texture that a generator would use.
The images, typically, very polished with minimal errors Such as flyaway hairs, or lint on a T-shirt
The before/after text is usually a very generic font and at odds with the rest of the branding of the clinic.
And if you’re still not sure you can use a AI detector such as https://undetectable.ai/ai-image-detector
I think ethical practitioners will remember that this is medicine, and there is a point where technology needs boundaries - don’t wait for regulations to tell you what’s right or wrong.
And hopefully you can tell from my poor grammar, that ChatGPT didn’t write this for me.
Sources
A)“Advertising guidelines when promoting your practice,” Avant (Australia) – Guidance based on AHPRA Advertising Rules, 2023.
B) https://www.whitecase.com/insight-our-thinking/ai-watch-global-regulatory-tracker-united-states

