1. Deepfake clinicians and AI-shaped wellness content are blurring the line between medical advice and marketing in ways that patients may struggle to recognize.
2. Rising cases of cosmetic botulinum toxin–related illness underscore the importance of harm-reduction counseling and clear guidance on avoiding unregulated injectables.
The current wave of enthusiasm for weight loss and aesthetic enhancement is unfolding against the backdrop of a sophisticated misinformation ecosystem. A detailed misinformation report describes how AI-generated videos and images are now impersonating real clinicians to promote unregulated products on social media and video platforms. These pieces show that deepfakes do not have to be perfect to be persuasive; they need only capture familiar visual cues such as white coats, exam rooms, and confident tone. At the same time, coverage of new devices from CES illustrates how AI-enabled wellness platforms can package automated assessments and recommendations in a way that feels clinically authoritative even when evidentiary support is thin. In parallel, there is a documented rise in individuals self-injecting botulinum toxin obtained from online sources that lack verified ties to authorized manufacturers. A case review summarizes episodes of cosmetic iatrogenic botulism in which patients developed cranial nerve palsies and generalized weakness after receiving injections in homes, salons, or informal group settings. A county-level health advisory has described suspected botulism linked to possibly counterfeit or mishandled product, with several patients requiring hospitalization and antitoxin therapy. Many of these individuals believed they were accessing the same agents used in dermatology clinics, reassured by influencer testimonials and polished marketing materials. When they present to your office, they may be embarrassed, frightened, and unsure how to disclose the details of what they used. A harm-reduction approach that focuses on stabilization, nonjudgmental history taking, and clear counseling about safer options is more likely to shift future behavior than confrontation alone. It can be helpful to explicitly invite patients to bring up online products before trying them, particularly now that deepfake endorsements can be difficult to distinguish from genuine communication. Incorporating a brief explanation of how to verify regulatory websites, recognize legitimate public health alerts, and spot hallmarks of counterfeit products can turn routine visits into opportunities for digital literacy coaching.
Image: PD
©2026 2 Minute Medicine, Inc. All rights reserved. No works may be reproduced without expressed written consent from 2 Minute Medicine, Inc. Inquire about licensing here. No article should be construed as medical advice and is not intended as such by the authors or by 2 Minute Medicine, Inc.