As synthetic intelligence-powered health instruments growth, a troubling sample has emerged: motivation for some, obsession for others. ATN spoke with private trainers to search out out what they’re seeing
AI health instruments promise smarter exercises, however a brand new survey reveals they could even be fueling dangerous behaviors. Practically half (46%) of non-public trainers report seeing extra purchasers skip meals, overtrain and battle with nervousness tied to digital monitoring.
The survey, carried out by Levity, a digital well being firm, polled 900 health lovers and 100 trainers on how calorie-tracking apps, wearables and AI-driven goal-setting applications are affecting psychological well being.
The image isn’t all dangerous — some customers say the instruments preserve them motivated and on observe. However for others, the identical apps gasoline nervousness and unhealthy habits.
The deeper AI embeds itself in health, wellness and vitamin platforms, the extra pressing the query turns into: do we have to rethink how these instruments are designed, marketed and monitored?
However first, the findings:
Thirty p.c of customers mentioned they typically prioritize an app’s targets over their physique’s wants. Practically half (45%) admitted to skipping meals or overtraining to remain inside app limits. Sixty-one p.c reported feeling anxious after lacking a day of monitoring, and 13% mentioned it made them really feel like that they had failed. Greater than 1 / 4 (27%) stopped utilizing an app as a result of it damage their psychological well being.
Levity’s findings additionally present a generational cut up. Sixty-six p.c of Millennials and 61% of Gen Z respondents reported nervousness once they missed monitoring, in comparison with 37% of Gen X and Child Boomers. Gen Z, a gaggle highly attuned to health and wellness, was additionally the almost certainly to view calorie monitoring itself as unhealthy, with 17% expressing concern.
Trainers are noticing the fallout. Based on Levity, 46% p.c of health professionals mentioned AI instruments are contributing to unhealthy or disordered habits amongst purchasers, and 20% mentioned these instruments make it more durable for individuals to belief their very own our bodies. Maybe essentially the most startling stat? 79% reported having to re-educate purchasers after they adopted dangerous or inaccurate AI recommendation.
Brandon Grant, an authorized private coach and proprietor of Notorious Fitness in Las Vegas, says he sees this drawback along with his personal purchasers.
“Over-dependence on AI health instruments and expertise causes points as a result of individuals might cease listening to their physique and simply push via to do regardless of the AI recommends,” Grant says. “I’ve heard from purchasers that ‘energy don’t matter’ and that you may ‘out prepare a foul food plan.’ The analysis reveals that this isn’t the case, and to achieve their health targets, it requires a steadiness of vitamin and health.”

Grant provides that whereas AI can present helpful info, it must be utilized in the best means to assist individuals succeed.
“Working with a private coach will all the time give individuals the accountability facet that helps them keep motivated and centered on their targets,” he provides.
Are Folks Changing into Too Reliant on Digital Health Steerage?
Some trainers mentioned they really feel professionally undermined, with 53% reporting that purchasers belief apps greater than their enter. But 32% acknowledged that purchasers typically obtain higher outcomes when AI instruments are paired with skilled steerage. The motivations driving customers are acquainted, Levity discovered, as 66% flip to apps to remain motivated, 47% to manage weight or physique fats and 43% to enhance efficiency or restoration.
Marshall Weber, an authorized private coach and proprietor of Jack City Fitness, says AI apps typically create extra issues than they clear up. With a background in train science and psychology, Weber warns that generic AI-driven applications typically miss the nuance wanted for secure coaching.
“I can positively see how individuals can choose up dangerous habits and even unrealistic coaching targets from AI-driven apps,” Weber says. “These apps are sometimes extra cookie-cutter than a bespoke exercise expertise. This can result in overtraining or worse, harm.”

His greatest concern, although, is with nutrition-focused platforms.
“Pay attention to apps pushing excessive calorie cuts or fad-style diets that simply should not sustainable,” he warns. “That stuff confuses individuals greater than it helps.”
Nonetheless, Weber — like Grant — acknowledges that the expertise can have an upside. “Apps can actually inspire some purchasers as a result of they make exercises really feel accessible,” he says. “Typically simply getting your foot within the door is the toughest half, and being enthusiastic about an app can bridge that hole for some. I consider that on the finish of the day, tech ought to help health, not substitute your coach or dietician.”
Why the Human Contact Can’t Be (Totally) Changed
Jenny Liebl, a senior product developer and grasp coach on the Worldwide Sports activities Sciences Affiliation (ISSA), says the difficulty with AI health instruments is much less about overtly dangerous recommendation and extra about gaps that depart individuals confused.
“You will get all this info, and it could actually distill it all the way down to one thing you’ll be able to really use, however AI doesn’t assist you to keep motivated,” Liebl says. “AI doesn’t assist you determine the place to train, how you can use a machine or what an train even is. There are quite a lot of lacking items in a few of these instruments, and that turns into the difficulty.”
Whereas the arrival of AI has staff in practically each sector questioning what will probably be left of their careers in a couple of years, Liebl makes a compelling case that in health and wellness, a human contact gained’t simply be essential but in addition welcomed for oversight and, above all, for real motivation.
She remembers assembly a girl who had generated a months-long exercise plan with an AI device however by no means began it.
“She had created it a number of weeks earlier than, however hadn’t carried out a single exercise,” Liebl says. “That’s the issue. AI may give you a program, however it could actually’t make you keep it up.”

Liebl mentioned the commonest misinformation she sees purchasers usher in is just “worrying about issues they don’t must be fearful about.” With a lot info accessible, purchasers typically assume each piece of recommendation applies to them when, in actuality, solely a small fraction could also be related.
She believes one of the best path ahead is one that’s balanced and pairs AI with human oversight.
“Perhaps there’s a means the place a program may be generated by AI, however then a health skilled critiques it earlier than it’s handed off to a shopper,” Liebl says. She in contrast it to Hudl, a video platform that makes use of AI to trace sports activities stats however all the time has a human overview the outcomes earlier than sending them again.
“It actually comes all the way down to prompting,” she provides. “Now we have to immediate AI appropriately to get the best info, after which have the best guardrails in place to verify individuals are utilizing it safely.”
All of this unfolds in opposition to a backdrop the place AI isn’t going anyplace. The worldwide AI in cell apps market, valued at $27.7 billion in 2025, is projected to achieve $322 billion by 2034, rising at a charge of greater than 31% yearly, in line with Analysis and Markets. Health and wellness apps are anticipated to be the fastest-growing section, pushed by rising demand for personalized health monitoring, predictive analytics and AI-powered digital teaching.
The query now could also be whether or not the business can ship that progress with out fueling extra nervousness alongside the best way.

