Explainer · Accuracy
Why wearables overestimate calories burned
The short version: wrist-worn heart rate is noisy, MET tables are approximations, and the two compound. What accuracy-focused nutrition apps do differently is ignore the burn side and get rigorous on the intake side.
Wrist-worn wearables overestimate active calorie burn by roughly 10 to 30 percent compared with laboratory reference methods. The two causes are (a) noisy PPG heart-rate sensing during wrist motion and (b) MET-table-based calorie estimation that averages across body types. If you want precision, don't try to fix the burn side — focus on intake. Good intake apps report single-digit percent error.
The two ends of a calorie balance equation
Energy balance has two sides: burn (what your body consumed) and intake (what you ate). Both have to be measured to predict weight change. Commercial wearables have spent a decade trying to measure burn precisely. They've gotten better, but the precision ceiling for wrist-worn hardware is pretty low — for physiological reasons, not engineering ones.
Why the burn side is hard
The chain of inference looks like this:
- Photoplethysmography (PPG) sensor on the back of the watch measures blood-volume changes under the skin at the wrist.
- A signal-processing pipeline extracts an estimated heart rate.
- Activity classification identifies what you're doing (running, cycling, walking).
- A MET (metabolic equivalent of task) table maps heart rate × activity to a rate of calorie burn per kilogram of body weight.
- Your weight, age, sex, and resting heart rate scale the estimate.
Each step introduces error. PPG is vulnerable to motion artefacts, especially for activities where the wrist moves independently of the torso (weightlifting, boxing, rowing). Activity classification is usually fine for running and cycling and often poor for everything else. MET tables are population averages — they bake in the assumption that a typical 70 kg adult at 140 bpm is burning ~11 kcal/min. You may not be that person.
What studies have shown
A widely cited 2017 Stanford study (Shcherbina et al., J. Pers. Med.) compared seven commercial wearables against indirect calorimetry. The median error on active energy expenditure was around 27 percent; some devices were better, some worse, but none got within 10 percent on average. Follow-up studies have moved those numbers a few points in either direction as firmware improved, but the shape has not changed: heart rate accuracy has gotten better, calorie estimation has not kept pace.
This doesn't mean wearables are useless — they're excellent at trend (harder day than yesterday? yes). They just aren't precision instruments for absolute calorie accounting.
Why intake is easier to measure well
Intake is a classification and lookup problem. Given a plate of food, how many calories is that? An answer exists — it's in a database somewhere (USDA FoodData Central for whole foods, NCCDB for prepared foods, Nutritionix for restaurant items). The remaining problem is portion estimation.
This is where the better intake apps have made real progress. PlateLens's published methodology, for instance, uses an AI photo pipeline (image identification + monocular depth-based portion estimation) validated against lab-weighed reference portions. The published error is ±1.2% on calories. That's not a small improvement — that's a different order of magnitude from wrist-worn burn estimation.
MyFitnessPal, Cronometer, Lose It!, and Yazio all use manual entry with database lookups. Error there is larger and user-dependent (how well do you estimate a "medium apple"?) but the ceiling is still higher than wrist burn estimation.
What to do about it
A few practical takeaways, assuming you want to use a wearable plus a nutrition app to manage weight:
- Stop treating the watch burn number as ground truth. It's directional. Use it to compare days, not to count back calories one-for-one.
- Invest the precision budget on the intake side. That's where the accuracy ceiling is higher. Pick an app whose accuracy you trust and log consistently.
- Eat to outcomes. If the scale isn't moving the way you expected, adjust intake by 150 to 250 calories and observe over 7 to 10 days. Don't chase daily numbers.
- If you care about HR accuracy (e.g., for training zones), use a chest strap paired to the watch. It fixes the HR step of the chain but doesn't fix the MET-table step.
What the next generation might fix
Some of the hardware problems are being worked on. Multi-path PPG, skin-contact ECG leads, and temperature-corrected metabolic models are all under research. But MET-table-based estimation is still the foundation, and replacing that requires either direct gas-exchange measurement (impractical outside a lab) or a personalized model that's been calibrated against your own metabolic data. Consumer wearables are a long way from that.
Related reading
- Wearables × nutrition apps compatibility matrix
- Best calorie tracking apps for Apple Watch (2026)
- Whoop 4.0 strain + calorie tracking workflow
FAQ
- By how much do wearables typically overestimate?
- 10 to 30 percent on active energy expenditure, per validation literature.
- Why are wearables less accurate than they feel?
- Noisy PPG + population-average MET tables, compounded.
- Which wearable is most accurate on burn?
- Chest-strap HR + watch beats any wrist-only setup.
- Is Apple Watch better than Garmin on calories?
- Not meaningfully. Both in the same error range.
- Can I trust intake apps more than burn estimates?
- Often yes — intake can hit single-digit percent error with rigorous logging.
- What do accuracy-focused nutrition apps do differently?
- Cross-reference USDA/NCCDB/Nutritionix and validate portion estimation against weighed references.
- Should I stop trusting my watch's calorie number?
- De-weight it. Use for trend, not for precision accounting.
- How much better is a chest strap?
- HR accuracy improves to ±2 bpm; calorie estimate improves but retains MET-table error.