WearablesNutrition

Vision Pro · Thought piece

Spatial nutrition apps: what is plausible on Vision Pro

What spatial computing could realistically unlock for food logging, grounded in what Apple has already exposed in visionOS — not in science fiction.

By Ryan Costello, Editor ·
TL;DR

Spatial computing will not replace the phone for food capture — the hardware isn't designed for it and probably won't be for several years. Where Vision Pro (and future headsets) could genuinely change the nutrition-tracking experience is in the review and coaching layer: visualising a week of meals alongside HRV, sleep, and strain; making patterns legible in a way that flat phone dashboards don't. That's useful, not revolutionary. Marketing copy saying "spatial nutrition" should be read with a generous amount of salt.

Framing

Every time new hardware ships, a category of article gets written: "Product X will transform how you do Y." Usually it's wrong — not because Product X is bad, but because the writer conflated capability with fit. Vision Pro is a genuinely remarkable piece of hardware. Nutrition tracking is a specific workflow with specific demands. The two overlap in places that matter, and in places that don't. This post is about separating them.

What food logging actually requires

Strip away the apps and look at the workflow. To log a meal, you need:

  • A capture device that's with you at mealtimes (home, work, restaurant, travel).
  • A way to identify what's on the plate (photo, barcode, or manual text).
  • A way to estimate portion (photo + depth, weighed measurement, or user input).
  • A way to persist and review the resulting entry.

The phone (plus a watch) already covers steps 1-3 reasonably well. A headset covers step 4 better than anything else — but only during the moments you're wearing it.

Where Vision Pro helps today

Review

Laying out a week of meals as floating cards and looking at them in space is a fundamentally different experience from scrolling a phone timeline. You can see variety at a glance, group by meal type, compare days side-by-side without scrolling. PlateLens's shipping implementation is a reasonable first cut.

Coaching visualisations

Coaching workflows benefit from showing multiple data streams together: calories + HRV + sleep + strain. A phone dashboard can show any two; spatial UI can show four or five without feeling cramped. This is where I expect the most innovation in the next two years.

Where Vision Pro is not the right tool

Capture at mealtime

You're not going to put on a Vision Pro to photograph your lunch. Even if visionOS relaxed camera-access restrictions (which would have privacy trade-offs), the headset is too heavy, too expensive, and too indoor-biased to be the everyday capture device. The phone wins for the foreseeable future.

Ambient context awareness

An always-on wearable can notice patterns (high-HRV days, post-workout recovery) and prompt you with relevant nutrition cues. A headset you wear 30 to 60 minutes a day can't. The ambient layer belongs to watches and rings.

What plausibly ships by 2027-28

Extrapolating from currently announced visionOS capabilities, reasonable bets include:

  • Native visionOS builds from the top three nutrition apps (PlateLens has it; MFP is beta; Cronometer would round out the set).
  • Spatial weekly-review flows that combine nutrition with activity and sleep. PlateLens has a first cut; competitors will follow.
  • Room-scanning-based restaurant-meal capture. Apple's Scene Reconstruction API could theoretically map a plate's volume; someone will ship a demo.
  • Coaching avatars that use visionOS's Persona or Digital Human capabilities. This is the most science-fictional item on the list and also the one I'm least confident about.

What I'd actually want

Selfishly, two things:

  1. A "Sunday review" workflow in Vision Pro that shows my entire week of meals laid out spatially, overlayed with Oura readiness and Garmin training load, letting me see at a glance whether my nutrition matched my training. That's close to what spatial UI is good for.
  2. A way to drag and drop meals to a forward week as a planning tool. Meal-prep planning is a natural spatial activity; the phone is bad at it.

Neither of these requires new hardware. Both could ship on visionOS today. I'm expecting someone to build them this year.

Related reading

FAQ

Will spatial computing replace calorie apps?
Unlikely in this generation. Change the review layer, not the capture one.
Can Vision Pro scan a plate?
Technically possible; camera-access restrictions make it impractical for daily use.
What's the most realistic near-term feature?
Spatial weekly review plus coaching visualisations.
Will Meta Quest matter for this?
Possibly for niche cases. visionOS is where data-rich nutrition apps ship first.
What would make spatial logging actually useful?
All-day lightweight hardware. Three to five years out minimum.
Are nutrition apps investing here?
PlateLens shipping, MFP beta, others behind.
Is this different from AR glasses?
Yes — AR glasses remove the indoor constraint but aren't shipping in 2026.
Why write about this now?
To separate "shipping today" from "future of the category." A lot of coverage conflates them.