Kinaesthetic AI
Investor evidence

Evidence: signal -> alert -> recovery.

The system reads body-state signals, gives one short command, measures recovery, and collects validation labels. Raw video and audio are never stored.

loading OFFLINE storage: local 100+ tester target 150-label goal

OFFLINE: no live signal. Start /play for real proof.

Latest proof Not enough live data yet

Start /play or /demo to collect recovery proof.

What we measure
  • Jaw tension, shoulder elevation, posture collapse.
  • Tilt risk, readiness, recovery delta.
  • Tester labels: felt tilt, helped, false alert.
What we never store
  • Raw video.
  • Raw audio.
  • Camera frames.
Validation 0/150 labels

Goal: prove early tension detection and command usefulness with testers.

Beta database 0/100 testers

Every consenting player session saves derived samples, labels, and proof metadata. Raw media is excluded.

Auto-save pipeline Local JSONL

Browser CV -> /api/signals -> local storage -> optional Supabase mirror.

Stored per check
  • Tilt risk, readiness, recovery.
  • Jaw/shoulder/posture signals.
  • Session id, tester id, game, timestamp.
Live timeline
Tilt risk Readiness Recovery

No timeline yet.

Cloud coach health fallback

Critical alerts work locally even when the LLM is unavailable.

Data moat 0

Derived samples and labels accumulate without storing raw media.

Privacy passport

Ready to export investor package. No raw media included.

Investor next step The public link is ready to view.

If the backend is not connected, the site honestly shows public demo mode. Live camera analysis stays local in the browser; the proof package can be downloaded here.

Watch demo Contact founder