Skip to content

adityasingh2400/formfix

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

FormFix πŸ€

AI-powered basketball shot form analyzer. Upload one clip, get real coaching cues in plain English β€” then dive into the tracked replay.


Why FormFix?

Most shot analysis tools dump raw numbers. FormFix gives you coaching, not metrics.

Traditional Tools FormFix
"Elbow angle: 78Β°" "Keep your elbow tighter at the set point"
"Knee flexion: 115Β°" "Sink a bit deeper β€” you're losing power"
"Release height: 2.1m" "Your release is clean, matches a textbook set shot"

One clean rep is all you need.


✨ Features

🎯 Quick Coaching First

Plain-language cues appear in seconds β€” before any heavy visuals render. No waiting around.

πŸ”¬ Full Pose + Hand Tracking

MediaPipe Holistic extracts 33 body landmarks plus 21 hand landmarks per hand. We see your wrist snap, finger roll-through, and follow-through extension.

πŸ“Š Automatic Phase Detection

The shot is segmented into Load β†’ Set β†’ Rise β†’ Release β†’ Follow-through β€” each phase analyzed independently.

🎨 Style Comparison

Your shot is matched against an archetype library. See which style family you're closest to, what traits you already share, and what to borrow next.

🎬 Slow-Mo Evidence Replay

Native slow-motion uploads are preferred. FormFix preserves the master clip, renders a guided replay, and builds cue clips, proof frames, and frame strips so the feedback feels earned.

πŸ“± Native Slow-Mo First

The ideal upload is the original 4K 240 fps or 1080p 120/240 fps clip from your phone. In-browser recording still works, but it is treated as a fallback path.


πŸš€ Quick Start

One-Command Setup

Terminal 1 β€” Backend:

cd backend && ./back.sh

Terminal 2 β€” Frontend:

cd frontend && ./front.sh

Open http://localhost:3000 β†’ upload or record a shot β†’ get feedback.


Manual Setup

Requirements

  • Python 3.10–3.12 (MediaPipe doesn't support 3.13 yet)
  • ffmpeg for video encoding
# macOS
brew install ffmpeg

# Ubuntu/Debian
sudo apt install ffmpeg

Backend

python3.11 -m venv .venv
source .venv/bin/activate
pip install -r backend/requirements.txt
python -m uvicorn backend.src.main:app --reload --port 8000

Frontend

cd frontend
python3 -m http.server 3000

πŸ— Architecture

formfix/
β”œβ”€β”€ backend/
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ main.py              # FastAPI endpoints
β”‚   β”‚   β”œβ”€β”€ schemas.py           # Pydantic models
β”‚   β”‚   β”œβ”€β”€ data/
β”‚   β”‚   β”‚   β”œβ”€β”€ reference_profiles.json   # Archetype library
β”‚   β”‚   β”‚   └── research_bank.json        # Biomechanics research
β”‚   β”‚   └── services/
β”‚   β”‚       β”œβ”€β”€ analyzer.py      # Pose extraction + phase detection
β”‚   β”‚       β”œβ”€β”€ comparison.py    # Style matching engine
β”‚   β”‚       └── video_utils.py   # Frame extraction + encoding
β”‚   └── requirements.txt
β”œβ”€β”€ frontend/
β”‚   └── index.html               # Single-page app (vanilla JS)
β”œβ”€β”€ tools/
β”‚   └── reference_pipeline/      # Train your own archetype library
β”‚       β”œβ”€β”€ export_features.py
β”‚       β”œβ”€β”€ train_reference_library.py
β”‚       └── build_player_profiles.py
└── docs/
    β”œβ”€β”€ comparison_engine.md     # How style matching works
    └── datasets_references.md   # Data sources + research

βš™οΈ How It Works

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Upload    β”‚ ──▢ β”‚    Pose     β”‚ ──▢ β”‚   Phase     β”‚
β”‚   Video     β”‚     β”‚  Extraction β”‚     β”‚  Detection  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                               β”‚
       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Quick     β”‚ ──▢ β”‚   Style     β”‚ ──▢ β”‚   Deep      β”‚
β”‚  Coaching   β”‚     β”‚   Match     β”‚     β”‚  Breakdown  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
  1. Video Upload β€” Native slow-motion is preferred; 4K 240 fps is the gold-standard capture
  2. Media Inspection β€” ffprobe classifies frame density, resolution, and capture tier
  3. Coarse Scan β€” A reduced rendition locates the shot rhythm and likely cue windows
  4. Dense Refinement β€” High-detail windows sharpen load, release, and follow-through timing
  5. Style Match + Coaching β€” Strongest findings turn into plain-language cues
  6. Evidence Pack β€” Guided replay, cue clips, proof frames, and frame strips are rendered as URL-based artifacts

πŸ“‘ API

POST /analysis-jobs

Queue an async analysis job.

Request:

Content-Type: multipart/form-data

file: video file (.mp4, .mov, .webm, etc.)
shot_type: optional β€” "free_throw" | "spot_up" | "pull_up"
shooting_hand: optional β€” "left" | "right"

Response:

{
  "job_id": "uuid",
  "status": "queued",
  "stage": "queued",
  "progress_message": "Clip uploaded. Checking whether this is a high-detail slow-motion read.",
  "media_profile": {
    "fps": 240.0,
    "detail_tier": "ultra_detail"
  }
}

GET /analysis-jobs/{job_id}

Poll for status and fetch the final URL-based result payload.

POST /analyze

Blocking compatibility wrapper for quick local testing.

Request:

Content-Type: multipart/form-data

file: video file (.mp4, .mov, etc.)
shot_type: optional β€” "free_throw" | "spot_up" | "pull_up"
return_visuals: "true" to include replay assets

Response:

{
  "job_id": "uuid",
  "status": "completed",
  "result": {
    "phases": [...],
    "issues": [...],
    "coaching_cues": [...],
    "comparison": {
      "style_family": "textbook_set",
      "fit_score": 82,
      "aligned_traits": [...],
      "borrow_next": [...]
    },
    "media_profile": {...},
    "artifacts": {
      "annotated_replay_url": "/media/jobs/<id>/artifacts/annotated-replay.mp4",
      "cue_clip_urls": [...],
      "cue_still_urls": [...],
      "frame_strip_urls": [...]
    },
    "playback_script": [...]
  }
}

GET /health

Service health check.


πŸ”§ Reference Pipeline

Train your own archetype library from labeled clips:

# 1. Export features from your clip library
python tools/reference_pipeline/export_features.py \
  --input clips/ --output features.jsonl

# 2. Cluster into archetypes
python tools/reference_pipeline/train_reference_library.py \
  --input features.jsonl --output trained_library.json

# 3. (Optional) Build player profiles
python tools/reference_pipeline/build_player_profiles.py \
  --input features.jsonl --output player_profiles.json

Override the default library:

export FORMFIX_REFERENCE_LIBRARY=/path/to/trained_library.json

See docs/comparison_engine.md for the full training roadmap.


πŸ“š Research References

  • Biomechanical analysis of basketball shooting (KDU study) β€” optimal knee ~122Β°, elbow ~79Β° at free throw
  • Kinematics of Arm Joint Motions (ScienceDirect) β€” shoulder rotation β†’ vertical velocity; elbow/wrist β†’ horizontal + backspin
  • MediaPipe Holistic β€” 33 pose + 21Γ—2 hand landmarks

πŸ“„ License

MIT

About

AI-powered basketball shot form analyzer that uses pose and hand tracking to give instant mechanics feedback from video.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors