A sports-media founder wanted to ship per-player highlight reels for full football matches - automatically, with no human editor in the loop.
The thesis was simple: parents will pay for a personalized three-minute reel of their kid's best touches, but only if it lands the same evening as the match. The product had to ingest a single broadcast-style camera feed, identify every player on the pitch, follow the ball, and stitch each player's involvement into a watchable cut.
We embedded for six months across vision, applied ML, and product. The first end-to-end render was running on real match footage in week three. By month five the pipeline was producing 22 personalized reels per match, in under the length of the match itself.
// what we built
Four layers, one product.
[01]py · torch · paddleocr
Player ID & re-identification
Jersey-number OCR + appearance embeddings, robust to occlusion and pile-ups.
[02]yolo · bytetrack
Ball + event tracking
Single-camera ball trajectory with possession + event classification.
[03]vlm · rules
Highlight scoring
Per-player involvement score on every clip; ranked and length-budgeted.
[04]ffmpeg · aws
Auto-edit & delivery
Cut, music bed, branded titles, push to parent on match end.
// outcomes
The receipts.
22
personalized reels per match
<90m
render budget · match-length SLA
94%
jersey-number OCR accuracy
3 min
median reel length per player
6mo
concept to first paying league
0
human editors in the loop
"
They wrote production code in week 4 and refused to write a slide for six months. The first time we saw the system render a real reel, i felt so proud.
Guy — founder, Match Cuts
// faq · match cuts
What clients ask about Match Cuts.
Producing 22 personalized highlight reels per football match from a single broadcast camera, in under 90 minutes, with no human editor in the loop. Player re-identification under occlusion was the hardest sub-problem.