In this video, Roboflow Engineer Felipe Tomino walks us through a custom computer vision application he built to help people practice guitar scales. The application uses a webcam to detect a guitar in real-time, mapping out the fretboard, sound hole, and other features. From there, the application overlays an interactive diagram for practicing scales.
You'll hear about how Felipe used Roboflow’s Auto Label feature to quickly annotate his dataset, train a custom model, and then power the application with Roboflow's Serverless Video API which allows for processing live video streams via WebRTC. After that, you'll see a demonstration of the application running and see how Felipe translates JSON predictions into a canvas overlay for an interactive user experience.
= Additional Resources =
Roboflow Serverless API:
https://docs.roboflow.com/deploy/hosted-api
Roboflow Auto Label:
https://docs.roboflow.com/annotate/auto-label
Getting Started with Roboflow:
https://docs.roboflow.com/
= Chapters =
00:00 - Introduction: Why build a vision app to practice guitar?
02:27 - Preparing the dataset with Auto Label
05:32 - Powering the app with the Serverless Video API
07:44 - Code overview: Connecting WebRTC to a Canvas Overlay
09:31 - Live Demo: Practicing Guitar Scales in Real-Time