Currently, the musical JoyCons project relies on MIDI files as the input source for vibration playback.
This issue proposes adding support for real-time or file-based device audio as the input source instead of MIDI. The goal is to make the system easier to use by allowing users to play any audio directly and have it translated into JoyCon vibration patterns.
Motivation
Using MIDI files introduces friction:
- Requires preprocessing or finding compatible MIDI files
- Limits usage to structured musical data
- Prevents quick experimentation with arbitrary audio
By using device audio directly, we can:
- Support any music or sound source
- Enable plug-and-play usage
- Improve UX and accessibility for non-technical users
Proposed Approach
1. Audio Input Pipeline
- Capture device audio (system output or selected input source)
- Convert audio stream into analyzable signal frames
- Perform buffering suitable for low-latency vibration output
2. Signal Processing
Process audio into a simplified representation suitable for JoyCon haptics:
-
Noise removal / filtering
- High-pass / low-pass filtering
- Optional smoothing
-
Frequency analysis (FFT or equivalent)
-
Normalize amplitude for consistent vibration strength
3. Primary + Secondary Note Extraction
Use dimensionality reduction to identify dominant musical components:
Goals:
- Reduce dense audio into two meaningful vibration channels
- Preserve musical feel without overwhelming the haptics
- Avoid jitter from minor frequencies
4. JoyCon Mapping
- Map component intensity → vibration amplitude
- Map frequency band → vibration frequency
- Smooth transitions between frames to avoid harsh changes
Notes
This change should be designed so it does not break existing MIDI functionality. Audio input should be an additional mode selectable by the user.
Currently, the musical JoyCons project relies on MIDI files as the input source for vibration playback.
This issue proposes adding support for real-time or file-based device audio as the input source instead of MIDI. The goal is to make the system easier to use by allowing users to play any audio directly and have it translated into JoyCon vibration patterns.
Motivation
Using MIDI files introduces friction:
By using device audio directly, we can:
Proposed Approach
1. Audio Input Pipeline
2. Signal Processing
Process audio into a simplified representation suitable for JoyCon haptics:
Noise removal / filtering
Frequency analysis (FFT or equivalent)
Normalize amplitude for consistent vibration strength
3. Primary + Secondary Note Extraction
Use dimensionality reduction to identify dominant musical components:
Apply PCA (Principal Component Analysis) or similar technique
Extract:
Goals:
4. JoyCon Mapping
Notes
This change should be designed so it does not break existing MIDI functionality. Audio input should be an additional mode selectable by the user.