An attempt in using WebAudio API on a phone with a TRRS jack headset to detect button presses on the headset.
If the headset shows up as a possible input stream in the WebAudio API, it should be possible to detect input pattern of the button press.
Analyzer node with
getByteTimeDomainData buffers in a
requestAnimationFrame should be enough to tell.
This should bridge the gap between native APIs and web APIs. Both Google's Android and Apple's iPhone provide programmatic ways for listening for the headphones button press in an application:
|Conductor||Channel (Nokia)||Channel (Apple)|
|Tip||Left Out||Left Out|
|Ring||Right Out||Right Out|
House of Marley Liberate XLBT - midnight with Google Chrome Android 64 do not show any specific pattern when pressing the headphone button. It appears as though the pulse is either too short to notice or the phones goes by the press-sound recording from the microphone matching. The phone is able to interpret the button press.
python -m SimpleHTTPServer
Use an Android phone on the same network with a TRRS jack.
On the phone go to
chrome://inspect to use Chrome DevTools on the phone.
You may need to enable Developer Mode on the phone as well as USB debugging.
Ensure you have up-to-date version of Chrome on the phone.
See the development plan.
See the development log.