Exploring New Haptics Features in Android 12
A deep dive into key haptics developments in Android 12
Google released the third developer preview of Android 12 and it appears that this new version of the OS will offer some important enhancements for developers who want to deliver higher-quality haptics on Android devices and game controllers. Although the source code is not yet accessible, and there are still a few months until the actual release, the developer preview provides API documentation on new classes and information on other changes that can provide insight into what Android 12 will deliver.
Let’s take a look at some of the key haptic-related features expected with Android 12 and examine how they might help developers who want to add or augment the haptic feedback delivered with their apps and games.
Audio-coupled haptics with the HapticGenerator
The biggest haptic-related change in Android 12 is the addition of what Google calls an “audio-coupled haptic effect.” At Lofelt, we would describe this capability as real-time audio-to-haptics conversion. In Android 12, a new class, HapticGenerator
(HG), converts any audio stream that has an audio session ID into haptic data on the fly.
Unlike the standard Android Vibrator
API, HG can control the frequency of the vibration. The Vibrator
API is limited to changing only the amplitude of a vibration waveform. The ability to change the frequency allows for much more nuanced vibrotactile feedback that is tightly coupled with audio content.
Behind the scenes, it’s likely that HG converts the audio PCM stream to a haptic PCM stream, which is then sent directly to the haptic driver IC and actuator. If that is the case, it operates similarly to an already-existing and under-documented Android feature, which uses Ogg files to create haptic patterns.
The new HG API also allows for very good sync between audio and haptics. Since the haptic stream is created by post-processing the audio stream, the timing of both streams should match. In other words, the haptic stream will likely reach the actuator at the same time the audio stream reaches the speaker. That should be the case as long as the post-processing algorithms don’t introduce noticeable latency due to buffering or some other issue.
While HG aims to address some limitations of the Android Vibrator
API with real-time haptic rendering and audio-haptic sync, it has two inherent shortcomings that will limit its adoption.
First, HG needs to be supported by the smartphone. Of the Google phones, only the Pixel 4 and Pixel 4 XL are supported. The Pixel 5 and Pixel 4a are not. The Pixel 4 and Pixel 4 XL are the same smartphones that support the previously mentioned Ogg playback, which may be because only these phones can send PCM streams of haptic data to the actuator. Until more smartphones are supported, HG will not be universally useful.
Second, HG does not provide playback of pre-authored haptic clips. It only takes audio as input. Developers would need to use the Android Vibrator
API, which has its own significant limitations. For example, it does not support frequency control, audio-haptic sync, or streaming.
Fixed timing bug in Android Vibrator Waveform API
For optimal vibrotactile feedback, a haptic clip should be played exactly simultaneously with a matching audio clip. An offset between the audio and haptics even as small as 12 ms will be noticeable.
While working on our Lofelt SDK for Android, we noticed a problem with the sync between the audio and haptics: The haptic clips would stretch and play longer than they were supposed to, becoming more and more out of sync with the audio over time.
To explore this issue, the Lofelt SDK Team recorded a vibration by attaching a contact microphone to a smartphone. The haptic clip was designed for a duration of 10 seconds, but it played longer than 11 seconds. The motor also turned off for about one-half a second toward the end, producing a glitch in the playback.
We determined the likely root cause by looking at the Android source code, reported the bug to the Android bug tracker, and Google fixed the bug quickly. As a result, Android 12 should play back waveforms with the Android Vibrator
API and the correct timing, and without turning off the motor towards the end of the waveform.
While the bug fix improves audio-haptic sync, the Android Vibrator
API still leaves much to be desired in the area of synchronization. Sync requires starting the audio and haptics at the same time, as well as maintaining the sync until the end of the clip. Unfortunately, the Vibrator
API offers no way to start the vibration at the same time as an audio stream. Triggering the API calls to start the audio and haptics at the same time is not enough: The audio doesn’t start immediately due to the latency caused by buffering and other effects.
In addition, Android uses the system clock when playing out the vibration waveform. Although this is much less of an issue now that the bug is fixed, the vibration waveform can still move slowly out of sync with the audio stream. As mentioned in a previous blog post, syncing haptics to audio is one area in which Android requires additional work.
Improved support for haptics on game controllers
Android has supported very basic haptics on game controllers since Android 4.1 (“Jelly Bean”), which was released in 2012. Using the InputDevice::getVibrator()
API, it is possible to obtain a Vibrator
for game controllers. The Vibrator
can then be used to play simple buzzes or waveforms. However, effects or primitives are not supported.
Android 12 will introduce a few new features that will improve haptics on game controllers.
Support for game controllers with multiple actuators
The existing InputDevice::getVibrator()
API limited the functionality to one Vibrator
per game controller. But many modern game controllers, such as the Sony DualShock controller, have multiple haptic actuators. With Android 12, these actuators can now be controlled independently by using the new VibratorManager
, which can be accessed with InputDevice::getVibratorManager().
VibratorManager
enables developers to list and access all Vibrators of the device. As a result, developers can control each actuator independently or trigger the same vibration to all actuators simultaneously.
Support for changing vibration intensity
In Android 11 and earlier, amplitude control for the waveforms is not supported, limiting haptics to one intensity only. A new patch set adds amplitude control. But this patch set is not yet merged, so it’s unclear if this capability will make it into Android 12.
Automatic redirection of vibrations to the game controller
While the Android OS has supported haptics on game controllers since 2012, it required an app developer to explicitly code in the haptics, using InputDevice::getVibrator().
Many app developers probably never bothered to support game controllers and focused only on vibrating smartphones. Now Android can automatically redirect smartphone vibrations to a game controller when connected.
See also this excellent article on XDA Developers.
Better support for the DualSense controller
A recently merged set of patches from a Sony employee adds Android support for the Sony PlayStation 5 DualSense controller, including support for haptics. An app can now control left and right actuators independently, though only the vibration intensity can be changed. The controller itself supports changing the vibration frequency as well and uses a PCM stream as the input for the vibration pattern. In the driver, however, the PCM stream is not supported, due to the limitations of the Linux force feedback API. The adaptive triggers are not supported yet either.
Conclusion
From what we can tell so far, Android 12 introduces some important new features that could help developers deliver better haptics for Android smartphones and game controllers. These features definitely take a step in the right direction and suggest that Google is taking haptics seriously.
However, a disjointed, piecemeal approach to haptics could make developers’ lives more complicated, especially for those who are creating cross-platform apps and games. The Lofelt SDK can help developers make the most of the Android interfaces. In addition, the Lofelt VTX haptic framework that will be available with the Qualcomm Snapdragon mobile platform should help many developers overcome Android limitations, enabling them to quickly and easily develop haptics-rich games and apps.