Google is committed to advancing racial equity for Black communities. See how.

Multi-Zone Audio in Android 9

While Android 9 did not support multi-zone audio, the Android audio teams explored several approaches to multi-zone audio. This section describes some of those approaches that might be helpful to system implementers building rear-seat entertainment (RSE) solutions in Android 9.

Use cases

  • Radio in rear seats plays simultaneously with different media sources in front seats.
  • Front seat passenger hears a different media source than the driver (e.g. passenger plays a game on their own screen while driver views navigation on the main screen).
  • Four different, independent audio zones: Driver, front seat passenger, rear seat 1, rear-seat 2.


Android 9 doesn't support multiple audio stacks (zones) or different priorities natively due to the following limitations:

  • Android 9 does not provide APIs that enable applications to target a specific zone. Instead, applications must target the audio type (media, announcement, etc.), which is selected from a pre-defined set provided by Android. For example, Android does not currently support defining the audio type as media for target zone 2.
  • Physical Streams (provided by the AudioFlinger/internal mixer) do not transport Context information (e.g. tagging within Logical Streams) after the mixing; preventing the Audio HAL from routing specific Logical Streams to different zones.

Scenario: Use multiple instances

This scenario uses multiple instances of Android automotive to effect multi-zone audio.

  • Each zone has its own Android automotive instance that independently manages zone content. Hardware below the HAL combines and coordinates the output of multiple instances.
  • Instances exist on distinct hardware (i.e. tablets in the rear seat) or share physical hardware via a hypervisor.
  • Output is statically assigned to vehicle speakers using a single primary zone or dynamic assignments are made below the HAL.
  • First-party party apps (installed in every instance) collaborate via a proprietary protocol to coordinate and route sounds to specific zones. Alternatively, use ChromeCast functionality to communicate across different instances and even devices.

Scenario: Target secondary zones

This scenario uses first-party applications to explicitly target secondary zones (which are ignored by Android).

  • OEM defines additional output audio device ports in audio_policy_configuration.xml.
  • First-party applications that implicitly know the vehicle configuration can enumerate the available output ports and explicitly target any one of them using the AudioTrack.setPrefereceDevice() API.

Scenario: Use audio policy rules

This scenario uses audio policy rules to dynamically add route-specific UIDs to additional audio devices.

  • The audio routing engine defines routing rules based on the UID of the requesting application.
  • A system-level service or launcher adds rules to send the output of a specific application (UID) to a specific device associated with a secondary zone.
  • These specific devices are defined in addition to those provided for routing of the predefined audio contexts.