In this document
Android sensors give applications access to a mobile device's underlying base sensor(s): accelerometer, gyroscope, and magnetometer. Manufacturers develop the drivers that define additional composite sensor types from those base sensors. For instance, Android offers both calibrated and uncalibrated gyroscopes, a geomagnetic rotation vector and a game rotation vector. This variety gives developers some flexibility in tuning applications for battery life optimization and accuracy.
The Sensors Hardware Abstraction Layer (HAL) API is the interface between the hardware drivers and the Android framework; the Sensors Software Development Kit (SDK) API is the interface between the Android framework and the Java applications. Please note, the Sensors HAL API described in this documentation is not identical to the Sensors SDK API described on developer.android.com. For example, some sensors that are deprecated in the SDK may still exist in the HAL, and vice versa.
Similarly, audio recorders, Global Positioning System (GPS) devices, and accessory (pluggable) sensors are not supported by the Android Sensors HAL API described here. This API covers sensors that are physically part of the device only. Please see the Audio, Location Strategies, and the Accessories section for information on those devices.
At the application framework level is the app code, which utilizes the android.hardware APIs to interact with the sensors hardware. Internally, this code calls corresponding JNI glue classes to access the native code that interacts with the sensors hardware.
The JNI code associated with android.hardware is located in the frameworks/base/core/jni/ directory. This code calls the lower level native code to obtain access to the sensor hardware.
The native framework is defined in frameworks/native/ and provides a native equivalent to the android.hardware package. The native framework calls the Binder IPC proxies to obtain access to sensor-specific services.
The Binder IPC proxies facilitate communication over process boundaries.
The Hardware Abstraction Layer (HAL) defines the standard interface that sensor services call into and that you must implement to have your sensor hardware function correctly. The sensor HAL interfaces are located in hardware/libhardware/include/hardware.
The sensors driver interacts with the hardware and your implementation of the HAL. The HAL is driver-agnostic.
Sensor axis definition
The sensor event values are expressed in a specific frame that is static relative to the phone. This API is relative only to the NATURAL orientation of the screen. In other words:
- the axes are not swapped when the device's screen orientation changes.
- higher level services may perform this transformation.
The sensors included by the manufacturer must be accurate and precise to meet the expectations of application developers. The sensors included in Android devices are tested for sensor interaction and accuracy as part of the Android Compatibility program starting in the Android 4.4 release. Testing will continue to be improved in future releases. See the Sensors section of the Android Compatibility Definition Document (CDD) for the exact requirements.
Some defined sensor are higher power than others. Others are lower power by design and should be implemented as such with their processing done in the hardware. This means they should not require the application processor to be running. Here are the low-power sensors:
They are accompanied by a low-power icon in the Sensor summary table.
These sensor types cannot be implemented at high power as their primary benefit is low battery use. It is better to not implement a low-power sensor at all rather than implement it as high power.
Composite low-power sensor types, such as the step detector, must have their processing conducted in the hardware; power use is much lower than if done in the software. Power use is low on small microprocessors and even lower still on application-specific integrated circuits (ASICs). A hardware implementation of composite sensor types can also make use of more raw sensor data and a better synchronization between sensors.
HAL release cycle
Functionality is tied to versions of the API. Android maintains two versions of the Sensors HAL API per release. For instance, if version 1 was the latest and version 1.1 is released, the version prior to 1 will no longer be supported upon that release. Only the two latest versions of the Sensors HAL API are supported.
Android sensors must work independently of one another. Activating one sensor shall not deactivate another sensor. Activating one shall not reduce the rate of another. This is a key element of compatibility testing.
Interaction with suspend mode
Unless otherwise noted, an enabled sensor shall not prevent the system on a chip (SoC) from going into suspend mode. It is the responsibility of applications to keep a partial wake lock should they wish to receive sensor events while the screen is off. While in suspend mode, and unless otherwise noted (batch mode and sensor particularities), enabled sensors' events are lost.
Note that conceptually, the sensor itself is not deactivated while in suspend mode. Instead, the data it returns is missing. The oldest data is dropped to accommodate the latest data. As soon as the SoC gets out of suspend mode, operations resume as normal.
Most applications should either hold a wake lock to ensure the system doesn't go to suspend, or unregister from the sensors when they do not need them, unless batch mode is active. When batching, sensors must continue to fill their internal FIFO. (See the documentation of batch mode to learn how suspend interacts with batch mode.)
Wake-up sensors are a notable exception to the above. Wake-up sensors must wake up the SoC to deliver events. They must still let the SoC go into suspend mode, but must also wake it up when an event is triggered.
Sensor fusion and virtual sensors
Many composite sensor types are or can be implemented as virtual sensors from underlying base sensors on the device. Examples of composite sensors types include the rotation vector sensor, orientation sensor, step detector and step counter.
From the point of view of this API, these virtual sensors must appear as real, individual sensors. It is the responsibility of the driver and HAL to make sure this is the case.
In particular, all sensors must be able to function concurrently. For example, if defining both an accelerometer and a step counter, then both must be able to work concurrently.
These are the common sensor calls expected at the HAL level:
- getSensorList() - Gets the list of all sensors.
- activate() - Starts or stops the specified sensor.
- batch() - Sets parameters to group event data collection and optimize power use.
- setDelay() - Sets the event's period in nanoseconds for a given sensor.
- flush() - Flush adds an event to the end of the "batch mode" FIFO for the specified sensor and flushes the FIFO.
- poll() - Returns an array of sensor data.
Please note, the implementation must be thread safe and allow these values to be called from different threads.
Provide the list of sensors implemented by the HAL for the given sensor type.
Developers may then make multiple calls to get sensors of different types or use Sensor.TYPE_ALL to get all the sensors. See getSensorList() defined on developer.android.com for more details.
int (*activate)(struct sensors_poll_device_t *dev, int handle, int enabled);
Activates or deactivates the sensor with the specified handle. Handles must be higher than SENSORS_HANDLE_BASE and must be unique. A handle identifies a given sensor. The handle is used to activate and/or deactivate sensors. In this version of the API, there can only be 256 handles.
The handle is the handle of the sensor to change. The enabled argument is set to 1 to enable or 0 to disable the sensor.
Unless otherwise noted in the individual sensor type descriptions, an activated
sensor never prevents the SoC from going into suspend mode; that is, the HAL
shall not hold a partial wake lock on behalf of applications.
One-shot sensors deactivate themselves automatically upon receiving an event, and they must still accept to be deactivated through a call to activate(..., ..., 0).
If "enabled" is 1 and the sensor is already activated, this function is a no-op and succeeds. If "enabled" is 0 and the sensor is already deactivated, this function is a no-op and succeeds. This returns 0 on success and a negative errno code otherwise.
batch(sensor, batching parameters)
int (*batch)(struct sensors_poll_device_1* dev, int handle, int flags, int64_t period_ns, int64_t timeout);
Sets parameters to group event data collection and reduce power use. Batching can enable significant power savings by allowing the application processor to sleep rather than awake for each notification. Instead, these notifications can be grouped and processed together. See the Batching section for details.
int (*setDelay)(struct sensors_poll_device_t *dev, int handle, int64_t period_ns);
Sets the event's period in nanoseconds for a given sensor. What the period_ns parameter means depends on the specified sensor's trigger mode:
- Continuous: setDelay() sets the sampling rate.
- On-change: setDelay() limits the delivery rate of events.
- One-shot: setDelay() is ignored. It has no effect.
- Special: See specific sensor type descriptions.
For continuous and on-change sensors, if the requested value is less than sensor_t::minDelay, then it's silently clamped to sensor_t::minDelay unless
sensor_t::minDelay is 0, in which case it is clamped to >= 1ms: