Starting March 27, 2025, we recommend using android-latest-release
instead of aosp-main
to build and contribute to AOSP. For more information, see Changes to AOSP.
Interaction in Android
Stay organized with collections
Save and categorize content based on your preferences.
This page explains how Android processes the various inputs it receives from
the keyboard, sensors, and more.
Haptics
The Android haptics subsystem refers to hardware and software features that
contribute to the creation of stimuli through the sense of touch. This section
provides guidance and compliance instructions on the best use of Android haptics
APIs.
The Android input subsystem nominally consists of an event pipeline that
traverses multiple layers of the system. At the lowest layer, the physical input
device produces signals that describe state changes such as key presses and
touch contact points.
Neural Networks API
The Android Neural Networks API (NNAPI) runs computationally intensive
operations for machine learning. This document provides an overview on how to
implement a Neural Networks API driver for Android 9.
Peripherals and accessories
Using a suite of standard protocols, you can implement compelling peripherals
and other accessories that extend Android capabilities in a wide range of
Android-powered devices.
Sensors
Android sensors give apps access to a mobile device's underlying
physical sensors. They are data-providing virtual devices defined by
sensors.h
, the sensor Hardware Abstraction Layer (HAL).
Context Hub Runtime Environment
Context Hub Runtime Environment (CHRE) provides a common platform for running
system-level apps on a low-power processor, with a simple, standardized,
embedded-friendly API. CHRE makes it easy for device OEMs to offload processing
from the applications processor, to save battery and improve various areas of
the user experience, and enable a class of always-on, contextually aware
features.
Content and code samples on this page are subject to the licenses described in the Content License. Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Last updated 2025-06-12 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-06-12 UTC."],[],[],null,["# Interaction in Android\n\nThis page explains how Android processes the various inputs it receives from\nthe keyboard, sensors, and more.\n\nHaptics\n-------\n\nThe Android haptics subsystem refers to hardware and software features that\ncontribute to the creation of stimuli through the sense of touch. This section\nprovides guidance and compliance instructions on the best use of Android haptics\nAPIs.\n\nInput\n-----\n\nThe Android input subsystem nominally consists of an event pipeline that\ntraverses multiple layers of the system. At the lowest layer, the physical input\ndevice produces signals that describe state changes such as key presses and\ntouch contact points.\n\nNeural Networks API\n-------------------\n\nThe Android Neural Networks API (NNAPI) runs computationally intensive\noperations for machine learning. This document provides an overview on how to\nimplement a Neural Networks API driver for Android 9.\n\nPeripherals and accessories\n---------------------------\n\nUsing a suite of standard protocols, you can implement compelling peripherals\nand other accessories that extend Android capabilities in a wide range of\nAndroid-powered devices.\n\nSensors\n-------\n\nAndroid sensors give apps access to a mobile device's underlying\nphysical sensors. They are data-providing virtual devices defined by\n`sensors.h`, the sensor Hardware Abstraction Layer (HAL).\n\nContext Hub Runtime Environment\n-------------------------------\n\nContext Hub Runtime Environment (CHRE) provides a common platform for running\nsystem-level apps on a low-power processor, with a simple, standardized,\nembedded-friendly API. CHRE makes it easy for device OEMs to offload processing\nfrom the applications processor, to save battery and improve various areas of\nthe user experience, and enable a class of always-on, contextually aware\nfeatures."]]