Migrate to Camera2

This page identifies the differences between the Extended View System (EVS) and Camera2. It also describes how to set up your Camera2 implementation.

Open and close the camera

EVS

openCamera combines opening the device and configuring a single stream.

Camera2

To open and close a device with Camera2:

  1. Select one of these modes:

  2. To configure streams, create a capture session with the relevant output surfaces. For example, from an ImageReader or SurfaceView with CameraDevice.createCaptureSession() (Java) or ACameraDevice_createCaptureSession() (NDK).

    Camera2 supports simultaneous multiple streams. Create multiple streams for purposes such as for preview, recording, and image processing. Streams serve as parallel pipelines, sequentially processing raw frames from the camera.

  3. To close a camera device, use CameraDevice.close() (Java) or ACameraDevice_close() (NDK).

Consider these sample code snippets:

Java

CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
    manager.openCamera(cameraId, new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            // Camera opened, now create session
        }
        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {}
        @Override
        public void onError(@NonNull CameraDevice camera, int error) {}
    }, handler);
} catch (CameraAccessException e) {
    // Handle exception
}

NDK

ACameraManager *cameraManager = ACameraManager_create();
ACameraDevice *cameraDevice = nullptr;
camera_status_t status = ACameraManager_openCamera(
    cameraManager, cameraId, &deviceStateCallbacks, &cameraDevice);

Stream camera data

This section describes how to stream camera data.

EVS

On EVS, to:

  1. Initiate streaming, use startVideoStream.
  2. Stop streaming, use stopVideoStream.

Camera2

On Camera2, to:

  1. Create a CaptureRequest suitable for preview, use TEMPLATE_PREVIEW with CameraDevice.createCaptureRequest() in Java or ACameraDevice_createCaptureRequest() on the NDK.

  2. Submit the request for continuous streaming, use CameraCaptureSession.setSingleRepeatingRequest (Java) or ACameraCaptureSession_setRepeatingRequestV2 (NDK).

  3. Stop streaming, use CameraCaptureSession.stopRepeating (Java) or ACameraCaptureSession_stopRepeating (NDK).

Buffer management

  • On EVS, setMaxFramesInFlight previously controlled buffer count, which could potentially be changed mid-stream. When camera streaming started, EVS furnished a buffer ID for each image frame, which correlated to the same hardware buffer address in memory.

  • On Camera2, the maximum number of images for an AImageReader or ImageReader is set with AImageReader_new or ImageReader.newInstance when a session is initialized. This can't be dynamically altered once the session has started. To get a buffer ID for each frame, clients can maintain a map that correlates the hardware buffer address, obtained from the Image object, to a unique identifier.

Pause and resume streaming

Camera parameters

Consider these code samples:

Java

CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
builder.set(CaptureRequest.CONTROL_EFFECT_MODE, CaptureRequest.CONTROL_EFFECT_MODE_MONO);
// Submit this request

NDK

ACaptureRequest_setEntry_i32(captureRequest, ACAMERA_CONTROL_EFFECT_MODE, 1, &effectMode);

Logical cameras

  • EVS: For logical cameras, such as Surround View, the EVS Manager opened all associated physical cameras, initiated the video streams, and provided a cohesive array of images.

  • Camera2: When similar functionality is needed with Camera2, apps must manage logical cameras, which requires you to:

    • Identify physical sub-cameras associated with a logical camera.
    • Open each necessary physical camera.
    • Initiate streams on each camera.
    • Synchronize frames, if required. Optimally, this is handled at the HAL for hardware-level synchronization.

We'll provide a compatibility library (shim layer) to existing EVS clients to facilitate the transition. The aim being to support Camera2 APIs with minimal changes to code.

Permissions

This section describes changes to permissions.

EVS

Access is restricted to privileged unique identifiers (UID). For example, AID_AUTOMOTIVE_EVS. Deprecated permissions include android.car.permission.USE_CAR_EVS_CAMERA.

Camera2

Camera2 requires android.permission.CAMERA. For special cases:

Safety-critical camera apps must follow the Google built-in pre-grant policies provided in Design for Driving.

Primary and secondary clients

For shared camera access:

  • EVS offered explicit APIs, setPrimaryClient and forcePrimaryClient, to manage the primary client, which had the authority to modify parameters.

  • Camera2, when the camera is opened in shared mode (Android 16 and higher), the priority of the client accessing the camera determines the primary client. The highest priority client (typically the foreground app) can modify capture request parameters. No direct APIs are used to force primary status. Primary status is managed by the framework.

System cameras

To restrict a camera device to be accessed by system or 1P apps only, declare the ANDROID_REQUEST_AVAILABLE_CAPABILITIES_SYSTEM_CAMERA capability in the Camera HAL for that device. Clients must have the android.permission.SYSTEM_CAMERA, in addition to android.permission.CAMERA connected to this camera device.

Rear view camera

EVS

EVS previously enabled camera access before Android boot, a critical event for features such as rear view cameras. Vehicle OEMs are responsible for compliance and certification with regulations provided at Federal Motor Vehicle Safety Standard (FMVSS), No. 111 Rear Visibility. In addition, vehicle OEMs must comply with other rear view camera regulations.

Compliance depends on the hardware, HAL implementation, and overall system integration. After an Android boot of the reference platform, EVS typically needs four to six seconds to become operational and to grant camera access.

Camera2

A privileged client, identified by the AID_AUTOMOTIVE_EVS UID, can use Camera2 APIs for camera access prior to the Android boot process completion. This early access is limited to system cameras on the exterior of the vehicle. Camera2 meets the same performance KPIs for early camera access as EVS, which typically becomes available within four to six seconds after Android boot.

For a consistent and unobstructed display of the rear view camera, particularly during user transitions or when other apps might obscure the preview, we recommend these guidelines when implementing rear view camera with Camera2:

  1. Designate the rear view camera as a System Camera to restrict third-party app access.

  2. Run the service or app accessing the camera as User 0 to using the CAMERA_HEADLESS_SYSTEM_USER permission. This assures uninterrupted camera streaming, regardless of foreground user switching.

  3. Add the app to the Camera Privacy Allowlist to grant camera access even when the user-controlled camera privacy toggle is enabled.

CarEVSManager and CarEVSService

CarEVSManager previously provided Java apps with camera access. The transition to Camera2 replaces this feature with standard android.hardware.camera2.CameraManager.

We plan to deprecate CarEVSService, an optional service that monitors the GEAR_SELECTION VHAL property and used to start an OEM-specified rear view camera activity. OEMs who use this feature must transition the associated logic to an OEM-owned app.

  • Monitor the GEAR_SELECTION VHAL property.
  • Launch the rear view camera activity when the reverse gear is activated.
  • Use Camera2 APIs to display the camera feed.

Display rendering

EVS display and automotive display service

These are deprecated.

Camera2

Use the standard Android rendering methods with Surface, android.hardware.display.DisplayManager, and android.view.Display.

For scenarios needing early camera display, the Camera2 ImageReader can provide direct access to the hardware buffer so you can integrate it with existing DRM-based display implementations for rendering.

This early camera access is exclusively permitted for privileged clients who have the AID_AUTOMOTIVE_EVS_UID and is limited to system cameras located on the exterior of a vehicle.

Emulator HAL (EVS mock HAL)

We plan to deprecate EVS Mock HAL. Instead, OEMs should use the Camera2 emulated camera HAL, hardware/google/camera/devices/EmulatedCamera/, in which we plan to support:

  • Configurable number of cameras.
  • Color bar test patterns.
  • Video file emulation.

To include this HAL in the build:

# In device.mk
PRODUCT_SOONG_NAMESPACES += hardware/google/camera/devices/EmulatedCamera
PRODUCT_PACKAGES += com.google.emulated.camera.provider.hal

Appropriate security-enhanced Linux (SELinux) policies are also required to allow cameraserver to interact with the Emulated Camera HAL service.

V4L2 UVC Camera HAL

We plan to deprecate EVS V4L2 HAL. Use Camera2 external camera support for USB cameras (UVC). To learn more, see External USB Cameras.

Ultrasonics APIs

We plan to deprecate the EVS Ultrasonics APIs. Instead, use these VHAL properties introduced in Android 15 for ultrasonic sensor detections.

Property Type Definition
ULTRASONICS_SENSOR_POSITION Static {<x>, <y>, <z>}

In millimeters, each value representing the position of the sensor along the associated axis relative to the AAOS sensor coordinate frame.

ULTRASONICS_SENSOR_ORIENTATION Static {<qw>, <qx>, <qy>, <qz>}

Which is this Quaternion rotation of the sensor relative to the AAOS sensor coordinate frame: $$w+xi+yj+zk$$

ULTRASONICS_SENSOR_FIELD_OF_VIEW Static {<horizontal>, <vertical>}

In degrees, the horizontal and vertical field of view of the sensor.

ULTRASONICS_SENSOR_DETECTION_RANGE Static {<minimum>, <maximum>}

In millimeters, the sensor’s detection range.

ULTRASONICS_SENSOR_DETECTION_RANGES Static {<range_min_1>, <range_max_1>, <range_min_2>, <range_max_2>}

In millimeters, inclusive, an array of the sensor’s supported detection ranges.

ULTRASONICS_SENSOR_DETECTION_RANGES Continuous {<distance>, <distance_error>}

In millimeters, the sensor’s measured distance and distance error. If only a range is supported, this is the minimum distance in the detected range.