camera3.h File Reference
#include <system/camera_metadata.h>
#include "camera_common.h"

Go to the source code of this file.

Data Structures

struct  camera3_stream
struct  camera3_stream_configuration
struct  camera3_stream_buffer
struct  camera3_stream_buffer_set
struct  camera3_jpeg_blob
struct  camera3_error_msg
struct  camera3_shutter_msg
struct  camera3_notify_msg
struct  camera3_capture_request
struct  camera3_capture_result
struct  camera3_callback_ops
struct  camera3_device_ops
struct  camera3_device


typedef enum camera3_stream_type camera3_stream_type_t
typedef struct camera3_stream camera3_stream_t
typedef struct
typedef enum camera3_buffer_status camera3_buffer_status_t
typedef struct
typedef struct
typedef struct camera3_jpeg_blob camera3_jpeg_blob_t
typedef enum camera3_msg_type camera3_msg_type_t
typedef enum camera3_error_msg_code camera3_error_msg_code_t
typedef struct camera3_error_msg camera3_error_msg_t
typedef struct camera3_shutter_msg camera3_shutter_msg_t
typedef struct camera3_notify_msg camera3_notify_msg_t
typedef enum
typedef struct
typedef struct
typedef struct camera3_callback_ops camera3_callback_ops_t
typedef struct camera3_device_ops camera3_device_ops_t
typedef struct camera3_device camera3_device_t


enum  camera3_buffer_status { CAMERA3_BUFFER_STATUS_OK = 0, CAMERA3_BUFFER_STATUS_ERROR = 1 }
enum  { CAMERA3_JPEG_BLOB_ID = 0x00FF }
enum  camera3_msg_type { CAMERA3_MSG_ERROR = 1, CAMERA3_MSG_SHUTTER = 2, CAMERA3_NUM_MESSAGES }
enum  camera3_error_msg_code {
enum  camera3_request_template {

Typedef Documentation


The current status of a single stream buffer.


A single request for image capture/buffer reprocessing, sent to the Camera HAL device by the framework in process_capture_request().

The request contains the settings to be used for this capture, and the set of output buffers to write the resulting image data in. It may optionally contain an input buffer, in which case the request is for reprocessing that input buffer instead of capturing a new image with the camera sensor. The capture is identified by the frame_number.

In response, the camera HAL device must send a camera3_capture_result structure asynchronously to the framework, using the process_capture_result() callback.


The result of a single capture/reprocess by the camera HAL device. This is sent to the framework asynchronously with process_capture_result(), in response to a single capture request sent to the HAL with process_capture_request(). Multiple process_capture_result() calls may be performed by the HAL for each request.

Each call, all with the same frame number, may contain some subset of the output buffers, and/or the result metadata. The metadata may only be provided once for a given frame number; all other calls must set the result metadata to NULL.

The result structure contains the output metadata from this capture, and the set of output buffers that have been/will be filled for this capture. Each output buffer may come with a release sync fence that the framework will wait on before reading, in case the buffer has not yet been filled by the HAL.


The metadata may be provided multiple times for a single frame number. The framework will accumulate together the final result set by combining each partial result together into the total result set.

If an input buffer is given in a request, the HAL must return it in one of the process_capture_result calls, and the call may be to just return the input buffer, without metadata and output buffers; the sync fences must be handled the same way they are done for output buffers.

Performance considerations:

Applications will also receive these partial results immediately, so sending partial results is a highly recommended performance optimization to avoid the total pipeline latency before sending the results for what is known very early on in the pipeline.

A typical use case might be calculating the AF state halfway through the pipeline; by sending the state back to the framework immediately, we get a 50% performance increase and perceived responsiveness of the auto-focus.

Defined error codes for CAMERA_MSG_ERROR


Message contents for CAMERA3_MSG_ERROR


Transport header for compressed JPEG buffers in output streams.

To capture JPEG images, a stream is created using the pixel format HAL_PIXEL_FORMAT_BLOB. The buffer size for the stream is calculated by the framework, based on the static metadata field android.jpeg.maxSize. Since compressed JPEG images are of variable size, the HAL needs to include the final size of the compressed image using this structure inside the output stream buffer. The JPEG blob ID field must be set to CAMERA3_JPEG_BLOB_ID.

Transport header should be at the end of the JPEG output stream buffer. That means the jpeg_blob_id must start at byte[buffer_size - sizeof(camera3_jpeg_blob)], where the buffer_size is the size of gralloc buffer. Any HAL using this transport header must account for it in android.jpeg.maxSize The JPEG data itself starts at the beginning of the buffer and should be jpeg_size bytes long.


Indicates the type of message sent, which specifies which member of the message union is valid.


The message structure sent to camera3_callback_ops_t.notify()


Available template types for camera3_device_ops.construct_default_request_settings()


Message contents for CAMERA3_MSG_SHUTTER


The complete set of gralloc buffers for a stream. This structure is given to register_stream_buffers() to allow the camera HAL device to register/map/etc newly allocated stream buffers.


Deprecated (and not used). In particular, register_stream_buffers is also deprecated and will never be invoked.


A single buffer from a camera3 stream. It includes a handle to its parent stream, the handle to the gralloc buffer itself, and sync fences

The buffer does not specify whether it is to be used for input or output; that is determined by its parent stream type and how the buffer is passed to the HAL device.


A structure of stream definitions, used by configure_streams(). This structure defines all the output streams and the reprocessing input stream for the current camera use case.


A handle to a single camera input or output stream. A stream is defined by the framework by its buffer resolution and format, and additionally by the HAL with the gralloc usage flags and the maximum in-flight buffer count.

The stream structures are owned by the framework, but pointers to a camera3_stream passed into the HAL by configure_streams() are valid until the end of the first subsequent configure_streams() call that does not include that camera3_stream as an argument, or until the end of the close() call.

All camera3_stream framework-controlled members are immutable once the camera3_stream is passed into configure_streams(). The HAL may only change the HAL-controlled parameters during a configure_streams() call, except for the contents of the private pointer.

If a configure_streams() call returns a non-fatal error, all active streams remain valid as if configure_streams() had not been called.

The endpoint of the stream is not visible to the camera HAL device. In DEVICE_API_VERSION_3_1, this was changed to share consumer usage flags on streams where the camera is a producer (OUTPUT and BIDIRECTIONAL stream types) see the usage field below.


The type of the camera stream, which defines whether the camera HAL device is the producer or the consumer for that stream, and how the buffers of the stream relate to the other streams.

Enumeration Type Documentation

anonymous enum

Definition at line 1554 of file camera3.h.


The current status of a single stream buffer.


The buffer is in a normal state, and can be used after waiting on its sync fence.


The buffer does not contain valid data, and the data in it should not be used. The sync fence must still be waited on before reusing the buffer.

Definition at line 1394 of file camera3.h.

Defined error codes for CAMERA_MSG_ERROR


A serious failure occured. No further frames or buffer streams will be produced by the device. Device should be treated as closed. The client must reopen the device to use it again. The frame_number field is unused.


An error has occurred in processing a request. No output (metadata or buffers) will be produced for this request. The frame_number field specifies which request has been dropped. Subsequent requests are unaffected, and the device remains operational.


An error has occurred in producing an output result metadata buffer for a request, but output stream buffers for it will still be available. Subsequent requests are unaffected, and the device remains operational. The frame_number field specifies the request for which result metadata won't be available.


An error has occurred in placing an output buffer into a stream for a request. The frame metadata and other buffers may still be available. Subsequent requests are unaffected, and the device remains operational. The frame_number field specifies the request for which the buffer was dropped, and error_stream contains a pointer to the stream that dropped the frame.u


Number of error types

Definition at line 1598 of file camera3.h.


Indicates the type of message sent, which specifies which member of the message union is valid.


An error has occurred. camera3_notify_msg.message.error contains the error information.


The exposure of a given request has begun. camera3_notify_msg.message.shutter contains the information the capture.


Number of framework message types

Definition at line 1574 of file camera3.h.


Available template types for camera3_device_ops.construct_default_request_settings()


Standard camera preview operation with 3A on auto.


Standard camera high-quality still capture with 3A and flash on auto.


Standard video recording plus preview with 3A on auto, torch off.


High-quality still capture while recording video. Application will include preview, video record, and full-resolution YUV or JPEG streams in request. Must not cause stuttering on video stream. 3A on auto.


Zero-shutter-lag mode. Application will request preview and full-resolution data for each frame, and reprocess it to JPEG when a still image is requested by user. Settings should provide highest-quality full-resolution images without compromising preview frame rate. 3A on auto.


A basic template for direct application control of capture parameters. All automatic control is disabled (auto-exposure, auto-white balance, auto-focus), and post-processing parameters are set to preview quality. The manual capture parameters (exposure, sensitivity, etc.) are set to reasonable defaults, but should be overridden by the application depending on the intended use case.


First value for vendor-defined request templates

Definition at line 1730 of file camera3.h.


The type of the camera stream, which defines whether the camera HAL device is the producer or the consumer for that stream, and how the buffers of the stream relate to the other streams.


This stream is an output stream; the camera HAL device will be responsible for filling buffers from this stream with newly captured or reprocessed image data.


This stream is an input stream; the camera HAL device will be responsible for reading buffers from this stream and sending them through the camera processing pipeline, as if the buffer was a newly captured image from the imager.

The pixel format for input stream can be any format reported by android.scaler.availableInputOutputFormatsMap. The pixel format of the output stream that is used to produce the reprocessing data may be any format reported by android.scaler.availableStreamConfigurations. The supported input/output stream combinations depends the camera device capabilities, see android.scaler.availableInputOutputFormatsMap for stream map details.

This kind of stream is generally used to reprocess data into higher quality images (that otherwise would cause a frame rate performance loss), or to do off-line reprocessing.

A typical use case is Zero Shutter Lag (ZSL), see S8.1 for more details.


This stream can be used for input and output. Typically, the stream is used as an output stream, but occasionally one already-filled buffer may be sent back to the HAL device for reprocessing.

This kind of stream is meant generally for Zero Shutter Lag (ZSL) features, where copying the captured image from the output buffer to the reprocessing input buffer would be expensive. See S8.2 for more details.

Note that the HAL will always be reprocessing data it produced.


Total number of framework-defined stream types

Definition at line 1188 of file camera3.h.