typedef vec CameraMetadata


typedef bitfield BufferUsageFlags


typedef bitfield DataspaceFlags


enum StreamType: uint32_t


The type of the camera stream, which defines whether the camera HAL device is the producer or the consumer for that stream, and how the buffers of the stream relate to the other streams.

This stream is an output stream;the camera HAL device must fill buffers from this stream with newly captured or reprocessed image data.
This stream is an input stream;the camera HAL device must read buffers from this stream and send them through the camera processing pipeline, as if the buffer was a newly captured image from the imager.
The pixel format for input stream can be any format reported by android.scaler.availableInputOutputFormatsMap.The pixel format of the output stream that is used to produce the reprocessing data may be any format reported by android.scaler.availableStreamConfigurations.The supported input/output stream combinations depends the camera device capabilities, see android.scaler.availableInputOutputFormatsMap for stream map details.
This kind of stream is generally used to reprocess data into higher quality images(that otherwise would cause a frame rate performance loss), or to do off-line reprocessing.
The typical use cases are OPAQUE(typically ZSL)and YUV reprocessing, see S8.2, S8.3 and S10 for more details.


enum StreamRotation: uint32_t


The required counterclockwise rotation of camera stream.

No rotation
Rotate by 90 degree counterclockwise
ROTATION_180 = 2
Rotate by 180 degree counterclockwise
ROTATION_270 = 3
Rotate by 270 degree counterclockwise


enum StreamConfigurationMode: uint32_t


This defines the general operation mode for the HAL(for a given stream configuration)where modes besides NORMAL have different semantics, and usually limit the generality of the API in exchange for higher performance in some particular area.

Normal stream configuration operation mode.This is the default camera operation mode, where all semantics of HAL APIs and metadata controls apply.
Special constrained high speed operation mode for devices that can not support high speed output in NORMAL mode.All streams in this configuration are operating at high speed mode and have different characteristics and limitations to achieve high speed output.The NORMAL mode can still be used for high speed output if the HAL can support high speed output while satisfying all the semantics of HAL APIs and metadata controls.It is recommended for the HAL to support high speed output in NORMAL mode(by advertising the high speed FPS ranges in android.control.aeAvailableTargetFpsRanges)if possible.
This mode has below limitations/requirements:
1.The HAL must support up to 2 streams with sizes reported by android.control.availableHighSpeedVideoConfigurations.2.In this mode, the HAL is expected to output up to 120fps or higher.This mode must support the targeted FPS range and size configurations reported by android.control.availableHighSpeedVideoConfigurations.3.The HAL must support IMPLEMENTATION_DEFINED output stream format.4.To achieve efficient high speed streaming, the HAL may have to aggregate multiple frames together and send to camera device for processing where the request controls are same for all the frames in this batch(batch mode). The HAL must support max batch size and the max batch size requirements defined by android.control.availableHighSpeedVideoConfigurations.5.In this mode, the HAL must override aeMode, awbMode, and afMode to ON, ON, and CONTINUOUS_VIDEO, respectively.All post-processing block mode controls must be overridden to be FAST.Therefore, no manual control of capture and post-processing parameters is possible.All other controls operate the same as when android.control.mode == AUTO.This means that all other android.control.* fields must continue to work, such as
android.control.aeTargetFpsRange android.control.aeExposureCompensation android.control.aeLock android.control.awbLock android.control.effectMode android.control.aeRegions android.control.afRegions android.control.awbRegions android.control.afTrigger android.control.aePrecaptureTrigger
Outside of android.control.*, the following controls must work:
android.flash.mode(TORCH mode only, automatic flash for still capture must not work since aeMode is ON)android.lens.opticalStabilizationMode(if it is supported)android.scaler.cropRegion android.statistics.faceDetectMode(if it is supported)6.To reduce the amount of data passed across process boundaries at high frame rate, within one batch, camera framework only propagates the last shutter notify and the last capture results(including partial results and final result)to the app.The shutter notifies and capture results for the other requests in the batch are derived by the camera framework.As a result, the HAL can return empty metadata except for the last result in the batch.
For more details about high speed stream requirements, see android.control.availableHighSpeedVideoConfigurations and CONSTRAINED_HIGH_SPEED_VIDEO capability defined in android.request.availableCapabilities.
This mode only needs to be supported by HALs that include CONSTRAINED_HIGH_SPEED_VIDEO in the android.request.availableCapabilities static metadata.
VENDOR_MODE_0 = 0x8000
A set of vendor-defined operating modes, for custom default camera application features that can't be implemented in the fully flexible fashion required for NORMAL_MODE.


enum BufferStatus: uint32_t


The current status of a single stream buffer.

OK = 0
The buffer is in a normal state, and can be used after waiting on its sync fence.
The buffer does not contain valid data, and the data in it must not be used.The sync fence must still be waited on before reusing the buffer.


enum CameraBlobId: uint16_t


Transport header for camera blob types;generally compressed JPEG buffers in output streams.

To capture JPEG images, a stream is created using the pixel format HAL_PIXEL_FORMAT_BLOB and dataspace HAL_DATASPACE_V0_JFIF.The buffer size for the stream is calculated by the framework, based on the static metadata field android.jpeg.maxSize.Since compressed JPEG images are of variable size, the HAL needs to include the final size of the compressed image using this structure inside the output stream buffer.The camera blob ID field must be set to CameraBlobId::JPEG.

The transport header must be at the end of the JPEG output stream buffer.That means the jpegBlobId must start at byte[buffer_size - sizeof(CameraBlob)], where the buffer_size is the size of gralloc buffer.Any HAL using this transport header must account for it in android.jpeg.maxSize.The JPEG data itself starts at the beginning of the buffer and must be blobSize bytes long.

JPEG = 0x00FF


enum MsgType: uint32_t


Indicates the type of message sent, which specifies which member of the message union is valid.

An error has occurred.NotifyMsg::Message::Error contains the error information.
The exposure of a given request or processing a reprocess request has begun.NotifyMsg::Message::Shutter contains the information the capture.


enum ErrorCode: uint32_t

Defined error codes for MsgType::ERROR

A serious failure occured.No further frames or buffer streams must be produced by the device.Device must be treated as closed.The client must reopen the device to use it again.The frameNumber field is unused.
An error has occurred in processing a request.No output(metadata or buffers)must be produced for this request.The frameNumber field specifies which request has been dropped.Subsequent requests are unaffected, and the device remains operational.
An error has occurred in producing an output result metadata buffer for a request, but output stream buffers for it must still be available.Subsequent requests are unaffected, and the device remains operational.The frameNumber field specifies the request for which result metadata won't be available.
An error has occurred in placing an output buffer into a stream for a request.The frame metadata and other buffers may still be available.Subsequent requests are unaffected, and the device remains operational.The frameNumber field specifies the request for which the buffer was dropped, and errorStreamId indicates the stream that dropped the frame.


enum RequestTemplate: uint32_t


Available template types for ICameraDevice::constructDefaultRequestSettings()

Standard camera preview operation with 3A on auto.
Standard camera high-quality still capture with 3A and flash on auto.
Standard video recording plus preview with 3A on auto, torch off.
High-quality still capture while recording video.Applications typically include preview, video record, and full-resolution YUV or JPEG streams in request.Must not cause stuttering on video stream.3A on auto.
Zero-shutter-lag mode.Application typically request preview and full-resolution data for each frame, and reprocess it to JPEG when a still image is requested by user.Settings must provide highest-quality full-resolution images without compromising preview frame rate.3A on auto.
A basic template for direct application control of capture parameters.All automatic control is disabled(auto-exposure, auto-white balance, auto-focus), and post-processing parameters are set to preview quality.The manual capture parameters(exposure, sensitivity, etc .)are set to reasonable defaults, but may be overridden by the application depending on the intended use case.
First value for vendor-defined request templates


struct Stream {int32_t id; StreamType streamType; uint32_t width; uint32_t height; android format; BufferUsageFlags usage; DataspaceFlags dataSpace; StreamRotation rotation}


A descriptor for a single camera input or output stream.A stream is defined by the framework by its buffer resolution and format, and additionally by the HAL with the gralloc usage flags and the maximum in-flight buffer count.

If a configureStreams() call returns a non-fatal error, all active streams remain valid as if configureStreams() had not been called.

Stream ID - a nonnegative integer identifier for a stream.
The identical stream ID must reference the same stream, with the same width/height/format, across consecutive calls to configureStreams.
If previously-used stream ID is not used in a new call to configureStreams, then that stream is no longer active.Such a stream ID may be reused in a future configureStreams with a new width/height/format.
The type of the stream(input vs output, etc).
The width in pixels of the buffers in this stream
The height in pixels of the buffers in this stream
The pixel format for the buffers in this stream.
If IMPLEMENTATION_DEFINED is used, then the platform gralloc module must select a format based on the usage flags provided by the camera device and the other endpoint of the stream.
The gralloc usage flags for this stream, as needed by the consumer of the stream.
The usage flags from the producer and the consumer must be combined together and then passed to the platform gralloc HAL module for allocating the gralloc buffers for each stream.
The HAL may use these consumer flags to decide stream configuration.For streamType INPUT, the value of this field is always 0.For all streams passed via configureStreams(), the HAL must set its own additional usage flags in its output HalStreamConfiguration.
The usage flag for an output stream may be bitwise combination of usage flags for multiple consumers, for the purpose of sharing one camera stream between those consumers.The HAL must fail configureStreams call with ILLEGAL_ARGUMENT if the combined flags cannot be supported due to imcompatible buffer format, dataSpace, or other hardware limitations.
A field that describes the contents of the buffer.The format and buffer dimensions define the memory layout and structure of the stream buffers, while dataSpace defines the meaning of the data within the buffer.
For most formats, dataSpace defines the color space of the image data.In addition, for some formats, dataSpace indicates whether image- or depth-based data is requested.See for details of formats and valid dataSpace values for each format.
The HAL must use this dataSpace to configure the stream to the correct colorspace, or to select between color and depth outputs if supported.The dataspace values are set using the V0 dataspace definitions.
The required output rotation of the stream.
This must be inspected by HAL along with stream width and height.For example, if the rotation is 90 degree and the stream width and height is 720 and 1280 respectively, camera service must supply buffers of size 720x1280, and HAL must capture a 1280x720 image and rotate the image by 90 degree counterclockwise.The rotation field must be ignored when the stream type is input.
The HAL must inspect this field during stream configuration and return IllegalArgument if HAL cannot perform such rotation.HAL must always support ROTATION_0, so a configureStreams() call must not fail for unsupported rotation if rotation field of all streams is ROTATION_0.


struct StreamConfiguration {vec streams; StreamConfigurationMode operationMode}


A structure of stream definitions, used by configureStreams().This structure defines all the output streams and the reprocessing input stream for the current camera use case.

An array of camera stream pointers, defining the input/output configuration for the camera HAL device.
At most one input-capable stream may be defined.At least one output-capable stream must be defined.
The operation mode of streams in this configuration.The HAL can use this mode as an indicator to set the stream property(e.g ., HalStream::maxBuffers)appropriately.For example, if the configuration is CONSTRAINED_HIGH_SPEED_MODE, the HAL may want to set aside more buffers for batch mode operation(see android.control.availableHighSpeedVideoConfigurations for batch mode definition).


struct HalStream {int32_t id; android overrideFormat; BufferUsageFlags producerUsage; BufferUsageFlags consumerUsage; uint32_t maxBuffers}


The camera HAL's response to each requested stream configuration.

The HAL may specify the desired format, maximum buffers, and usage flags for each stream.

Stream ID - a nonnegative integer identifier for a stream.
The ID must be one of the stream IDs passed into configureStreams.
An override pixel format for the buffers in this stream.
The HAL must respect the requested format in Stream unless it is IMPLEMENTATION_DEFINED, in which case the override format here must be used by the client instead, for this stream.This allows cross-platform HALs to use a standard format since IMPLEMENTATION_DEFINED formats often require device-specific information.In all other cases, the overrideFormat must match the requested format.
When HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED is used, then the platform gralloc module must select a format based on the usage flags provided by the camera device and the other endpoint of the stream.
The gralloc usage flags for this stream, as needed by the HAL.
For output streams, these are the HAL's producer usage flags.For input streams, these are the HAL's consumer usage flags.The usage flags from the producer and the consumer must be combined together and then passed to the platform graphics allocator HAL for allocating the gralloc buffers for each stream.
If the stream's type is INPUT, then producerUsage must be 0, and consumerUsage must be set.For other types, producerUsage must be set, and consumerUsage must be 0.
The maximum number of buffers the HAL device may need to have dequeued at the same time.The HAL device may not have more buffers in-flight from this stream than this value.


struct HalStreamConfiguration {vec streams}


A structure of stream definitions, returned by configureStreams().This structure defines the HAL's desired parameters for each stream.

All streams that were defined in the input to configureStreams() must have a corresponding entry in this structure when returned by configureStreams().



struct StreamBuffer {int32_t streamId; uint64_t bufferId; handle buffer; BufferStatus status; handle acquireFence; handle releaseFence}


A single buffer from a camera3 stream.It includes a handle to its parent stream, the handle to the gralloc buffer itself, and sync fences

The buffer does not specify whether it is to be used for input or output;that is determined by its parent stream type and how the buffer is passed to the HAL device.

The ID of the stream this buffer is associated with.-1 indicates an invalid(empty)StreamBuffer, in which case buffer must also point to null and bufferId must be 0.
The unique ID of the buffer within this StreamBuffer.0 indicates this StreamBuffer contains no buffer.For StreamBuffers sent to the HAL in a CaptureRequest, this ID uniquely identifies a buffer.When a buffer is sent to HAL for the first time, both bufferId and buffer handle must be filled.HAL must keep track of the mapping between bufferId and corresponding buffer until the corresponding stream is removed from stream configuration or until camera device session is closed.After the first time a buffer is introduced to HAL, in the future camera service must refer to the same buffer using only bufferId, and keep the buffer handle null.
The graphics buffer handle to the buffer.
For StreamBuffers sent to the HAL in a CaptureRequest, if the bufferId is not seen by the HAL before, this buffer handle is guaranteed to be a valid handle to a graphics buffer, with dimensions and format matching that of the stream.If the bufferId has been sent to the HAL before, this buffer handle must be null and HAL must look up the actual buffer handle to use from its own bufferId to buffer handle map.
For StreamBuffers returned in a CaptureResult, this must be null, since the handle to the buffer is already known to the client(since the client sent it in the matching CaptureRequest), and the handle can be identified by the combination of frame number and stream ID.
Current state of the buffer.The framework must not pass buffers to the HAL that are in an error state.In case a buffer could not be filled by the HAL, it must have its status set to ERROR when returned to the framework with processCaptureResult().
The acquire sync fence for this buffer.The HAL must wait on this fence fd before attempting to read from or write to this buffer.
In a buffer included in a CaptureRequest, the client may set this to null to indicate that no waiting is necessary for this buffer.
When the HAL returns an input or output buffer to the framework with processCaptureResult(), the acquireFence must be set to null.If the HAL never waits on the acquireFence due to an error in filling or reading a buffer, when calling processCaptureResult() the HAL must set the releaseFence of the buffer to be the acquireFence passed to it by the client.This allows the client to wait on the fence before reusing the buffer.
The release sync fence for this buffer.The HAL must set this to a valid fence fd when returning the input buffer or output buffers to the client in a CaptureResult, or set it to null to indicate that no waiting is required for this buffer.
The client must set this to be null for all buffers included in a processCaptureRequest call.
After signaling the releaseFence for this buffer, the HAL must not make any further attempts to access this buffer as the ownership has been fully transferred back to the client.
If this is null, then the ownership of this buffer is transferred back immediately upon the call of processCaptureResult.


struct CameraBlob {CameraBlobId blobId; uint32_t blobSize}


struct ErrorMsg {uint32_t frameNumber; int32_t errorStreamId; ErrorCode errorCode}


Message contents for MsgType::ERROR

Frame number of the request the error applies to.0 if the frame number isn't applicable to the error.
Pointer to the stream that had a failure.-1 if the stream isn't applicable to the error.
The code for this error.


struct ShutterMsg {uint32_t frameNumber; uint64_t timestamp}


Message contents for MsgType::SHUTTER

Frame number of the request that has begun exposure or reprocessing.
Timestamp for the start of capture.For a reprocess request, this must be input image's start of capture.This must match the capture result metadata's sensor exposure start timestamp.


struct NotifyMsg {MsgType type; union Message msg}


The message structure sent to ICameraDevice3Callback::notify()

The message type.


struct CaptureRequest {uint32_t frameNumber; uint64_t fmqSettingsSize; CameraMetadata settings; StreamBuffer inputBuffer; vec outputBuffers}


A single request for image capture/buffer reprocessing, sent to the Camera HAL device by the framework in processCaptureRequest().

The request contains the settings to be used for this capture, and the set of output buffers to write the resulting image data in.It may optionally contain an input buffer, in which case the request is for reprocessing that input buffer instead of capturing a new image with the camera sensor.The capture is identified by the frameNumber.

In response, the camera HAL device must send a CaptureResult structure asynchronously to the framework, using the processCaptureResult() callback.

The frame number is an incrementing integer set by the framework to uniquely identify this capture.It needs to be returned in the result call, and is also used to identify the request in asynchronous notifications sent to ICameraDevice3Callback::notify().
If non-zero, read settings from request queue instead(see ICameraDeviceSession.getCaptureRequestMetadataQueue). If zero, read settings from.settings field.
If fmqSettingsSize is zero, the settings buffer contains the capture and processing parameters for the request.As a special case, an empty settings buffer indicates that the settings are identical to the most-recently submitted capture request.A empty buffer cannot be used as the first submitted request after a configureStreams() call.
This field must be used if fmqSettingsSize is zero.It must not be used if fmqSettingsSize is non-zero.
The input stream buffer to use for this request, if any.
An invalid inputBuffer is signified by a null inputBuffer::buffer, in which case the value of all other members of inputBuffer must be ignored.
If inputBuffer is invalid, then the request is for a new capture from the imager.If inputBuffer is valid, the request is for reprocessing the image contained in inputBuffer, and the HAL must release the inputBuffer back to the client in a subsequent processCaptureResult call.
The HAL is required to wait on the acquire sync fence of the input buffer before accessing it.
An array of at least 1 stream buffers, to be filled with image data from this capture/reprocess.The HAL must wait on the acquire fences of each stream buffer before writing to them.
The HAL takes ownership of the handles in outputBuffers;the client must not access them until they are returned in a CaptureResult.
Any or all of the buffers included here may be brand new in this request(having never before seen by the HAL).


struct CaptureResult {uint32_t frameNumber; uint64_t fmqResultSize; CameraMetadata result; vec outputBuffers; StreamBuffer inputBuffer; uint32_t partialResult}


The result of a single capture/reprocess by the camera HAL device.This is sent to the framework asynchronously with processCaptureResult(), in response to a single capture request sent to the HAL with processCaptureRequest().Multiple processCaptureResult() calls may be performed by the HAL for each request.

Each call, all with the same frame number, may contain some subset of the output buffers, and/or the result metadata.

The result structure contains the output metadata from this capture, and the set of output buffers that have been/will be filled for this capture.Each output buffer may come with a release sync fence that the framework must wait on before reading, in case the buffer has not yet been filled by the HAL.

The metadata may be provided multiple times for a single frame number.The framework must accumulate together the final result set by combining each partial result together into the total result set.

If an input buffer is given in a request, the HAL must return it in one of the processCaptureResult calls, and the call may be to just return the input buffer, without metadata and output buffers;the sync fences must be handled the same way they are done for output buffers.

Performance considerations:

Applications receive these partial results immediately, so sending partial results is a highly recommended performance optimization to avoid the total pipeline latency before sending the results for what is known very early on in the pipeline.

A typical use case might be calculating the AF state halfway through the pipeline;by sending the state back to the framework immediately, we get a 50% performance increase and perceived responsiveness of the auto-focus.

The frame number is an incrementing integer set by the framework in the submitted request to uniquely identify this capture.It is also used to identify the request in asynchronous notifications sent to ICameraDevice3Callback::notify().
If non-zero, read result from result queue instead(see ICameraDeviceSession.getCaptureResultMetadataQueue). If zero, read result from.result field.
The result metadata for this capture.This contains information about the final capture parameters, the state of the capture and post-processing hardware, the state of the 3A algorithms, if enabled, and the output of any enabled statistics units.
If there was an error producing the result metadata, result must be an empty metadata buffer, and notify() must be called with ErrorCode::ERROR_RESULT.
Multiple calls to processCaptureResult() with a given frameNumber may include(partial)result metadata.
Partial metadata submitted must not include any metadata key returned in a previous partial result for a given frame.Each new partial result for that frame must also set a distinct partialResult value.
If notify has been called with ErrorCode::ERROR_RESULT, all further partial results for that frame are ignored by the framework.
The completed output stream buffers for this capture.
They may not yet be filled at the time the HAL calls processCaptureResult();the framework must wait on the release sync fences provided by the HAL before reading the buffers.
The StreamBuffer::buffer handle must be null for all returned buffers;the client must cache the handle and look it up via the combination of frame number and stream ID.
The number of output buffers returned must be less than or equal to the matching capture request's count.If this is less than the buffer count in the capture request, at least one more call to processCaptureResult with the same frameNumber must be made, to return the remaining output buffers to the framework.This may only be zero if the structure includes valid result metadata or an input buffer is returned in this result.
The HAL must set the stream buffer's release sync fence to a valid sync fd, or to null if the buffer has already been filled.
If the HAL encounters an error while processing the buffer, and the buffer is not filled, the buffer's status field must be set to ERROR.If the HAL did not wait on the acquire fence before encountering the error, the acquire fence must be copied into the release fence, to allow the framework to wait on the fence before reusing the buffer.
The acquire fence must be set to null for all output buffers.
This vector may be empty;if so, at least one other processCaptureResult call must be made(or have been made)by the HAL to provide the filled output buffers.
When processCaptureResult is called with a new buffer for a frame, all previous frames' buffers for that corresponding stream must have been already delivered(the fences need not have yet been signaled).
Buffers for a frame may be sent to framework before the corresponding SHUTTER-notify call is made by the HAL.
Performance considerations:
Buffers delivered to the framework are not dispatched to the application layer until a start of exposure timestamp has been received via a SHUTTER notify() call.It is highly recommended to dispatch that call as early as possible.
The handle for the input stream buffer for this capture, if any.
It may not yet be consumed at the time the HAL calls processCaptureResult();the framework must wait on the release sync fence provided by the HAL before reusing the buffer.
The HAL must handle the sync fences the same way they are done for outputBuffers.
Only one input buffer is allowed to be sent per request.Similarly to output buffers, the ordering of returned input buffers must be maintained by the HAL.
Performance considerations:
The input buffer should be returned as early as possible.If the HAL supports sync fences, it can call processCaptureResult to hand it back with sync fences being set appropriately.If the sync fences are not supported, the buffer can only be returned when it is consumed, which may take long time;the HAL may choose to copy this input buffer to make the buffer return sooner.
In order to take advantage of partial results, the HAL must set the static metadata android.request.partialResultCount to the number of partial results it sends for each frame.
Each new capture result with a partial result must set this field to a distinct inclusive value between 1 and android.request.partialResultCount.
HALs not wishing to take advantage of this feature must not set an android.request.partialResultCount or partial_result to a value other than 1.
This value must be set to 0 when a capture result contains buffers only and no metadata.


struct BufferCache {int32_t streamId; uint64_t bufferId}


A list of cached bufferIds associated with a certain stream.Buffers are passed between camera service and camera HAL via bufferId except the first time a new buffer is being passed to HAL in CaptureRequest.Camera service and camera HAL therefore need to maintain a cached map of bufferId and corresponing native handle.

The ID of the stream this list is associated with.
A cached buffer ID associated with streamId.