在前一版本的外部視圖系統 (EVS) 中, IEvsCameraStream
接口定義了一個回調方法來僅傳送捕獲的視頻幀。雖然這簡化了 EVS 服務客戶端實現,但它也使客戶端難以識別任何流式事件,因此難以正確處理它們。為了增強 EVS 開發體驗,AOSP 現在包含一個額外的回調來傳遞流事件。
package android.hardware.automotive.evs@1.1; import @1.0::IEvsCameraStream; /** * Implemented on client side to receive asynchronous video frame deliveries. */ interface IEvsCameraStream extends @1.0::IEvsCameraStream { /** * Receives calls from the HAL each time a video frame is ready for inspection. * Buffer handles received by this method must be returned via calls to * IEvsCamera::doneWithFrame_1_1(). When the video stream is stopped via a call * to IEvsCamera::stopVideoStream(), this callback may continue to happen for * some time as the pipeline drains. Each frame must still be returned. * When the last frame in the stream has been delivered, STREAM_STOPPED * event must be delivered. No further frame deliveries may happen * thereafter. * * @param buffer a buffer descriptor of a delivered image frame. */ oneway deliverFrame_1_1(BufferDesc buffer); /** * Receives calls from the HAL each time an event happens. * * @param event EVS event with possible event information. */ oneway notify(EvsEvent event); };
此方法提供由三個字段組成的EvsEventDesc
:
- 事件的類型。
- 用於標識事件來源的字符串。
- 4 個 32 位字數據,包含可能的事件信息。
/** * Structure that describes informative events occurred during EVS is streaming */ struct EvsEvent { /** * Type of an informative event */ EvsEventType aType; /** * Device identifier */ string deviceId; /** * Possible additional information */ uint32_t[4] payload; };
並且,為了避免 EVS 和其他 Android 圖形組件之間的圖形緩衝區描述出現任何分歧,已重新定義BufferDesc
以使用從 android.hardware.graphics.common@1.2 接口導入的HardwareBuffer
。 HardwareBuffer
包含HardwareBufferDescription
,它是Android NDK 的AHardwareBuffer_Desc
的 HIDL 對應物,具有緩衝區句柄。
/** * HIDL counterpart of AHardwareBuffer_Desc. * * An AHardwareBuffer_Desc object can be converted to and from a * HardwareBufferDescription object by memcpy(). * * @sa +ndk libnativewindow#AHardwareBuffer_Desc. */ typedef uint32_t[10] HardwareBufferDescription; /** * HIDL counterpart of AHardwareBuffer. * * AHardwareBuffer_createFromHandle() can be used to convert a HardwareBuffer * object to an AHardwareBuffer object. * * Conversely, AHardwareBuffer_getNativeHandle() can be used to extract a native * handle from an AHardwareBuffer object. Paired with AHardwareBuffer_Desc, * AHardwareBuffer_getNativeHandle() can be used to convert between * HardwareBuffer and AHardwareBuffer. * * @sa +ndk libnativewindow#AHardwareBuffer". */ struct HardwareBuffer { HardwareBufferDescription description; handle nativeHandle; } /** * Structure representing an image buffer through our APIs * * In addition to the handle to the graphics memory, need to retain * the properties of the buffer for easy reference and reconstruction of * an ANativeWindowBuffer object on the remote side of API calls. * (Not least because OpenGL expect an ANativeWindowBuffer* for us as a * texture via eglCreateImageKHR(). */ struct BufferDesc { /** * HIDL counterpart of AHardwareBuffer_Desc. Please see * hardware/interfaces/graphics/common/1.2/types.hal for more details. */ HardwareBuffer buffer; /** * The size of a pixel in the units of bytes */ uint32_t pixelSize; /** * Opaque value from driver */ uint32_t bufferId; /** * Unique identifier of the physical camera device that produces this buffer. */ string deviceId; /** * Time that this buffer is being filled */ int64_t timestamp; /** * Frame metadata. This is opaque to EVS manager */ vec<uint8_t> metadata };
注意: HardwareBufferDescription
定義為一個由 10 個 32 位字組成的數組。您可能希望將其轉換為AHardwareBuffer_Desc
類型並填寫內容。
EvsEventDesc
是enum EvsEventType
的結構,它列出了幾個流事件和一個 32 位字的有效負載,開發人員可以在其中放置可能的附加信息。例如,開發人員可以為流錯誤事件放置錯誤代碼。
/** * Types of informative streaming events */ enum EvsEventType : uint32_t { /** * Video stream is started */ STREAM_STARTED = 0, /** * Video stream is stopped */ STREAM_STOPPED, /** * Video frame is dropped */ FRAME_DROPPED, /** * Timeout happens */ TIMEOUT, /** * Camera parameter is changed; payload contains a changed parameter ID and * its value */ PARAMETER_CHANGED, /** * Master role has become available */ MASTER_RELEASED, };
幀傳送
通過新的BufferDesc
, IEvsCameraStream
還引入了新的回調方法來接收來自服務實現的幀和流事件。
/** * Implemented on client side to receive asynchronous streaming event deliveries. */ interface IEvsCameraStream extends @1.0::IEvsCameraStream { /** * Receives calls from the HAL each time video frames are ready for inspection. * Buffer handles received by this method must be returned via calls to * IEvsCamera::doneWithFrame_1_1(). When the video stream is stopped via a call * to IEvsCamera::stopVideoStream(), this callback may continue to happen for * some time as the pipeline drains. Each frame must still be returned. * When the last frame in the stream has been delivered, STREAM_STOPPED * event must be delivered. No further frame deliveries may happen * thereafter. * * A camera device will deliver the same number of frames as number of * backing physical camera devices; it means, a physical camera device * sends always a single frame and a logical camera device sends multiple * frames as many as the number of backing physical camera devices. * * @param buffer Buffer descriptors of delivered image frames. */ oneway deliverFrame_1_1(vec<BufferDesc> buffer); /** * Receives calls from the HAL each time an event happens. * * @param event EVS event with possible event information. */ oneway notify(EvsEventDesc event); };
較新版本的幀回調方法旨在傳遞多個緩衝區描述符。因此,如果 EVS 攝像頭實現管理多個源,則可以通過單個調用轉發多個幀。
此外,之前通知流結束的協議(發送空幀)已被棄用,並被STREAM_STOPPED
事件取代。
圖 1.事件通知序列圖
使用事件和幀通知機制
識別客戶端實現的 IEvsCameraStream 版本
服務可以通過嘗試向下轉換來識別客戶端實現的傳入 IEvsCameraStream 接口的版本:
using IEvsCameraStream_1_0 = ::android::hardware::automotive::evs::V1_0::IEvsCameraStream; using IEvsCameraStream_1_1 = ::android::hardware::automotive::evs::V1_1::IEvsCameraStream; Return<EvsResult> EvsV4lCamera::startVideoStream( const sp<IEvsCameraStream_1_0>& stream) { IEvsCameraStream_1_0 aStream = stream; // Try to downcast. This will succeed if the client implements // IEvsCameraStream v1.1. IEvsCameraStream_1_1 aStream_1_1 = IEvsCameraStream_1_1::castFrom(aStream).withDefault(nullptr); if (aStream_1_1 == nullptr) { ALOGI("Start a stream for v1.0 client."); } else { ALOGI("Start a stream for v1.1 client."); } // Start a video stream ... }
通知()回調
EvsEvent
將通過notify()
回調傳遞,然後客戶端可以根據鑑別器識別其類型,如下所示:
Return<void> StreamHandler::notify(const EvsEvent& event) { ALOGD("Received an event id: %u", event.aType); // Handle each received event. switch(event.aType) { case EvsEventType::ERROR: // Do something to handle an error ... break; [More cases] } return Void(); }
使用 BufferDesc
AHardwareBuffer_Desc
是 Android NDK 的數據類型,用於表示可以綁定到 EGL/OpenGL 和 Vulkan 原語的本機硬件緩衝區。它包含來自先前 EVS BufferDesc 的大部分緩衝區元數據,因此在新的 BufferDesc 定義中替換它。但是,由於這在 HIDL 接口中定義為數組,因此無法直接索引成員變量。相反,您可以將數組轉換為AHardwareBuffer_Desc
類型,如下所示:
BufferDesc bufDesc = {}; AHardwareBuffer_Desc* pDesc = reinterpret_cast<AHardwareBuffer_Desc *>(&bufDesc.buffer.description); pDesc->width = mVideo.getWidth(); pDesc->height = mVideo.getHeight(); pDesc->layers = 1; pDesc->format = mFormat; pDesc->usage = mUsage; pDesc->stride = mStride; bufDesc_1_1.buffer.nativeHandle = mBuffers[idx].handle; bufDesc_1_1.bufferId = idx;