[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["缺少我需要的資訊","missingTheInformationINeed","thumb-down"],["過於複雜/步驟過多","tooComplicatedTooManySteps","thumb-down"],["過時","outOfDate","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["示例/程式碼問題","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-07-27 (世界標準時間)。"],[],[],null,["# SurfaceTexture\n\n`SurfaceTexture` is a combination of a surface and an [OpenGL ES (GLES)](https://www.khronos.org/opengles/)\ntexture. `SurfaceTexture` instances are used to provide surfaces that output to GLES\ntextures.\n\n`SurfaceTexture` contains an instance of `BufferQueue` for which\napps are the consumer. The `onFrameAvailable()` callback\nnotifies apps when the producer queues a new buffer. Then, apps call\n`updateTexImage()`, which releases the previously held buffer,\nacquires the new buffer from the queue, and makes [EGL](https://www.khronos.org/egl) calls to make the buffer available to GLES as an external texture.\n\nExternal GLES textures\n----------------------\n\nExternal GLES textures (`GL_TEXTURE_EXTERNAL_OES`) differ\nfrom traditional GLES textures (`GL_TEXTURE_2D`) in the following\nways:\n\n- External textures render textured polygons directly from data received from `BufferQueue`.\n- External texture renderers are configured differently than traditional GLES texture renderers.\n- External textures can't perform all traditional GLES texture activities.\n\nThe main benefit of external textures is\ntheir ability to render directly from `BufferQueue` data. `SurfaceTexture`\ninstances set the consumer usage flags to `GRALLOC_USAGE_HW_TEXTURE` when it creates\n`BufferQueue` instances for external textures to ensure that the data in the buffer is\nrecognizable by GLES.\n\nBecause `SurfaceTexture` instances interact with an EGL context, an app can only call\nits methods while the EGL context that owns the texture is current on the\ncalling thread. For more information see the [`SurfaceTexture`](https://developer.android.com/reference/android/graphics/SurfaceTexture) class documentation.\n\nTimestamps and transformations\n------------------------------\n\n`SurfaceTexture` instances include the `getTimeStamp()` method, which\nretrieves a timestamp, and `getTransformMatrix()` method, which\nretrieves a transformation matrix. Calling `updateTexImage()`\nsets both the timestamp and the transformation matrix. Each buffer that\n`BufferQueue` passes includes transformation parameters and a timestamp.\n\nTransformation parameters are useful for efficiency. In some cases,\nsource data might be in the incorrect orientation for the consumer.\nInstead of rotating the data before sending it to the consumer, send the data in\nits orientation with a transform that corrects it. The transformation\nmatrix can be merged with other transformations when the data is used,\nminimizing overhead.\n\nThe timestamp is useful for buffer sources that are time dependent. For\nexample, when `setPreviewTexture()` connects\nthe producer interface to the output of the camera, frames from the camera can\nbe used to create a video. Each frame needs to have a presentation\ntimestamp from when the frame was captured, not from when the app received the\nframe. The camera code sets the timestamp provided with the buffer,\nresulting in a more consistent series of timestamps.\n\nCase study: Grafika's continuous capture\n----------------------------------------\n\n[Grafika's continuous capture](https://github.com/google/grafika/blob/master/app/src/main/java/com/android/grafika/ContinuousCaptureActivity.java) involves recording frames\nfrom a device's camera and displaying those frames on screen.\nTo record frames, create a surface with the\n[MediaCodec](https://www.google.com/url?sa=D&q=https%3A%2F%2Fdeveloper.android.com%2Freference%2Fandroid%2Fmedia%2FMediaCodec) class's\n`createInputSurface()` method and pass the surface to the camera. To\ndisplay frames, create an instance of `SurfaceView` and pass the surface to\n`setPreviewDisplay()`. Note that recording frames and displaying\nthem at the same time is a more involved process.\n\nThe *continuous capture* activity displays video from the camera as\nvideo is being recorded. In this case, encoded video is written to a\ncircular buffer in memory that can be saved to disk at any time.\n\nThis flow involves three buffer queues:\n\n- `App` --- The app uses a `SurfaceTexture` instance to receive frames from the camera, converting them to an external GLES texture.\n- `SurfaceFlinger` --- The app declares a `SurfaceView` instance to display the frames.\n- `MediaServer` --- Configure a `MediaCodec` encoder with an input surface to create the video.\n\nIn the figure below, the arrows indicate data propagation from the camera.\n`BufferQueue` instances are in color (producers are teal, consumers are green).\n\n**Figure 1.** Grafika's continuous capture\nactivity\n\nEncoded H.264 video goes to a circular buffer in RAM in the app process.\nWhen a user presses the capture button, the `MediaMuxer` class\nwrites the encoded video to an MP4 file on disk.\n\nAll `BufferQueue` instances are handled with a single EGL context in the\napp while the GLES operations are performed on the UI thread. The handling of\nencoded data (managing a circular buffer and writing it to disk) is done\non a separate thread.\n| **Warning:** If the video encoder locks up and blocks a dequeueing buffer the app becomes unresponsive.\n| **Note:** Performing `SurfaceView` rendering on the UI thread is discouraged, but it's used in this case study for simplicity. More complex rendering should use a dedicated thread to isolate the GLES context and minimize interference with the rendering of another app's UI.\nWhen using the `SurfaceView` class, the `surfaceCreated()` callback creates the `EGLContext` and `EGLSurface` instances for the display and the video encoder. When a new frame arrives, `SurfaceTexture` performs four activities:\n\n1. Acquires the frame.\n2. Makes the frame available as a GLES texture.\n3. Renders the frame with GLES commands.\n4. Forwards the transform and timestamp for each instance of `EGLSurface`.\n\nThe encoder thread then pulls the encoded output from `MediaCodec` and stashes\nit in memory.\n\nSecure texture video playback\n-----------------------------\n\nAndroid supports GPU post-processing of protected video content. This\nlets apps use the GPU for complex, nonlinear video effects (such as warps),\nmapping protected video content onto textures for use in general graphics scenes\n(for example, using GLES), and virtual reality (VR).\n\n**Figure 2.** Secure texture video playback\n\nSupport is enabled using the following two extensions:\n\n- **EGL extension** --- ([EGL_EXT_protected_content](https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt)) Enables the creation of protected GL contexts and surfaces, which can both operate on protected content.\n- **GLES extension** --- ([GL_EXT_protected_textures](https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt)) Enables tagging textures as protected so they can be used as framebuffer texture attachments.\n\nAndroid enables `SurfaceTexture` and ACodec\n(`libstagefright.so`) to send protected content even if\nthe window's surface doesn't queue to `SurfaceFlinger`\nand provides a protected video surface for use within a protected context. This\nis done by setting the protected consumer bit\n(`GRALLOC_USAGE_PROTECTED`) on surfaces created in a protected\ncontext (verified by ACodec).\n\nSecure texture video playback sets the foundation for strong DRM\nimplementation in the OpenGL ES environment. Without a strong DRM implementation,\nsuch as Widevine Level 1, many content providers don't allow rendering of\ntheir high-value content in the OpenGL ES environment, preventing important VR\nuse cases such as watching DRM-protected content in VR.\n\nAOSP includes framework code for secure texture video playback. Driver\nsupport is up to OEMs. Device implementers must implement\n`EGL_EXT_protected_content` and\n`GL_EXT_protected_textures extensions`. When using your own codec\nlibrary (to replace `libstagefright`), note the changes in\n`/frameworks/av/media/libstagefright/SurfaceUtils.cpp` that allow\nbuffers marked with `GRALLOC_USAGE_PROTECTED` to be sent to\n`ANativeWindow` (even if `ANativeWindow` doesn't queue directly to the window\ncomposer) as long as the consumer usage bits contain\n`GRALLOC_USAGE_PROTECTED`. For detailed documentation on implementing\nthe extensions, refer to the Khronos registries\n([`EGL_EXT_protected_content`](https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt),\nand\n[`GL_EXT_protected_textures`](https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt)).\n| **Note:** Device implementers may need to make hardware changes to ensure that protected memory mapped onto the GPU remains protected and unreadable by unprotected code."]]