自 2025 年 3 月 27 日起,我们建议您使用 android-latest-release
而非 aosp-main
构建 AOSP 并为其做出贡献。如需了解详情,请参阅 AOSP 的变更。
层和显示屏
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
层和屏幕是两个基元,用于表示合成工作以及与屏幕硬件的交互。
图层
层是合成的最重要单元。层是 Surface 和 SurfaceControl
实例的组合。每个层都有一组属性,用于定义它与其他层的交互方式。层属性如下表所述。
属性 |
说明 |
定位 |
定义层在其屏幕上的显示位置。包括层边缘的位置及其相对于其他层的 Z 顺序(指示该层在其他层之前还是之后)等信息。 |
内容 |
定义应如何在定位属性定义的边界内呈现层上显示的内容。包括诸如剪裁(用来扩展内容的一部分以填充层的边界)和转换(用来显示旋转或翻转的内容)等信息。 |
乐曲 |
定义层应如何与其他层合成。包括混合模式和用于 Alpha 合成的全层 Alpha 值等信息。 |
优化 |
提供对于正确合成层并非绝对必要但可由硬件混合渲染器 (HWC) 设备用来优化合成执行方式的信息。包括层的可见区域以及层的哪个部分自上一帧以来已经更新等信息。 |
显示屏
屏幕是合成的另一个重要单元。系统可以具有多个屏幕,并且在正常系统操作期间可以添加或移除屏幕。屏幕应 HWC 或框架的请求添加/移除。
在外部屏幕与设备连接或断开连接时(称为热插拔),HWC 设备请求添加或移除屏幕。客户端请求虚拟屏幕,其内容会渲染到离屏缓冲区(而不是实体屏幕)。
虚拟屏幕
SurfaceFlinger 支持一个内部屏幕(内置于手机或平板电脑中的屏幕)、一个外部屏幕(如通过 HDMI 连接的电视)以及一个或多个令合成的输出在系统中可用的虚拟屏幕。虚拟屏幕可用于记录屏幕信息或通过网络发送屏幕信息。为虚拟屏幕生成的帧会写入 BufferQueue。
虚拟屏幕可以与主屏幕共享相同的一组层(层堆叠),也可拥有自己的一组层。虚拟屏幕没有 VSYNC,因此内部屏幕的 VSYNC 可为所有屏幕触发合成。
在支持虚拟屏幕的 HWC 实现中,虚拟屏幕可以与 OpenGL ES (GLES)、HWC 或者 GLES 及 HWC 合成在一起。在不支持虚拟屏幕的实现中,虚拟屏幕始终使用 GLES 进行合成。
案例研究:screenrecord
screenrecord
命令可让用户将屏幕上显示的所有内容作为一个 .mp4
文件记录在磁盘上。为此,系统从 SurfaceFlinger 接收合成的帧,将它们写入视频编码器,然后将已编码的视频数据写入一个文件。视频编解码器由单独的进程 (mediaserver
) 进行管理,因此我们必须在系统中移动大量图形缓冲区。为了使其更具挑战性,我们的目标是以全分辨率录制 60 fps 的视频。高效完成这项工作的关键是 BufferQueue。
MediaCodec
类允许应用以缓冲区中的原始字节形式或通过 Surface 来提供数据。当 screenrecord
请求访问视频编码器时,mediaserver
进程会创建一个 BufferQueue,将其自身连接到使用方端,然后将生产方端作为 Surface 传回到 screenrecord
。
然后,screenrecord
实用程序会要求 SurfaceFlinger 创建一个镜像主屏幕的虚拟屏幕(即它与主屏幕具有完全相同的层),并指示它将输出发送到来自 mediaserver
进程的 Surface。在这种情况下,SurfaceFlinger 是缓冲区的生产方,而不是使用方。
配置完成后,screenrecord
会在编码数据显示时触发。在应用绘制时,其缓冲区会前往 SurfaceFlinger,SurfaceFlinger 将它们合成为单个缓冲区,然后直接发送到 mediaserver
进程中的视频编码器。screenrecord
进程从未观测到完整的帧。在内部,mediaserver
进程具有自己的移动缓冲区的方式,这种方式还通过句柄传递数据,从而最大限度地降低开销。
案例研究:模拟辅助显示设备
WindowManager 可以要求 SurfaceFlinger 创建一个可见层,以 SurfaceFlinger 作为其 BufferQueue 使用方。也可以要求 SurfaceFlinger 创建一个虚拟屏幕,同样以 SurfaceFlinger 作为其 BufferQueue 生产方。
如果将虚拟屏幕连接到可见层,则系统会创建一个闭合循环,其中合成的屏幕显示在窗口中。该窗口现在是合成输出的一部分,因此在下一次刷新时,该窗口中的合成图像也会显示窗口内容。如需实际查看该进程,请在设置中启用开发者选项,选择模拟辅助显示设备,然后启用一个窗口。如需实际查看辅助显示设备,请使用 screenrecord
捕获启用显示设备的操作,然后逐帧播放。
本页面上的内容和代码示例受内容许可部分所述许可的限制。Java 和 OpenJDK 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-03-26。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-03-26。"],[],[],null,["# Layers and displays are two primitives that represent composition work\nand interactions with the display hardware.\n\nLayers\n------\n\nA *layer* is the most important unit of composition. A layer is a\ncombination of a [surface](/docs/core/graphics/arch-sh) and an instance of\n[`SurfaceControl`](https://developer.android.com/reference/android/view/SurfaceControl).\nEach layer has a set of properties that define how it interacts with other layers. Layer\nproperties are described in the following table:\n\n| Property | Description |\n|--------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Positional | Defines where the layer appears on its display. Includes information such as the positions of a layer's edges and its *Z order* relative to other layers (whether it should be in front of or behind other layers). |\n| Content | Defines how content displayed on the layer should be presented within the bounds defined by the positional properties. Includes information such as crop (to expand a portion of the content to fill the bounds of the layer) and transform (to show rotated or flipped content). |\n| Composition | Defines how the layer should be composited with other layers. Includes information such as blending mode and a layer-wide alpha value for [alpha compositing](https://en.wikipedia.org/wiki/Alpha_compositing#Alpha_blending). |\n| Optimization | Provides information not strictly necessary to correctly composite the layer, but that can be used by the Hardware Composer (HWC) device to optimize how it performs composition. Includes information such as the visible region of the layer and which portion of the layer has been updated since the previous frame. |\n\nDisplays\n--------\n\nA *display* is another important unit of composition. A system can\nhave multiple displays and\ndisplays can be added or removed during normal system operations. Displays are\nadded or removed at the request of the HWC or at the request of the framework.\nThe HWC device requests displays be added or removed when an external\ndisplay is connected or disconnected from the device, which is called\n*hotplugging* . Clients request *virtual displays*, whose contents\nare rendered into an off-screen buffer instead of to a physical display.\n\n### Virtual displays\n\n[SurfaceFlinger](/docs/core/graphics/arch-sf-hwc#surfaceflinger)\nsupports an internal display (built into the phone or tablet), external displays\n(such as a television connected through HDMI), and one or more virtual displays\nthat make composited output available within the system. Virtual displays can\nbe used to record the screen or send the screen over a network. Frames generated\nfor a virtual display are written to a BufferQueue.\n\nVirtual displays may share the same set of layers as the main display\n(the layer stack) or have their own set. There's no VSync for a virtual display,\nso the VSync for the internal display triggers composition for all\ndisplays.\n\nOn HWC implementations that support them, virtual\ndisplays can be composited with OpenGL ES (GLES), HWC, or both GLES and HWC.\nOn nonsupporting implementations, virtual displays are always composited using\nGLES.\n\nCase study: screenrecord\n------------------------\n\nThe [`screenrecord` command](https://android.googlesource.com/platform/frameworks/av/+/android16-release/cmds/screenrecord/) allows the user to\nrecord everything that appears on the screen as an MP4 file on\ndisk. To implement this, the system receives composited frames from\nSurfaceFlinger, writes them to the video encoder, and then writes the encoded\nvideo data to a file. The video codecs are managed by a separate process\n(`mediaserver`), so large graphics buffers have to move around the\nsystem. To make it more challenging, the goal is to record 60 fps video at\nfull resolution. The key to making this work efficiently is BufferQueue.\n\nThe `MediaCodec` class allows an app to provide data as raw bytes in buffers,\nor through a surface. When `screenrecord` requests access to a video\nencoder, the `mediaserver` process creates a BufferQueue, connects\nitself to the consumer side, then passes the producer side back to\n`screenrecord` as a surface.\n\nThe `screenrecord` utility then asks SurfaceFlinger to create a\nvirtual display that mirrors the main display (that is, it has all of the same\nlayers), and directs it to send output to the surface that came from the\n`mediaserver` process. In this case, SurfaceFlinger is the producer\nof buffers rather than the consumer.\n\nAfter the configuration is complete, `screenrecord` triggers when\nthe encoded data appears. As apps draw, their buffers travel to SurfaceFlinger,\nwhich composites them into a single buffer that's sent directly to the video\nencoder in the `mediaserver` process. The full frames are never\nseen by the `screenrecord` process. Internally, the\n`mediaserver` process has its own way of moving buffers around that\nalso passes data by handle, minimizing overhead.\n\nCase study: simulate secondary displays\n---------------------------------------\n\nThe WindowManager can ask SurfaceFlinger to create a visible layer for which\nSurfaceFlinger acts as the BufferQueue consumer. It's also possible to ask\nSurfaceFlinger to create a virtual display, for which SurfaceFlinger acts as\nthe BufferQueue producer.\n\nIf you connect a virtual display to a visible layer, a closed loop is created\nwhere the composited screen appears in a window. That window is now part of the\ncomposited output, so on the next refresh the composited image inside the window\nshows the window contents as well. To see this in action, enable\n[Developer options](https://developer.android.com/studio/debug/dev-options) in **Settings** , select\n**Simulate secondary displays** , and enable a window. To see\nsecondary displays in action, use `screenrecord` to capture the act\nof enabling the display then play it back frame by frame."]]