自 2025 年 3 月 27 日起,我們建議您使用 android-latest-release
而非 aosp-main
建構及貢獻 AOSP。詳情請參閱「Android 開放原始碼計畫變更」。
圖層和顯示
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
圖層和顯示畫面是兩個原始元素,代表組合作業和與顯示硬體的互動。
圖層
圖層是組合中最重要的單位。圖層是 surface 和 SurfaceControl
例項的組合。每個圖層都有一組屬性,用於定義與其他圖層的互動方式。下表說明圖層屬性:
資源 |
說明 |
位置 |
定義圖層在螢幕上的顯示位置。包含圖層邊緣的位置和相對於其他圖層的 Z 順序等資訊 (也就是圖層應置於其他圖層前方還是後方)。 |
內容 |
定義圖層上顯示的內容,應如何在位置屬性定義的邊界內顯示。包含裁剪 (可擴展部分內容,以填滿圖層邊界) 和轉換 (可顯示旋轉或翻轉的內容) 等資訊。 |
樂曲 |
定義此圖層應如何與其他圖層合成。包含alpha 合成的混合模式和圖層層級 alpha 值等資訊。 |
最佳化 |
提供的資訊並非正確合成的圖層所必需,但可供硬體合成器 (HWC) 裝置使用,以便最佳化其執行合成作業的方式。包含圖層可見區域和圖層自上一個影格以來更新的部分等資訊。 |
螢幕
顯示是另一個重要的組合單位。系統可以有多個螢幕,並可在正常系統作業期間新增或移除螢幕。系統會根據 HWC 或架構的要求,新增或移除螢幕。當外接螢幕連線或從裝置中斷連線時,HWC 裝置會要求新增或移除螢幕,這稱為熱插拔。用戶端會要求虛擬螢幕,其內容會轉譯至螢幕外緩衝區,而不是實體螢幕。
虛擬螢幕
SurfaceFlinger 支援內部螢幕 (內建於手機或平板電腦)、外部螢幕 (例如透過 HDMI 連接的電視),以及一或多個虛擬螢幕,讓系統提供合成的輸出內容。虛擬螢幕可用於錄製螢幕畫面,或透過網路傳送螢幕畫面。為虛擬螢幕產生的影格會寫入 BufferQueue。
虛擬螢幕可能會與主螢幕共用一組圖層 (圖層堆疊),也可能有自己的圖層。虛擬顯示器沒有 VSync,因此內部顯示器的 VSync 會觸發所有顯示器的組合。
在支援 HWC 的實作項目中,虛擬螢幕可以與 OpenGL ES (GLES)、HWC 或 GLES 和 HWC 合成。在未支援的實作項目中,虛擬顯示螢幕一律會使用 GLES 進行合成。
個案研究:screenrecord
screenrecord
指令可讓使用者將螢幕上顯示的所有內容錄製為磁碟上的 MP4 檔案。為實現這項功能,系統會接收來自 SurfaceFlinger 的合成影格,將其寫入影片編碼器,然後將已編碼的影片資料寫入檔案。影片編解碼是由個別程序 (mediaserver
) 管理,因此大型圖形緩衝區必須在系統中移動。為了提高難度,目標是錄製 60 fps 影片,且解析度為全解析度。要讓這項工作順利進行,關鍵在於 BufferQueue。
MediaCodec
類別可讓應用程式在緩衝區或透過介面,以原始位元組的形式提供資料。當 screenrecord
要求存取影片編碼器時,mediaserver
程序會建立 BufferQueue,將自己連接至消費者端,然後將生產者端傳回至 screenrecord
做為介面。
screenrecord
公用程式會要求 SurfaceFlinger 建立鏡像主螢幕的虛擬螢幕 (也就是具有所有相同的圖層),並指示將輸出內容傳送至來自 mediaserver
處理程序的途徑。在這種情況下,SurfaceFlinger 是緩衝區的生產端,而非消費者。
設定完成後,screenrecord
會在編碼資料出現時觸發。應用程式繪製時,其緩衝區會傳送至 SurfaceFlinger,後者會將這些緩衝區合成單一緩衝區,並直接傳送至 mediaserver
程序中的影片編碼器。screenrecord
程序不會看到完整的影格。在內部,mediaserver
程序會以自己的方式移動緩衝區,並透過句柄傳遞資料,盡可能減少額外負擔。
個案研究:模擬次要顯示器
WindowManager 可以要求 SurfaceFlinger 建立可見的圖層,並由 SurfaceFlinger 充當 BufferQueue 消費者。您也可以要求 SurfaceFlinger 建立虛擬螢幕,SurfaceFlinger 會充當 BufferQueue 的生產者。
如果您將虛擬顯示器連接至可見圖層,系統就會建立封閉迴路,在視窗中顯示合成的畫面。該視窗現在是合成輸出的一部分,因此在下次重新整理時,視窗內的合成圖片也會顯示視窗內容。如要查看實際運作情形,請在「設定」中啟用「開發人員選項」,選取「模擬次要螢幕」,然後啟用視窗。如要查看次要顯示螢幕的運作情形,請使用 screenrecord
擷取啟用顯示螢幕的動作,然後逐格播放。
這個頁面中的內容和程式碼範例均受《內容授權》中的授權所規範。Java 與 OpenJDK 是 Oracle 和/或其關係企業的商標或註冊商標。
上次更新時間:2025-07-27 (世界標準時間)。
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["缺少我需要的資訊","missingTheInformationINeed","thumb-down"],["過於複雜/步驟過多","tooComplicatedTooManySteps","thumb-down"],["過時","outOfDate","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["示例/程式碼問題","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-07-27 (世界標準時間)。"],[],[],null,["# Layers and displays are two primitives that represent composition work\nand interactions with the display hardware.\n\nLayers\n------\n\nA *layer* is the most important unit of composition. A layer is a\ncombination of a [surface](/docs/core/graphics/arch-sh) and an instance of\n[`SurfaceControl`](https://developer.android.com/reference/android/view/SurfaceControl).\nEach layer has a set of properties that define how it interacts with other layers. Layer\nproperties are described in the following table:\n\n| Property | Description |\n|--------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Positional | Defines where the layer appears on its display. Includes information such as the positions of a layer's edges and its *Z order* relative to other layers (whether it should be in front of or behind other layers). |\n| Content | Defines how content displayed on the layer should be presented within the bounds defined by the positional properties. Includes information such as crop (to expand a portion of the content to fill the bounds of the layer) and transform (to show rotated or flipped content). |\n| Composition | Defines how the layer should be composited with other layers. Includes information such as blending mode and a layer-wide alpha value for [alpha compositing](https://en.wikipedia.org/wiki/Alpha_compositing#Alpha_blending). |\n| Optimization | Provides information not strictly necessary to correctly composite the layer, but that can be used by the Hardware Composer (HWC) device to optimize how it performs composition. Includes information such as the visible region of the layer and which portion of the layer has been updated since the previous frame. |\n\nDisplays\n--------\n\nA *display* is another important unit of composition. A system can\nhave multiple displays and\ndisplays can be added or removed during normal system operations. Displays are\nadded or removed at the request of the HWC or at the request of the framework.\nThe HWC device requests displays be added or removed when an external\ndisplay is connected or disconnected from the device, which is called\n*hotplugging* . Clients request *virtual displays*, whose contents\nare rendered into an off-screen buffer instead of to a physical display.\n\n### Virtual displays\n\n[SurfaceFlinger](/docs/core/graphics/arch-sf-hwc#surfaceflinger)\nsupports an internal display (built into the phone or tablet), external displays\n(such as a television connected through HDMI), and one or more virtual displays\nthat make composited output available within the system. Virtual displays can\nbe used to record the screen or send the screen over a network. Frames generated\nfor a virtual display are written to a BufferQueue.\n\nVirtual displays may share the same set of layers as the main display\n(the layer stack) or have their own set. There's no VSync for a virtual display,\nso the VSync for the internal display triggers composition for all\ndisplays.\n\nOn HWC implementations that support them, virtual\ndisplays can be composited with OpenGL ES (GLES), HWC, or both GLES and HWC.\nOn nonsupporting implementations, virtual displays are always composited using\nGLES.\n\nCase study: screenrecord\n------------------------\n\nThe [`screenrecord` command](https://android.googlesource.com/platform/frameworks/av/+/android16-release/cmds/screenrecord/) allows the user to\nrecord everything that appears on the screen as an MP4 file on\ndisk. To implement this, the system receives composited frames from\nSurfaceFlinger, writes them to the video encoder, and then writes the encoded\nvideo data to a file. The video codecs are managed by a separate process\n(`mediaserver`), so large graphics buffers have to move around the\nsystem. To make it more challenging, the goal is to record 60 fps video at\nfull resolution. The key to making this work efficiently is BufferQueue.\n\nThe `MediaCodec` class allows an app to provide data as raw bytes in buffers,\nor through a surface. When `screenrecord` requests access to a video\nencoder, the `mediaserver` process creates a BufferQueue, connects\nitself to the consumer side, then passes the producer side back to\n`screenrecord` as a surface.\n\nThe `screenrecord` utility then asks SurfaceFlinger to create a\nvirtual display that mirrors the main display (that is, it has all of the same\nlayers), and directs it to send output to the surface that came from the\n`mediaserver` process. In this case, SurfaceFlinger is the producer\nof buffers rather than the consumer.\n\nAfter the configuration is complete, `screenrecord` triggers when\nthe encoded data appears. As apps draw, their buffers travel to SurfaceFlinger,\nwhich composites them into a single buffer that's sent directly to the video\nencoder in the `mediaserver` process. The full frames are never\nseen by the `screenrecord` process. Internally, the\n`mediaserver` process has its own way of moving buffers around that\nalso passes data by handle, minimizing overhead.\n\nCase study: simulate secondary displays\n---------------------------------------\n\nThe WindowManager can ask SurfaceFlinger to create a visible layer for which\nSurfaceFlinger acts as the BufferQueue consumer. It's also possible to ask\nSurfaceFlinger to create a virtual display, for which SurfaceFlinger acts as\nthe BufferQueue producer.\n\nIf you connect a virtual display to a visible layer, a closed loop is created\nwhere the composited screen appears in a window. That window is now part of the\ncomposited output, so on the next refresh the composited image inside the window\nshows the window contents as well. To see this in action, enable\n[Developer options](https://developer.android.com/studio/debug/dev-options) in **Settings** , select\n**Simulate secondary displays** , and enable a window. To see\nsecondary displays in action, use `screenrecord` to capture the act\nof enabling the display then play it back frame by frame."]]