The Rendering Architecture: mpv + OpenGL
HorangPlayer's rendering is built on a single, highly optimized path: mpv renders directly through OpenGL into a CAOpenGLLayer. No intermediate copies, no context switches, no Metal fallbacks.
The mpv OpenGL Pipeline
mpv handles the entire rendering pipeline internally — YUV to RGB conversion, scaling, subtitle compositing, tone mapping — in a single GPU pass via GLSL shaders.
This is the same approach used by IINA, and it's incredibly efficient. mpv's renderer is highly optimized with:
- Automatic BT.601/BT.709/BT.2020 color space detection
- Hardware-accelerated decoding via VideoToolbox
- Native HDR tone mapping (
tone-mapping=auto) - Built-in ASS subtitle rendering via libass
How It Works
The render loop follows IINA's battle-tested pattern:
- mpv signals a new frame via
mpv_render_context_set_update_callback - We dispatch to a dedicated render queue (
mpvGLQueue,.userInteractiveQoS) mpv_render_context_update()confirms the frame is ready- We read the actual FBO and viewport from OpenGL state
mpv_render_context_render()draws the frame into theCAOpenGLLayermpv_render_context_report_swap()tells mpv the frame was displayed
The key components:
- MPVDecoder — Wraps the libmpv C API, manages the render context
- MPVVideoLayer — A
CAOpenGLLayersubclass that provides the OpenGL surface - MPVVideoView — A SwiftUI
NSViewRepresentablewrapper
Why OpenGL Instead of Metal?
mpv's render API is built around OpenGL. While Apple has deprecated OpenGL on macOS, it remains fully functional and mpv's GLSL shader pipeline is heavily optimized for it. The alternative — extracting pixel buffers from mpv and re-rendering through Metal — would add an unnecessary copy and negate mpv's rendering optimizations.
We silence the deprecation warnings with GL_SILENCE_DEPRECATION=1 and accept the tradeoff: proven, battle-tested rendering performance over a "modern" API that would actually be slower for our use case.
Render Thread Isolation
All rendering happens on a dedicated dispatch queue (mpvGLQueue) with .userInteractive QoS. The main thread handles UI updates only.
This ensures the SwiftUI interface remains responsive even during heavy 4K playback — a lesson learned from IINA, which found that main-thread rendering caused side panel animations to stutter.
Video Filters
Video filters (brightness, contrast, saturation, sharpen, deband) are applied through mpv's built-in filter system. mpv processes these in the same GPU pass as the main rendering — no separate filter pipeline, no additional overhead.
Filter settings are saved per-file in SQLite, so your adjustments persist across sessions.
HDR Tone Mapping
For HDR content (PQ/HDR10, HLG), mpv handles all tone mapping natively:
tone-mapping=auto— Adaptive tone mapping algorithmhdr-compute-peak=auto— Dynamic peak detectiontarget-colorspace-hint=auto— Display hint for HDR-capable monitors
HorangPlayer's HDRManager observes mpv's video-params/primaries and video-params/gamma properties to detect HDR content and update the UI state, but all the actual tone mapping work is done by mpv.
The Result
This single-path architecture keeps things simple and fast. mpv's renderer is purpose-built for video — it handles color conversion, scaling, filtering, subtitles, and tone mapping in one GPU pass. By not adding intermediate layers or framework abstractions, HorangPlayer matches IINA's rendering performance while keeping the codebase lean.