Lions, tigers, and high-DPI, oh my

Captain's Log: Stardate 78275.5

While continuing to work on making the new 3D rendering engine and associated 2D GUI code more robust, I eventually came to a TODO I'd had for a while which was to double-check that things worked well when the little Auto-Scale Plug-In Window item was un-checked in Ableton Live:

This little obscure option ended up taking me down a multi-day rabbit hole learning about high-DPI handling in Windows.

Taking a step back, the basic history here is that around 2011, Apple introduced the Retina display, which essentially was a computer screen with double the DPI of a normal monitor. Here DPI means "dots per inch," that is, the number of pixels in a row an inch long. It didn't take super long for high-DPI displays to catch on with non-Apple PC manufacturers.

The difficulty with high-DPI displays is that historically, desktop applications tend to have their GUI layouts written using pixels as the main measurement. So, a button might be sized as 64x128 pixels. When most monitors had similar DPIs, this worked reasonably well. However, when that application is run on a monitor where each pixel is physically half as wide, that 64x128 pixel button might be much too small. Perhaps a better example is font rendering, where if a text area is defined in terms of pixels, the font might become so small as to be unreadable on a high-DPI display.

Apple's solution to this problem was clever: by standardizing their Retina displays at exactly double the DPI of a normal display, the pixel dimension calculations in old apps could simply be scaled up by a factor of 2 by the OS. Because this is an exact doubling, even when things like bitmaps are scaled, they come out sharp (just lower resolution). And for things like fonts, or other drawing primitives, the OS-level rendering engine can rasterize things at the full 2x pixel resolution behind the scenes to get high-resolution results.

Microsoft had a more difficult problem to solve. Since Microsoft has much less control over what hardware its OS runs on, Windows has to provide a variety of scaling factors, for compatibility with monitors at various DPIs. In practice this is a 100% - 500% slider that users can adjust to change the size of the Windows GUI:

This brings about two major issues.

First, because this scaling factor is not an integer (like Apple's 2x scale), old apps written before high-DPI monitors were a thing look awful when they're scaled up by the OS. There's just no way around it; especially things like fonts are just God-awful when you blow up 1 pixel to, say, 1.75 pixels. It's impossible to make this look good. So if you're ever running an older Windows app with this scale set to something that isn't a multiple of 1x, this is why it looks blurry and awful. (And this is why the settings menu will recommend 200% or 300% depending on your monitor, because those will at least scale somewhat better.)

The second issue is much more subtle. Because Windows runs on such heterogeneous hardware, it provides different scaling factors for each monitor. Without this, if a user had two displays where one had double the DPI of the other, they could never get apps to appear at a good size on both monitors.

From what I can tell, having come into all this after the fact, Microsoft went through several iterations of poor support for high-DPI monitors. This started with an API for applications to declare their "DPI Awareness" at the process level. Basically, by default apps are registered as UNAWARE of DPI issues, and Windows will automatically bitmap-scale them up, so they'll be large enough to read but awfully blurry. The DPI-awareness API allows a newer application to register itself as SYSTEM_AWARE, meaning, "I know how to scale myself up and down in a way that will look sharp, based on a global DPI setting." On a single monitor system, this would largely work, with a big caveat I'll discuss below.

But, not all systems are single-monitor. So Microsoft introduced another DPI-awareness setting called PER_MONITOR_AWARE. This is more complex for an application to handle, because it has to respond to scale changes if, for example, the user drags the window from a low-DPI monitor to a high-DPI monitor. In other words, there's no longer just one global DPI for the application to scale to.

As an absolutely wonderful tidbit of history, the original PER_MONITOR_AWARE setting was actually buggy and failed to account for a bunch of important scenarios. And thus Microsoft had to introduce yet another setting called PER_MONITOR_AWARE_V2 that fixed these issues.

(There's even a super edge-cased setting UNAWARE_GDISCALED for thins like fonts in GDI apps to look slightly less horrible even though they are bitmap-scaled. But let's ignore that.)

Okay, so this is already all a huge mess, and obviously very complicated for Windows applications to deal with. It is much more complex than Apple's simple "blow everything up 2x" strategy.

But wait, there's more!™

Yes, actually things get more complicated, and this is where the history starts to tie in to problems for Anukari. You see, Anukari will primarily be used as a VST plugin in host applications (DAWs) like Ableton Live. Now, think about this: a host application like Ableton Live itself will have some DPI-awareness setting for its own GUI. But VST plugins are DLLs loaded into the host process, and those VST plugins might have different DPI-awareness. So what if the host is DPI-aware and the user loads an old VST plugin that is not DPI-aware?

Yes, this is an actual nightmare, and Windows added an API to handle it: it is now possible to declare the DPI-awareness not just for a process, but for a specific thread within that process. This means that different threads can have different DPI-awareness, and the way it works is that when a native OS window is created, it inherits the DPI-awareness of the thread that created it. So now each process, thread, and window in a Windows application has its own associated DPI-awareness from the list of 5 different awareness modes.

(I am pretty sure that Gary Larsen did a comic about Satan welcoming a software engineer to hell, and their job was to deal with DPI-awareness in Windows applications. "Your room is right in here, Maestro.")

So, back to the little Auto-Scale Plug-In Window menu item in Ableton Live. This is a per-VST-plugin setting, and it is on by default. What "on" means is that Ableton will set up the main thread for the VST plugin with a DPI-awareness of UNAWARE, and will let the OS scale it up in an ugly way. That's right, Ableton makes new, fancy, DPI-aware plugins look terrible by default.

Disabling this setting makes Ableton set up the main thread for the VST plugin in a way that is DPI-aware, allowing the VST plugin to scale itself in a way that looks nice and sharp. But, obviously, the plugin needs to actually be DPI-aware for this to work. If not, it will render weird, possibly only drawing its GUI to a part of the window, unscaled, with black bars around it.

In the case of Anukari, most of the GUI scaled itself up and looked good, but the 3D renderer did not scale up. Actually, something weirder happened. The 3D renderer window was actually scaled to the proper size, but the viewport within the 3D renderer to which the 3D scene was drawn was not scaled up, and thus occupied a sub-portion of the window.

This was weird to me, because I thought my code was taking into account the DPI scaling amount. And it turns out that it was, but there was a deeper, more demented issue. I joked about hell before with the Windows DPI-awareness APIs, but this issue turned out to be in the much worse category of eldrich driver bug horrors.

After banging my head against the wall, I finally found that the Vulkan API was, when given a window with correct pixel dimensions for a high-DPI monitor, generating a swap chain for that window with pre-scaling dimensions. In concrete terms, this bug means that in a DPI-aware context with 125% scaling, if I give Vulkan a 1250x1250 pixel window, it will give me back a swap chain of 1000x1000 pixels, and refuse to give one with the correct number of pixels even if I try to force it. Which is flat-out broken.

Now, why am I seemingly the only person to have noticed this? It's a huge gaping hole that breaks any Vulkan application. Well... this bug only happens if the Vulkan client code has a per-thread DPI-awareness set that does not match the native window's DPI-awareness. In other words, the NVIDIA driver doesn't correctly handle per-window DPI-awareness. Let me spell that out: the NVIDIA driver has not been updated to work correctly with a Windows API that was introduced 8 years ago in 2016. I reported the bug to NVIDIA but doubt I'll get a response, as I found reports of similar issues with OpenGL from 4 years ago with no resolution.

Fortunately there is a simple workaround, which is to change the rendering thread's DPI-awareness to that of the window it is rendering to, and the Filament folks were quick to accept my PR to implement this. It's not perfect, because an application could technically request Vulkan swap chains for multiple windows that have different DPI-awareness contexts, but... it will do.


© 2024 Anukari LLC, All Rights Reserved
Contact Us|Legal