There’s a large screen iMac that I use for development a lot. It has a 5K screen. The advantage of this is that when I am working on a project that runs at 4K resolution I can develop it on this machine and still have some pixels left over for some dev tools. But a problem with the machine is that it never had a great GPU. It pushes a lot of pixels but Apple had used a mobile GPU (which are usually lower powered). It’s also an older machine. The solution for this: an external GPU. It took a bit of fanagling but I got an Nvidia GTX 1080 working on the machine. When I tried testing out some WebGL shaders on the computer at full resolution the performance was awful! What was going on?
I thought that the performance was indicating that the external GPU wasn’t being used. I thought that starting Chrome on the secondary (external) screen would solve this problem. It wouldn’t. Thankfully the Windows 10 Task Manager also shows GPU usage. I found that no matter where I placed the window that the internal (weak) GPU was the one that was being used.
My hypothesis on what was going on here is that Chrome was creating surfaces on the primary GPU. To test this I changed the primary screen to be one of the screens attached to the external GPU. The results were promising. I got much better performance. But there was still a significant difference if I ran full screen on the built in screen compared to the external one. The Task Manager showed that the Windows Desktop Manager was using a lot more cycles when I ran the Chrome window on the internal screen. Basically the system was rendering using the external GPU and then copying the screen to the internal GPU for each cycle. For my purposes this needs no solution. I connected the eGPU to a 4K display and did my testing from it.
If you’ve got multiple GPUs and wish for WebGL to run on a specific screen you’ll for now have to set the desired GPU to be your primary one.