I'm learning Three.js and created a simple app with it expecting it to run on my dedicated GPU, but task manager shows 0% GPU utilization (all values) while the scene is rendering and 80-90% integrated graphics usage. I tried a few sites that use WebGL and the results were the same. Here's what I know:
- Hardware acceleration is enabled in the browser, and my computer's power plan is set to high performance.
powerPreference: 'high-performance'
seemingly does nothing.edge://gpu/
seems to detect my GPU as ctrl-f "nvidia" finds some values which are set to the name of my GPU. I don't really know what to look for here, though, it's mostly just values I don't understand.- https://alteredqualia.com/tmp/webgl-maxparams-test/ shows
ANGLE (Intel, Intel(R) HD Graphics 530 Direct3D11 vs_5_0 ps_5_0, D3D11-21.20.16.4550)
under "Unmasked Renderer". - I'm using Edge on Windows 10 and I have an NVIDIA GeForce GTX 960M. I also tried Chrome which doesn't behave any differently as far as I can tell, and Firefox uses integrated graphics too except with lower FPS for some reason.
My question is: How does the browser decide which graphics processor to use? Is there a way to control this choice, both for the end user and the website developer?