Three.js/WebGL running on integrated GPU instead of dedicated
Asked Answered
W

1

8

I'm learning Three.js and created a simple app with it expecting it to run on my dedicated GPU, but task manager shows 0% GPU utilization (all values) while the scene is rendering and 80-90% integrated graphics usage. I tried a few sites that use WebGL and the results were the same. Here's what I know:

  • Hardware acceleration is enabled in the browser, and my computer's power plan is set to high performance.
  • powerPreference: 'high-performance' seemingly does nothing.
  • edge://gpu/ seems to detect my GPU as ctrl-f "nvidia" finds some values which are set to the name of my GPU. I don't really know what to look for here, though, it's mostly just values I don't understand.
  • https://alteredqualia.com/tmp/webgl-maxparams-test/ shows ANGLE (Intel, Intel(R) HD Graphics 530 Direct3D11 vs_5_0 ps_5_0, D3D11-21.20.16.4550) under "Unmasked Renderer".
  • I'm using Edge on Windows 10 and I have an NVIDIA GeForce GTX 960M. I also tried Chrome which doesn't behave any differently as far as I can tell, and Firefox uses integrated graphics too except with lower FPS for some reason.

My question is: How does the browser decide which graphics processor to use? Is there a way to control this choice, both for the end user and the website developer?

Wardroom answered 18/12, 2021 at 17:3 Comment(5)
I was debating if this question belongs more on Super User, but I figured it's useful knowledge for web developers.Wardroom
I was going to suggest turning on hardware acceleration as outlined in this article but it sounds like you’ve already tried that. The Alteredqualia results suggest it’s using your Intel integrated graphics, and not your Nvidia graphics card. You might find something by fiddling through your Windows System Settings. I recommend also asking on SuperUser.Epicureanism
Weird, I seem to have the same problem (GTX 960M on a Dell 9550)Bohannon
What kind of computer are you using? Could you add some system specs to the question? Sometimes desktop towers have an HDMI output that comes out of the integrated motherboard, and a secondary HDMI out of the discrete graphics card, so you've got to make sure it's plugged into the graphics card. Also, your Device Manager in the System Settings might give you some info on the status of your GPU, maybe it's been disabled?Epicureanism
I'm using a laptop. The GPU works normally in e.g. games, it seems that it's just the browser defaulting to integrated graphics for some reason.Wardroom
B
-1

Probably the easiest way to accomplish it is to disable the integrated graphics card. If you are only interested in your own computer, then you can disable it through the device manager. Access through right click on "my computer/manage" left side panel, Device Manager (select it), then select in the right panel "Display adapters" the drop down will show all of them. Right mouse select the built-in and select "Disable device". It will pop up a confirmation windows ...

Once you disable it, the only graphics interface you will have is the discrete adapter instead of the integrated adapter.

You can see where I got the information from: https://www.evga.com/support/faq/FAQdetails.aspx?faqid=58534

Bly answered 29/12, 2021 at 23:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.