How can I modify xorg.conf file to force X server to run on a specific GPU? (I am using multiple GPUs) [closed]
Asked Answered
M

2

9

I'm running 2 GPUs and I'm trying to force X server to run on one GPU. According to this website : http://nvidia.custhelp.com/app/answers/detail/a_id/3029/~/using-cuda-and-x , here is how i should proceed :

The X display should be forced onto a single GPU using the BusID parameter in the relevant "Display" section of the xorg.conf file. In addition, any other "Display" sections should be deleted. For example: BusID "PCI:34:0:0"

Here is my xorg.conf file :

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 304.64  (buildmeister@swio-display-x86-rhel47-12)  Tue Oct 30 12:04:46 PDT 2012

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "Files"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

So I tried to modify the subsection display with the correct BusID but it still does not work, I also tried to put it in the section Device.

Anyone knows how i could do that ?

Mealworm answered 22/8, 2013 at 13:45 Comment(0)
T
10

If you have 2 NVIDIA GPUs, get the BusID parameters for both. The doc you linked explains a couple ways to do that, but nvidia-smi -a is pretty easy.

You will need to figure out which GPU you want to keep for display, and which you want to keep for CUDA. Again, this should be pretty obvious from nvidia-smi -a

Let's suppose your nvidia-smi -a includes a section like this:

PCI
    Bus                     : 0x02
    Device                  : 0x00
    Domain                  : 0x0000
    Device Id               : 0x06D910DE
    Bus Id                  : 0000:02:00.0

Then modify the device section like this:

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BusID          "PCI:2:0:0"
EndSection

Then reboot. Make sure the one you are keeping for display is the one with the display cable attached!

You may also be interested in reading the nvidia driver readme and search on "BusID" for additional tips.

The document you linked references a "Display" section but that should be the "Device" section.

Ting answered 22/8, 2013 at 14:4 Comment(2)
Finally the problem was that the 2nd GPU was defective. It was detected but was unable to display. Your answer was helpful though thank you !Mealworm
One very important note; if there are many GPUs installed you will get hex values from lspci or nvidia-smi like 0000:0A:00.0. You have to either convert it to decimal like this 10:00:0 or skip leading zero(s) like this A:00:0 (notice 0A is now just A). Credit goes to ossifrage at #ethereum-mining on Freenode.Unrestraint
W
6

Since cannot add comments to the answer above, due to the reputation restriction, I just leave my solution here.

I followed the solution provided by @Robert Crovella. But it still did not work for me, until I changed the BusID to decimal format.

Let me write more details.

  • Two GPUs: GTX 1080Ti(device0 ) and GTX 960(device1). So I want to set GTX 1080Ti(device0) as computing card and GTX 960(device1) for xorg display.

  • Find their BusIDs: you can find the BusIDs' via the command 'lspci | grep VGA', which will give the following :

03:00.0 VGA compatible controller: NVIDIA Corporation Device 1b06 (rev a1)

82:00.0 VGA compatible controller: NVIDIA Corporation GM206 [GeForce GTX 960] (rev a1)

So we get the BusId 03:00.0 for device0 and 82:00.0 for device1, but they are both hexadecimal numbers. So convert 0x03 and 0x82 to decimal numbers as 3 and 130, respectively.

  • Add the BusID to the Device section in the xorg.conf file:

    Section "Device"

       Identifier     "Device1"
       Driver         "nvidia"   
       VendorName     "NVIDIA Corporation"
       BusID          "PCI:130:0:0"
    

    EndSection

Pay attention to the BusID format, like, "0:0", (not "0.0"). And also use the same device in the Section "Screen":

Section "Screen"

   Identifier     "Screen0"
   Device         "Device1"   
   ...

EndSection

  • With the monitors connected to the display GPU, and reboot your computer.
  • I found this solution when I read the above comment by @Piotr Dobrogost, and double checked the decimal format BusID used in the xorg.conf file, different than the BusID provided vis command lspci when I found this article.
Wandawander answered 12/10, 2018 at 18:29 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.