DirectX 11: simultaneous use of multiple adaptors
Asked Answered
E

3

9

We need to drive 8 to 12 monitors from one pc, all rendering different views of a single 3d scenegraph, so have to use several graphics cards. We're currently running on dx9, so are looking to move to dx11 to hopefully make this easier.

Initial investigations seem to suggest that the obvious approach doesn't work - performance is lousy unless we drive each card from a separate process. Web searches are turning up nothing. Can anybody suggest the best way to go about utilising several cards simultaneously from a single process with dx11?

End answered 16/5, 2012 at 19:44 Comment(4)
Are you using the vendor libraries at all? What does VRAM usage and transfer look like with one proc/many? Are you running fullscreen exclusive on each card, and does that change anything if not? Are you sending frames to all simultaneously (multiple render threads)?Trauner
@peachykeen: This is using DirectX from C++. We've done some initial tests but won't be starting the core work for a week or so, so I thought I'd ask this question now to see if anyone out there just happened to know the answer - we haven't done any in-depth profiling or debugging yet. We need to run in windowed mode as we're a desktop app, and each window is being rendered independently from its own thread (which rensders slowly) or process (which renders fast). We can split our rendering over multiple processes, but that shouldn't be necessary.End
Do you have VSync enabled? actually I noticed some severe slowdowns in (almost) the same use case as you. Without VSync on it's pretty ok.Compotation
@catflier: No. Our best guess is that the effect was caused by resource locking that stalled the render threads (btw, we're rendering in windows, not fullscreen, which could be part of the problem). This effect simply doesn't occur with processes running the same code. Our final solution was to simply use multiple processes for our rendering (it's a client/server design anyway so that it can be distributed across many PCs if we need it to, so this was trivial to achieve).End
D
6

I see that you've already come to a solution, but I thought it'd be good to throw in my own recent experiences for anyone else who comes onto this question...

Yes, you can drive any number of adapters and outputs from a single process. Here's some information that might be helpful:

In DXGI and DX11:

Each graphics card is an "Adapter". Each monitor is an "Output". See here for more information about enumerating through these.

Once you have pointers to the adapters that you want to use, create a device (ID3D11Device) using D3D11CreateDevice for each of the adapters. Maybe you want a different thread for interacting with each of your devices. This thread may have a specific processor affinity if that helps speed things up for you.

Once each adapter has its own device, create a swap chain and render target for each output. You can also create your depth stencil view for each output as well while you're at it.

The process of creating a swap chain will require your windows to be set up: one window per output. I don't think there is much benefit in driving your rendering from the window that contains the swap chain. You can just create the windows as hosts for your swap chain and then forget about them entirely afterwards.

For rendering, you will need to iterate through each Output of each Device. For each output change the render target of the device to the render target that you created for the current output using OMSetRenderTargets. Again, you can be running each device on a different thread if you'd like, so each thread/device pair will have its own iteration through outputs for rendering.

Here are a bunch of links that might be of help when going through this process:

Display Different images per monitor directX 10

DXGI and 2+ full screen displays on Windows 7

http://msdn.microsoft.com/en-us/library/windows/desktop/ee417025%28v=vs.85%29.aspx#multiple_monitors

Good luck!

Devilish answered 16/3, 2013 at 21:38 Comment(0)
R
2

Maybe you not need to upgrade the Directx.
See this article.

Reactive answered 31/5, 2012 at 16:15 Comment(1)
Thanks, but now that DX 9 is nearly 200 years old we really want to stick a nail in it and start using DX11 features, avoid having to install DX9 onto all our Win7 target machines, etc.End
T
1

Enumerate the available devices with IDXGIFactory, create a ID3D11Device for each and then feed them from different threads. Should work fine.

Tawanda answered 24/5, 2012 at 20:55 Comment(7)
Thanks Axel. The problem is that when we do this, we get about 30 frames per second from a demo scene, but if we drive each Adapter from a separate process we get about 600 fps for the same rendering load. It's possible that our initial quick trials had a bug, or that in a real world example it'll actually work fine, but we're concerned that we could put a lot of effort into a renderer only to find that it has to be split multi-process to get the full performance out of the adapters. As we have a bit of lead time, I thought I'd ask here and come back to it in a few days...End
Is this with D3D9 oder D3D11? Could also be a driver issue, that only creates one instance of itself per process instead of per thread. What cards do you use?Tawanda
This is DX11, using brand new cards like nVidia GT 220, ATI Radeon HD 6700.End
Note: the GT220 is NOT a new card. Not even close.Jecon
@Brendan: Sure. Poor wording on my part. I meant current, very capable, DX11 graphics cards from two different, reasonably reliable manufacturers, using the latest drivers. It's unlikely to be a driver/hardware bug because we get the same results with different cards, and we get excellent performance when driving form multiple processes. More likely we're driving DirectX in the wrong way. It appears this is something nobody (here) has actually tried. I'll just have to get my team to investigate further and then report back if we suss anything out.End
Is the GT220 DX11 capable? I'm not too sure, yet I'm sure it does if you are using it. Just trying to make sure you're not using incompatible cards.Jecon
@Brendan GT220 is DX10.1, so you can use it in dx11 with 10_1 feature set.Compotation

© 2022 - 2024 — McMap. All rights reserved.