Activate dual displays in Nvidia control panel SLI area doesn’t seem to use both GPUs as advertised.

So I was playing a video game on one monitor while watching a video on the other.

And I noted that the video was stuttering in spots. And generally when the screen action got busy in the game. I have 2 1070ti GPUs in my system, this shouldn’t really be happening…

Complete stats of the system are on pcpartpicker.

Anyway, the setting I have in Nvidia doesn’t seem to be working as advertised. Take a look:

So note at the bottom the text “Your GPUs will drive all connected displays while optimizing rendering performance whenever possible using SLI or multi-GPU rendering.”

Awesome, sign me up….why does it say SLI disabled then?

I thought I should dig in more so I enabled the GPU activity icon in my notification area in Windows 10.

When I click on it though, no matter which app is on which monitor, the GPU activity reports no activity on one GPU…

But yeah check this out, the 2nd tab;

So why is everything on one GPU?!

Task Manager seems to back this up, the consumed video ram is 200MB on the one with no apps running, and 2GB with the one that has all the apps.

The only time I see activity on the 2nd GPU is when I take a screenshot with Snagit (presumably because Snagit is capturing what is displayed on the GPU itself? dunno).

So is this enough proof/diagnostic info/screenscraps and graphs to analyze? Not really;

I fire up WPRui from the Windows Performance Toolkit;

Note I have selected in this options area;

  • 1st level triage
  • CPU
  • Disk I/O activity
  • Video glitches
  • HTML Responsive analysis

(HTML because my movie is playing off a plex.tv host on my LAN)

And I hit start, run a game intro, watch some Rodney Dangerfield for a minute or three, and then stop the trace.

Interestingly, when I try to open the WPA tool and read the trace, it hangs…

Warning: EventSink {13399e05-4afd-48fd-ba25-6b673a7a2b92} signaled an Invalid Event:
Event#: 25707928 (T#0:#25707928)
TimeStamp: 120768768, Process: 2916, Thread: 20636, Cpu: 4
ClassicEventGuid: {01853a65-418f-4f36-aefc-dc0f1d2fd235}
ClassicEventDescriptor: 0x0c 0x00 0x0002
MofLength: 112
InitializeSession: OnEnd: Finished pass 1
InitializeSession: OnBegin: Starting pass 2

 

This is on insider preview Win10 pro, slow ring, and the preview ADK as well… more to come.

Hyper-V Pro Tip: Don’t cross the streams

update: using a single GPU, same results. It’s not the dual-gpu unsupported problem. Something else is hosing my hyper-v host from the guest install.

The results, well, sucked. Nuked the system totally. So don’t cross the streams!

barf

In my lab I was building an MDT share so I could mass deploy endpoints. I’m a hyper-v guy, and a gamer, and an Ethereum miner, so I have a couple video cards in my rig. Now I know, multiple video cards aren’t supported in Hyper-V RemoteFX. I get it. I don’t know why, sounds like some code laziness going on but whatever. I don’t use RemoteFX acceleration in my guest VMs, so I didn’t think much of it…

Until I stood up a Windows 7 Professional SP1 updated install and injected the Hyper-V additions into it as part of the normal MDT process. 

[youtube https://www.youtube.com/watch?v=ytP-RfhdBWY&w=560&h=315]

Apparently the guest driver initialization hosed the host? I didn’t think that was supposed to be possible. I’d file a bug but I’m a pro customer now, not enterprise level, so I can’t.

How to find this? Well, GFL looking in the SYSTEM event log, where crashes are typically recorded. It’s empty.

The esoteric Microsoft-Windows-Hyper-V-VMMS/Admin log though? It was pretty helpful. Just have to think to look down in there…

“The required GPU resources could not be accessed. This server cannot run as a RemoteFX host without a GPU. Verify that the GPU is correctly installed.”

gpuerror1

Followed quickly by “The machine contains non-identical GPUs. This is not a supported configuration for RemoteFX.”

gpuerror2

So I get it. “You shouldn’t be running in this config”. Fair enough.

Should Hyper-V allow this configurations’ guest to crash the host? Probably not.

Another fun fact. I don’t use the GPUs for the guest anyway…

gfl