Quantcast
Channel: Remote Desktop Services (Terminal Services) forum
Viewing all articles
Browse latest Browse all 27765

Does 2008R2 Enterprise "RemoteFX Virtualisation host" need 2 gpus?

$
0
0

Hey guys,

I'm testing a new setup for my boss, so far I've managed to get things working as follows:

1x WS2008 R2 SP1 with Hyper-V

1x WS2008 R2 SP1 (virtual) as RD Session Host

1x Win7 SP1 Ultimate client

Hardware used:

Phenom 2 X4 840 3.2Ghz (SLAT enabled)

8GB ram

ATI Radeon 5570 1GB

Results were better than expected to be honest :) but now I'd like to take the next step and virtualize a Win7 Ultimate SP1 computer.

I've managed to get that working, sort of, but performance is horrible/sluggish compared to the Session Host setup when RemoteFX is enabled on an RD session. I'm using a clean win 7 setup with only updates and sp1 installed, firewall off etc.

Now I've come across this website:

http://studiohub.wordpress.com/2011/02/24/enable-remotefx-on-hyper-v-server-sp1/

And it says below in the comments (Sean):

"The one thing I wish I had noticed before I started down this path though is thatRemoteFX requires two GPU’s, even if one is crappy (read: integrated) so that it can load the video capture driver on the unused GPUand the standard driver on the discrete GPU."

Is this correct? And if so, how do I tell windows to use the integrated card for the video capture and the other for the rendering bit? Or am I missing something else?

Ironically I've been jumping through hoops to get the onboard gpu disabled because it kept trying to use it as the remotefx gpu (since I had no monitor in the test setup building lol), but this was before I discovered the Capture driver needed to be installed..

Any help would be appreciated! Thanks













Viewing all articles
Browse latest Browse all 27765

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>