Use NVidia GPU from VirtualBox?












19















How do I make the VirtualBox guest use the NVidia graphics?



Host setup:




  • Windows 7 x64


  • NVidia Optimus



  • In NVIDIA Control Panel, I explicitly selected High-performance NVIDIA processor
    for:



     C:Program FilesoracleVirtualBoxVirtualBox.exe
    C:Program FilesOracleVirtualBoxVBoxSVC.exe


  • When VirtualBox is running, then the NVidia software does not list it as application
    that uses the NVidia GPU. Therefore, I assume that VirtualBox indeed does not use the
    Nvidia GPU.



Guest:




  • Windows 7 x64 (i.e. same as host)


  • Guest Additions installed


  • 3D acceleration enabled in VirtualBox settings: Display / Video / Enable 3D
    Acceleration



  • What Rhinoceros, an OpenGL capable application reports as video adapter:



    Humper
    Chromium
    OpenGL version: 2.1 Chromium 1.9
    Render version: 2.0
    Shading Language: 1.40 - Intel Build 9.17.10.3517
    Driver Date: NA
    Driver Version: NA
    Maximum Texture size: 8192 x 8192
    Z-Buffer depth: 32bits
    Maximum Viewport size: 8192 x 8192
    Total Video Memory: 64 MB


    To me it looks like the virtual machine does use 3D hardware acceleration of the host,
    but unfortunately the Intel one.












share|improve this question




















  • 2





    VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

    – ernie
    Jul 9 '14 at 21:03






  • 5





    @ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

    – feklee
    Jul 9 '14 at 21:10











  • @feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

    – larkey
    Dec 30 '15 at 8:18











  • askubuntu.com/questions/139320/…

    – Ciro Santilli 新疆改造中心 六四事件 法轮功
    Mar 2 '17 at 15:14
















19















How do I make the VirtualBox guest use the NVidia graphics?



Host setup:




  • Windows 7 x64


  • NVidia Optimus



  • In NVIDIA Control Panel, I explicitly selected High-performance NVIDIA processor
    for:



     C:Program FilesoracleVirtualBoxVirtualBox.exe
    C:Program FilesOracleVirtualBoxVBoxSVC.exe


  • When VirtualBox is running, then the NVidia software does not list it as application
    that uses the NVidia GPU. Therefore, I assume that VirtualBox indeed does not use the
    Nvidia GPU.



Guest:




  • Windows 7 x64 (i.e. same as host)


  • Guest Additions installed


  • 3D acceleration enabled in VirtualBox settings: Display / Video / Enable 3D
    Acceleration



  • What Rhinoceros, an OpenGL capable application reports as video adapter:



    Humper
    Chromium
    OpenGL version: 2.1 Chromium 1.9
    Render version: 2.0
    Shading Language: 1.40 - Intel Build 9.17.10.3517
    Driver Date: NA
    Driver Version: NA
    Maximum Texture size: 8192 x 8192
    Z-Buffer depth: 32bits
    Maximum Viewport size: 8192 x 8192
    Total Video Memory: 64 MB


    To me it looks like the virtual machine does use 3D hardware acceleration of the host,
    but unfortunately the Intel one.












share|improve this question




















  • 2





    VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

    – ernie
    Jul 9 '14 at 21:03






  • 5





    @ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

    – feklee
    Jul 9 '14 at 21:10











  • @feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

    – larkey
    Dec 30 '15 at 8:18











  • askubuntu.com/questions/139320/…

    – Ciro Santilli 新疆改造中心 六四事件 法轮功
    Mar 2 '17 at 15:14














19












19








19


6






How do I make the VirtualBox guest use the NVidia graphics?



Host setup:




  • Windows 7 x64


  • NVidia Optimus



  • In NVIDIA Control Panel, I explicitly selected High-performance NVIDIA processor
    for:



     C:Program FilesoracleVirtualBoxVirtualBox.exe
    C:Program FilesOracleVirtualBoxVBoxSVC.exe


  • When VirtualBox is running, then the NVidia software does not list it as application
    that uses the NVidia GPU. Therefore, I assume that VirtualBox indeed does not use the
    Nvidia GPU.



Guest:




  • Windows 7 x64 (i.e. same as host)


  • Guest Additions installed


  • 3D acceleration enabled in VirtualBox settings: Display / Video / Enable 3D
    Acceleration



  • What Rhinoceros, an OpenGL capable application reports as video adapter:



    Humper
    Chromium
    OpenGL version: 2.1 Chromium 1.9
    Render version: 2.0
    Shading Language: 1.40 - Intel Build 9.17.10.3517
    Driver Date: NA
    Driver Version: NA
    Maximum Texture size: 8192 x 8192
    Z-Buffer depth: 32bits
    Maximum Viewport size: 8192 x 8192
    Total Video Memory: 64 MB


    To me it looks like the virtual machine does use 3D hardware acceleration of the host,
    but unfortunately the Intel one.












share|improve this question
















How do I make the VirtualBox guest use the NVidia graphics?



Host setup:




  • Windows 7 x64


  • NVidia Optimus



  • In NVIDIA Control Panel, I explicitly selected High-performance NVIDIA processor
    for:



     C:Program FilesoracleVirtualBoxVirtualBox.exe
    C:Program FilesOracleVirtualBoxVBoxSVC.exe


  • When VirtualBox is running, then the NVidia software does not list it as application
    that uses the NVidia GPU. Therefore, I assume that VirtualBox indeed does not use the
    Nvidia GPU.



Guest:




  • Windows 7 x64 (i.e. same as host)


  • Guest Additions installed


  • 3D acceleration enabled in VirtualBox settings: Display / Video / Enable 3D
    Acceleration



  • What Rhinoceros, an OpenGL capable application reports as video adapter:



    Humper
    Chromium
    OpenGL version: 2.1 Chromium 1.9
    Render version: 2.0
    Shading Language: 1.40 - Intel Build 9.17.10.3517
    Driver Date: NA
    Driver Version: NA
    Maximum Texture size: 8192 x 8192
    Z-Buffer depth: 32bits
    Maximum Viewport size: 8192 x 8192
    Total Video Memory: 64 MB


    To me it looks like the virtual machine does use 3D hardware acceleration of the host,
    but unfortunately the Intel one.









windows-7 virtualbox virtual-machine nvidia-graphics-card optimus






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 12 '16 at 22:31









Hennes

59.2k792142




59.2k792142










asked Jul 9 '14 at 20:41









fekleefeklee

67551131




67551131








  • 2





    VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

    – ernie
    Jul 9 '14 at 21:03






  • 5





    @ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

    – feklee
    Jul 9 '14 at 21:10











  • @feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

    – larkey
    Dec 30 '15 at 8:18











  • askubuntu.com/questions/139320/…

    – Ciro Santilli 新疆改造中心 六四事件 法轮功
    Mar 2 '17 at 15:14














  • 2





    VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

    – ernie
    Jul 9 '14 at 21:03






  • 5





    @ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

    – feklee
    Jul 9 '14 at 21:10











  • @feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

    – larkey
    Dec 30 '15 at 8:18











  • askubuntu.com/questions/139320/…

    – Ciro Santilli 新疆改造中心 六四事件 法轮功
    Mar 2 '17 at 15:14








2




2





VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

– ernie
Jul 9 '14 at 21:03





VMs don't get direct hardware access - they get a virtual GPU . . . related question: superuser.com/questions/395245/…

– ernie
Jul 9 '14 at 21:03




5




5





@ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

– feklee
Jul 9 '14 at 21:10





@ernie VirtualBox can give guests direct access to the OpenGL API of the host's GPU. As far as I can tell that works on my machine. Only VirtualBox is using the Intel GPU instead of the NVidia GPU. Quote from the VirtualBox manual: "With this feature, if an application inside your virtual machine uses 3D features through the OpenGL or Direct3D 8/9 programming interfaces, instead of emulating them in software (which would be slow), VirtualBox will attempt to use your host's 3D hardware."

– feklee
Jul 9 '14 at 21:10













@feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

– larkey
Dec 30 '15 at 8:18





@feklee The title of the question is misleading: You want to use the nVidia GPU for VBox. 'From' is also possible but means sth different: Hardware-Passthrough, ie. acess the host's GPU directly from within the VBox.

– larkey
Dec 30 '15 at 8:18













askubuntu.com/questions/139320/…

– Ciro Santilli 新疆改造中心 六四事件 法轮功
Mar 2 '17 at 15:14





askubuntu.com/questions/139320/…

– Ciro Santilli 新疆改造中心 六四事件 法轮功
Mar 2 '17 at 15:14










2 Answers
2






active

oldest

votes


















8














I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.



People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.



Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.



It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.



It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.



After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.



You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".



Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.



If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.



All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.



Screenshot






share|improve this answer



















  • 1





    I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

    – Thomas Carlisle
    Jan 19 at 18:07











  • " "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

    – moondra
    Feb 17 at 5:07



















16














Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)



There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.



If that doesn't help, try installing Guest Additions in Safe Mode on the guest.



Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:



vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f


Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:



vboxmanage modifyvm "Windows 7 x64" --pciattach 01:00.0@03.01.0


In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.






share|improve this answer





















  • 2





    The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

    – Tim G
    Aug 4 '17 at 1:26











  • @TimG Good point, I've clarified my post slightly. Thanks!

    – Ben N
    Aug 4 '17 at 1:28











  • @BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

    – Baris Demiray
    Jan 22 at 9:32













  • @BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

    – Ben N
    Jan 30 at 4:12











  • Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

    – Baris Demiray
    Jan 31 at 8:49










protected by bwDraco Aug 4 '17 at 1:14



Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



Would you like to answer one of these unanswered questions instead?














2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









8














I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.



People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.



Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.



It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.



It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.



After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.



You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".



Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.



If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.



All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.



Screenshot






share|improve this answer



















  • 1





    I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

    – Thomas Carlisle
    Jan 19 at 18:07











  • " "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

    – moondra
    Feb 17 at 5:07
















8














I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.



People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.



Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.



It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.



It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.



After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.



You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".



Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.



If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.



All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.



Screenshot






share|improve this answer



















  • 1





    I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

    – Thomas Carlisle
    Jan 19 at 18:07











  • " "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

    – moondra
    Feb 17 at 5:07














8












8








8







I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.



People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.



Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.



It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.



It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.



After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.



You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".



Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.



If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.



All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.



Screenshot






share|improve this answer













I realize a few years have passed but wanted to answer since this post shows up pretty high when you google for "virtualbox 3d multiple GPU". In the time that has passed, things have gotten a lot simpler and better.



People that stumble upon this thread will likely land here because they have a laptop or PC that has two GPU's, which is quite common these days -- especially on gaming laptops. The onboard Intel GPU is used for rendering windows and general applications, but applications that make use of GPU 3D functionality should do that via the higher performing Nvidia GPU.



Today, I was building an Ubuntu VM on my laptop to do some cross-platform development, and everything was fine except the guest VM was extremely slow, and there was no explanation for it because CPU, memory, disk were all showing low utilization.



It didn't take long to figure out it was video performance that was causing the problem. Launching applications, maximizing/minimizing windows -- anything that we take for granted in 2019 but needs 3D acceleration to work at any reasonable speed -- was using GPU 0.



It was easy to determine this because Windows 10 now has the ability to see GPU utilization using "task manager", then the "performance" tab. And I could see as I moved windows, maximized, minimized, that was being done through GPU on the host. That GPU on the host is the integrated Intel HD GPU, and I wanted to use the NVidia GTX-1050ti, which was GPU1.



After searching around I didn't really find anywhere where you could specify which GPU to use. But this thread, and some others, reminded me that on these kind of setups you have to go into the NVidia control panel, then "manage 3d settings", then the "Program Settings" tab.



You won't likely find "Virtualbox" in the list. But you can press the "Add" button, and add virtualbox.exe. You may have to drill down the drive/path where your virtualbox installation is. Once you have added it, in the settings below make sure that item 2. "Select the preferred graphics processor for this program" is set to the GPU that you want it to use, which in my case was "HIgh-performance NVIDIA processor".



Don't set it to auto, and certainly don't set it to integrated. Of course, you need the VM settings set with the 3D acceleration box ticked, and you need the guest additions installed on the host. But once you have set the host video 3d settings as described above, shutdown the guest VM, exit virtual box, and then re-launch virtualbox and the VM.



If you use task manager|performacne and look at the "virtualbox manager" process and watch what GPU gets used when you navigate the guest VM's UI, you should see it using the better GPU now. See the image pasted below.



All that said, don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM, like anything based on directX9. Unfortunately, as the ability to virtualize GPU eveolves, the 3d gaming technology evolves quicker.



Screenshot







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 18 at 20:23









Thomas CarlisleThomas Carlisle

21621




21621








  • 1





    I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

    – Thomas Carlisle
    Jan 19 at 18:07











  • " "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

    – moondra
    Feb 17 at 5:07














  • 1





    I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

    – Thomas Carlisle
    Jan 19 at 18:07











  • " "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

    – moondra
    Feb 17 at 5:07








1




1





I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

– Thomas Carlisle
Jan 19 at 18:07





I did a little research today on 3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9. As I said in the post, directx 9 is quite dated.

– Thomas Carlisle
Jan 19 at 18:07













" "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

– moondra
Feb 17 at 5:07





" "Select the preferred graphics processor for this program" is set to the GPU that you want it to use" I don't see the above option. I see Ambinetn, Ansiotropic Filtering...etc but nothing on "select preferred grapics processor"

– moondra
Feb 17 at 5:07













16














Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)



There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.



If that doesn't help, try installing Guest Additions in Safe Mode on the guest.



Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:



vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f


Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:



vboxmanage modifyvm "Windows 7 x64" --pciattach 01:00.0@03.01.0


In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.






share|improve this answer





















  • 2





    The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

    – Tim G
    Aug 4 '17 at 1:26











  • @TimG Good point, I've clarified my post slightly. Thanks!

    – Ben N
    Aug 4 '17 at 1:28











  • @BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

    – Baris Demiray
    Jan 22 at 9:32













  • @BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

    – Ben N
    Jan 30 at 4:12











  • Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

    – Baris Demiray
    Jan 31 at 8:49
















16














Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)



There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.



If that doesn't help, try installing Guest Additions in Safe Mode on the guest.



Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:



vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f


Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:



vboxmanage modifyvm "Windows 7 x64" --pciattach 01:00.0@03.01.0


In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.






share|improve this answer





















  • 2





    The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

    – Tim G
    Aug 4 '17 at 1:26











  • @TimG Good point, I've clarified my post slightly. Thanks!

    – Ben N
    Aug 4 '17 at 1:28











  • @BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

    – Baris Demiray
    Jan 22 at 9:32













  • @BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

    – Ben N
    Jan 30 at 4:12











  • Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

    – Baris Demiray
    Jan 31 at 8:49














16












16








16







Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)



There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.



If that doesn't help, try installing Guest Additions in Safe Mode on the guest.



Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:



vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f


Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:



vboxmanage modifyvm "Windows 7 x64" --pciattach 01:00.0@03.01.0


In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.






share|improve this answer















Giving the guest full GPU access is probably not possible. If a virtual machine had direct access to your GPU while your host was using it, Bad ThingsTM would happen because sharing memory between two effectively different computers is not a thing; pointers and addresses and whatnot would be very different between them. (No consumer-available card supports servicing two computers at once.)



There are, however, some things you can try. First, set your preferred graphics processor to the good one in the NVidia Control Panel (3D SettingsManage 3D settingsPreferred graphics processor). That might make VirtualBox go with the NVidia card for OpenGL.



If that doesn't help, try installing Guest Additions in Safe Mode on the guest.



Finally, on Linux hosts, you can try to pass the GPU through to the virtual machine, but this will only work for PCI cards and I wasn't able to find whether yours is PCI, and even so, you stand a good chance of ripping the GPU away from the host or causing other problems. First find the PCI address (bus, device, and function) for the good card. Set your VM's chipset to ICH9; this didn't immediately break anything when I tried it. Then use the VBoxManage utility to attach the card:



vboxmanage modifyvm "Your VM Name" --pciattach BB:DD.F@bb:dd.f


Replace Your VM Name as appropriate. BB is the bus number of your GPU on the host; DD is the device; F is the function. After the @, enter the PCI slot that it will be on the guest. For example:



vboxmanage modifyvm "Windows 7 x64" --pciattach 01:00.0@03.01.0


In general, GPU passthrough is more likely to be possible on a Linux host. See How to setup a gaming machine with GPU passthrough.







share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 31 at 17:08

























answered Dec 31 '15 at 17:15









Ben NBen N

29.7k1398145




29.7k1398145








  • 2





    The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

    – Tim G
    Aug 4 '17 at 1:26











  • @TimG Good point, I've clarified my post slightly. Thanks!

    – Ben N
    Aug 4 '17 at 1:28











  • @BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

    – Baris Demiray
    Jan 22 at 9:32













  • @BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

    – Ben N
    Jan 30 at 4:12











  • Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

    – Baris Demiray
    Jan 31 at 8:49














  • 2





    The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

    – Tim G
    Aug 4 '17 at 1:26











  • @TimG Good point, I've clarified my post slightly. Thanks!

    – Ben N
    Aug 4 '17 at 1:28











  • @BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

    – Baris Demiray
    Jan 22 at 9:32













  • @BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

    – Ben N
    Jan 30 at 4:12











  • Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

    – Baris Demiray
    Jan 31 at 8:49








2




2





The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

– Tim G
Aug 4 '17 at 1:26





The premise that attaching a GPU to a VM is a "Bad Thing" is incorrect; there is a large community of people that want to do it to create gaming VMs on bare-metal hypervisors. If one wanted to do this in Linux, it would require blacklisting the PCIe GPU so the kernel only goes to the chipset graphics; this prevents the "shared memory" problems. Preventing Windows from picking up the card is probably the hardest issue to solve for this question, and it would likely require disabling the device or uninstalling optimus and the nvidia driver.

– Tim G
Aug 4 '17 at 1:26













@TimG Good point, I've clarified my post slightly. Thanks!

– Ben N
Aug 4 '17 at 1:28





@TimG Good point, I've clarified my post slightly. Thanks!

– Ben N
Aug 4 '17 at 1:28













@BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

– Baris Demiray
Jan 22 at 9:32







@BenN Great answer, but I need a point to be clarified. Have you tried this yourself on a Windows host? The manual says that this is available only on Linux hosts (virtualbox.org/manual/ch09.html#pcipassthrough) and I myself didn't have any luck having it work with a Windows host.

– Baris Demiray
Jan 22 at 9:32















@BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

– Ben N
Jan 30 at 4:12





@BarisDemiray If I remember correctly, the command didn't report any errors when I tried it, but I don't think I tested that the VM was able to use the device. It was a long time ago, though, sorry.

– Ben N
Jan 30 at 4:12













Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

– Baris Demiray
Jan 31 at 8:49





Hi @BenN, thanks a lot for taking time to reply. That's exactly what happened when I tried, no errors on the command line yet no graphics card in the VM.

– Baris Demiray
Jan 31 at 8:49





protected by bwDraco Aug 4 '17 at 1:14



Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



Would you like to answer one of these unanswered questions instead?



Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...