What does HackerNews think of vgpu_unlock?
Unlock vGPU functionality for consumer grade GPUs.
Given that some folks did manage to unlock the VGPU functionality on consumer GPUs [1] the silicon is certainly capable, but I guess Nvidia doesn’t want to cannibalise their data centre sales.
[0] - https://edc.intel.com/content/www/us/en/design/ipla/software...
I did not yet try it myself (maybe it doesn't even work anymore?), and it might not be an option for businesses.
What really would drive me to support them directly would be (para)virtualization support via SR-IOV/GVT-g. Especially given Microsoft's recent stunts an increasing amount of people (including me and close friends) are evaluating moving to Linux while keeping a Windows VM around, but both NVIDIA and AMD seem adamant about keeping those features gated to enterprise cards (even thought it's a software thing, see community efforts to reverse this[1]) so they can charge ludicrous prices to businesses for those.
My opinion is that if Intel wants to get their foot in the door they should win over enthusiast by breaking patterns such as this one rather than try to compete exclusively on performance.
No, the FP64 units aren't physically present on silicon in high numbers on the non xx100 dies.
However, limitations enforced just by the driver and its FW set:
- GPU virtualisation (see: https://github.com/DualCoder/vgpu_unlock)
- NVENC video encoder limitations to 2 simultaneous streams on customer cards
- Lite Hash Rate limitation enforcement to make GPUs less attractive for miners
https://github.com/DualCoder/vgpu_unlock
Looks like it can enable a virtual GPU so you don't need to actually use 2 GPUs, but I haven't tried it, it looks pretty cool if you only want to run only 1 to switch.
Can theoretically game on Linux and windows at same time with just a single gpu (2080 ti), mainly use it cause I only have single gpu and I don't want to loose hardware acceleration on Linux, the nixos module both applies a the vgpu unlocker for consumer graphic cards (so u can split a gpu in multiple virtual gpus) https://github.com/DualCoder/vgpu_unlock
And merges it with the GeForce drivers so the host gpu does not stop having display output
And I just passthrough the vgpu and Xbox controllers to the vm with qemu and my main windows nvme disk (windows naturally just works inside and outside a vm for me, dual boot)
And for PCIe VFs for virtualized GPUs, AMD does _not_ provide it either outside of their datacenter line.
Note that if you just passthrough the whole GPU, that works out of the box on NVIDIA cards.
(btw, for some NV cards, there is: https://github.com/DualCoder/vgpu_unlock to crack that protection)