Enabling VMXNET 3 on TrueNAS Core VM?

TNC Version: TrueNAS-13.0-U6.1
ESXi v8.0.2 Enterprise Plus

Trying to figure out how to enable VMXNET 3 on my TNC VM. It’s using legacy E1000 drivers. There is no option in the drop down like other VM machines to use this. I’d like this enabled so I can utilize the VM over my upgrade to 10G network setup.

VM has open-vm-tools 11.3.5 build 18557794 installed. I was reading somewhere that it has to do with the vmtools but not how to fix it.

E1000 is supposedly the most stable option.

There’s a common misconception that a virtual E1000 is somehow limited to 1G. It isn’t. There is no hardware interface here. Only an emulation of a particular interface that was only ever produced and sold with 1G speed.

So the FreeBSD kernel has this hardcoded string that says “hey, I discovered an Intel 1G frob!” And likewise the driver only offers 10M, 100M and 1G to the API e.g. ifconfig uses, but again: there is no hardware here.

The network will go as fast as your underlying real hardware, the hypervisor host, the FreeBSD guest, and your CPU and memory can manage. Probably not at full 10G but definitely over 1G.

2 Likes

I appreciate the response. I should have added more information and the research I did. I was about to go to bed and wanted to get a post out before I went to sleep and got busy in the morning.

So I had been researching it and while that is true in some cases it hasn’t been that way for everyone. There are some old posts on the old forum discussing it showing differences from a smidge over Gbps to over 14Gbps transfer speed changes (this example is from a 20G setup).

In my own tests transferring media from my old R710 server to my new R730xd I’m getting just around gbit or lower transfer limits. Both servers have a 10G NIC and both SPF+ to an CRS305 in SwOS. So for my use case it appears to at least be capping it at some level. That was with threaded transfers of 10 files at a time all being 500mb to over 1gb

I also noticed when I enabled VMXNET 3 on my MS 2022 Server I was actually getting above my rated ISP fiber speeds. Around 120MB/s when before I’d usually max out at around 90-95MB/s.

Also to show the vSwitch is using the 10G NIC. MTU is 1500
Basically I’d just like to test it out for myself and see if anything changes. Except I Don’t know how to enable that driver even though it seems FreeBSD comes with it.

image

It’s been a while since we ran a TrueNAS VM, but let me spin one up in our ESX environment real fast and see what I can figure out.

Was this VM upgraded from an older version of ESX\TrueNAS? In creating this test VM, with a specified guest OS of FreeBSD 12.0, our system defaulted to VMXNET3. The system booted after OS install and connected to the network just fine.

I spun this up on ESXI 8, update 1e. VM Hardware version 19, TrueNAS Core 13U5.

I’m not sure when it changed, but vCenter\ESXi flipped from defaulting to the E1000 NIC quite a while ago. The only place we still use the E1000 NIC is for some customers running medical records software based on SCO Unix from 1997 for which no vmtools is available.

1 Like

You need to add a new nic to the vm.
VMXNet3 shows up just fine


image

Add New Device → Network Adapter,
then select Adapter Type.
image

You can’t change on existing NICs

That’s the issue. I cant select VMXNET 3, there is no option.

Powered off, or adding a second NIC.
image

The VM was a fresh install to a freshly installed ESXi. Neither have been updated/upgraded.

That was my issue.

I had the guest OS wrong the entire time. :man_facepalming:
Normally you get ESXi barking at you about wrong guest OS selected but that didn’t happen.

1 Like

Now if I could get SMB speeds above 900Mbps. From my windows VM to the TNC SAS3 pool or one pool to another over smb. Guess drivers didn’t do anything in that regard either.

I haven’t tested the server to server over 10G yet. But if I cant get over gbit from inside the same hypervisor I doubt it will over line.

Edit:

Nevermind, I’m getting 200MB-1GB/s, big B’s So the drivers are working. Going away from E1000 made a crazy improvement. From 900Mbps to close 10G
image