I have the 16 PCIE Bifurcated to 4x4x4x4 the 4 nvme drives are Pooled together at RAID 0
So anyway running Crystal Disk Mark I’m getting 116 mb/s read 117 mb/s write.
Running NASTester I get similar results.
Steam download speed on the NAS tops out around 50mb/s, but 400 mb/s on my computer.
Load times on steam games off the NAS are outragous, like litterally 20 times longer than my computer. Heck steam itself takes 5 times longer to boot up on the NAS
Copying files I generally get 100 mb/s but soon as another computer is accessing the Nas the speeds drop to 10 mb/s and its virtually unusable.
What is RAID 0? Are you using ZFS or something else?
The second computer, what is it’s speed when it is accessing the NAS? Is it about 80-90 mb/s? Is your network a complete 10Gbps network?
What kind of speed were you looking to get? It sounds like your testing was good, for a 1Gbps network.
How are the Steam games provided? SMB? You are lacking a bit of information as this sounds like however you are hosting these Steam games, that is the problem.
Looking at the NVMe expansion card, you should not be using a bifurcated PCIe bus, unless I’m looking at the wrong one ( HYPER M.2 X16 CARD V2).
Also, are the NVMe drive in a RAID 0 by the card or were each drive individually viewable within TrueNAS and you created a Stripe of the four drives? This makes a big difference.
I once also tried to go down the bifurcation route as PCI-E expansion cards that have a PLX switch chip cost at least 250€
However due to all the issues that plague bifurcation I eventually had to bite the bullet and get a card with a PLX chip which honestly works great.
That said the slow speeds that you see should not be caused by the bifurcation card.
From what test exactly?
How did you run that test? From a networked PC over SMB?
It looks like you hit the 1Gbps networking limit - are you sure everything is running at 10Gbps?
So you run LanCache on TrueNas I guess?
Is that 50Mbps for games that are not on the cache (first download) or for games that are on the cache already?
Did you check the CPU load on the TrueNAS system?
LanCache is mostly singlethreaded, so if you see one core on the server stuck at 100%, then you have found your bottleneck for the LanCache.
So you have a Windows VM on the server?
Again, please check the CPU load per core on the TrueNAS system.
Threadripper is great for multithreaded workloads - steam itself and decompression of the chucks it downloads is mostly singlethreaded though. So if you see one core pinned at 100% load then this is your bottleneck.
For clarification I have the 4 nvme drives together in a pool, striped with LZ4 Compression using TrueNas
Don’t you need to Bifurcated the 16 PCIE slot to 4x4x4x4 on the mobo using a NVME expansion card else only one NVME drive will be recognized?
The NAS, Switch, and My main PC all are using 10g SFP+ ports so I should be expecting speeds of 1,250megabytes/s but im only getting speeds of 117 megabytes/s
Everything else is connected with 1g ethernet but soon as I begin copying files/watching movies on another device copy speeds on my main pc to Nas drops to 10 megabytes/s.
I’m not running a Virtual Machine or LanCache I simply have steam installed on the Nas Network location, I open it from that location with my main pc and It boots slow, and games load times are really slow. Which was my main reason to build the Nas in the first place. I have multiple computers and wish to Upgrade my switch(one with more than 3 10g ports) and other computers to all use 10g ports.
Watching movies with my rasberry pi works good and playing 20 year old games is fine I guess but that’s basically all I can use it for. That and storage but its my pool is striped so I back up everything on an hdd which takes ages
Lastly I believe my Nas is setup for SMB but can’t find that option anywhere
Oh also while running Crystal Mark the CPU runs between 1% and 4 % and I see all 16 cores in use I saw it spike briefly to 25% testing the write speed.
Launching Doom 2016 on Steam CPU runs between 0% and 2% and all 16 cores are in use. Load time for the main menu took about 90 seconds between 2 different loading screens. Launching doom on my PC with steam installed on a second hard drive for storage a 1TB SSD Load times between the two screens were litterally like 2 seconds, I could hardly even tell there was a load screen at all.
It does appear something is bottlenecking me to 1g networking speeds but that doesn’t explain the horrendous steam load times even at 125 megabytes/s it should load faster than that
If the card is what you said above, then no, it has some sort of PLX chip built in. It handles bifurcation onboard. Turn off the bifurcation and see if that makes any difference at all. If you lose all but one of your NVMe drives when TrueNAS boots up, then you need to use bifurcation and the board is not the one I looked up.
megabytes (MB/s) or megabits (mb/s). This is a huge difference. Terminology matters.
Unplug the 1g card because for what I read, the traffic is going via the 1g.
Or, use the IP ( if it is in a different IP range ) to force to use that 10g network.
If none of that does it, get same NIC cards. But I think the above is the reason.
I undid the Bifrucation settings on the Mobo from 4x4x4x4 back to 16x and now the mobo and TrueNas only sees one NVME drive
Tried removing all 1g ethernet cord from the Switch, Still seeing the same speeds.
I then tried Directly connecting my PC’S 10g port to the second 10g port on the NAS then changed the IP settings on both, then ran a speed test with the boarderline same results
I’ve been around computers since they were untapped electricity in the wild
The other thing that I can think of, is that if you are using copper, and a pair or two in the cable are broken. Try replacing the cable. Also, sulfidation of the contacts. Pull and push them in and out a few times in every cable to clean the contacts.
…now that, paints a picture.
Ok, plug the optical SPF+ straight from your NAS to your MainPC
If that is good, the problem is with the QNAP switch.
If that is slow, them two cards may not like each other. Or one just broke. It happens.
Also, is this a new happening were before it was good but now is not or is it a new setup ?
…also, if the 2nd PC is SFP+ speeds, the problem is in your PC only ? Can you swap those PCs ?, Can you swap the ports they are plugged in in the router ?
Router ports can go bad independently. They all work but one. It happens.
Also, is that a managed or unmanaged switch ? Because if it is managed, it may be configuration.
There you have enough to troubleshot with. May look like a lot but are standard procedures.
I tried directly connecting the PC already with the same speeds.
Actually I’ve had this setup for 3 years now and its always been slow. I’ve just been busy with life I hadn’t had time to figure out why. Now I’m getting back into gaming and it’s really bothering me, especially with the amount of money put into the NAS for the simple reason of making it fast
The Hyper card does need bifurcation, but there are cards which have a chip on them where you dont need bifurcation.
Bifurcation implementation in many mainboards isnt very stable - if you run into issues like an nvme disk suddenly dissapearing you should consider an upgrade to a card which comes with an PLX chip (this happened to me).
That said, I doubt that this is the cause for your problems.
When it comes to the architecture of your setup, (I dont think you answered this yet?) you use an SMB share to access your steam library that this located on your NAS, is that correct?
May I ask what was the reason for that choice?
I have toyed around with this a couple of years ago and there are situations (games) which do not behave well when their data is located on an SMB share. That ranges from stutter while loading data from the library when gaming to games not launching at all.
Besides that, coming back to the slow speed, did you check what network/link speed is reported by both the NAS and the PC?
Do they report 10Gbps ?
I have seen cases where I had to force a NIC to the correct linkspeed as it would not auto-negotiate the fastest speed with a certain switch (well, in my case it was an SFP+/RJ45 transsceiver).
lastly I tried to remove my pool of drives and setup each of the 4 nvme drives with separate storage, then I tested the speed of the 4 different drives and got about the same exact speed on all 4 with crystal mark which would be about 0.928 Gbits/Sec my thought was 1 was bad and bottlenecking the rest
Although Crystal mark seems to just show speeds of a 1g connection for whatever reason