One disk in raided boot volume keeps failing

ElectricEel-24.10.0.2 installed Nov 24
data-pool = 2 mirrored VDevs with 2x6TB drives = 11TB total storage
boot-pool = Mirrored DogFish 64GB SSDs

My boot-pool devices are sde and sdf. sde has been steady but sdf has failed and been replaced four times. In each case, I’ve attempted to use Windows DISKPART to test the failed drive and found they could be read but no changes could be written to the disk. Got “access is denied” when attempting a CLEAN ALL but DELETE PARTITION responded successfully without actually removing the partition. Attempted to remove write protection but got an error doing that, too. So, clearly the drives had failed.

Disk writes appear to be consistent on both drives:

I’ve moved sdf to a different SATA port on the MOBO and replaced the SATA cable. So far, this has had no impact. The only other thing that has not been replaced is the power connection which is daisy chained from the other SSD and one of the drives in data-pool.

I know the DogFish SSDs are cheap and not recommended but I use the very same boot-pool configuration in two Proxmox nodes installed last year and have not had to replace a single drive.

Does anyone have any idea why the same device keeps failing?

Do you keep replacing it with “dogfish” brand SSD’s, I would say stick with reputable brands?

Could be TrueNAS just pushes them harder than proxmox does…

Sure, I get that. But I hesitate because it’s always /dev/sdf that fails. /dev/sde is the primary in the boot volume and it’s never recorded a single error while I’ve gone through a bunch of other drives on /dev/sdf.

It’s just the boot volume. On FreeNAS, I used to use mirrored USB sticks and rarely had to replace them. Any suggestions on a better SSD that does not break the bank?

Maybe earlier models of the drive had better quality, and the newer ones fail? Some of these brands just buy up the chips they can get at the moment.

And just a reminder, drive names sdf, sde can move around after reboot.

Not sure if a bad cable or maybe permanent overheating could cause the problems you see, very likely not.

It is pretty quick and easy to just reinstall truenas, if you can live with a short downtime. So the cheapest option might be to back up the configuration regularly and just use one disk. .In the case of nvme drives, this might free up a valuable m.2 slot.

Any small drive from a known chip manufacturer might be safest bet, but everything with memory might seem to break the bank at the moment.

For me, “like new” nvmes pulled from laptops have been a good and reasonably cheap option, but offers vary and you have to trust the seller to a degree.

Ya, for me if you want a cheap SSD,

Western Digital,SanDisk, Kingstom, Samsung, they all have budget drives.

I have used TeamGroups SSD’s and “knock on wood” not had any issues yet.