Install didn't work it seems. Mounting from zfs:boot-pool/ROOT/default failed with error 2

Hi,

This is a repost as for some reason it went to Apps/Virtualisation - deleted and reposted here.

New install of TrueNAS Core
Oldish I5 12gb RAM
Oldish 80gb SATA drive for boot, 2 500gb SATA drives will be for data.
Installed from bootable ISO on USB stick TrueNAS CORE 13.0-U6.2

Seems to install ok. At end says reboot and move installation media.

Do so and it seems to boot ok initially, then at some point I get
Mounting from zfs:boot-pool/ROOT/default failed with error 2

Repeated the installation, reformatting the boot drive to be. No change to behaviour.

I have no idea why this happens or what ā€˜error 2’ is. I do use linux somewhat, but not an expert. I have another TrueNAS Core system at my own home and it’s been trouble free for years - and it’s using a USB stick for boot media.
I tried doing this with a USB stick and got strange errors so I found a usable (no SMART errors) 80gb drive and used that. There is nothing wrong with the drive.
The mainboard has RAID capability, but I have it in AHCI mode, in BIOS boot mode.
It’s capable of UEFI but don’t want to mess with it.
Install was from a 32gb Verbatim StoreNGo with Ventoy.
It’s weird issues like this that make me hate linux at times. (Ok, yeah, I know it’s BSD, but still a 'Nix ok?)
Any help getting this working would be much appreciated. It’s intended to be a 500gb mirror (raid 1 by TrueNas if I can ever get it to work - not hardware raid 1) just for storing movies and music everyone can access. Nothing dramatic.

Anyone know what’s going on here?

I had to create a new topic because there seems to not be one with that error in the system - though there was on the old ā€˜read only’ forums but it was from 2012 so…

Appreciate any help. I’ve had 40 years experience with computers, but mostly DOS/Windows (every version) Novell Netware, VMS and DragonOS Focal on a couple of PCs.

Further testing.

Tried TrueNAS Scale. That was even worse. Kernel Panic in Installer.
Too much garbage test spat out to actually read but something about syncing I think.
So… changed hard drives. Much newer 500gb drive - not going to wasted it as a boot device, but figured it would do to prove whether the drive was the problem.
Evidently it isn’t. Same outcome - TrueNAS Core installed completes normally, but it fails to load the zfs with ā€˜error 2’ (whatever that is).
I noticed that the drive was not plugged into SATA 0 and the bios would try and rearrange boot order to boot from that first, so I replugged everything so that the boot drive I’m trying to install to became ADA0 instead of ADA2.
Fresh Install. No change to behaviour.
ā€œMounting from zfs:boot-pool/ROOT/default failed with error 2ā€
I have no idea what this means or what to do about it.
The hardware (aside from the disk drives) is IDENTICAL to the system I have at home that’s had TrueNAS Core on it, in fact it was initially built on FreeNAS, so I know the hardware is compatible.
Can some kind soul please tell WTH is going on here. And WTH does ā€˜error 2’ mean anyway?

Thanks in advance

Regards

Geoff
ETWebs
VK5GDR

After much searching, found this talking about a clean install of something else.

ā€œThe installer does not add zfs_load=ā€œYESā€ to /boot/loader.conf
Adding it manually after the installation will fix this issue.ā€

Clean install of 21.02 on ZFS filesystem not booting | Netgate Forum

Ok, not sure if the circumstance is the same, but ā€˜error 2’ is apparently ā€˜Unknown File System’. Is there an intelligent reason it doesn’t just say ā€˜Unknown File System’ instead of ā€˜Error 2’? Or is there an aversion to plain language?

If we can assume this error doesn’t occur during upgrades (since it leaves the boot stuff alone presumably) could a fix be to try this. Problem is I have no idea how to do that from the post halt command line.

Any help appreciated.

Arrrgggh.

Ok I can’t figure out how to do anything from the mountroot or db prompts (well nothing that will let me edit loader.conf anyway, or even look at it so I can see if that’s the issue.
So… took it out and put it in my DragonOS (Lubuntu) system. Linux FS easy to mount in linux right? Wrong.
Ubuntu doesn’t speak ZFS by default it seems
After an hour of research managed to get the right utils loaded import the pool and go through the ridiculous sequence of commands necessary to set a mnt point etc.
Only you can’t mount it. Has something called ā€˜canmount’ set to off/no so nothing you do will mount it (well maybe, haven’t got that far, but it will probably screw it up for being the boot disk without another install.

Any suggestions? Because ā€˜Install Windows 10, use the hardware raid driver and just share the damn drives from that’ is starting to look way better than messing with this for hours on end. I mean, if it won’t even do a clean install because of inherent errors in the installer, I’m wondering what else is broken.

Very frustrated.

Did you try messing with this?

No… can’t see the point. It boots from the installed version fine, it just can’t mount the root file system because apparently loader doesn’t know what the root file system (zfs) is. That’s what ā€˜error 2’ means apparently. So booting legacy or UEFI wouldn’t affect that near as I can see.
Installer was booted from a Ventoy based USB stick and runs fine, completing successfully, and the removal and reboot starts it up from the onboard HD so it’s not a boot issue as such. Takes around thirty seconds of loading stuff to get to the point where it tries to mount the root FS and falls over with error 2 leaving me at the mountroot prompt, where you can’t do anything to force it to recognise it’s allowed to mount ZFS. You can manually specify the load of the root FS but it of course fails with the same error. This suggests the problem is that the installer creates the loader.conf without the necessary line to allow it to mount zfs as I suggested further up the thread. Problem is I can’t, from mountroot, either look at or edit loader.conf and the drive will not mount in a linux box even after I persuaded it to speak ZFS. There’s a canmount parameter set to NO/OFF so I can’t even get at it. I think someone needs to look at the installer and ensure the generated loader.conf contains the line "zfs_load=ā€œYESā€ because apparently, without that, it won’t mount the root FS. I can’t believe this got released with this obvious bug.
And just for laughs, I tried SCALE on the same system. Didn’t like it. Installer has a kernel panic despite the hardware supposedly being able to support it (I5 with 12gb). I’m wondering what else is broken, but clearly the installer is. If you’re doing an update from within TrueNAS, it probably works fine, this would only show up on fresh installs I’m pretty sure.

Thanks for responding.

Regards

Could you provide detailed hardware listing like the one in Stux signature. Just click on the black arrow in his post to expand the listing. It’s currently hidden. It’s a good example of what may help.

Um, I suppose. It doesn’t appear to me to be a hardware issue.
Oldish Intel server mainboard with I5 12gb RAM
Has onboard Intel HW raid, but it’s not in RAID mode, it’s in AHCI mode.
Oldish 80gb SATA drive for boot, 2 500gb SATA drives will be for data.
I tried substituting the newer 500gb SATA drive for testing and it made no difference.
Installed from 32gb Verbatinm StoreNGo using Ventoy, with TrueNAS CORE 13.0-U6.2 iso.
Using recommended legacy bios boot rather than UEFI, thought the system is capable of it.
Iso has been checked with the checksum and is correct. The installer completes successfully and the installation boots off the media fine, it looks to me to be purely a software issue ie the loader can’t mount zfs root because it doesn’t know what the file system is. (Meaning of ā€˜error 2’)

I don’t have the mainboard details readily available, but if you need them, I’ll see if I can find them on the mainboard. This is a retired Win 10 ā€˜Server’ and I can confirm installling Win10 on it goes without a hitch, in BIOS Legacy boot and AHCI mode. There is nothing wrong with any of the hardware.

Hope that helps and thank you for responding.

I’m not clear on how this could be HW at all, given it boots, but the loader simply can’t speak ZFS it seems.
If there’s more detailed information that will help, please ask and I’ll dig it up.

Regards

FWIW, when I install CORE in a SCALE VM, I have to boot the installer with Legacy BIOS, but I have to run the actual installed OS with UEFI.

Consider trying UEFI.

Um… Ok. I guess I could try that.
I have no idea how that would make any difference, as I said, it’s not that it doesn’t boot, it does, it just won’t load the root file system because to the loader zfs is an ā€˜unknown file system’. But why the hell not, at least I’ll know that’s not a factor for sure.
Thanks for responding.

1 Like

Yeah. I’m wondering if some sort of compatibility layer is getting in the way of the loader finding the right sectors it needs etc.

1 Like

Agreed.
This looks like a hardware compatibility issue, if it wasn’t the forum would be drowning in threads like this, and it’s not.

Testing UEFI seems to be a sensible thing to do, at least as break from having spent countless hours trying other avenues unsuccessfully.

1 Like

I’m uncertain how it could be hardware given it boots off the drive.
Tried 3 different sizes/makes of ATA drive. Same effect.

However, I tried UEFI and had no trouble booting the installer of USB stick and it ran and completed without any issue.
Removed it at the end and rebooted, again it booted off the SATA drive without delay but drops to mountroot prompt with exactly the same ā€˜Error 2’ failure trying to mount ROOT using ZFS.
I’d have to say this does not suprise me at all. I can’t imagine how it could be hardware to be honest. Not saying it’s impossible, seen too many strange outcomes with computers for that, but it doesn’t seem at all likely given it’s just saying the file system is unknown type - not that it can’t find it.
Linux will try and mount it, but it has the canmount set to OFF, so you can’t mount it on another machine. I also tried installing DragonOS Focal (basically Lubuntu Linux) and that went on without an issue as well.
I can find no hardware issue, nor can I envisage how it could be involved. The loader clearly doesn’t know what ZFS file system is, that’s what error 2 means so this appears to be a BSD installerism of some kind, but I have no idea how to make the changes to loader.conf (assuming that’s what the problem is - I can’t even look at the darn file) that may fix it. I’m open to suggestion.
I’d perhaps suggest the reason the forums are not full of it is that it likely only occurs on new installations, not upgrades and most people in the forums are long term users - maybe others tried new installs and just binned it when it didn’t work. No way to be sure, but I’m fairly confident this is not a hardware issue and that it’s related to ZFS and incorrect settings in loader.conf the installer generates. If that’s not it, I have no other idea what else it could be. I’ve tried it on three different drives now and I’m going to try it on an oldish Shuttle with an I5 just as a test, it doesn’t have enough SATA ports to make it server material, but it works reliably with both Win10 and DragonOS Focal as well.
IF you have any other ideas, I’d be very pleased to look into them, within the limits of my ability.
One other thing, I have an identical system at home, which dates to when it was FreeNAS and it’s been constantly updated to the current version without an issue, only difference is make and size of HDs.
Never put a foot wrong and it’s only got the minimum 8gb of ram.

Only other thing, this is an install on bare metal, not a VM. Really don’t like VMs much.

Thanks again for responding, it’s appreciated.

Regards

1 Like

Ok, I guess we need to completely eliminate the possibility of a hardware issue.
I’ll post a detailed hardware description of the mainboard and drives, but it’s an Intel Server Board from circa 2012 with an earlyish Intel I5. I’ll list all the HDs models as well.
I’m out and about until tomorrow, so I’ll post it then.

I had one other thought, is it possible one of you gentlemen could find the time to test the installation on spare hardware, just to see what happens?
It might be a way to eliminate hardware as an issue, or perhaps implicate it.
Just a thought.

Meantime, thank you to all that have given advice, it’s much appreciated.

Regards

I just installed Core TrueNAS-13.0-U6.2 on an old laptop with 4GB of RAM. The first boot after install, choosing 4. Shutdown option and removing USB did error (couldn’t find boot?) but CTRL+ALT+DEL and it did a successful boot. Turned off computer for a few minutes and boots fine afterwards

Ancient Dell Inspiron 1525. Service Tag H8W82H1 From around 2009, per Dell service ending.

I just did the computer in my sig with TrueNAS-SCALE-24.04.2 a few days ago, fresh install. Only problem was I had to choose UFI instead of BIOS in the installer otherwise I got a ā€˜not a block device’ error upon trying to install to SSD or HDD. Computer is from around 2011 (when service ended per Dell)

Ok, thanks for that.
I’m going to try it on the Shuttle and see what happens.
So it sounds like something peculiar to this particular system.
BIOS Version? I’ll see if there’s any updates available.
Why it only shows up with TrueNAS is puzzling - and it’s not as if it doesn’t boot, it does, it just won’t load ROOT FS as part of the boot process.
And how does it cause error 2? I can’t figure out why it does this rather than just crash and burn if it’s a hardware issue. Weird.
When I tried SCALE on it, I got a kernel panic. That was booted in legacy bios mode though. Might try it in UEFI with SCALE and see what happens.
I can’t figure out what could be creating this effect if the hardware isn’t actually broken.
One clue. I’ve just noticed there is a bunch of text that briefly flashes on the screen before the usual bios dialog on power up. It’s way too fast for me to read it, literally blink and you miss it, first time I’ve noticed, so might see if I can pause it and catch that. Might be complaining about something. CMOS Battery? Who knows. I"ll see if I can catch it, is there a way to capture all the bios messages from power on? I don’t know of any. It’s there for a second at most.

I’m also going to try removing the 2nd bank of 4gb RAM and see what happens with the original 8gb, maybe a memory error of some kind I’m missing… That might make sense if it somehow scrambled the FS id in ram I suppose.

At least I can now be reasonable sure it’s the specific system that is the issue and not a broader issue with the installer.

Thanks for doing that. Appreciate the time you have given it.
Regards

Ok, I tried the 80gb SATA drive in the I5 Shuttle with 8gb and it worked.
Booted the installer, did a clean install, rebooted and it worked, loaded the ZFS Root without an issue and eventually wound up at the Console screen.
So, it’s the hardware. I’ve tried changing ram around (don’t have a lot of the DDR3 that this needs, but some combinations seemed to make it worse (a couple caused a kernel panic in the installer, others worked fine, including 4gb of known good ram I borrowed from a Win10 box. But none of them would load ZFS root after install finished and the system was rebooted.
Either pretty much all the RAM I have (maybe a dozen sticks of various sizes that fit this board) is faulty or it’s the mainboard, based on available evidence I’m going for the mainboard. Which is annoying as I don’t have another board around that will take an I5 or better that has enough SATA drive ports to make a usable system and/or physical room for 3 hdds (1 boot, 2 x RAID1). So I might have to see if I can get another mainboard from somewhere.
I’ve gone over it with a magnifier and can’t see any obvious issues, no popped caps, corrosion etc, it all looks fine, but clearly something isn’t right.

Thanks for all your help, it seems your first thought, that it was hardware was in fact correct, though it’s the most obscure hardware issue I can recall seeing to be quite honest. I’m going to ramtest all the ram I can find just on the off chance it’s faulty (stranger things have happened) but I’ll do it in another system and go from there, but at this point I think we can leave it as a ā€˜Hardware broken in some strange way’ issue and move on from here.

Thank you again for all your help and patience, much appreciated.

Regards

Should we choose UEFi or BIOS settings during installation?

Both seems to work depending on your hardware. I just managed a successful install (on an Old IBM Pentum Duo server with only 6 gig of RAM.) in BIOS and on a Shuttle I5 using UEFI for both installer and installed system. The system I had all the problems with seems to have subtle mainboard or possibly ram issues, but it made no difference booting in BIOS or UEFI.

Cheers
Geoff