Lost zvol data after truenas restore

Hello

I lost my VMs in proxmox and one of them a truenasVM with 2 disk passed-through , i was forced to re-create a fresh truenasVM from scratch and luckily i had a Truenas Backup settings from the lost VM 22.04.

I passed-through the old two disk which was prevously used as a zfs pool, imported the dataset and

restore truenas settings database file.

Now my datasets + 2tb-zvol (which was previously used as an iscsi share to a windows server, now when i tried to configure and initiate the iscsi and accessed it via windows it show as GPT Protective Partition and I’m unbale to access the data inside it.

note that the dataset was previously encrypted, and during the restoration of the dataset i provided the key file to decrypt it.

I tried using the truenas shell but with chatgpt and it stated that the zvol is currently showing as empty with no data inside ( meanwhile in truenas gui it shows that it has data 616GB data written .

note the dataset restore was smooth with no any errors after the new VM.

Is there anyway i can access the zvol data inside?

I would be really appreciate it if you can help me out.

My guess is that this is the issue… somehow windows is mounting the iSCSI LUN wrongly.

I’d suggest doing two things:

  1. Making it clear in title that the issue is windows iSCSI mounting the zvol

  2. Documenting the setting used for both TrueNAS iSCSI and your windowns iSCSI set-up

Thank you very match !!

I went back through all the settings, extents, initiators, portal, and targets, then realized the issue was caused by a block size mismatch.

My newly created iSCSI extent was 512-byte logical block size, while my original (restored) ZVOL had actually been created with 4096 bytes. because of this mismatch, windows was detecting the LUN as a GPT Protective Partition and couldn’t read the data.

After updating the iSCSI extent to use a 4096-byte block size, Windows immediately recognized the disk correctly and all of my data became accessible again.

Hopefully this follow-up helps anyone else who runs into similar issue.

4 Likes

Well done!