I’m experiencing almost the same issue. Shares show up as blank, and come back after some time - from 30 or 40 seconds to a couple of minutes. Local users (not AD), all workstations running Windows 11. It is not related to a single client.
P.S.: This system has been working flawlessly on 24.10 - the problem mentioned above started from the first few hours on 25.10.
P.S.2: rolled back to 25.04.2.5 - all seems well, but it is a bit early to say for sure.
In short, the autostart is intentionally disabled and there is a manual process documented for migrating your Containers VM volumes to zvols that can then be used to create regular VMs.
It looks more complicated than it is really. It’s mainly just locating the existing volume, running a zfs rename to move it out of the Incus managed volume and place it as a standard zvol on your pool. From there you just create a VM and set the zvol as your disk.
That said, unless you absolutely need to maintain the existing VM and its storage, destroy and rebuild is a fine solution.
This process is not actually correct, as it uses the wrong volmode and doesn’t correct the cache settings, assuming the goal is to have the zvol have the same properties it would if created outside incus in the first place.
Not “reckless” - there’s this thing called a “release candidate” where you can actually test pre-release - maybe you’ve heard of it
It’s pretty common for people to update to the release version upon release - especially people desperate for the new features. But just because that initial version may be deemed “early adopter” - that should not dictate future intentions . And “Update Profile” does signal future intention because it decides what updates are shown to you in the future (updates tend to happen in the future, unless the arrow of time somehow flipped), or if it instead says " System is up to date!"
And so that’s the mistake that Update Profile makes as it stands currently - it conflates present with future. The fix would be to not blank out any of the update profiles - the current running version is irrelevant. Anyway it’s moot and little more than an annoyance, given that the software is not auto-updating, and I can pick and choose what future updates I wish to deploy.
Yes. And what iX calls “release x.0” would charitably be called a beta anywhere else. Upgrading to .0 on release day on a production server is barely, if at all, shy of reckless. iX themselves say so.
I’m not outright disagreeing with you, other than to qualify that if thorough RC testing has been conducted and the new features are a must-have, then the risk vs reward is acceptable on a per-case basis. And “production” can mean a lot of things. Like… .one box out of many in a F5 load balancing pool where if one goes down - there’s negligible impact, vs a mission-critical DB with very large restore times etc.
Plus we need brave souls to go forth and shake out the remaining issues, for the sake of everyone else
I upgraded to 25.10 Beta1 immediately, so there was no reference documentation. I guessed that the .ix-virt dataset would contain a zvol file corresponding to Incus, so I guessed based on the zvol size, copied the possible zvol, and created a new virtual machine.
This zvol was used to run OpenWrt, which obviously involved multiple network cards. OpenWrt binds the first network card it recognizes to eth0 and sets it to the LAN area, so I only added one network card to ensure OpenWrt could access it within the LAN. After confirming the configuration was correct, I added a second network card and set it to PPPoE.
I have a box running 24.10. I did not upgrade to 25.04 because I have several VMs, and release notes at the time recommended against upgrading. And apparently upgrading to 25.04 causes loss of virtual machines (is this still the case?).
See bug report “I upgrade from 24.10 to 25.04 and my virtual machine is gone” (sorry, I can’t paste links here).
What is the appropriate upgrade path to 25.10? Ideally I don’t want to lose my virtual machines, is there an upgrade path for them?
I tried looking here at the 25.10 upgrade paths page (again can’t use links):
But it seems to suggest that I need to go via 25.04, and if 25.04 doesn’t support my VMs, that doesn’t seem correct.
But maybe my information is dated, I see 25.04.02 has in its release notes: “(Updated for 25.04.2) Reintroduced “classic virtualization” with the [Virtual Machines] feature.”
Does this mean it is not safe for me to upgrade to 25.04 and then 25.10?
SMART monitoring appears to have automagically migrated to CRON commands. From the CRON command syntax, it makes what was otherwise an easy thing to configure orders of magnitude more complicated.
This morning, I received the following error message.
The command:
midclt call disk.smart_test SHORT '["{serial_lunid}W523L9JK_5000c5009c3c2932", "{serial_lunid}Z529VH6X_5000c500c5869d37", "{serial_lunid}Z529VHB9_5000c500c58672e0", "{serial_lunid}Z52ARP5B_5000c500c7f646e3"]'
Produced the following output:
null
If you don't wish to receive these e-mails, please go to your Cron Job options and check "Hide Standard Output" and "Hide Standard Error" checkboxes.
Is that all I have to do it hide the output? If I do , will that suppress actual error messages when the self test fails?
I could have sworn I saw an announcement that 25.10 would feature a New and Improved™, graphical, web-based installer, so the days of “download the ISO and dd it” were behind us. But the docs still say “download the ISO and dd it,” and the 25.10 installer looks identical to the last several releases. What gives?
Thanks for the suggestion. I followed the process as written before we published it and ended up with a working VM, so I don’t believe the volmode change is needed to “activate” the zvol as you say, unless there was a later change after you initially tested it. But I’ll look into it and see if these actions should be added.
I think the post you’re referring to was intended as an early teaser of the web-based install process in TrueNAS Connect from before that project had been formally announced. There was no intention to rewrite the installer in TrueNAS itself for this version.
The web based installer is part of TrueNAS Connect (currently in beta) TrueNAS Connect - Login and TN Connect depends on 25.10 (it can only monitor and install 25.10 and newer TN systems).
The easiest way to add a system to TN Connect is a new icon in the top menu (between the True Command and Running Jobs icons).