Different backup solutions use different storage layouts. For example, vanilla windows backup&restore backup consists of many ~200MB zip archives; the actual files sizes don’t matter much.
Yeah, listing would be faster. Even though having 10k pictures in one directory is usually not the greatest idea. Also, sVDEV should be enough for just listing speed-up.
It doesn’t work that way. It has nothing to do with the file size; it is about block size, and (big) files usually consist of multiple blocks. Also, the maximum value of special_small_blocks for the dataset is 1M (at least in the GUI).
Perhaps this thread can be useful for explanations: