

- #BANDIZIP SMART EXTRACT ARCHIVE#
- #BANDIZIP SMART EXTRACT CODE#
- #BANDIZIP SMART EXTRACT ZIP#
- #BANDIZIP SMART EXTRACT WINDOWS#
So it looks like sometimes it's worth it and sometimes not, possibly with improved detection it could be made to be worth it almost every time, but I'm not intending to push this idea any furtherīut it seems hat these weaker machines you are worried about would be the least punished in cases when it doesn't work as well

To dive deeper I tried constraining the process to single thread while compressing my downloads: So in a mixed bag it definitely works and is worth it, for everything else it looks like the way they determine if they should use Deflate or Store is not efficient enough to be worth it, at least on my machine
#BANDIZIP SMART EXTRACT CODE#
With this out of the way I decided to try some well compressible data like a source code with almost no binaries, 1 029 669KBĪfter that I added some video files and archives to the mix: Regarding data I performed quick and dirty experiment using Bandizip to compress my downloads folder (4 946 928KB) containing mostly poorly compressible files (installers, videos and so on) with as they call it High speed archiving active and not, using Deflate setting maximum available compression (except for the HSA switch) PS do you have any data whatsoever that suggests this method of 'smart' compression would benefit anyone or is just an idea It's a nice idea, but in reality, it's not practical, it's going to require way too many man hours to implement, the ux is going to be too hard for the average normie to understand, and it's going to make the likes of your mom compressing family photos more stressed because their budget laptop is going to be wasting resources to meet an arbitrary store-instead-of-deflate threshold instead of just archiving with the intent to compress.
#BANDIZIP SMART EXTRACT ARCHIVE#
Do you want to be responsible for the hack that throws away and recalculates archive dictionaries over late and very much arbitrary deflate/store selection? Also noting that, an optimal multi-threaded implementation would have an even harder time rolling back. Making compression inherently slower by forcing an additional large section read after throwing out CPU time on a deflation operation, just because the result didn't match an arbitrary (probably hard-coded 'sane') compression ratio, deployed across the most potato spinning disks, Intel Shitrons, and the likes, just to save a few ms-to-seconds when deflating massive blobs on high end machines is stupid. The current implementation is already problematic in enough ways.
#BANDIZIP SMART EXTRACT WINDOWS#
Windows Build NumberĪdding smart compression that skips compression of files that doesn't reduce size enough and double-click-to-extract
#BANDIZIP SMART EXTRACT ZIP#
The zip compression tool in Explorer is so painfully slow that for most devs it is unusable. Unzip should be very fast, to enable a developer to stay in the flow and not be slowed down by unnecessary wait times. Uncompress a zip file, especially one containing many small files Expected Behavior Want to quickly look at an archive of zipped log files from something? Might as well make a coffee while you wait.įor me the process is often like this: Start unzip from Explorer, get frustrated that it takes so long, open 7-zip (or similar), uncompress there again, 3rd party tool finishes unzip, I check over to Explorer which is still at around 10% done. There is no real diagnosis here need: The library doing the unzipping, is from 1998 and nobody at Microsoft knows how it works.ĭevelopers work with zip files very often. This is a very known issue (, ) and has been for years.

It does not utilize modern multi-core CPU cores and does IO in incredibly inefficient ways. Depending on the zip archive it is so painfully slow that it is unusable. The zip uncompression in Windows Explorer is not very performant.
