Welcome to Emulationworld

Forum Index | FAQ | New User | Login | Search

*View All ThreadsShow in Flat Mode*


SubjectRe: ZipMax Suggestions Reply to this message
Posted byPacFan
Posted on02/05/04 02:35 PM



With my proposed changes to .50 (Roman will be reviewing them shortly), I added some INI flags to skip files with either the Archive or ReadOnly bits set. This doesn't affect the files sizes by adding ZipComments or anything to it and gives you 2 flags to use to determine if ZipMax should attempt to rezip a file.

However what is the context of you needing ZipMax to skip files? Realistically if you have a 2.0 GHz or faster computer, except on BJWFlate, recompressing the 20-60 merged archives that change for each version is miniscule compared to the human time needed to go find and download them in the first place.

Assuming you have a complete .XX distribution verified with CLRMamePro, then you should burn all the files to read only media at least once to have a good base point.

Then when .XX+1 comes out, you should first mark all the files with the ARCHIVE BIT OFF. Then use CMP to fix up all the files that changed. Only the files with the archive bit ON should now be moved to a separate folder (temporarily) and then rezipped because *something* changed in them.

That's the quickest way to determine which sets need rezipping for me.

However, if you're asking for a way to determine for EACH SINGLE file in a zip if it's been changed, then that my friend is a more complex task. Yes, ZipComments could help but realistically, in many cases, bloat the zip back up to the size before you tried to rezip in the first place. Why? Because you'd have to store which version and command lines for each packer used because if you ever change a version of a packer or the order in the Ini file, you would no longer match. It's just not realistic to do that at least along in the ZIP file itself.

Roman's had a comment for a while about keeping track of smallest size for each file in a database. That would be great as you could set it up realationally and map command lines and versions to size. But the problem is, you'd also have to implement SHA/MD5 crc's (not just CRC32s) on them, and all the ZIP format supports is CRC, thus adding more complexity.

It would be great, at some point, of having a SETI like distributed way to rezip things to the most small size, each mamer could take a couple files a day and in no time we'd have no additional compression left to do with all 20 some zip algorthims and 65K command line variations out there.

As I toyed around with BJWFlate, I found different block size options (e.g. default 120, or -b64, -b8192, -b3) can produce a smaller size file in some cases. Of course it's down to the value of it at that point. I'm sure shortly Mame will implement support for Deflate64/BZip2 if not RAR format which will compress much further than "classic zip" format, even with some of the power compression tools available, so the point of using 50+ zippers/command line options will be moot.


> I think to save processing time in ZipMAX there should be an option for a small
> Comment being added to the zip file to prevent ZipMax from reprocessing the file
> again.. this could also be a URL comment to increase visability..
>
> i understand that this would increase the size of the zip file but this would
> just be an option so don't bitch about filesize

-
Entire Thread
Subject  Posted byPosted On
*ZipMax Suggestions  kiczek01/10/04 01:16 AM
..Re: ZipMax Suggestions  PacFan02/05/04 02:35 PM
..*Re: ZipMax Suggestions  [Pi]02/05/04 04:14 PM
.*Re: ZipMax Suggestions  Roman01/10/04 02:51 PM