|
I don't know why but i have always had an idea on compression, as in archive compressing like rar/zip. I never understood how you couldn't compress one thing any better then another. I don't know how things are really done so this idea may be so out of the park that it can't be done probebly most likely due to cpu and proccessing time. but here is my idea.
stages of compression. stage one would be 25% shrink of file size so no matter what type of file it is it will shrink 25% but taking all the combinations of 2 binary numbers 11,10,00,01 now if it is a 11 compress that into 1 if 00 then 0 if 10 then A if 01 then B. then the program you have would have a table of decompression orders so stage 1 would just do it backwards. it would have multiple stages of compression but each one taking more time then the one before it but figure with this method entire servers of images and mp3s and whatever you want could be compressed into half the size. simple compression. but i don't know how much cpu/time that would take... anyone maybe have an article or something of why this wouldn't or can't work?
|