GZipStream and DeflateStream produce bigger files
Asked Answered
S

5

9

I'm trying to use deflate/gzip streams in C# but it appears that the files after compression are bigger than before.

For example, I compress a docx file of 900ko, but it produce a 1.4Mo one !

And it does it for every file I tried.

May be I am wrong in the way I'm doing it? Here is my code :

  FileStream input = File.OpenRead(Environment.CurrentDirectory + "/file.docx");
  FileStream output = File.OpenWrite(Environment.CurrentDirectory + "/compressedfile.dat");

  GZipStream comp = new GZipStream(output, CompressionMode.Compress);

  while (input.Position != input.Length)
      comp.WriteByte((byte)input.ReadByte());

  input.Close();

  comp.Close(); // automatically call flush at closing
  output.Close();
Sadler answered 5/10, 2010 at 13:27 Comment(7)
You do realize that a compression method that will compress any arbitrary input by at least one byte cannot exist? So especially if you are trying to compress data that is close to random already, e.g. precompressed data, you may see a size increase.Turley
.docx is already compressed using ZIP compression (try renaming to .zip and having an explore). I'd be surprised if a second level of compression would yield any benefit.Kehoe
it should effectively do compression only on the flush, so it shouldn't change a thingSadler
@Kehoe > didn't know that, I'll try with an other file formetSadler
Have you tried compressing a .txt file?Ensue
well, it works with a txt. didn't know docs was already a compressed formatSadler
There was a bug opened with Microsoft covering this phenomenon, in which DeflateStream increases the size of a previously compressed data stream: connect.microsoft.com/VisualStudio/feedback/details/93930/… It's currently marked "Closed - External". I don't know what that means.Dumps
E
7

Such a big difference seems strange to me, but you should keep in mind that docx is itself compressed in ZIP, so there is no reason to compress it again, results usually are bigger.

Ess answered 5/10, 2010 at 13:32 Comment(1)
yes thanks, I didn't know it, and it is why it didn't work :) tried with .txt and other format and it seems better. but it still doesn't works on a home-made serialized file type ... but it doesn't matter at the end, just wanted to see how to use those compression streams :)Sadler
D
2

Firstly, deflate/gzip streams are remarkably bad at compression when compared to zip, 7z, etc.

Secondly, docx (and all of the MS document formats with an 'x' at the end) are just .zip files anyway. Rename a .docx to .zip to reveal the smoke and mirrors.

So when you run deflate/gzip over a docx, it will actually make the file bigger. (Its like doing a zip with a low level of compression over a zipped file with a high level of compression.)

However if you run deflate/gzip over HTML or a text file or something that is not compressed then it will actually do a pretty good job.

Detruncate answered 5/10, 2010 at 13:39 Comment(3)
yep thanks, as said in other comment didn't know that docx was already compressed. and sure 7z and other libraries are better, but just wanted to try these out to see what they were able to doSadler
This seems like a totally invalid comment: deflate/gzip streams are remarkably bad at compression when compared to zip, 7z, etc. Fact is, 99% of zip files use DEFLATE as the compression format. So zip can be no better than DEFLATE, because it augments the compressed stream with metadata.Dumps
The phenomenon in which a DeflateStream actually increases the size of the previously compressed data is the topic of a bug that was opened with Microsoft in 2006: connect.microsoft.com/VisualStudio/feedback/details/93930/…Dumps
P
0

Although it is true, as others have indicated, that the example files you specified are already compressed - the biggest issue is to understand that unlike most compression utilities, the DeflateStream and GZipStream classes simply try to tokenize/compress a data stream without the intelligence that all the additional tokens (overhead) are actually increasing the amount of data required. Zip, 7z, etc. are smart enough to know that if data is largely random entropy (virtually uncompressable), that they simply store the data "as-is" (store, not compressed), instead of attempting to compress it further.

Photometry answered 5/10, 2010 at 14:21 Comment(3)
This is not true: Zip, 7z, etc. are smart enough to know that if data is largely random entropy (virtually uncompressable), that they simply store the data "as-is" (store, not compressed), instead of attempting to compress it further. ZIP is merely a file format. It does not "know" anything. a program that produces a ZIP file may do what you describe, but the ZIP format does not.Dumps
The phenomenon in which DeflateStream actually inflates the size of previously compressed data is the topic of a bug that has been opened with Microsoft: connect.microsoft.com/VisualStudio/feedback/details/93930/…Dumps
Wasn't talking about the format (good grief). Was talking about the compression utilities that write data in their corresponding formats.Photometry
O
0

I had the same issue with compressing databases containing jpg data. I tried dotnetzip - a drop in replacement and got decent compression (Supports Compact Framework too!):

MS : 10MB -> 10.0MB
DNZ: 10MB ->  7.6MB
Omnivore answered 11/10, 2011 at 14:53 Comment(0)
D
-2

I don't think GzipStream and DeflateStream are intended to compress files. You would probably have better luck with a file compressor like SharpZipLib.

Decompress answered 5/10, 2010 at 13:32 Comment(6)
they are made to compress and decompress. I'm currently reading MCTS 70-536 certification book and they are used like that there ^^Sadler
and what are they for? msdn.microsoft.com/en-us/library/… "GZipStream Class Provides methods and properties used to compress and decompress streams."Ess
They're perfectly good at compressing files and for many cases handier than zip since they work straight on the file rather than creating an archive, and you can output them straight from a webserver instead of compressing on the fly every time. Appending .gz to the name (after the original extension rather than replacing it) is common with gzip files. Not to say that SharpZipLib isn't better in a lot of cases though.Crossed
@kite: I worked at Microsoft PSS and helped develop some of the testing. If it's done in an MS certification book, it's equally likely to be a HORRIBLE way of doing things :) Having said that, there is no compressor that can make an already-compressed file smaller.Decompress
@Dave Swersky: That's a rather bold statement. One could use Huffman coding to compress a file, and then zip it to make it even smaller. Depending on how bad your first compressing technique is, a second compressen technique could make it better or worse.Caseworm
@Excel: I stand corrected. I suppose combining two different types of compression could increase the ratio overall, but I will say using ZIP twice will not work.Decompress

© 2022 - 2024 — McMap. All rights reserved.