pdftk compression option
Asked Answered
B

13

120

I use pdftk to compress a pdf using the following command line

pdftk file1.pdf output file2.pdf compress

It works as the weight of my file decreased.

Are there [options] to change the compression???

Or maybe other solutions to compress my file? It is heavy because some graphics have a lot of points. Is there a way to convert these graphs to jpg for instance and adapt the compression?

Bacteriolysis answered 14/3, 2011 at 9:16 Comment(1)
From my experience, it depends what is inside your pdf. If it is a graph with many dots for instance, the best solution is to convert the graph to png and include this png into the pdf.Bacteriolysis
C
147

I had the same problem and found two different solutions (see this thread for more details). Both reduced the size of my uncompressed PDF dramatically.

  • Pixelated (lossy):

    convert input.pdf -compress Zip output.pdf
    
  • Unpixelated (lossless, but may display slightly differently):

    gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/screen -dNOPAUSE -dBATCH  -dQUIET -sOutputFile=output.pdf input.pdf
    

Edit: I just discovered another option (for lossless compression), which avoids the nasty gs command. qpdf is a neat tool that converts PDFs (compression/decompression, encryption/decryption), and is much faster than the gs command:

qpdf --linearize input.pdf output.pdf
Catlett answered 2/5, 2011 at 12:56 Comment(12)
Awesome. gs worked for me, converting a 4MB file to 339K. There was a loss of quality, but it served my purpose sufficiently.Lavinia
You can use "printer" PDF setting for a better quality: gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.5 -dPDFSETTINGS=/printer -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdfPrinciple
To adjust quality (and therefore size), vary PDFSETTINGS value. See ghostscript.com/doc/current/Ps2pdf.htm#OptionsDiscomfiture
qpdf didn't change anything for me, the zip compress looks horrible, but the gs with printer setting works wellSaulsauls
Note that the gs command in the answer is not exactly lossless, since it lowers the resolution and quality of embedded JPGs. But it is lossless re. text, keeping it as text, while the convert command converts it to raster graphics.Scissel
With graphicsmagick I had to gm convert, not just convert.Fortuneteller
Setting option -dPDFSETTINGS= to /ebook gives a very nice output for me: sure, it's compressed and some jpg artifacts are visible, but it's totally readable for a reasonable size. Thanks!Adam
gs wirked marvelously well, however the image magick convert in.pdf -compress zip out.pdf increased the size of my file quite dramatically.Turanian
the link to the thread seems to be dead, but it's available in the wayback machine.Ironmonger
convert will turn your txt pdf into a bunch of crappy images. this solution is waaaay destructive :-)Numerable
The gs command does not transform the PDF Ojects but only the pictures. I explain me. In my PDF created with PDKTK I have some fields. So I use the flatten option (in PDFTK) to convert this fieds into a simple text. So that it can no longer be edited. But this simples texts seems not to be transformed with gs command. And so the compression of the file is not complete. And so the solution would be to convert PDF to a format that could be processed completely by the gs command But what format...?Font
The link posted by @Discomfiture to the list of PDFSETTINGS options seems to be dead. This one is working now: ghostscript.readthedocs.io/en/latest/…Bechler
S
48

Trying to compress a PDF I made with 400ppi tiffs, mostly 8-bit, a few 24-bit, with PackBits compression, using tiff2pdf compressed with Zip/Deflate. One problem I had with every one of these methods: none of the above methods preserved the bookmarks TOC that I painstakingly manually created in Acrobat Pro X. Not even the recommended ebook setting for gs. Sure, I could just open a copy of the original with the TOC intact and do a Replace pages but unfortunately, none of these methods did a satisfactory job to begin with. Either they reduced the size so much that the quality was unacceptably pixellated, or they didn't reduce the size at all and in one case actually increased it despite quality loss.

pdftk compress:

no change in size
bookmarks TOC are gone

gs screen:

takes a ridiculously long time and 100% CPU
errors:
    sfopen: gs_parse_file_name failed.                                 ? 
    | ./base/gsicc_manage.c:1651: gsicc_set_device_profile(): cannot find device profile
74.8MB-->10.2MB hideously pixellated
bookmarks TOC are gone

gs printer:

takes a ridiculously long time and 100% CPU
no errors
74.8MB-->66.1MB
light blue background on pages 1-4
bookmarks TOC are gone

gs ebook:

errors:
    sfopen: gs_parse_file_name failed.
      ./base/gsicc_manage.c:1050: gsicc_open_search(): Could not find default_rgb.ic 
    | ./base/gsicc_manage.c:1651: gsicc_set_device_profile(): cannot find device profile
74.8MB-->32.2MB
badly pixellated
bookmarks TOC are gone

qpdf --linearize:

very fast, a few seconds
no size change
bookmarks TOC are gone

pdf2ps:

took very long time
output_pdf2ps.ps 74.8MB-->331.6MB

ps2pdf:

pretty fast
74.8MB-->79MB
very slightly degraded with sl. bluish background
bookmarks TOC are gone
Sorrows answered 7/9, 2014 at 22:22 Comment(2)
This is extremely valuable research (thank you!) but it's also so much not an answer that for a moment I thought about down-voting.Lodger
How is it not an answer?Sorrows
F
39

this procedure works pretty well

pdf2ps large.pdf very_large.ps

ps2pdf very_large.ps small.pdf

give it a try.

Form answered 25/3, 2012 at 21:38 Comment(9)
This is not a general solution. In many cases, the resulting pdf is larger.Protolanguage
This worked the best out of all mentioned solutions for me. A few large images went down from 23MB to 1.4MB with by far the least quality loss.Leister
@Protolanguage There probably is no general solution because there are different types of documents. However I see your point. It would be nice to have software figuring what works best for us.Bryce
Thanks, this worked for me, while qpdf and gs did not reduce the size of the output file.Sontich
As mentioned here another drawback to this method is that it will break URL links inside the document.Lesialesion
I tried several solutions mentioned on this page and this one worked the best with the best quality. Went from 22.1MB PDF to 220.2MB PS and in the end got a 3.8MB PDF with little perceived quality loss.Phototelegraph
31,6 MB vs. 8,8 MB. this has won for me.Alanaalanah
I suspect that using just ps2pdf large.pdf small.pdf has a chance to give the same result. (This are not misprints: ps2pdf accepts pdf input as well.)Shuping
did return exactly the same file size :-(Numerable
H
34

If file size is still too large it could help using ps2pdf to downscale the resolution of the produced pdf file:

pdf2ps input.pdf tmp.ps
ps2pdf -dPDFSETTINGS=/screen -dDownsampleColorImages=true -dColorImageResolution=200 -dColorImageDownsampleType=/Bicubic tmp.ps output.pdf

Adjust the value of the -dColorImageResolution option to achieve a result that fits your needs (the value describes the image resolution in DPIs). If your input file is in grayscale, replacing Color through Gray or using both options in the above command could also help. Further fine-tuning is possible by changing the -dPDFSETTINGS option to /default or /printer. For explanations of the all possible options consult the ps2pdf manual.

Hilde answered 17/2, 2015 at 13:28 Comment(3)
Thanks for tip. With -dPDFSETTINGS I could reduce the size of my scanned PDFDetection
THANK YOU. I don't think there is a general solution for everyone's use case - but I tried almost every solution on this thread and this is the only one that worked for me!!! Being able to "tune" the dColorImageResolution parameter was key - had to get the doc size small enough for this government site to accept it but big enough to be legible. Thanks, uncle Sam, for yet another painful hoop to jump through :)Poultryman
TANK U! This reduced 10x size with no visible image loss :-) ` gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/screen -dNOPAUSE -dBATCH -dQUIET -dDownsampleColorImages=true -dColorImageResolution=200 -dColorImageDownsampleType=/Bicubic -sOutputFile=output.pdf input.pdf`Numerable
B
7

The one-line pdf2ps option (by Lee) actually increased the pdf size. However, the two steps one did better. And it can be combined in a single one using redirection from & to standard input/output and pipes:

pdf2ps large.pdf - | ps2pdf - small.pdf

did reduce a PDF generated by xsane from 18 Mo to 630 ko!

Links are lost, but for the present example, it's not a concern... and was the easiest way to achieve the desired result.

Burnet answered 4/7, 2018 at 21:1 Comment(1)
You could try ps2pdf instead, see my comment to @Lee's answer.Ironmonger
I
5

pdf2ps large.pdf small.pdf is enough, instead of two steps

pdf2ps large.pdf very_large.ps 
ps2pdf very_large.ps small.pdf

However, ps2pdf large.pdf small.pdf is a better choice.

  • ps2pdf is much faster
  • without additional parameters specified, pdf2ps sometimes produces larger file.
Impinge answered 8/5, 2018 at 15:26 Comment(11)
Where did you find this option? Is it a feature in some recent version? It did not work for me. Even though I named the output file out.pdf, it became a PS file (mimetype out.pdf says out.pdf: application/postscript).Ironmonger
mine is the most recent version 9.xx. not sure your.Impinge
I'm using the debian stable ("stretch") packaged version, which is 9.25. Could you check if you indeed have a pdf file by typing mimetype small.pdf?Ironmonger
the output of mimetype small.pdf is small.pdf: application/pdf. I think the program can determine the filetype automatically according to the suffix.Impinge
my pdf2ps is like part of ghostscript 9.25Impinge
That's strange. Does your pdf2ps look different than this one?Ironmonger
Oh, ps2pdf works for me to convert pdf-to-pdf. Did you maybe confuse ps2pdf and pdf2ps?Ironmonger
I used pdf2ps. It's the same as in the link. ps2pdf may also work. I didn't try.Impinge
Would be great if you could try if ps2pdf makes any difference to you (maybe by comparing hashes, or by using ”diffpdf“). In my understanding, ps2pdf should always work, since it uses the pdfwrite driver, just like the highest-voted gs answer. — I suggest you to state in your answer that ps2pdf might work as well (or even that it might work in more cases). (If you'll do so, you'll get +1 from me ;).)Ironmonger
@Ironmonger yeah, I made tests. ps2pdf is better.Impinge
thanks! ps2pdf did reduce file size 2x. this is worse than gs that did 10x compression with no image loss, but it is easier to remember ps2pdf :-) :-) :-)Numerable
W
3

After trying gpdf as nullglob suggested, I found that I got the same compression results (a ~900mb file down to ~30mb) by just using the cups-pdf printer. This might be easier/preferred if you are already viewing a document and only need to compress one or two documents.

In Ubuntu 12.04, you can install this by

sudo apt-get install cups-pdf

After installation, be sure to check in System Tools > Administration > Printing > right-click 'PDF' and set it to 'enable'

By default, the output is saved into a folder named PDF in your home directory.

Wenwenceslaus answered 7/11, 2012 at 17:17 Comment(3)
i got here because cups-pdf did not compress my pdf.. it made it 5x bigger :-)Numerable
Perhaps it is an issue with the version that you are using, as I am guessing I was using a different version when I wrote this.Wenwenceslaus
It converted a 182 MB PDF into a 237 MB PDF for me. Unfortunately there don't seem to be any relevant options to set.Kickshaw
F
2

I know there are already many replies to this post but I had the same problem with a PDF created with PDKTK which I wanted to reduce in size.

And as I had said in the comments, the gs command was not suitable for my case.

And as it had already been said in the comments the result of the convert command was too degraded for some people.

But in reality no. The "convert" command can give a correct PDF with a fairly small size.

With this command, the visual is correct with a compression ratio of 74% on my PDF :

convert -density 125 original_file.pdf -quality 100 -compress Zip compress_file.pdf

With this command, the visual is a little less correct but with a compression ratio of 81% on my PDF :

convert -density 100 original_file.pdf -quality 100 -compress Zip compress_file.pdf

Font answered 22/7, 2022 at 14:1 Comment(2)
This is a really poor option because you remove all the vector graphics, not just the embedded graphics. You essentially convert the pages into compressed bitmaps. That means the crisp text is gone and you could just as well have been collecting jpegs.Motch
Maybe it's a poor option in your case but it worked for the problem I had. So of course I'm posting it here so I can help. And it's not like pasting jpegs.Font
M
1

After trying all the answers listed here, the best results I have obtained for a pdf with lots of graphics is

pdftocairo input.pdf output.pdf -pdf

I discovered this by opening a pdf with Evince in Gnome and then printing to file. This resulted in better file compression and better file quality compared to all the other answers for my pdf file. It seems cairo graphics is used in the background when printing to a file this way: running pdfinfo on the resulting file reveals

Producer: cairo 1.16.0 (https://cairographics.org)

Midiron answered 11/9, 2021 at 12:24 Comment(2)
I tried it with a 10MB file and it increased it to 13MBPillbox
pdftocairo -pdf -paper A4 input.pdf output.pdfTailgate
B
0

Okular's Print to PDF

I just turned a 140MB PDF produced with Keynote into 2.8Mb using Okular's Print to PDF. Text was converted to raster and zooming-in too much cleary shows pixels, but images were kept pretty sharp and its useable for messaging apps.

Building answered 3/5, 2021 at 16:15 Comment(0)
S
-2

I didn't see a lot of reduction in file size using qpdf. The best way I found is after pdftk is done use ghostscript to convert pdf to postscript then back to pdf. In PHP you would use exec:

$ps = $save_path.'/psfile.ps';
exec('ps2ps2 ' . $pdf . ' ' . $ps);
unlink($pdf);
exec('ps2pdf ' .$ps . ' ' . $pdf);
unlink($ps);

I used this a few minutes ago to take pdftk output from 490k to 71k.

Sconce answered 24/1, 2012 at 22:48 Comment(1)
PHP adds a completely unnecessary complexity and narrows the applicability of this answerLodger
T
-4

I had the same issue and I used this function to compress individual pages which results in the file size being compressed by upto 1/3 of the original size.

for (int i = 1; i <= theDoc.PageCount; i++)
{
       theDoc.PageNumber = i;
       theDoc.Flatten();
}
Translation answered 11/4, 2012 at 15:33 Comment(1)
are you using the C++ library of pdftk?Bacteriolysis
C
-4

In case you want to compress a PDF which contains a lot of selectable text, on Windows you can use NicePDF Compressor - choose "Flate" option. After trying everything (cpdf, pdftk, gs) it finally helped me to compress my 1360 pages PDF from 500 MB down to 10 MB.

Chenee answered 3/10, 2016 at 11:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.