Does not leverage previous compilations? #38
Loading…
x
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
According to:
Journaux et fichiers de sortie > Nettoyer le cache des fichiers does not seem to agree.
Related to Improve_websites_thanks_to_open_source/issues/727.
Still compiling on PLMLaTeX after 8 minutes when not in draft.
It seems to have taken about 10 minutes.
With draft mode it takes 9 seconds.
The resolution does not seem to blame but the loseless compression.
Compilation time in seconds of article (as of commit
7c9d421303ca83f287950bc1c05e610e7851dd42
):#40#issuecomment-3599
benjaminloison/kile/issues/24#Bash script to monitor compilations
Should first ensure that actual figures are to blame by generating same resolution but single color images.
if support PNG and JPG then 60.9 MB over 62.7 MB.
So if switch to 1.8 MB, will use 34x less binary data.
DuckDuckGo search Linux generate single color PNG.
Bash script:
Output:
works fine to generate a yellow image.
Local not draft is about 0.74 seconds, so this shows that the loseless compression is to blame.
Spaces for resolution in
-size
works fine.works fine.
Maybe the slowness is due to
.svg
.Bash script:
All the generated images are about 13 KB.
Being able to specify quality as percentage as for JPG
convert
would be nice, otherwise have to manually compute and enforce such initial file size portion.DuckDuckGo search Linux convert PNG lossy compress.
https://pngquant.org
pngquant has 5.3k stars.
For testing can try generating PNG as
.jpg
.Unclear if want to have JPG compression or rescale without rescaling the size used in the article.
Note that rescaling means interpolating.
Can append
.jpg
to ease the JPG compression.Bash script:
0.77s of compilation locally with quality 1.
How to have lossy compression while having transparency?
Note that JPG compression is useless for images already being JPG, assuming that all are lossy compressed.
0.68s for quality 25.
0.70s for quality 50.
0.72s for quality 75.
0.95s for quality 100.
If less than a second, then it seems fine.
This looks astonishingly fast but the PDF image quality seems to match.
Maybe JPG can be written as is in the PDF contrarily to PNG.
DuckDuckGo and Google search JPG compression with transparency.
Just setting transparency with white would meet my needs.
DuckDuckGo and Google search Linux convert png to jpg with transparent pixels to white.
-background white
does not help.Maybe could use an intermediary PNG setting transparency to white.
https://imagemagick.org/script/mogrify.php
-alpha remove
works fine to have white instead of transparency (by default black), source: the Stack Overflow answer 8437562.Now PLMLaTeX compile in 8.09s (that is 14x faster than previously on my laptop and 31x faster than previously on PLMLaTeX).
Note that previously the PDF was about 81.6 MB and now it is about 84.9 MB. So the PDF size does not seem to matter much.
Related to Benjamin_Loison/firefox/issues/73.
0.85s locally with initial JPG.
3.46s on PLMLaTeX.
That is respectively to #issuecomment-3623 32x and 72x.
Related to Improve_websites_thanks_to_open_source/issues/733#issuecomment-3114463.
Note that could modify LaTeX code to actually use
.jpg
instead of.png
but as it is a temporary measure in theory let us stick with.png
but JPG encoded.Specifying
svg-inkscape/
to workaround #42 does not seem to involve delay.I guess that the workload is quite important as it now takes on PLMLaTeX 20 seconds.
Maybe PNG transparency or only few of the considered PNG are to blame.
Can probably leverage this JPG trick with
beamer
.