Jump to content

.7z image format: Idea


Bleek II

Recommended Posts

I had an idea to use an existing lossless compression format for images. This is not a request as I can get along without it and I don't plan on making it myself.

I've found that the smallest lossless file I can get is not even an image format at all. Using 7zip (http://www.7-zip.org/, most know of it) I am able to save 24bit .BMP files and take them down to a .7z file that is 60% to 20% the size of a .PNG file. I know I can't save alpha values this way but maybe a plugin based on 7zip could. I don't know there is a way for someone to call upon 7zip using a plugin or make it part of one but it's an idea. I don't need this plugin with over 300GB of storage but I thought someone might care to grab this bull by the horns, so to speak.

Link to comment
Share on other sites

Have you noticed that Alpha makes up 25% of the ARGB color mode? ;)

I thought most modern formats didn't include data that stayed the same - or have I just thought of the greatest way to compress images?

KaHuc.png
Link to comment
Share on other sites

I thought most modern formats didn't include data that stayed the same - or have I just thought of the greatest way to compress images?

yes let me give two examples.

Image1 .PNG=1.49MB .7z=847KB

Image2 .PNG=6.19MB .7z=1.57MB

Link to comment
Share on other sites

Clearly, you have not heard of OptiPNG and PNGOUT. Granted, LZMA compression is better than what PNG can support, but still, not enough to warrant its own FileType plugin. If you want to compress your files with 7-zip, use 7-zip. :roll:

xZYt6wl.png

ambigram signature by Kemaru

[i write plugins and stuff]

If you like a post, upvote it!

Link to comment
Share on other sites

Clearly, you have not heard of OptiPNG and PNGOUT. Granted, LZMA compression is better than what PNG can support, but still, not enough to warrant its own FileType plugin. If you want to compress your files with 7-zip, use 7-zip. :roll:

:lol: You are very right. It's just a fun idea like a cow with wings carrying a knight into battle.

Link to comment
Share on other sites

One thing I haven't seen mentioned is performance. Yes, LZMA has better compression. However, it also takes a lot longer to achieve that. Do you want every image you work on that's larger than 800x600 to take 2 minutes to save?

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

Is the extension the same? If so, that wouldn't be possible.

That is a feature suggestion I havn't got round to starting a threat for (support for multiple filetypes for the same extension).

KaHuc.png
Link to comment
Share on other sites

Is the extension the same? If so, that wouldn't be possible.

That is a feature suggestion I havn't got round to starting a threat for (support for multiple filetypes for the same extension).

Does this regard JPEG-LS? If so: no, it has ".jls". JPEG/JPEG 2000 seem to have a lossless mode, too (?), but JLS is a different thing and is really "lossless". It is especially good for pictures PNG is not made for (high details etc.).

Link to comment
Share on other sites

  • 2 weeks later...
JPG itself is very near lossless if you save at 100%..

While this is not that related to the discussion of a new - lossless - file format implementation, it is wrong. With PDN's implementation of JPEG, you'll get horrible results especially with red image parts even if you save at 100. With GIMP, you can access enhanced options of JPEG and achieve a "near lossless" quality, but you still get artifacts, visible when zooming into the picture.

Link to comment
Share on other sites

Well I didn't claim it was lossless, which it of course isn't

Near enough for me though (but I like PNG better)

I would write plugins, if I knew what kind of plugins were needed.. :(

Link to comment
Share on other sites

  • 2 weeks later...
Well I didn't claim it was lossless, which it of course isn't

Near enough for me though (but I like PNG better)

What I mean is not that 100 JPEG in general is bad, the problem is the lack of JPEG options in Paint.NET (especially in 3.30 compared to PNG or TGA - I don't know how many people actually use TGA).

You should look in the thread I link and especially scroll down to the 2nd example ("This is an example").

viewtopic.php?f=12&t=20431&p=100540

Here's a screenshot of GIMP's JPEG options (cause the one linked in the thread is "404ed"):

GIMP_JPEG.png

The subsampling has the most important impact on quality.

Link to comment
Share on other sites

If PDN doesn't have a custom JPG exporter yet then it might need one for all those settings.

Image.Save(args) probably isn't that flexible

And that would be a lot of work..

I would write plugins, if I knew what kind of plugins were needed.. :(

Link to comment
Share on other sites

  • 1 year later...

Hi everybody,

I'm new to this forum, but I've done some research with PNG and 7-Zip compression :)

First, let me clarify some prejudices:

1. PNG does not always compress higher than a 7-Zip-compressed BMP. 7-Zip does compress many images 25% better (nothing to do with alpha, meta data, etc), even if you compare it to a PNG that has been optimized with OptiPNG, PNGOut and DeflOpt. However, on some images, 7-Zip performs just horrible (due to the loss of pre-compression filters and the wrong compression method).

2. LZMA is not the optimal compression method for images. Yes, it is often better than Deflate (5-30 %). However, PPMd is even better (again 5-15%) than LZMA. If LZMA compresses a BMP better than PPMd does, the result is usually bigger than a PNG (because the row filtering is the real deal then).

3. The time argument is just wrong. Optimizing a PNG with OptiPNG, PNGOut and DeflOpt (for highest compression) takes like ten times longer than writing and compressing a BMP with PPMd and still, the result is often larger. Hey, we are talking about the highest possible compression here … no one expects it to complete in a split second!

Also, JPEG on highest settings is still far from lossless (especially regarding color information). There is a special Lossless JPEG codec, but from what I've heard, it's mostly unsupported and its efficiency is somewhere around PNG's. I would be interested in reading some facts on that, though.

My idea:

If a PNG is smaller than a PPMd/LZMA-compressed BMP, this is only because of PNG's row filtering. If, on the other hand, a PPMd/LZMA-compressed BMP is smaller than a PNG, this is only because of the superior compression method. So, can't we combine the advantages of both?

Yes, we can :) We can save an image as a PNG (so the image data is pre-filtered) but without compression and then compress the whole file with PPMd (which then operates on uncompressed, pre-filtered data). Such images are always smaller than a PNG (15 - 50 % smaller than a PNG on highest compression level, 10 - 35 % smaller than a PNG optimized with OptiPNG, PNGOut and DeflOpt). They are also always smaller than BMPs compressed with 7-Zip (0.5 to 60 %).

The whole process is slow, but still not as slow as optimizing a PNG to the max. It yields to higher compression. It consumes a lot of memory, but PNG optimization does, too.

The most time-consuming problem here is finding the best filters … brute-force, it's comparable to the run-time of OptiPNG -o7.

Since only one image at a time is compressed, one could save some of the 7z archive features (multi-file and directory handling, file names, modification date etc) and call the whole thing ".7png".

I have no experience or any particular interest in developing a plugin. However, everybody who regards this opportunity to compress images to the max should consider these discoveries. I have also made some benchmarks, if anyone is interested.

Link to comment
Share on other sites

Is the extension the same? If so, that wouldn't be possible.

How does the OptiPNG plugin work, then?

 

The Doctor: There was a goblin, or a trickster, or a warrior... A nameless, terrible thing, soaked in the blood of a billion galaxies. The most feared being in all the cosmos. And nothing could stop it, or hold it, or reason with it. One day it would just drop out of the sky and tear down your world.
Amy: But how did it end up in there?
The Doctor: You know fairy tales. A good wizard tricked it.
River Song: I hate good wizards in fairy tales; they always turn out to be him.

Link to comment
Share on other sites

Images need to be quick to load and save. Using excessive compression defeats that purpose. Storage is cheap, and network speed is always improving. Unless you're trying to cram a bunch of images onto a 3.5" floppy disk, I really don't see the point.

Paint.NET uses GZIP for compressing the bitmap data in .PDN files. One of the reasons for using GZIP is because it's simple, fast, and provides a reasonable amount of compression. This is mostly important so that featureless layers (e.g. a blank layer) don't have to contribute much to the file's size. The simplicity of the algorithm is a genuine advantage. If necessary, I could probably implement it myself instead of relying on the SharpZipLib DLL -- this is important in the event that a critical bug (or legal issue) is discovered in their implementation. There's no way I could implement something like LZMA myself, however.

If you really need to compress an image further, JPEG is available which gives you the ability to compromise quality for storage. For PNG's, tools such as PNGOUT and optipng are good for final post-processing before uploading; they are not practical to use while authoring the images, however, as they take a significant amount of time to process an image.

So, while your idea is a decent one, it simply falls victim to the good-enough-is-good-enough strategies that have already been ubiquitously implemented.

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

@rick: I don't think gzip is more difficult or easier to implement than most other algorithms.

@???: I think BWT based compressors have a greater potential for compression ratios than LZMA.

BZip2 only fails because of legal issues, the guy switched from arithmetic coding to huffman coding because he feared patents (arithmetic coding can encode symbols on 1.2 bits when huffman can only encode symbols on an integer number of bits).

But there was/is other compressors using that algorithms that works far better.

In my remembrance, most other compressors that beat it, are russian compressors having fun at trying to crush every bit of data (like 12 hour and 2Go ram for compressing your 2MB file, well, probably exagerated, but not so much....) using several advanced statistical modeling techniques and blending probalities (predicting the input is what compression is all about after all), etc...

But if you're seeking compression ratio, you'd better do something like: expected data specific transform -> BTW -> range coding

For the legal issues, the bzip guy, didn't feel that range coding was ok (legally), but wikipedia says arithmetic coding isn't, and range coding is, which is retard because it's the same algo, from the point of view of someone having a clue (it's only a precision issue), but patents are usually retards, so, that might be true, ... or not... he didn't take the risk...

Link to comment
Share on other sites

  • 11 years later...

Lovely topic about JPEG algorithms ;)


From JPEG Group we have:
1. JPEG (JPG/JPEG) or ITU T.81 standard that specifies LOSSLESS compression.
But JPEG LOSSLESS compression it's not so good compared to the other "state-of-art" compression algorithms.

2. JPEG-LS (JLS) or ITU T.87 standard with somewhat very good compression level (8-bit to 20~30% of original) vs CPU computation power.

Where JPEG-LS also support both LOSSLESS and NEAR-LOSSLESS compression (with NEAR factor set to 3 I get very good images with only 1.5% in size of original).

3. JPEG 2000 (J2K) or ITU T.8(00~15), that has awesome compression levels, but is very CPU computation extensive "state-of-art" (not sure what "art" stands for there. They have named it like that, not me).

This time surprise comes from Microsoft:
4. JPEG XR (JXR) or ITU T.8(30-49), very recent and very fast, also good compression levels.


Unfortunately many developers opt to implement only first (as many libraries support JPEG baseline, and not lossless encoding ie: just part of it.
Many do not implement higher bit depths support, or bigger documents support int instead of short for width and heights and so on.

But so far I have seen that impossible is possible, so very slow algorithms can be improved to work faster.

I've managed to reassemble JPEG-LS encoder/decoder for 16-bit/8-bit grayscale images supporting both LOSSLESS and NEAR-LOSSLESS in pure .NET (well using unsafe and a little of native heap, but all code is actually C# ;) ). And outrun on performances test both, free opensource lib CharLS (C/C++) (beaten by 40%), and payed Leadtools JLS Codec (C/C++ native+managed) (beaten by 70%). So I am guessing someone out there probably has very fast J2K/JXR codec that we need only to discover. Jpeg-Turbo is nice choice, but they tend to loose quality of image for the speed, so if task is best image quality I would avoid JpegTurbo (but is nice start for optimizations).

 

Recently I have discovered some article on subject of image lossless algorithms, and comparison test results that might interest you all:
https://res.mdpi.com/d_attachment/electronics/electronics-09-00360/article_deploy/electronics-09-00360-v2.pdf

 

JPEG-LS tends to be among fastest with average compression, but surprise MS JXR beats him for 16-bit images in encoding/decoding speed.

Where compression level of both is very similar with 8-bit images, 16-bit images are compresses better by JXR, so JXR wins also this time.
So the table at the end of document states results for different matching between compression ratio, encoding, and decoding times that might be useful to everyone.

JXR is next on my list to try, only to figure out how to implement and measure it's speed.
So if anyone is voting what to implement in paint.net I would suggest both JXR and JLS

Link to comment
Share on other sites

6 hours ago, S o L a R said:

Recently I have discovered some article on subject of image lossless algorithms, and comparison test results that might interest you all:
https://res.mdpi.com/d_attachment/electronics/electronics-09-00360/article_deploy/electronics-09-00360-v2.pdf

 

This is a study entitled:  "Optimization of Public Transport Services to Minimize Passengers’ Waiting Times and Maximize Vehicles’ Occupancy Ratios".

It is not related to images in any way.

(September 25th, 2023)  Sorry about any broken images in my posts. I am aware of the issue.

bp-sig.png
My Gallery  |  My Plugin Pack

Layman's Guide to CodeLab

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...