return to table of content

Jpegli: A new JPEG coding library

lxgr
33 replies
23h40m

They'll do literally anything rather than implementing JPEG XL over AVIF in Chrome, huh?

I mean, of course this is still valuable (JPEG-only consumers will probably be around for decades, just like MP3-only players), and I realize Google is a large company, but man, the optics on this...

refulgentis
9 replies
23h20m

I have no love for Google, at all.

It's really hard to say this in public, because people are treating it like a divisive "us or them" issue that's obvious, but the JPEG-XL stuff is _weird_.

I've been in codecs for 15 years, and have never seen as unconstructive behavior like the JPEG-XL work. If I had infinite time and money and it came across my plate, we'd have a year or two of constructive work to do, so we didn't just rush something in with obvious issues and opportunities.

It turned into "just figure out how to merge it in, and if you don't like it, that's malfeasance!" Bread and circus for commentators, maybe, but, actively preventing even foundational elements of a successful effort.

lxgr
5 replies
22h55m

To be honest, at Google scale, if there's an objectively good new codec with some early signs of excitement and plausible industry traction, and even Apple managed to deploy it to virtually all of their devices (and Apple isn't exactly known as an "open codec forward" company), not integrating it does seem like either malfeasance or gross organizational dysfunction to me.

refulgentis
4 replies
22h53m

Completely hypothetical scenario: what if the technical reaction was so hostile they invested in it themselves to fix the issues and make it sustainable?

In light of the recent security incident, I'd see that completely hypothetical situation as more admirable.

lxgr
1 replies
21h35m

Hm, are you suggesting they're currently in the process of reimplementing it in a safer and/or more maintainable way as part of Chrome?

In that case, that would just be extremely bad messaging (which I also wouldn't put past Google). Why agitate half of the people on here and in other tech-affine parts of the Internet when they could have just publicly stated that they're working on it and to please have some patience?

Public support by Google, even if it's just in the form of a vague "intent to implement", would be so important for a nascent JPEG successor.

refulgentis
0 replies
19h50m

See comment on peer (TL;DR: I agree, and that's the substance of the post we're commenting on)

LeoNatan25
1 replies
21h14m

Your posts here seem of the “just asking questions” variety—no substance other than being counterculture. Do you have any proof or resemblance of any logical reason to think this?

refulgentis
0 replies
19h51m

It's a gentle joke, it happened, that's TFA. (ex. see the other threads re: it started from the JPEG XL repo).

I use asking questions as a way to keep contentious discussions on track without being boorish. And you're right, it can easily be smarmy instead of Socratic without tone, a la the classic internet sarcasm problem.

Gentle note: I only asked one question, and only in the post you replied to.

F3nd0
1 replies
22h33m

Whatever are you referring to? JPEG XL had already been merged into Chromium, prior to being removed again (without a proper reason ever given). As far as I know, the JPEG XL developers have offered to do whatever work was necessary for Chromium specifically, but were never taken up on the offer.

Same thing with Firefox, which has had basic support merged into Nightly, and a couple more patches gathering dust due to lack of involvement from the side of Firefox. Mozilla has since decided to take a neutral stance on JPEG XL, seemingly without doing any kind of proper evaluation. Many other programs (like GIMP, Krita, Safari, Affinity, darktable) already support JPEG XL.

People are not getting upset because projects don’t invest their resources into supporting JPEG XL. People are getting upset because Google (most notably), which has a decisive say in format interoperability, is flat out refusing to give JPEG XL a fair consideration. If they came up with a list of fair conditions JPEG XL has to meet to earn their support, people could work towards that goal, and if JPEG XL failed to meet them, people would easily come to terms with it. Instead, Google has chosen to apply double standards, present vague requirements, and refuse to elaborate. If anyone is ‘preventing even foundational elements of a successful effort’, it’s Google, or more specifically, the part that’s responsible for Chromium.

rockdoe
0 replies
7h4m

has had basic support merged

I read the parent post as saying that this is the problem, i.e. that "complete" support is a mess, because AFAIK even the reference implementation is incomplete and buggy, and that then getting angry at the consumers of it is besides the point and won't lead anywhere (which is what we see in practice).

Browsers supporting a format "a little" is almost worse than not supporting it at all, because it makes the compatibility and interoperability problems worse.

ksec
0 replies
16h43m

"just figure out how to merge it in, and if you don't like it, that's malfeasance!"

It isn't not accepting it or hostile. That is completely not true.

They actively push against JPEG XL, despite all the data, prior to even 1.0 suggest it is or could be better than AVIF in many cases. To the point where they even make up false benchmarks to downplay JPEG XL.

Companies were even willing to paid ( wont name ) and put resources into getting JPEG XL because they see it to be so good. But they still refused.

It is at this point people thought something doggy is going on. And then not only did Google not explain themselves. They were even more hostile.

So why the extra hate? Well partly because this is a company who gave us an over promised WebP and underdelivered.

whywhywhywhy
8 replies
23h24m

If you do creative work countless tools just don’t support webp, AVIF or HEIF.

It’s so prominent running into files you can’t open in your tools that I have a right click convert to PNG context menu

jiggawatts
7 replies
23h10m

They don’t support it because Chromium doesn’t.

Because Chromium doesn’t support it, Electron doesn’t.

Because Electron doesn’t, Teams and other modern web apps and web sites don’t either, etc…

If Google just added JPEG XL support instead then it would be… a supported alternative to JPEG.

You’re saying working in that is a waste of time because… it’s not supported.

modeless
3 replies
22h20m

There's a lot more to format support than Chromium. There's a pretty strong meme out there on the Internet that webp is evil despite being supported in all browsers for years because there's still a lot of software out there that never added support and people get annoyed when an image fails to open.

redeeman
0 replies
22h7m

maybe proprietary software just isnt so good?

lxgr
0 replies
21h37m

I don't think it's evil, but I just don't think it's very good either.

And a graphics format better be damn good (i.e. much, not just a little bit, better than what it's hoping to replace) if it aspires to become widely supported across applications, operating systems, libraries etc.

jug
0 replies
19h42m

At least now with Jpegli, this will surely be the nail in the coffin for WebP?

The article has 35% compression improvements over JPEG mentioned and that's at least as much as usually thrown around when discussing WebP.

n2d4
1 replies
22h59m

Chromium does support WebP and AVIF, yet parent's tools don't.

The_Colonel
0 replies
20h14m

Maybe because WebP and AVIF are actually not that great image formats. WebP has been critiqued as a very mediocre improvement over JPEG all the way since its introduction.

These formats are in Chromium because of Google politics, not because of their technical merit.

whywhywhywhy
0 replies
6h58m

Talking about things that don’t pass through webtech photoshop, after effects, illustrator, Final Cut, davinchi, 3d software, rendering engines etc

ur-whale
8 replies
23h11m

They'll do literally anything rather than implementing JPEG XL over AVIF in Chrome, huh?

Before making that kind of claim, I would spend some time looking at the names of the folks who contributed heavily to the development of JPEG XL and the names of the folks who wrote jpegli.

lxgr
7 replies
22h54m

By "they" I mean "Google, the organization", not "the authors of this work", who most likely have zero say in decisions concerning Chrome.

JyrkiAlakuijala
6 replies
22h38m

Chrome advised and inspired this work in their position about JPEG XL.

Here: https://www.mail-archive.com/blink-dev@chromium.org/msg04351...

"can we optimize existing formats to meet any new use-cases, rather than adding support for an additional format"

It's a yes!

Of course full JPEG XL is quite a bit better still, but this helps old compatible JPEG to support HDR without 8-bit banding artefacts or gainmaps, gives a higher bit depth for other uses where more precision is valuable, and quite a bit better compression, too.

simon_o
3 replies
7h32m

It's a yes!

Reminds me of "You Scientists Were So Preoccupied With Whether Or Not You Could, You Didn't Stop To Think If You Should."

The arithmetic coding feature was already painful enough. I'm simply not in need of yet another thing that makes jpeg files more complicated to deal with.

After weighing the data, we’ve decided to stop Chrome’s

JPEG XL experiment and remove the code associated with

the experiment.

We'll work to publish data in the next couple of weeks.

Did that ever happen?

JyrkiAlakuijala
2 replies
6h57m

I don't see any downsides with Jpegli. Your Linux distro admin exchanges the lib for you, you never need to think about it, only get smaller and more beautiful files. If you use commercial software (Apple, Adobe, mobile phone cameras, Microsoft, ...) hopefully they migrate by themselves.

If they don't, literally nothing happens.

I fail to see a major downside. Perhaps open up your thinking on this?

Yes, Chrome published data.

simon_o
1 replies
5h26m

you never need to think about it, only get smaller and more beautiful files

People said the same thing last time and it took more than 10 years until decoding worked reliably. I'm simply not interested in dealing with another JPEG++.

Perhaps open up your thinking on this?

Nah, I'm fine. I went JXL-only for anything new I'm publishing, and if people need to switch browsers to see it – so be it.

JyrkiAlakuijala
0 replies
4h49m

It's not a new JPEG++. It creates old JPEGs, fully 100% compatible.

(Of course JXL is better still.)

lxgr
0 replies
21h30m

"can we optimize existing formats to meet any new use-cases, rather than adding support for an additional format"

Only within pretty narrow limits.

Classic JPEG will never be as efficient given its age, in the same way that LAME is doing incredible things for MP3 quality, but any mediocre AAC encoder still blows it out of the water.

This is in addition to the things you've already mentioned (HDR) and other new features (support for lossless coding).

And I'd find their sentiment much easier to believe if Google/Chrome weren't hell-bent on making WebP (or more recently AVIF) a thing themselves! That's two formats essentially nobody outside of Google has ever asked for, yet they're part of Chrome and Android.

IshKebab
0 replies
22h16m

Despite the answer being yes, IMO it's pretty clear that the question is disingenuous, otherwise why did they add support for WebP and AVIF? The question applies equally to them.

dchest
2 replies
23h8m

Some authors of this are also the authors of JPEG XL.

lxgr
1 replies
22h59m

I saw that. It's the tragedy of Google in a nutshell: Great things are being worked on in some departments, but organizational dysfunction virtually ensures that the majority of them will not end up in users' hands (or at least not for long).

ksec
0 replies
16h52m

This is work from Google research outside US. You could even call it a different company with the same name. It is Google US who made those AOM / AVIF decisions.

out_of_protocol
1 replies
15h54m

Why no blame on Mozilla for ignoring format as well?

simon_o
0 replies
7h39m

Mozilla only does what Google tells them.

underlines
22 replies
20h31m

JPEGLI = A small JPEG

The suffix -li is used in Swiss German dialects. It forms a diminutive of the root word, by adding -li to the end of the root word to convey the smallness of the object and to convey a sense of intimacy or endearment.

This obviously comes out of Google Zürich.

Other notable Google projects using Swiss German:

https://github.com/google/gipfeli high-speed compression

Gipfeli = Croissant

https://github.com/google/guetzli perceptual JPEG encoder

Guetzli = Cookie

https://github.com/weggli-rs/weggli semantic search tool

Weggli = Bread roll

https://github.com/google/brotli lossless compression

Brötli = Small bread

billyhoffman
12 replies
19h42m

Google Zürich also did Zopfli, a DEFLATE-compliant compressor that gets better ratios than gzip by taking longer to compress.

Apparently Zopfli = small sweet breat

https://en.wikipedia.org/wiki/Zopfli

codetrotter
10 replies
19h29m

They should do XZli next :D

And write it in Rust

tialaramex
7 replies
19h15m

All of the data transformation (codecs, compression etc.) libraries should be in WUFFS. That's exactly what it's for, and unlike the C++ this was written in, or indeed Rust, it's able to provide real compile time safety guarantees for the very affordable price of loss of generality (that is, you can't use WUFFS to write your video game, web browser, word processor, operating system or whatever)

For example in C++ array[index] has Undefined Behaviour on a bounds miss. Rust's array[index] will panic at runtime on a bounds miss, at least we know what will happen but what happens isn't great... WUFFS array[index] will not compile if it could incur a bounds miss. Show the compiler why index will be a value that's always in-bounds when the index occurs.

nigeltao
3 replies
16h8m

Yeah, it's just a coincidence (†), but I started working on Wuffs' LZMA and XZ decoders last December. It works well enough to decode the Linux source code tarball correctly (producing the same output as /usr/bin/xz).

    $ git clone --quiet --depth=1 https://github.com/google/wuffs.git
    $ gcc -O3 wuffs/example/mzcat/mzcat.c -o my-mzcat
    $ ./my-mzcat     < linux-6.8.2.tar.xz | sha256sum 
    d53c712611ea6cb5acaf6627a84d5226692ae90ce41ee599fcc3203e7f8aa359  -
    $ /usr/bin/xz -d < linux-6.8.2.tar.xz | sha256sum 
    d53c712611ea6cb5acaf6627a84d5226692ae90ce41ee599fcc3203e7f8aa359  -
(†) Also, I'm not "Jia Tan"! You're just going to have to trust me on both of those claims. :-/

kuschku
2 replies
7h50m

Also, I'm not "Jia Tan"! You're just going to have to trust me on both of those claims. :-/

No need to trust – it's actually easily verified :) Your activity pattern (blue) is entirely different than jia tan's (orange): https://i.k8r.eu/vRRvVQ.png

(Each day is a row, each column is an hour in UTC. A pixel is filled if a user made a commit, wrote a comment, etc during that hour)

Squeeeez
1 replies
5h14m

So, if one person were to login to one account for a certain time, and then switch accounts for a few hours... Hmmm :o)

kuschku
0 replies
2h56m

Then they'd still need to sleep at some time ;)

asveikau
1 replies
18h23m

The xz backdoor was not about safety. Nor was it really about compilation or compile time checks -- they slipped an extra object file to the linker.

nigeltao
0 replies
16h4m

You're right that Wuffs' memory-safety isn't relevant for this attack.

Still, Wuffs doesn't use autotools, and if you're pulling the library from the https://github.com/google/wuffs-mirror-release-c repository then that repo doesn't even contain any binary-data test files.

JyrkiAlakuijala
1 replies
19h21m

Brotli:11 gets within 0.6 % of LZMA density but decodes 3–5x faster.

codetrotter
0 replies
19h15m

Yeah. But it seems to be most widely used in web browsers.

I’ve never seen a .tar.br file, but I frequently download .tar.xz files.

And therefore, a Rust implementation by Google of xz compression and decompression would be most welcome :)

occamrazor
0 replies
10h4m

Zopf means “braid” and it also denotes a medium-size bread type, made with some milk and glazed with yolk, shaped like a braid, traditionally eaten on Sunday.

codetrotter
6 replies
19h36m

The suffix -li is used in Swiss German dialects

Seems similar to -let in English.

JPEGlet

Or -ito/-ita in Spanish.

JPEGito

(Joint Photographers Experts Grupito)

Or perhaps, if you want to go full Spanish

GEFCito

(Grupito de Expertos en Fotografía Conjunta)

sa-code
4 replies
12h55m

Or JPEGchen in high German

7bit
2 replies
10h45m

Or JPEGle in Swabian German. -le as in left, not as in Pebble

mrbluecoat
1 replies
4h4m

Or JPEGito in Spanish

codetrotter
0 replies
1h10m

We already said that one :D

ale42
0 replies
5h13m

Or JPEGino in Italian

cout
1 replies
13h8m

Interesting, I was expecting there to be some connection to the deblocking jpeg decoder knusperli.

JyrkiAlakuijala
0 replies
12h21m

That would give additional savings.

kloch
21 replies
23h45m

10+ bits. Jpegli can be encoded with 10+ bits per component.

If you are making a new image/video codec in 2024 please don't just give us 2 measly extra bits of DR. Support up to 16 bit unsigned integer and floating point options. Sheesh.

JyrkiAlakuijala
10 replies
23h32m

We insert/extract about 2.5 bits more info from the 8 bit jpegs, leading to about 10.5 bits of precision. There is quite some handwaving necessary here. Basically it comes down to coefficient distributions where the distributions have very high probabilities around zeros. Luckily, this is the case for all smooth noiseless gradients where banding could be otherwise observed.

vlovich123
9 replies
23h23m

Does the decoder have to be aware of it to properly display such an image?

spider-mario
8 replies
22h58m

To display it at all, no. To display it smoothly, yes.

JyrkiAlakuijala
7 replies
22h50m

From a purely theoretical viewpoint 10+ bits encoding will lead into slightly better results even if rendered using a traditional 8 bit decoder. One source of error has been removed from the pipeline.

vlovich123
2 replies
22h20m

How does the data get encoded into 10.5 bits but displayable correctly by an 8 bit decoder while also potentially displaying even more accurately by a 10 bit decoder?

lonjil
0 replies
22h5m

8-bit JPEG actually uses 12-bit DCT coefficients, and traditional JPEG coders have lots of errors due to rounding to 8 bits quite often, while Jpegli always uses floating point internally.

JyrkiAlakuijala
0 replies
22h9m

Through non-standard API extensions you can provide a 16 bit data buffer to jpegli.

The data is carefully encoded in the dct-coefficients. They are 12 bits so in some situations you can get even 12 bit precision. Quantization errors however sum up and worst case is about 7 bits. Luckily it occurs only in the most noisy environments and in smooth slopes we can get 10.5 bits or so.

qingcharles
1 replies
11h32m

Has there been any outreach to get a new HDR decoder for the extra bits into any software?

I might be wrong, but it seems like Apple is the primary game in town for supporting HDR. How do you intend to persuade Apple to upgrade their JPG decoder to support Jpegli?

p.s. keep up the great work!

Sesse__
1 replies
21h24m

Ideally, the decoder should be dithering, I suppose. (I know of zero JPEG decoders that do this in practice.)

lonjil
0 replies
20h5m

Jpegli, of course, does this when you ask for 8 bit output.

Timon3
4 replies
23h40m

From their Github:

Support for 16-bit unsigned and 32-bit floating point input buffers.

"10+" means 10 bits or more.

johnisgood
3 replies
23h34m

Would not ">10" be a better way to denote that?

mkl
2 replies
20h4m

That means something different, but "≥10" would be better IMHO. Really there's an upper limit of 12, and 10.5 is more likely in practice: https://news.ycombinator.com/item?id=39922511

johnisgood
0 replies
9h19m

Yeah, >=, my bad.

JyrkiAlakuijala
0 replies
13h37m

I decided to call it 10.5 bits based on rather fuzzy theoretical analysis and a small amount of practical experiments with using jpegli in HDR use where more bits is good to have. My thinking is that in the slowest smoothest gradients (where banding would otherwise be visible) it is only three quantization decisions that generate error: (0,0), (0,1) and (1, 0) coefficient. Others are close to zero. I consider these as adding stochastic variables that have uniform error. On the average they start to behave a bit like a Gaussian distribution, but each block samples those distributions 64 times so there are going to be some more and some less lucky pixels. If we consider that every block would have one maximally unlucky corner pixel which would get all three wrong.

log(4096/3)/log(2) = 10.41

So, very handwavy analysis.

Experimentally it seems to roughly hold.

Retr0id
3 replies
23h41m

It's not a new codec, it's a new encoder/decoder for JPEG.

whywhywhywhy
1 replies
23h26m

This should have been in a H1 tag at the top of the page. Had to dig into a paragraph to find out Google wasn’t about to launch another image format supported in only a scattering of apps yet served as Image Search results.

Retr0id
0 replies
23h23m

It is. (well, h3 actually)

Introducing Jpegli: A New JPEG Coding Library
JyrkiAlakuijala
0 replies
22h13m

I consider codec to mean a pair of encoder and decoder programs.

I don't consider it to necessarily mean a new data format.

One data format can be implemented by multiple codecs.

Semantics and nomenclature within our field is likely underdeveloped and the use of these terms varies.

bufferoverflow
0 replies
17h15m

For pure viewing of non-HDR content 10 bits is good enough. Very few humans can tell the difference between adjacent shades among 1024 shades. Gradients look smooth.

16 bits is useful for image capture and manipulation. But then you should just use RAW/DNG.

simonw
20 replies
23h40m

High quality results. When images are compressed or decompressed through Jpegli, more precise and psychovisually effective computations are performed and images will look clearer and have fewer observable artifacts.

Does anyone have a link to any example images that illustrate this improvement? I guess the examples would need to be encoded in some other lossless image format so I can reliably view them on my computer.

tedunangst
3 replies
22h50m

What are the file sizes for those two?

edflsafoiewq
1 replies
22h46m

Edit: I'm dumb.

tedunangst
0 replies
22h37m

I would hope the jpegs compress better than png does.

simonw
0 replies
22h26m

The zip file doesn't have the originals, just the PNGs.

pseudosavant
1 replies
20h35m

Perhaps try quality settings in the 70 range, and comparable output file sizes. 95 will be high-quality by definition.

IshKebab
1 replies
22h28m

They're far too high quality to tell anything. There's no point comparing visually lossless images (inb4 "I am amazing and can easily tell...").

deanresin
0 replies
15h36m

Right? I had all 3 open and quickly flipped over them saw no difference. Maybe I'm just uncultured.

modeless
0 replies
22h22m

You shouldn't compare the same quality setting across encoders as it's not standardized. You have to compare based on file size.

n2d4
1 replies
23h7m

I can't blame you, my comment originally didn't have the word "linked", I edited that in after I realized the potential misunderstanding. Maybe you saw it before the edit. My bad.

masfuerte
0 replies
22h56m

Ha ha! No worries. I thought it had changed but I frequently skim read and miss things so I wasn't sure.

andrewla
4 replies
22h37m

As an aside, jpeg is lossless on decode -- once encoded, all decoders will render the same pixels. Since this library produces a valid jpeg file, it should be possible to directly compare the two jpegs.

nigeltao
2 replies
15h43m

all decoders will render the same pixels

Not true. Even just within libjpeg, there are three different IDCT implementations (jidctflt.c, jidctfst.c, jidctint.c) and they produce different pixels (it's a classic speed vs quality trade-off). It's spec-compliant to choose any of those.

A few years ago, in libjpeg-turbo, they changed the smoothing kernel used for decoding (incomplete) progressive JPEGs, from a 3x3 window to 5x5. This meant the decoder produced different pixels, but again, that's still valid:

https://github.com/libjpeg-turbo/libjpeg-turbo/commit/6d91e9...

andrewla
0 replies
13m

I was not aware of that; I thought that it was pretty deterministic.

Nonetheless, for this particular case, comparing jpegs decoded into lossless formats is unnecessary -- you can simply compare the two jpegs directly based on the default renderer in your browser.

JyrkiAlakuijala
0 replies
14h7m

Moritz, the author of that improvement, implemented the same for jpegli.

I believe the standard does not specify what the intermediate progressive renderings should look like.

I developed that interpolation mechanism originally for Pik, and Moritz was able to formulate it directly in the DCT space so that we don't need to go into pixels for the smoothing to happen, but he computed it using a few of the low frequency DCT coefficients.

JyrkiAlakuijala
0 replies
22h32m

It is approximately correct. The rendering is standards compliant without pixel perfection and most decoders make different compromises and render slightly different pixels.

littlestymaar
12 replies
23h41m

Why is it written in C++, when Google made Wuffs[1] for this exact purpose?

[1]: https://github.com/google/wuffs

Sesse__
4 replies
22h0m

Wuffs is for the exact _opposite_ purpose (decoding). It can do simple encoding once you know what bits to put in the file, but a JPEG encoder contains a lot of nontrivial machinery that does not fit well into Wuffs.

(I work at Google, but have nothing to do with Jpegli or Wuffs)

nigeltao
1 replies
19h33m

Encoding is definitely in Wuffs' long term objectives (it's issue #2 and literally in its doc/roadmap.md file). It's just that decoding has been a higher priority. It's also a simpler problem. There's often only one valid decoding for any given input.

Decoding takes a compressed image file as input, which have complicated formats. Roughly speaking, encoding just takes a width x height x 4 pixel buffer, with very regular structure. It's much easier to hide something malicious in a complicated format.

Higher priority means that, when deciding whether to work on a Wuffs PNG encoder or a Wuffs JPEG decoder next, when neither existed at the time, I chose to have more decoders.

(I work at Google, and am the Wuffs author, but have nothing to do with Jpegli. Google is indeed a big company.)

littlestymaar
0 replies
12h35m

Thanks for the answer!

Hackbraten
1 replies
21h47m

The first paragraph in Wuffs's README explicitly states that it's good for both encoding and decoding?

Sesse__
0 replies
21h20m

No, it states that wrangling _can be_ encoding. It does not in any way state that Wuffs is actually _good_ for it at the current stage, and I do not know of any nontrivial encoder built with Wuffs, ever. (In contrast, there are 18 example decoders included with Wuffs. I assume you don't count the checksum functions as encoding.)

ramrunner0xff
3 replies
23h29m

this is a valid question why is it being downvoted?

SunlitCat
2 replies
22h18m

Maybe because the hailing for yet another "safe" language starts to feel kinda repetitive?

Java, C#, Go, Rust, Python, modern C++ with smartpointer,...

I mean a concepts for handling files in a safe way are an awesome (and really needed) thing, but inventing a whole new programming language around a single task (even if it's just a transpiler to c)?

vinkelhake
0 replies
22h9m

One of the advantages with wuffs is that it compiles to C and wuffs-the-library is distributed as C code that is easy to integrate with an existing C or C++ project without having to incorporate new toolchains.

littlestymaar
0 replies
21h31m

Maybe because the hailing for yet another "safe" language starts to feel kinda repetitive?

Ah yeah, because the endless stream of exploits and “new CVE allows for zero-click RCE, please update ASAP” doesn't feel repetitive?

I mean a concepts for handling files in a safe way are an awesome (and really needed) thing, but inventing a whole new programming language around a single task (even if it's just a transpiler to c)?

It's a “single task” in the same way “writing compilers” is a single task. And like we're happy that LLVM IR exists, having a language dedicated to writing codecs (of which there are dozens) is a worthwhile goal, especially since they are both security critical and have stringent performance needs for which existing languages (be it managed languages or Rust) aren't good enough.

vanderZwan
1 replies
23h10m

This is pure speculation, but I'm presuming Wuffs is not the easiest language to use during the research phase, but more of a thing you would implement a format in once it has stabilized. And this is freshly published research.

Probably would be a good idea to get a port though, if possible, improving both safety and performance sounds like a win to me.

nigeltao
0 replies
19h36m

Yeah, you're right. It's not as easy to write Wuffs code during the research phase, since you don't just have to write the code, you also have to help the compiler prove that the code is safe, and sometimes refactor the code to make that tractable.

Wuffs doesn't support global variables, but when I'm writing my own research phase code, sometimes I like to just tweak some global state (without checking the code in) just to get some experimental data: hey, how do the numbers change if I disable the blahblah phase when the such-and-such condition (best evaluated in some other part of the code) holds?

Also, part of Wuffs' safety story is that Wuffs code cannot make any syscalls at all, which implies that it cannot allocate or free memory, or call printf. Wuffs is a language for writing libraries, not whole programs, and the library caller (not callee) is responsible for e.g. allocating pixel buffers. That also makes it harder to use during the research phase.

Blackthorn
0 replies
23h4m

Google is a big company.

jug
9 replies
19h56m

Their claims about Jpegli seem to make WebP obsolete regarding lossy encoding? Similar compression estimates as WebP versus JPEG are brought up.

Hell, I question if AVIF is even worth it with Jpegli.

It's obviously "better" (higher compression) but wait! It's 1) a crappy, limited image format for anything but basic use with obvious video keyframe roots and 2) terribly slow to encode AND 3) decode due to not having any streaming decoders. To decode, you first need to download the entire AVIF to even begin decoding it, which makes it worse than even JPEG/MozJPEG in many cases despite their larger sizes. Yes, this has been benchmarked.

JPEG XL would've still been worth it though because it's just covering so much more ground than JPEG/Jpegli and it has a streaming decoder like a sensible format geared for Internet use, as well as progressive decoding support for mobile networks.

But without that one? Why not just stick with JPEG's then.

ksec
6 replies
16h57m

Sharing similar view. I even go as far as to say jpegli ( and the potential with XYB ICC ) makes JPEG XL just not quite good enough to be worth the effort.

The good thing is that the author of XL ( Jyrki's ) claims there are potential of 20-30% bitrate savings at the low end. So I hope JPEG XL encoder continues to improve.

JyrkiAlakuijala
2 replies
13h29m

You can always use JPEG XL lossless JPEG1 recompression to get some savings in the high end quality, too — if you trust the quality decision heuristics in jpegli/guetzli/other jpeg encoder more than the JPEG XL encoder itself.

We also provide a ~7000 lines-of-code libjxl-tiny that is more similar to jpeg encoders in complexity and coding approach, and a great starting point for building a hardware encoder.

ksec
1 replies
12h4m

JPEG XL lossless JPEG1 recompression

This reminded of something. I so wish iOS 18 could support JPEG XL out of the box rather than Safari only. I have 80GB of Photos on my iPhone. Vast Majority of them were sent over by WhatsApp ( JPEG ). If iOS could simply recompress those into JPEG XL I would instantly gain ~10GB+ of storage.

JyrkiAlakuijala
0 replies
11h40m

What happens if you recompress them losslessly manually to JPEG XL?

jug
1 replies
9h18m

Yes, I agree and I think there is a hurdle in mucking with file formats alone because it always affects interoperability somewhere in the end. I think this also needs to be accounted for - the advantages need to outweigh this downside because it is a downside. I still kind of want JPEG XL but I'm starting to question how much of it is simply due to me being a geek that want tech as good as possible rather than a pragmatic view on this, and I didn't question this as much before Jpegli.

JyrkiAlakuijala
0 replies
9h10m

It can be a question when your uncle's/daughter's/etc phone is full of photos and they asks for advice on how to make more space.

It can be a question of if the photo fits as an email attachment etc.

'Zillions' of seconds of aggregate latency waiting time is spent each day on waiting for web sites to load. Back-of-the-envelope calculations can suggest that the value of reducing waiting time can be in hundreds of billions over the whole time of the deployment. Bandwidth cost to users and energy use may also be significant factors.

themerone
0 replies
14h10m

It's not just about the compression ratio. JPEG XL improvements in generational loss are reason enough that it should be the default format for the web.

lonjil
1 replies
19h35m

Their claims about Jpegli seem to make WebP obsolete regarding lossy encoding? Similar compression estimates as WebP versus JPEG are brought up.

I believe Jpegli beats WebP for medium to high quality compression. I would guess that more than half of all WebP images on the net would definitely be smaller as Jpegli-encoded JPEGs of similar quality. And note that Jpegli is actually worse than MozJPEG and libjpeg-turbo at medium-low qualities. Something like libjpeg-turbo q75 is the crossover point I believe.

Hell, I question if AVIF is even worth it with Jpegli.

According to another test [1], for large (like 10+ Mpix) photographs compressed with high quality, Jpegli wins over AVIF. But AVIF seems to win for "web size" images. Though, as for point 2 in your next paragraph, Jpegli is indeed much faster than AVIF.

JPEG XL would've still been worth it though because it's just covering so much more ground than JPEG/Jpegli and it has a streaming decoder like a sensible format geared for Internet use, as well as progressive decoding support for mobile networks.

Indeed. At a minimum, JXL gives you another 20% size reduction just from the better entropy coding.

[1] https://cloudinary.com/blog/jpeg-xl-and-the-pareto-front

ksec
0 replies
17h3m

I would guess that more than half of all WebP images on the net would definitely be smaller as Jpegli-encoded JPEGs of similar quality.

That was what I expected a long time ago but it turns out to be a false assumption. According to Google with data from Chrome. 80%+ of images on the web are bpp 1.0+.

aendruk
9 replies
21h20m

Looks like it’s not very competitive at low bitrates. I have a project that currently encodes images with MozJPEG at quality 60 and just tried switching it to Jpegli. When tuned to produce comparable file sizes (--distance=4.0) the Jpegli images are consistently worse.

ruuda
7 replies
20h30m

What is your use case for degrading image quality that much? At quality level 80 the artifacts are already significant.

aendruk
4 replies
20h28m

Thumbnails at a high pixel density. I just want them up fast. Any quality that can be squeezed out of it is a bonus.

londons_explore
2 replies
17h53m

JPEG has a fixed macroblock size (16x16 pixels), which negatively affects high resolution low bitrate images.

If you must use JPEG, I suspect you might get better visual quality by halving the resolution and upsampling on the client.

By doing so, you are effectively setting the lower and right halves of the DCT to zero (losing all high resolution info), but get to have 32x32 pixel macroblocks which lets you better make use of low frequency spacial patterns.

egorfine
1 replies
9h58m

Oh, that's interesting. I typically serve thumbnails at 2x resolution and heavily compressed. Should I try to instead compress them less but serve at 0.5x resolution?

londons_explore
0 replies
6h54m

I'd say it's worth a try.

lonjil
0 replies
19h34m

I recently noticed that all the thumbnails on my computer are PNG, which I thought was funny.

egorfine
0 replies
9h59m

Thumbnails. I typically serve them at 2x resolution but extremely heavily compressed. Still looks good enough in browser when scaled down.

Brian_K_White
0 replies
16h50m

I apologize that this will seem like, well it IS frankly, more reaction than is really justified, sorry for that. But this question is an example of a thing people commonly do that I think is not good and I want to point it out once in a while when I see it:

There are infinite use-cases for everything beside one's own tiny personal experience and imagination. It's not remarkable that someone tested for the best version of something you personally don't have a use for.

Pretend they hadn't answered the question. The answer is it doesn't matter.

They stated a goal of x, and compared present-x against testing-x and found present-x was the better-x.

"Why do they want x when I only care about y?" is irrelevant.

I mean you may be idly curious and that's not illegal, but you also stated a reason for the question which makes the question not idle but a challenge (the "when I only care about y" part).

What I mean by "doesn't matter" is, whatever their use-case is, it's automatically always valid, and so it doesn't change anything, and so it doesn't matter.

Their answer happened to be something you probably agree is a valid use-case, but that's just happenstance. They don't have to have a use-case you happen to approve of or even understand.

JyrkiAlakuijala
0 replies
19h7m

I believe that they should be roughly the same in a photography corpus density at quality 60. Consider filing an issue if some image is worse with jpegli.

theanonymousone
7 replies
23h59m

Is that from Google Zürich?

JyrkiAlakuijala
6 replies
23h53m

Yes!

veselin
3 replies
23h43m

When I saw the name, I knew immediately this is Jyrki's work.

082349872349872
2 replies
23h14m

I'm waiting for huaraJPEG...

actionfromafar
1 replies
22h6m

what is that?

mensi
0 replies
22h1m

a much ruder but just as stereotypically Swiss German thing as the "-li" suffix ;)

thaliaarchi
1 replies
23h1m

I’ve never heard of a Jpegli bread, but Zöpfli and Brötli sure are yummy :)

JyrkiAlakuijala
0 replies
22h30m

I thought clarity was more important.

Otherwise it would be called ... Pumpernikkeli.

mgraczyk
7 replies
22h32m

Jpegli can be encoded with 10+ bits per component.

How are the extra bits encoded?

Is this the JPEG_R/"Ultra HDR" format, or has Google come up with yet another metadata solution? Something else altogether?

Ultra HDR: https://developer.android.com/media/platform/hdr-image-forma...

JyrkiAlakuijala
2 replies
21h24m

Ultra HDR can have two jpegs inside, one for the usual image and another for the gain-map.

Hypothetically, both jpegs can be created with jpegli.

Hypothetically, both Ultra HDR jpegs can be decoded with jpegli.

In theory jpegli would remove the 8 bit striping that would otherwise be present in Ultra HDR.

I am not aware of jpegli-based Ultra HDR implementations.

A personal preference for me would be a single Jpegli JPEG and very fast great local tone mapping (HDR source, tone mapping to SDR). Some industry experts are excited about Ultra HDR, but I consider it is likely too complicated to get right in editing software and automated image processing pipelines.

Zardoz84
1 replies
11h58m

What is the point of that complexity if JPEG XL can store HDR images ?

JyrkiAlakuijala
0 replies
11h30m

The main idea why Ultra HDR is done like that is that the content creator (photographer) can control the local tone mapping. I think.

lonjil
1 replies
22h16m

It's regular old JPEG1. I don't know the details, but it turns out that "8 bit" JPEG actually has enough precision in the format to squeeze out another 2.5 bits, as long as both the encoder and the decoder use high precision math.

actionfromafar
0 replies
22h7m

Wow, this is the first time I heard about that. I wonder if Lightroom uses high precision math.

donatzsky
1 replies
22h8m

This has nothing to do with Ultra HDR. It's "simply" a better JPEG encoder.

Ultra HDR is a standard SDR JPEG + a gain map that allows the construction of an HDR version. Specifically it's an implementation of Adobe's Gain Map specification, with some extra (seemingly pointless) Google bits. Adobe gain Map: https://helpx.adobe.com/camera-raw/using/gain-map.html

mgraczyk
0 replies
22h6m

Thanks, I was on the team that did Ultra HDR at Google so I was curious if it was being used here. Didn't see anything in the code though so that makes sense.

lars_thomas
5 replies
22h2m

Sorry for perhaps missing it but it states "It provides both a fully interoperable encoder and decoder complying with the original JPEG standard". Does that mean that jpegli-encoded images can be decoded by all jpeg decoders? But it will not have the same quality?

lonjil
4 replies
22h0m

Jpegli encoded images decode just fine with any JPEG decoder, and will still be of great quality. All the tests were done with libjpeg-turbo as the decoder. Using Jpegli for decoding gives you a bit better quality and potentially higher bit depth.

lars_thomas
2 replies
21h51m

Thanks, sounds great! Not sure if you are part of the research team but a follow up question nevertheless. Learning from JpegXL, what would it take to develop another widely supported image format? Would the research stage already need to be carried out as a multi-corporate effort?

lonjil
0 replies
21h35m

Not sure if you are part of the research team but a follow up question nevertheless.

I am not.

what would it take to develop another widely supported image format? Would the research stage already need to be carried out as a multi-corporate effort?

I believe JXL will be very successful sooner or later, it already has a lot more support than many other attempts at new image formats.

But in general, the main way to get fast adoption on the web is to have Chromium's codec team be the main developers.

JyrkiAlakuijala
0 replies
21h38m

Multi-corporate effort would likely need to start by first agreeing what is image quality.

Image quality folks are more cautious and tradition-centric than codec devs, so quite an initial effort would be needed to use something as advanced and risky as butteraugli, ssimulacra or XYB. With traditional objective metrics it would be very difficult to make a competing format as they would start with a 10–15 % disadvantage.

So, I think it is not easy and would need substantial investment.

01HNNWZ0MV43FF
0 replies
21h56m

Sort of like the quality vs. speed settings on libx264, I suppose jpegli aims to push the Pareto boundary on both quality and speed without changing the decode spec

bArray
5 replies
10h5m

In order to quantify Jpegli's image quality improvement we enlisted the help of crowdsourcing raters to compare pairs of images from Cloudinary Image Dataset '22, encoded using three codecs: Jpegli, libjpeg-turbo and MozJPEG, at several bitrates.

Looking further [1]:

It consists in requiring a choice between two different distortions of the same image, and computes an Elo ranking (an estimate of the probability of each method being considered higher quality by the raters) of distortions based on that. Compared to traditional Opinion Score methods, it avoids requiring test subjects to calibrate their scores.

This seems like a bad way to evaluate image quality. Humans can tend towards liking more highly saturated colours, which would be a distortion of the original image. If it was just a simple kernel that turned any image into a GIF cartoon, and then I had it rated by cartoon enthusiasts, I'm sure I could prove GIF is better than JPEG.

I think that to produce something more fair, it would need to be "Given the following raw image, which of the following two images appears to better represent the above image?" The allowed answers should be "A", "B" and "unsure".

ELO would likely be less appropriate. I would also like to see an analysis regarding which images were most influential in deciding which approach is better and why. Is it colour related, artefact related, information frequency related? I'm sure they could gain some deeper insight into why one method is favoured over the other.

[1] https://github.com/google-research/google-research/blob/mast...

zond
1 replies
7h44m

The next sentence says "The test subject is able to flip between the two distortions, and has the original image available on the side for comparison at all times.", which indicates that the subjects weren't shown only the distortions.

Permik
1 replies
8h44m

One other thing to control for is the subpixel layout of their display which is almost always forgotten in these studies.

bArray
0 replies
6h27m

I did think about this - but then I thought the variation in displays/monitors and people would enhance the experiment.

geraldhh
0 replies
6h34m

Humans can tend towards liking more highly saturated colours, which would be a distortion of the original image.

android with google photos did/does this whereas apple went with enhanced contrast.

as far as i can tell, they're both wrong but one mostly notices the 'distortion' if used to the other.

asicsarecool
5 replies
23h4m

Awesome google. Disable zooming on mobile so I can't see the graph detail.

You guys should up your web game

politelemon
4 replies
23h2m

Many sites do this out of a misguided notion of what web development is.

FWIW Firefox mobile lets you override zooming on a site.

pandemic_region
1 replies
22h15m

FWIW Firefox mobile lets you override zooming on a site.

What in heavens name and why is this not a default option? Thankee Sai!

modeless
0 replies
22h12m

Chrome also has an option for this and it's great

_carbyau_
1 replies
13h30m

Having just searched a little for this, to turn this on you need about:config.

Apparently about:config is still not available on Firefox Mobile main release it seems.

Supposedly available on Dev/Beta/Nightly or similar - unverified statement though.

Annoying.

kbrosnan
0 replies
12h28m

There is no need to go to about:config.

Firefox 3 dot menu -> Settings -> Accessibility -> Zoom on all websites

p0nce
4 replies
23h59m

Wonder how it compares to guetzli, which is good albeit slow (by Google also!).

JyrkiAlakuijala
2 replies
23h53m

I believe guetzli is slightly more robust around quality 94, but jpegli likely better at or equal at lower qualities like below 85. Jpegli is likely about 1000x faster and still good.

ronjouch
1 replies
16h4m

That's my experience, yes. Just tested it on a 750kB 1080p image with detailed areas and areas with gradients. Highly unscientific, N=1 results:

- Guetzli at q=84 (the minimum allowed by Guetzli) takes 47s and produces a 403kB image.

- Jpegli at q=84 takes 73ms (MILLIseconds) and produces a mostly-indistinguishable 418kB image. "Mostly" because:

A. it's possible to find areas with subtle color gradients where Guetzli does a better job at keeping it smooth over a large area.

B. "Little specks" show a bit more "typical JPG color-mush artifacting" around the speck with Jpegli than Guetzli, which stays remarkably close to the original

Also, compared to the usual encoder I'm used to (e.g. the one in GIMP, libjpeg maybe?), Jpegli seems to degrade pretty well going into lower qualities (q=80, q=70, q=60). Qualities lower than q=84 are not even allowed by Guetzli (unless you do a custom build).

I'm immediately switching my "smallify jpg" Nautilus script from Guetzli to Jpegli. The dog-slowness of Guetzli used to be tolerable when there was no close contender, but now it feels unjustified in comparison to the instant darn excellent result of Jpegli.

JyrkiAlakuijala
0 replies
13h55m

Thank you for such a well-informed report!

With guetzli I added manually overprovisioning for slow smooth gradients. If you have an example where guetzli is better with gradients you could post an issue with a sample image. That would help us to potentially fix it for jpegli, too.

donatj
0 replies
19h8m

I was wondering the same thing. We have a system that guetzli's some of our heavily used assets but it takes SOOOOOO long

ctz
4 replies
23h4m

From the people who bought you WebP CVE-2023-41064/CVE-2023-4863...

vanderZwan
3 replies
23h1m

No, these are not the WebP people. These are Google's JXL people.

JyrkiAlakuijala
2 replies
22h54m

I designed WebP lossless and implemented the first encoder for it. Zoltan who did most of the implementation work for jpegli wrote the first decoder.

vanderZwan
1 replies
4h42m

Ok well, in that case I'll label you two as the Renaissance men of image formats

JyrkiAlakuijala
0 replies
3h25m

Haha. Thank you!

xfalcox
3 replies
1d

Does anyone have compiled this to WASM? I'm currently using MozJPEG via WASM for a project and would love to test replacing it by Jpegli.

jeffbee
1 replies
21h18m

Just for my own edification, why would there be any trouble compiling portable high-level libraries to target WASM?

tedunangst
0 replies
20h52m

Maybe it uses threads.

tambourine_man
3 replies
23h7m

Is there an easy way for us to install and test it?

aendruk
0 replies
23h3m

nixpkgs unstable (i.e. 24.05) has it at ${libjxl}/bin/cjpegli

publius_0xf3
2 replies
20h36m

Can we just get rid of lossy image compression, please? It's so unpleasant looking at pictures on social media and watching them degrade over time as they are constantly reposted. What will these pictures look like a century from now?

npteljes
0 replies
10h41m

It's not the social network's job to preserve image quality. That would be mixing up the concerns.

adamzochowski
0 replies
20h1m

Please, keep lossy compression. Web is unusable already with websites too big as it is.

What should happen: websites/applications shouldn't recompress images if they already deliver good pixel bitrate. Websites/applicates shouldn't recompress images just to add own watermarks.

jeffbee
0 replies
21h16m

Did you mean other than the jpeg decode that is already in wuffs?

greenavocado
0 replies
23h25m

Google has a direct conflict of interest: the AVIF team.

atdt
2 replies
17h34m

When using Jpegli as a drop-in replacement for for libjpeg-turbo (i.e., with the same input bit-map and quality setting), will the output produced by Jpegli be smaller, more beautiful, or both? Are the space savings the result of the Jpegli encoder being able to generate comparable or better-looking images at lower quality settings? I'd like to understand whether capitalizing on the space efficiency requires any modification the caller code.

pizza
0 replies
17h21m

I think the main benefit is a better decorrelation transform so the compression is higher at the same quality parameter. So you could choose whether you want better accuracy for the same quality parameter, or lower the quality parameter and get better fidelity than you would have otherwise. Probably to get both most of the time, just use JPEGXL

JyrkiAlakuijala
0 replies
14h4m

The output will be smaller after replacing libjpeg-turbo or mozjpeg with jpegli. You don't need to do any code changes.

therealmarv
1 replies
22h58m

oh, probably we will get it soon in ImageOptim then https://imageoptim.com/

vladstudio
0 replies
22h30m

Thanks in advance!

pizlonator
1 replies
21h28m

Very interesting that new projects like this still use C++, not something like Rust.

mkl
0 replies
19h57m

When your aim is maximum adoption and compatibility with existing C++ software, C++ or C are the best choice. When you're building on an existing C++ codebase, switching language and doing a complete rewrite is very rarely sensible.

jbverschoor
1 replies
23h39m

Google sure did a shitty job of explaining the whole situation.

JpegXL was kicked out. This thing is added, but the repo and code seems to be from jxl.

I'm very confused.

JyrkiAlakuijala
0 replies
23h30m

It was just easiest to develop in libjxl repo. All test workers etc. are already setup there. This was done by a very small team...

edent
1 replies
21h58m

I'd love to know what "Paradigms of Intelligence" means in this context.

larodi
0 replies
21h45m

I would've loved to see side-by-side comparison... after all we talk visuals here, right? So as this old saying goes: a hand to touch, an eye to see.

Not underestimating the value in this, but the presentation is very weak.

Waterluvian
1 replies
23h49m

This is the kind of realm I'm fascinated by: taking an existing chunk of something, respecting the established interfaces (ie. not asking everyone to support yet another format), and seeing if you can squeeze out an objectively better implementation. It's such a delicious kind of performance gain because it's pretty much a "free lunch" with a very comfortable upgrade story.

terrelln
0 replies
21h53m

I agree, this is a very exciting direction. We shouldn’t let existing formats stifle innovation, but there is a lot of value in back porting modern techniques to existing encoders.

Mr_Minderbinder
1 replies
11h15m

It would help if the authors explained how exactly they used the Elo rating system to evaluate quality, since this seems like a non-standard and rather unusual use case for this. I am guessing that if an image is rated better than another that counts as a "win"?

Finally, writing "ELO" instead of "Elo" is incorrect (this is just one of my pet peeves but indulge me nevertheless). This is some guy's name not an abbreviation, nor a prog rock band from the 70's! You would not write "ELO's" rating system for the same reason you wouldn't write "DIJKSTRA's" algorithm.

LeoNatan25
1 replies
21h23m

10+ bits. Jpegli can be encoded with 10+ bits per component. Traditional JPEG coding solutions offer only 8 bit per component dynamics causing visible banding artifacts in slow gradients. Jpegli's 10+ bits coding happens in the original 8-bit formalism and the resulting images are fully interoperable with 8-bit viewers. 10+ bit dynamics are available as an API extension and application code changes are needed to benefit from it.

So, instead of supporting JPEG XL, this is the nonsense they come up with? Lock-in over a JPEG overlay?

spider-mario
0 replies
19h7m

This is from the same team as JPEG XL, and there is no lock-in or overlay. It’s just exploiting the existing mechanics better by not doing unnecessary truncation. The new APIs are only required because the existing APIs receive and return 8-bit buffers.

cactusplant7374
0 replies
1d

These heuristics are much faster than a similar approach originally used in guetzli.

I liked guetzli but it's way too slow to use in production. Glad there is an alternative.