[HN Gopher] Jpegli: A new JPEG coding library
___________________________________________________________________
Jpegli: A new JPEG coding library
Author : todsacerdoti
Score : 186 points
Date : 2024-04-03 17:51 UTC (5 hours ago)
(HTM) web link (opensource.googleblog.com)
(TXT) w3m dump (opensource.googleblog.com)
| xfalcox wrote:
| Does anyone have compiled this to WASM? I'm currently using
| MozJPEG via WASM for a project and would love to test replacing
| it by Jpegli.
| simonw wrote:
| A WASM demo of this would be fantastic, would make it much
| easier for people to try it out.
|
| Maybe a fork of https://squoosh.app/ ? The code for that is
| https://github.com/GoogleChromeLabs/squoosh
| jeffbee wrote:
| Just for my own edification, why would there be any trouble
| compiling portable high-level libraries to target WASM?
| tedunangst wrote:
| Maybe it uses threads.
| cactusplant7374 wrote:
| > These heuristics are much faster than a similar approach
| originally used in guetzli.
|
| I liked guetzli but it's way too slow to use in production. Glad
| there is an alternative.
| theanonymousone wrote:
| Is that from Google Zurich?
| JyrkiAlakuijala wrote:
| Yes!
| veselin wrote:
| When I saw the name, I knew immediately this is Jyrki's work.
| 082349872349872 wrote:
| I'm waiting for huaraJPEG...
| actionfromafar wrote:
| what is that?
| mensi wrote:
| a much ruder but just as stereotypically Swiss German
| thing as the "-li" suffix ;)
| thaliaarchi wrote:
| I've never heard of a Jpegli bread, but Zopfli and Brotli
| sure are yummy :)
| JyrkiAlakuijala wrote:
| I thought clarity was more important.
|
| Otherwise it would be called ... Pumpernikkeli.
| p0nce wrote:
| Wonder how it compares to guetzli, which is good albeit slow (by
| Google also!).
| JyrkiAlakuijala wrote:
| I believe guetzli is slightly more robust around quality 94,
| but jpegli likely better at or equal at lower qualities like
| below 85. Jpegli is likely about 1000x faster and still good.
| Waterluvian wrote:
| This is the kind of realm I'm fascinated by: taking an existing
| chunk of something, respecting the established interfaces (ie.
| not asking everyone to support yet another format), and seeing if
| you can squeeze out an objectively better implementation. It's
| such a delicious kind of performance gain because it's pretty
| much a "free lunch" with a very comfortable upgrade story.
| terrelln wrote:
| I agree, this is a very exciting direction. We shouldn't let
| existing formats stifle innovation, but there is a lot of value
| in back porting modern techniques to existing encoders.
| kloch wrote:
| > 10+ bits. Jpegli can be encoded with 10+ bits per component.
|
| If you are making a new image/video codec in 2024 please don't
| just give us 2 measly extra bits of DR. Support up to 16 bit
| unsigned integer _and_ floating point options. Sheesh.
| Retr0id wrote:
| It's not a new codec, it's a new encoder/decoder for JPEG.
| whywhywhywhy wrote:
| This should have been in a H1 tag at the top of the page. Had
| to dig into a paragraph to find out Google wasn't about to
| launch another image format supported in only a scattering of
| apps yet served as Image Search results.
| Retr0id wrote:
| It is. (well, h3 actually)
|
| > Introducing Jpegli: A New JPEG Coding Library
| JyrkiAlakuijala wrote:
| I consider codec to mean a pair of encoder and decoder
| programs.
|
| I don't consider it to necessarily mean a new data format.
|
| One data format can be implemented by multiple codecs.
|
| Semantics and nomenclature within our field is likely
| underdeveloped and the use of these terms varies.
| Timon3 wrote:
| From their Github:
|
| > Support for 16-bit unsigned and 32-bit floating point input
| buffers.
|
| "10+" means 10 bits or more.
| johnisgood wrote:
| Would not ">10" be a better way to denote that?
| mkl wrote:
| That means something different, but ">=10" would be better
| IMHO. Really there's an upper limit of 12, and 10.5 is more
| likely in practice:
| https://news.ycombinator.com/item?id=39922511
| JyrkiAlakuijala wrote:
| We insert/extract about 2.5 bits more info from the 8 bit
| jpegs, leading to about 10.5 bits of precision. There is quite
| some handwaving necessary here. Basically it comes down to
| coefficient distributions where the distributions have very
| high probabilities around zeros. Luckily, this is the case for
| all smooth noiseless gradients where banding could be otherwise
| observed.
| vlovich123 wrote:
| Does the decoder have to be aware of it to properly display
| such an image?
| spider-mario wrote:
| To display it at all, no. To display it smoothly, yes.
| JyrkiAlakuijala wrote:
| From a purely theoretical viewpoint 10+ bits encoding
| will lead into slightly better results even if rendered
| using a traditional 8 bit decoder. One source of error
| has been removed from the pipeline.
| vlovich123 wrote:
| How does the data get encoded into 10.5 bits but
| displayable correctly by an 8 bit decoder while also
| potentially displaying even more accurately by a 10 bit
| decoder?
| JyrkiAlakuijala wrote:
| Through non-standard API extensions you can provide a 16
| bit data buffer to jpegli.
|
| The data is carefully encoded in the dct-coefficients.
| They are 12 bits so in some situations you can get even
| 12 bit precision. Quantization errors however sum up and
| worst case is about 7 bits. Luckily it occurs only in the
| most noisy environments and in smooth slopes we can get
| 10.5 bits or so.
| lonjil wrote:
| 8-bit JPEG actually uses 12-bit DCT coefficients, and
| traditional JPEG coders have lots of errors due to
| rounding to 8 bits quite often, while Jpegli always uses
| floating point internally.
| Sesse__ wrote:
| Ideally, the decoder should be dithering, I suppose. (I
| know of zero JPEG decoders that do this in practice.)
| lonjil wrote:
| Jpegli, of course, does this when you ask for 8 bit
| output.
| littlestymaar wrote:
| Why is it written in C++, when Google made Wuffs[1] for this
| exact purpose?
|
| [1]: https://github.com/google/wuffs
| ramrunner0xff wrote:
| this is a valid question why is it being downvoted?
| SunlitCat wrote:
| Maybe because the hailing for yet another "safe" language
| starts to feel kinda repetitive?
|
| Java, C#, Go, Rust, Python, modern C++ with smartpointer,...
|
| I mean a concepts for handling files in a safe way are an
| awesome (and really needed) thing, but inventing a whole new
| programming language around a single task (even if it's just
| a transpiler to c)?
| vinkelhake wrote:
| One of the advantages with wuffs is that it compiles to C
| and wuffs-the-library is distributed as C code that is easy
| to integrate with an existing C or C++ project without
| having to incorporate new toolchains.
| littlestymaar wrote:
| > Maybe because the hailing for yet another "safe" language
| starts to feel kinda repetitive?
|
| Ah yeah, because the endless stream of exploits and "new
| CVE allows for zero-click RCE, please update ASAP" doesn't
| feel repetitive?
|
| > I mean a concepts for handling files in a safe way are an
| awesome (and really needed) thing, but inventing a whole
| new programming language around a single task (even if it's
| just a transpiler to c)?
|
| It's a "single task" in the same way "writing compilers" is
| a single task. And like we're happy that LLVM IR exists,
| having a language dedicated to writing codecs (of which
| there are dozens) is a worthwhile goal, especially since
| they are both security critical and have stringent
| performance needs for which existing languages (be it
| managed languages or Rust) aren't good enough.
| vanderZwan wrote:
| This is pure speculation, but I'm presuming Wuffs is not the
| easiest language to use during the _research_ phase, but more
| of a thing you would implement a format in once it has
| stabilized. And this is freshly published research.
|
| Probably would be a good idea to get a port though, if
| possible, improving both safety and performance sounds like a
| win to me.
| nigeltao wrote:
| Yeah, you're right. It's not as easy to write Wuffs code
| during the research phase, since you don't just have to write
| the code, you also have to help the compiler prove that the
| code is safe, and sometimes refactor the code to make that
| tractable.
|
| Wuffs doesn't support global variables, but when I'm writing
| my own research phase code, sometimes I like to just tweak
| some global state (without checking the code in) just to get
| some experimental data: hey, how do the numbers change if I
| disable the blahblah phase when the such-and-such condition
| (best evaluated in some other part of the code) holds?
|
| Also, part of Wuffs' safety story is that Wuffs code _cannot
| make any syscalls at all_ , which implies that it cannot
| allocate or free memory, or call printf. Wuffs is a language
| for writing libraries, not whole programs, and the library
| caller (not callee) is responsible for e.g. allocating pixel
| buffers. That also makes it harder to use during the research
| phase.
| Blackthorn wrote:
| Google is a big company.
| Sesse__ wrote:
| Wuffs is for the exact _opposite_ purpose (decoding). It can do
| simple encoding once you know what bits to put in the file, but
| a JPEG encoder contains a lot of nontrivial machinery that does
| not fit well into Wuffs.
|
| (I work at Google, but have nothing to do with Jpegli or Wuffs)
| Hackbraten wrote:
| The first paragraph in Wuffs's README explicitly states that
| it's good for both encoding and decoding?
| Sesse__ wrote:
| No, it states that wrangling _can be_ encoding. It does not
| in any way state that Wuffs is actually _good_ for it at
| the current stage, and I do not know of any nontrivial
| encoder built with Wuffs, ever. (In contrast, there are 18
| example decoders included with Wuffs. I assume you don't
| count the checksum functions as encoding.)
| nigeltao wrote:
| Encoding is definitely in Wuffs' long term objectives (it's
| issue #2 and literally in its doc/roadmap.md file). It's just
| that decoding has been a higher priority. It's also a simpler
| problem. There's often only _one_ valid decoding for any
| given input.
|
| Decoding takes a compressed image file as input, which have
| complicated formats. Roughly speaking, encoding just takes a
| width x height x 4 pixel buffer, with very regular structure.
| It's much easier to hide something malicious in a complicated
| format.
|
| Higher priority means that, when deciding whether to work on
| a Wuffs PNG encoder or a Wuffs JPEG decoder next, when
| neither existed at the time, I chose to have more decoders.
|
| (I work at Google, and am the Wuffs author, but have nothing
| to do with Jpegli. Google is indeed a big company.)
| simonw wrote:
| > High quality results. When images are compressed or
| decompressed through Jpegli, more precise and psychovisually
| effective computations are performed and images will look clearer
| and have fewer observable artifacts.
|
| Does anyone have a link to any example images that illustrate
| this improvement? I guess the examples would need to be encoded
| in some other lossless image format so I can reliably view them
| on my computer.
| n2d4 wrote:
| You can find them in the mucped23.zip file linked here (encoded
| as PNG): https://github.com/google-research/google-
| research/tree/mast...
| simonw wrote:
| Thanks - I downloaded that zip file (460MB!) and extracted
| one of the examples into a Gist: https://gist.github.com/simo
| nw/5a8054f18f9ea3c560b628b16b00f...
|
| Here's an original: https://gist.githubusercontent.com/simonw
| /5a8054f18f9ea3c560...
|
| And the jpegli-q95- version: https://gist.githubusercontent.c
| om/simonw/5a8054f18f9ea3c560...
|
| And the same thing with mozjpeg-a95 https://gist.githubuserco
| ntent.com/simonw/5a8054f18f9ea3c560...
| tedunangst wrote:
| What are the file sizes for those two?
| edflsafoiewq wrote:
| Edit: I'm dumb.
| tedunangst wrote:
| I would hope the jpegs compress better than png does.
| simonw wrote:
| The zip file doesn't have the originals, just the PNGs.
| IshKebab wrote:
| They're far too high quality to tell anything. There's no
| point comparing visually lossless images (inb4 "I am
| amazing and can easily tell...").
| modeless wrote:
| You shouldn't compare the same quality setting across
| encoders as it's not standardized. You have to compare
| based on file size.
| pseudosavant wrote:
| Perhaps try quality settings in the 70 range, and
| comparable output file sizes. 95 will be high-quality by
| definition.
| simonw wrote:
| There are some 70s and 65s in the gist: https://gist.gith
| ub.com/simonw/5a8054f18f9ea3c560b628b16b00f...
| masfuerte wrote:
| You said linked but like a fool I went looking for a zip in
| the repository. This is the link:
|
| https://cloudinary.com/labs/cid22/mucped23.zip (460MB)
| n2d4 wrote:
| I can't blame you, my comment originally didn't have the
| word "linked", I edited that in after I realized the
| potential misunderstanding. Maybe you saw it before the
| edit. My bad.
| masfuerte wrote:
| Ha ha! No worries. I thought it had changed but I
| frequently skim read and miss things so I wasn't sure.
| andrewla wrote:
| As an aside, jpeg is lossless on decode -- once encoded, all
| decoders will render the same pixels. Since this library
| produces a valid jpeg file, it should be possible to directly
| compare the two jpegs.
| JyrkiAlakuijala wrote:
| It is approximately correct. The rendering is standards
| compliant without pixel perfection and most decoders make
| different compromises and render slightly different pixels.
| JyrkiAlakuijala wrote:
| https://twitter.com/jyzg/status/1622890389068718080
|
| Some earlier results. Perhaps these were with XYB color space,
| I don't remember ...
| kaladin-jasnah wrote:
| I wonder if Google will make a JPEG library with
| https://github.com/google/wuffs at some point.
| greenavocado wrote:
| Google has a direct conflict of interest: the AVIF team.
| jeffbee wrote:
| Did you mean other than the jpeg decode that is already in
| wuffs?
| lxgr wrote:
| They'll do literally anything rather than implementing JPEG XL
| over AVIF in Chrome, huh?
|
| I mean, of course this is still valuable (JPEG-only consumers
| will probably be around for decades, just like MP3-only players),
| and I realize Google is a large company, but man, the optics on
| this...
| whywhywhywhy wrote:
| If you do creative work countless tools just don't support
| webp, AVIF or HEIF.
|
| It's so prominent running into files you can't open in your
| tools that I have a right click convert to PNG context menu
| jiggawatts wrote:
| They don't support it because Chromium doesn't.
|
| Because Chromium doesn't support it, Electron doesn't.
|
| Because Electron doesn't, Teams and other modern web apps and
| web sites don't either, etc...
|
| If Google just added JPEG XL support instead then it would
| be... a supported alternative to JPEG.
|
| You're saying working in that is a waste of time because...
| it's not supported.
| n2d4 wrote:
| Chromium does support WebP and AVIF, yet parent's tools
| don't.
| The_Colonel wrote:
| Maybe because WebP and AVIF are actually not that great
| image formats. WebP has been critiqued as a very mediocre
| improvement over JPEG all the way since its introduction.
|
| These formats are in Chromium because of Google politics,
| not because of their technical merit.
| modeless wrote:
| There's a lot more to format support than Chromium. There's
| a pretty strong meme out there on the Internet that webp is
| evil despite being supported in all browsers for years
| because there's still a lot of software out there that
| never added support and people get annoyed when an image
| fails to open.
| redeeman wrote:
| maybe proprietary software just isnt so good?
| lxgr wrote:
| I don't think it's evil, but I just don't think it's very
| good either.
|
| And a graphics format better be damn good (i.e. much, not
| just a little bit, better than what it's hoping to
| replace) if it aspires to become widely supported across
| applications, operating systems, libraries etc.
| jug wrote:
| At least now with Jpegli, this will surely be the nail in
| the coffin for WebP?
|
| The article has 35% compression improvements over JPEG
| mentioned and that's at least as much as usually thrown
| around when discussing WebP.
| refulgentis wrote:
| I have no love for Google, at all.
|
| It's really hard to say this in public, because people are
| treating it like a divisive "us or them" issue that's obvious,
| but the JPEG-XL stuff is _weird_.
|
| I've been in codecs for 15 years, and have never seen as
| unconstructive behavior like the JPEG-XL work. If I had
| infinite time and money and it came across my plate, we'd have
| a year or two of constructive work to do, so we didn't just
| rush something in with obvious issues and opportunities.
|
| It turned into "just figure out how to merge it in, and if you
| don't like it, that's malfeasance!" Bread and circus for
| commentators, maybe, but, actively preventing even foundational
| elements of a successful effort.
| lxgr wrote:
| To be honest, at Google scale, if there's an objectively good
| new codec with some early signs of excitement and plausible
| industry traction, and even Apple managed to deploy it to
| virtually all of their devices (and Apple isn't exactly known
| as an "open codec forward" company), not integrating it does
| seem like either malfeasance or gross organizational
| dysfunction to me.
| refulgentis wrote:
| Completely hypothetical scenario: what if the technical
| reaction was so hostile they invested in it themselves to
| fix the issues and make it sustainable?
|
| In light of the recent security incident, I'd see that
| completely hypothetical situation as more admirable.
| lxgr wrote:
| Hm, are you suggesting they're currently in the process
| of reimplementing it in a safer and/or more maintainable
| way as part of Chrome?
|
| In that case, that would just be extremely bad messaging
| (which I also wouldn't put past Google). Why agitate half
| of the people on here and in other tech-affine parts of
| the Internet when they could have just publicly stated
| that they're working on it and to please have some
| patience?
|
| Public support by Google, even if it's just in the form
| of a vague "intent to implement", would be so important
| for a nascent JPEG successor.
| refulgentis wrote:
| See comment on peer (TL;DR: I agree, and that's the
| substance of the post we're commenting on)
| LeoNatan25 wrote:
| Your posts here seem of the "just asking questions"
| variety--no substance other than being counterculture. Do
| you have any proof or resemblance of any logical reason
| to think this?
| refulgentis wrote:
| It's a gentle joke, it happened, that's TFA. (ex. see the
| other threads re: it started from the JPEG XL repo).
|
| I use asking questions as a way to keep contentious
| discussions on track without being boorish. And you're
| right, it can easily be smarmy instead of Socratic
| without tone, a la the classic internet sarcasm problem.
|
| Gentle note: I only asked one question, and only in the
| post you replied to.
| F3nd0 wrote:
| Whatever are you referring to? JPEG XL had already been
| merged into Chromium, prior to being removed again (without a
| proper reason ever given). As far as I know, the JPEG XL
| developers have offered to do whatever work was necessary for
| Chromium specifically, but were never taken up on the offer.
|
| Same thing with Firefox, which has had basic support merged
| into Nightly, and a couple more patches gathering dust due to
| lack of involvement from the side of Firefox. Mozilla has
| since decided to take a neutral stance on JPEG XL, seemingly
| without doing any kind of proper evaluation. Many other
| programs (like GIMP, Krita, Safari, Affinity, darktable)
| already support JPEG XL.
|
| People are not getting upset because projects don't invest
| their resources into supporting JPEG XL. People are getting
| upset because Google (most notably), which has a decisive say
| in format interoperability, is flat out refusing to give JPEG
| XL a fair consideration. If they came up with a list of fair
| conditions JPEG XL has to meet to earn their support, people
| could work towards that goal, and if JPEG XL failed to meet
| them, people would easily come to terms with it. Instead,
| Google has chosen to apply double standards, present vague
| requirements, and refuse to elaborate. If anyone is
| 'preventing even foundational elements of a successful
| effort', it's Google, or more specifically, the part that's
| responsible for Chromium.
| ur-whale wrote:
| > They'll do literally anything rather than implementing JPEG
| XL over AVIF in Chrome, huh?
|
| Before making that kind of claim, I would spend some time
| looking at the names of the folks who contributed heavily to
| the development of JPEG XL and the names of the folks who wrote
| jpegli.
| lxgr wrote:
| By "they" I mean "Google, the organization", not "the authors
| of this work", who most likely have zero say in decisions
| concerning Chrome.
| JyrkiAlakuijala wrote:
| Chrome advised and inspired this work in their position
| about JPEG XL.
|
| Here: https://www.mail-archive.com/blink-
| dev@chromium.org/msg04351...
|
| "can we optimize existing formats to meet any new use-
| cases, rather than adding support for an additional format"
|
| It's a yes!
|
| Of course full JPEG XL is quite a bit better still, but
| this helps old compatible JPEG to support HDR without 8-bit
| banding artefacts or gainmaps, gives a higher bit depth for
| other uses where more precision is valuable, and quite a
| bit better compression, too.
| IshKebab wrote:
| Despite the answer being yes, IMO it's pretty clear that
| the question is disingenuous, otherwise why did they add
| support for WebP and AVIF? The question applies equally
| to them.
| lxgr wrote:
| > "can we optimize existing formats to meet any new use-
| cases, rather than adding support for an additional
| format"
|
| Only within pretty narrow limits.
|
| Classic JPEG will never be as efficient given its age, in
| the same way that LAME is doing incredible things for MP3
| quality, but any mediocre AAC encoder still blows it out
| of the water.
|
| This is in addition to the things you've already
| mentioned (HDR) and other new features (support for
| lossless coding).
|
| And I'd find their sentiment much easier to believe if
| Google/Chrome weren't hell-bent on making WebP (or more
| recently AVIF) a thing themselves! That's _two_ formats
| essentially nobody outside of Google has ever asked for,
| yet they 're part of Chrome and Android.
| dchest wrote:
| Some authors of this are also the authors of JPEG XL.
| lxgr wrote:
| I saw that. It's the tragedy of Google in a nutshell: Great
| things are being worked on in some departments, but
| organizational dysfunction virtually ensures that the
| majority of them will not end up in users' hands (or at least
| not for long).
| jbverschoor wrote:
| Google sure did a shitty job of explaining the whole situation.
|
| JpegXL was kicked out. This thing is added, but the repo and code
| seems to be from jxl.
|
| I'm very confused.
| JyrkiAlakuijala wrote:
| It was just easiest to develop in libjxl repo. All test workers
| etc. are already setup there. This was done by a very small
| team...
| tambourine_man wrote:
| Is there an easy way for us to install and test it?
| aendruk wrote:
| nixpkgs unstable (i.e. 24.05) has it at ${libjxl}/bin/cjpegli
| miragecraft wrote:
| I'm currently using it via XL Converter.
|
| https://codepoems.eu/xl-converter/
| ctz wrote:
| From the people who bought you WebP
| CVE-2023-41064/CVE-2023-4863...
| vanderZwan wrote:
| No, these are not the WebP people. These are Google's JXL
| people.
| JyrkiAlakuijala wrote:
| I designed WebP lossless and implemented the first encoder
| for it. Zoltan who did most of the implementation work for
| jpegli wrote the first decoder.
| asicsarecool wrote:
| Awesome google. Disable zooming on mobile so I can't see the
| graph detail.
|
| You guys should up your web game
| politelemon wrote:
| Many sites do this out of a misguided notion of what web
| development is.
|
| FWIW Firefox mobile lets you override zooming on a site.
| pandemic_region wrote:
| > FWIW Firefox mobile lets you override zooming on a site.
|
| What in heavens name and why is this not a default option?
| Thankee Sai!
| modeless wrote:
| Chrome also has an option for this and it's great
| therealmarv wrote:
| oh, probably we will get it soon in ImageOptim then
| https://imageoptim.com/
| vladstudio wrote:
| Thanks in advance!
| mgraczyk wrote:
| > Jpegli can be encoded with 10+ bits per component.
|
| How are the extra bits encoded?
|
| Is this the JPEG_R/"Ultra HDR" format, or has Google come up with
| yet another metadata solution? Something else altogether?
|
| Ultra HDR: https://developer.android.com/media/platform/hdr-
| image-forma...
| lonjil wrote:
| It's regular old JPEG1. I don't know the details, but it turns
| out that "8 bit" JPEG actually has enough precision in the
| format to squeeze out another 2.5 bits, as long as both the
| encoder and the decoder use high precision math.
| actionfromafar wrote:
| Wow, this is the first time I heard about that. I wonder if
| Lightroom uses high precision math.
| donatzsky wrote:
| This has nothing to do with Ultra HDR. It's "simply" a better
| JPEG encoder.
|
| Ultra HDR is a standard SDR JPEG + a gain map that allows the
| construction of an HDR version. Specifically it's an
| implementation of Adobe's Gain Map specification, with some
| extra (seemingly pointless) Google bits. Adobe gain Map:
| https://helpx.adobe.com/camera-raw/using/gain-map.html
| mgraczyk wrote:
| Thanks, I was on the team that did Ultra HDR at Google so I
| was curious if it was being used here. Didn't see anything in
| the code though so that makes sense.
| JyrkiAlakuijala wrote:
| Ultra HDR can have two jpegs inside, one for the usual image
| and another for the gain-map.
|
| Hypothetically, both jpegs can be created with jpegli.
|
| Hypothetically, both Ultra HDR jpegs can be decoded with
| jpegli.
|
| In theory jpegli would remove the 8 bit striping that would
| otherwise be present in Ultra HDR.
|
| I am not aware of jpegli-based Ultra HDR implementations.
|
| A personal preference for me would be a single Jpegli JPEG and
| very fast great local tone mapping (HDR source, tone mapping to
| SDR). Some industry experts are excited about Ultra HDR, but I
| consider it is likely too complicated to get right in editing
| software and automated image processing pipelines.
| lars_thomas wrote:
| Sorry for perhaps missing it but it states "It provides both a
| fully interoperable encoder and decoder complying with the
| original JPEG standard". Does that mean that jpegli-encoded
| images can be decoded by all jpeg decoders? But it will not have
| the same quality?
| lonjil wrote:
| Jpegli encoded images decode just fine with any JPEG decoder,
| and will still be of great quality. All the tests were done
| with libjpeg-turbo as the decoder. Using Jpegli for decoding
| gives you a bit better quality and potentially higher bit
| depth.
| 01HNNWZ0MV43FF wrote:
| Sort of like the quality vs. speed settings on libx264, I
| suppose jpegli aims to push the Pareto boundary on both
| quality and speed without changing the decode spec
| lars_thomas wrote:
| Thanks, sounds great! Not sure if you are part of the
| research team but a follow up question nevertheless. Learning
| from JpegXL, what would it take to develop another widely
| supported image format? Would the research stage already need
| to be carried out as a multi-corporate effort?
| JyrkiAlakuijala wrote:
| Multi-corporate effort would likely need to start by first
| agreeing what is image quality.
|
| Image quality folks are more cautious and tradition-centric
| than codec devs, so quite an initial effort would be needed
| to use something as advanced and risky as butteraugli,
| ssimulacra or XYB. With traditional objective metrics it
| would be very difficult to make a competing format as they
| would start with a 10-15 % disadvantage.
|
| So, I think it is not easy and would need substantial
| investment.
| lonjil wrote:
| > Not sure if you are part of the research team but a
| follow up question nevertheless.
|
| I am not.
|
| > what would it take to develop another widely supported
| image format? Would the research stage already need to be
| carried out as a multi-corporate effort?
|
| I believe JXL will be very successful sooner or later, it
| already has a lot more support than many other attempts at
| new image formats.
|
| But in general, the main way to get fast adoption on the
| web is to have Chromium's codec team be the main
| developers.
| edent wrote:
| I'd love to know what "Paradigms of Intelligence" means in this
| context.
| larodi wrote:
| I would've loved to see side-by-side comparison... after all we
| talk visuals here, right? So as this old saying goes: a hand to
| touch, an eye to see.
|
| Not underestimating the value in this, but the presentation is
| very weak.
| pizlonator wrote:
| Very interesting that new projects like this still use C++, not
| something like Rust.
| mkl wrote:
| When your aim is maximum adoption and compatibility with
| existing C++ software, C++ or C are the best choice. When
| you're building on an existing C++ codebase, switching language
| and doing a complete rewrite is very rarely sensible.
| LeoNatan25 wrote:
| > 10+ bits. Jpegli can be encoded with 10+ bits per component.
| Traditional JPEG coding solutions offer only 8 bit per component
| dynamics causing visible banding artifacts in slow gradients.
| Jpegli's 10+ bits coding happens in the original 8-bit formalism
| and the resulting images are fully interoperable with 8-bit
| viewers. 10+ bit dynamics are available as an API extension and
| application code changes are needed to benefit from it.
|
| So, instead of supporting JPEG XL, this is the nonsense they come
| up with? Lock-in over a JPEG overlay?
| aendruk wrote:
| Looks like it's not very competitive at low bitrates. I have a
| project that currently encodes images with MozJPEG at quality 60
| and just tried switching it to Jpegli. When tuned to produce
| comparable file sizes (--distance=4.0) the Jpegli images are
| consistently worse.
| ruuda wrote:
| What is your use case for degrading image quality that much? At
| quality level 80 the artifacts are already significant.
| aendruk wrote:
| Thumbnails at a high pixel density. I just want them up fast.
| Any quality that can be squeezed out of it is a bonus.
| lonjil wrote:
| I recently noticed that all the thumbnails on my computer
| are PNG, which I thought was funny.
| publius_0xf3 wrote:
| Can we just get rid of lossy image compression, please? It's so
| unpleasant looking at pictures on social media and watching them
| degrade over time as they are constantly reposted. What will
| these pictures look like a century from now?
| adamzochowski wrote:
| Please, keep lossy compression. Web is unusable already with
| websites too big as it is.
|
| What should happen: websites/applications shouldn't recompress
| images if they already deliver good pixel bitrate.
| Websites/applicates shouldn't recompress images just to add own
| watermarks.
| underlines wrote:
| JPEGLI = A small JPEG
|
| The suffix -li is used in Swiss German dialects. It forms a
| diminutive of the root word, by adding -li to the end of the root
| word to convey the smallness of the object and to convey a sense
| of intimacy or endearment.
|
| This obviously comes out of Google Zurich.
|
| Other notable Google projects using Swiss German:
|
| https://github.com/google/gipfeli high-speed compression
|
| Gipfeli = Croissant
|
| https://github.com/google/guetzli perceptual JPEG encoder
|
| Guetzli = Cookie
|
| https://github.com/weggli-rs/weggli semantic search tool
|
| Weggli = Bread roll
|
| https://github.com/google/brotli lossless compression
|
| Brotli = Small bread
| billyhoffman wrote:
| Google Zurich also did Zopfli, a DEFLATE-compliant compressor
| that gets better ratios than gzip by taking longer to compress.
|
| Apparently Zopfli = small sweet breat
|
| https://en.wikipedia.org/wiki/Zopfli
| codetrotter wrote:
| They should do XZli next :D
|
| And write it in Rust
| codetrotter wrote:
| > The suffix -li is used in Swiss German dialects
|
| Seems similar to -let in English.
|
| JPEGlet
|
| Or -ito/-ita in Spanish.
|
| JPEGito
|
| (Joint Photographers Experts Grupito)
|
| Or perhaps, if you want to go full Spanish
|
| GEFCito
|
| (Grupito de Expertos en Fotografia Conjunta)
| jug wrote:
| Their claims about Jpegli seem to make WebP obsolete regarding
| lossy encoding? Similar compression estimates as WebP versus JPEG
| are brought up.
|
| Hell, I question if AVIF is even worth it with Jpegli.
|
| It's obviously "better" (higher compression) but wait! It's 1) a
| crappy, limited image format for anything but basic use with
| obvious video keyframe roots and 2) terribly slow to encode AND
| 3) decode due to not having any streaming decoders. To decode,
| you first need to download the entire AVIF to even begin decoding
| it, which makes it worse than even JPEG/MozJPEG in many cases
| despite their larger sizes. Yes, this has been benchmarked.
|
| JPEG XL would've still been worth it though because it's just
| covering so much more ground than JPEG/Jpegli and it has a
| streaming decoder like a sensible format geared for Internet use,
| as well as progressive decoding support for mobile networks.
|
| But without that one? Why not just stick with JPEG's then.
| lonjil wrote:
| > Their claims about Jpegli seem to make WebP obsolete
| regarding lossy encoding? Similar compression estimates as WebP
| versus JPEG are brought up.
|
| I believe Jpegli beats WebP for medium to high quality
| compression. I would guess that more than half of all WebP
| images on the net would definitely be smaller as Jpegli-encoded
| JPEGs of similar quality. And note that Jpegli is actually
| worse than MozJPEG and libjpeg-turbo at medium-low qualities.
| Something like libjpeg-turbo q75 is the crossover point I
| believe.
|
| > Hell, I question if AVIF is even worth it with Jpegli.
|
| According to another test [1], for large (like 10+ Mpix)
| photographs compressed with high quality, Jpegli wins over
| AVIF. But AVIF seems to win for "web size" images. Though, as
| for point 2 in your next paragraph, Jpegli is indeed much
| faster than AVIF.
|
| > JPEG XL would've still been worth it though because it's just
| covering so much more ground than JPEG/Jpegli and it has a
| streaming decoder like a sensible format geared for Internet
| use, as well as progressive decoding support for mobile
| networks.
|
| Indeed. At a minimum, JXL gives you another 20% size reduction
| just from the better entropy coding.
|
| [1] https://cloudinary.com/blog/jpeg-xl-and-the-pareto-front
___________________________________________________________________
(page generated 2024-04-03 23:01 UTC)