[HN Gopher] Chrome Jpegxl Issue Reopened
___________________________________________________________________
Chrome Jpegxl Issue Reopened
Author : markdog12
Score : 193 points
Date : 2025-11-24 12:23 UTC (10 hours ago)
(HTM) web link (issues.chromium.org)
(TXT) w3m dump (issues.chromium.org)
| markdog12 wrote:
| "Yes, re-opening.".
|
| > Given these positive signals, we would welcome contributions to
| integrate a performant and memory-safe JPEG XL decoder in
| Chromium. In order to enable it by default in Chromium we would
| need a commitment to long-term maintenance. With those and our
| usual launch criteria met, we would ship it in Chrome.
|
| https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...
| bigbuppo wrote:
| LOL. Google, the "yeah that thing we bought six months ago,
| we're killing it off 30 days for 4 weeks ago" company demanding
| "long-term" anything.
| lonjil wrote:
| long term support is actually being provided by google...
|
| just a different team in a different country :D
|
| most jxl devs are at google research in zurich, and already
| pledged to handle long tetm support
| malfist wrote:
| Just like google pledges long term support for everything
| until the next new and shiny comes along.
| concinds wrote:
| Context: Mozilla has had the same stance and many devs
| (including Googlers) are already working on a Rust decoder
| which has made good progress.
| wizee wrote:
| JPEG-XL provides the best migration path for image conversion
| from JPEG, with lossless recompression. It also supports
| arbitrary HDR bit depths (up to 32 bits per channel) unlike AVIF,
| and generally its HDR support is much better than AVIF. Other
| operating systems and applications were making strides towards
| adopting this format, but Google was up till now stubbornly
| holding the web back in their refusal to support JPEG-XL in
| favour of AVIF which they were pushing. I'm glad to hear they're
| finally reconsidering. Let's hope this leads to resources being
| dedicated to help build and maintain a performant and memory safe
| decoder (in Rust?).
| kps wrote:
| > (in Rust?)
|
| Looks like that's the idea:
| https://issues.chromium.org/issues/462919304
| homebrewer wrote:
| It's not just Google, Mozilla has no desire to introduce a
| barely supported massive C++ decoder for marginal gains either:
|
| https://github.com/mozilla/standards-positions/pull/1064
|
| avif is just better for typical web image quality, it produces
| better looking images and its artifacts aren't as annoying
| (smoothing instead of blocking and ringing around sharp edges).
|
| You also get it for basically free because it's just an av1 key
| frame. Every browser needs an av1 decoder already unless it's
| willing to forego users who would like to be able to watch
| Netflix and YouTube.
| jnd-cz wrote:
| Can AVIF display 10 bit HDR with larger color gamut that any
| modern phone nowadays is capable of capturing?
| arccy wrote:
| if you actually read your parent comment: "typical web
| image quality"
| ansgri wrote:
| Typical web image quality is like it is partly because of
| lack of support. It's literally more difficult to show a
| static HDR photo than a whole video!
| zozbot234 wrote:
| PNG supports HDR with up to 16 bits per channel, see
| https://www.w3.org/TR/png-3/ and the cICP, mDCV and cLLI
| chunks.
| lonjil wrote:
| With incredibly bad compression ratios.
| mort96 wrote:
| HDR should not be "typical web" anything. It's insane
| that websites are allowed to override my system
| brightness setting through HDR media. There's so much
| stuff out there that literally hurts my eyes if I've set
| my brightness such that pure white (SDR FFFFFF) is a
| comfortable light level.
|
| I want JXL in web browsers, but without HDR support.
| magicalhippo wrote:
| There's nothing stopping browsers from tone mapping[1]
| those HDR images using your tone mapping preference.
|
| [1]: https://en.wikipedia.org/wiki/Tone_mapping
| mort96 wrote:
| What does that achieve? Isn't it simpler to just not
| support HDR than to support HDR but tone map away the HDR
| effect?
|
| Anyway, which web browsers have a setting to tone map HDR
| images such that they look like SDR images? (And why
| should "don't physically hurt my eyes" be an opt-in
| setting anyway instead of just the default?)
| spider-mario wrote:
| How about a user stylesheet that uses
| https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-
| lim... ?
| mort96 wrote:
| How about websites just straight up aren't allowed to
| physically hurt me, _by default_?
| spider-mario wrote:
| You asked "which web browsers have a setting to tone map
| HDR images such that they look like SDR images?"; I
| answered. Were you not actually looking for a solution?
| magicalhippo wrote:
| > What does that achieve?
|
| Because then a user who _wants_ to see the HDR image in
| all its full glory can do so. If the base image is not
| HDR, then there is nothing they can do about it.
|
| > And why should "don't physically hurt my eyes" be an
| opt-in setting anyway instead of just the default?
|
| While I very much support more HDR in the online world, I
| fully agree with you here.
|
| However, I suspect the reason will boil down to what it
| usually does: almost no users change the default settings
| ever. And so, any default which goes the other way will
| invariably lead to a ton of support cases of "why doesn't
| this work".
|
| However, web browsers are dark-mode aware, they could be
| HDR aware and do what you prefer based on that.
| kps wrote:
| Not everything in the world is passive end-of-the-line
| presentation. JPEG-XL is the only one that tries to be a
| general-purpose image format.
| asadotzler wrote:
| If that's the case, let it be a feature of image editing
| packages that can output formats that are for the web. It's
| a web standard we're talking about here, not a general-
| purpose image format, so asking browsers to carry that big
| code load seems unreasonable when existing formats do most
| of what we need and want _for the web._
| crote wrote:
| People generally expect browsers to display general-
| purpose image formats. It's why they support formats like
| classical JPEG, instead of just GIF and PNG.
|
| Turns out people _really_ like being able to just drag-
| and-drop an image from their camera into a website -
| being forced to re-encode first it isn 't exactly
| popular.
| robertoandred wrote:
| > Turns out people really like being able to just drag-
| and-drop an image from their camera into a website -
| being forced to re-encode first it isn't exactly popular.
|
| That's a function of the website, not the browser.
| jyoung8607 wrote:
| > That's a function of the website, not the browser.
|
| That's hand-waving away quite a lot. The task changes
| from serving a copy of a file on disk, as every other
| image format in common use, to needing a transcoding
| pipeline more akin to sites like YouTube. Technically
| possible, but lots of extra complexity in return for what
| gain?
| xeeeeeeeeeeenu wrote:
| >avif is just better for typical web image quality,
|
| What does "typical web image quality" even mean? I see lots
| of benchmarks with very low BPPs, like 0.5 or even lower, and
| that's where video-based image codecs shine.
|
| However, I just visited CNN.com and these are the BPPs of the
| first 10 images my browser loaded: 1.40, 2.29, 1.88, 18.03
| (PNG "CNN headlines" logo), 1.19, 2.01, 2.21, 2.32, 1.14,
| 2.45.
|
| I believe people are underestimating the BPP values that are
| actually used on the web. I'm not saying that low-BPP images
| don't exist, but clearly it isn't hard to find examples of
| higher-quality images in the wild.
| lonjil wrote:
| I don't understand what you're trying to say. Mozilla said
| over a year ago that they would support JXL as soon as
| there's a fast memory safe decoder that will be supported.
|
| Google on the other hand never expressed any desire to
| support JXL at all, regardless of the implementation. Only
| just now after the PDF Association announced that PDF would
| be using JXL, did they decide to support JXL on the web.
|
| > avif is just better for typical web image quality, it
| produces better looking images and its artifacts aren't as
| annoying (smoothing instead of blocking and ringing around
| sharp edges).
|
| AVIF is certainly better for the level of quality that Google
| wants you to use, but in reality, images on the web are much
| higher quality than that.
|
| And JXL is pretty good if you want smoothing, in fact
| libjxl's defaults have gotten so overly smooth recently that
| it's considered a problem which they're in the process of
| fixing.
| bawolff wrote:
| > I don't understand what you're trying to say. Mozilla
| said over a year ago that they would support JXL as soon as
| there's a fast memory safe decoder that will be supported.
|
| Did they actually say that? All the statements i've seen
| them have been much more guarded and vauge. More of a,
| maybe we will think about it if that happens.
| lonjil wrote:
| > If they successfully contribute an implementation that
| satisfies these properties and meets our normal
| production requirements, we would ship it.
|
| That's what they said a year ago. And a couple of Mozilla
| devs have been in regular contact with the JXL devs ever
| since then, helping with the integration. The patches to
| use jxl-rs with Firefox already exist, and will be merged
| as soon as a couple of prerequisite issues in Gecko are
| fixed.
| magicalist wrote:
| Their standards position is still neutral; what switched
| a year ago was that they said they would be _open_ to
| shipping an implementation that met their requirements.
| The tracking bug hasn 't been updated[2] The patches you
| mention are still part of the intent to prototype (behind
| a flag), similar to the earlier implementation that was
| removed in Chrome.
|
| They're looking at the same signals as Chrome of a format
| that's actually getting use, has a memory safe
| implementation, and that will stick around for decades to
| justify adding it to the web platform, all of which seem
| more and more positive since 2022.
|
| [1] https://mozilla.github.io/standards-positions/#jpegxl
|
| [2] https://bugzilla.mozilla.org/show_bug.cgi?id=1539075
| wizee wrote:
| I disagree about the image quality at typical sizes - I find
| JPEG-XL is generally similar or better than AVIF at any
| reasonable compression ratios for web images. See this for
| example: https://tonisagrista.com/blog/2023/jpegxl-vs-avif/
|
| AVIF only comes out as superior at extreme compression ratios
| at much lower bit rates than are typically used for web
| images, and the images generally look like smothered messes
| at those extreme ratios.
| bananalychee wrote:
| Even though AVIF decoding support is fairly widespread by
| now, it is still not ubiquitous like JPEG/PNG/GIF. So
| typically services will store or generate the same image in
| multiple formats including AVIF for bandwidth optimization
| and JPEG for universal client support. Browser headers help
| to determine compatibility, but it's still fairly complicated
| to implement, and users also end up having to deal with
| different platforms supporting different formats when they
| are served WebP or AVIF and want to reupload an image
| somewhere else that does not like those formats. As far as I
| can tell, JXL solves that issue for most websites since it is
| backwards-compatible and can be decoded into JPEG when a
| client does not support JXL. I would happily give up a few
| percent in compression efficiency to get back to a single
| all-purpose lossy image format.
| hirako2000 wrote:
| Even Google photo does not support avif.
|
| It's almost as if Google had an interest in increased
| storage and bandwidth. Of course they don't but as paying
| Driver used I'm overcharged for the same thing.
| lonjil wrote:
| Some years ago, the Google Photos team asked the Chrome
| team to support JXL, so that they could use it for
| Photos. The request was ignored, of course.
| hirako2000 wrote:
| They could have added support themselves to the app as it
| doesn't use the WebView
| magicalist wrote:
| > _Even Google photo does not support avif_
|
| I have no previous first-hand knowledge of this, but I
| vaguely remember discussions of avif in google photos
| from reddit a while back so FWIW I just tried uploading
| some avif photos and it handled them just fine.
|
| Listed as avif in file info, downloads as the original
| file, though inspecting the network in the web frontend,
| it serves versions of it as jpg and webp, so there's
| obviously still transcoding going on.
|
| I'm not sure when they added support, the consumer
| documentation seem to be more landing site than docs
| unless I'm completely missing the right page, but the API
| docs list avif support[1], and according to the way back
| machine, "AVIF" was added to that page some time between
| August and November 2023.
|
| [1] https://developers.google.com/photos/library/guides/u
| pload-m...
| hirako2000 wrote:
| You are correct it is possible to upload avif files into
| Google Photo. But you lose the view and of course the
| thumbnail. Defeating the whole purpose of putting them
| into Photo.
|
| Given it's an app, they didn't even need Google chrome to
| add support. Avif is supported on Android natively.
| magicalhippo wrote:
| > Mozilla has no desire to introduce a barely supported
| massive C++ decoder for marginal gains
|
| On a slightly related note, I wanted to have a HDR background
| image in Windows 11. Should be a breeze in 2025 right?
|
| Well, Windows 11 only supports JPEG XR[1] for HDR background
| images. And my commonly used tools did either not support
| JPEG XR (Gimp fex) or they did not work correctly
| (ImageMagick).
|
| So I had a look at the JPEG XR reference implementation,
| which was hosted on Codeplex but has been mirrored on
| GitHub[2]. And boy, I sure hope that isn't the code that
| lives in Windows 11...
|
| Ok most of the gunk is in the encoder/decoder wrapper code,
| but still, for something that's supposedly still in active
| use by Microsoft... Though not even hosting their own copy of
| the reference implementation is telling enough I suppose.
|
| [1]: https://en.wikipedia.org/wiki/JPEG_XR
|
| [2]: https://github.com/4creators/jxrlib
| infinet wrote:
| Another JPEG XR user is Zeiss. It saves both grayscale and
| color microscope images with JPEG XR compression in a
| container format. Zeiss also released a C++ library
| (libczi) using the reference JPEG XR implementation to
| read/write these images. Somehow Zeiss is moving away from
| JPEG XR - its newer version of microscope control software
| saves with zstd compression by default.
| greenavocado wrote:
| "Marginal Gains"
|
| Generation Loss - JPEG, WebP, JPEG XL, AVIF :
| https://www.youtube.com/watch?v=w7UDJUCMTng
| kllrnohj wrote:
| > and generally its HDR support is much better than AVIF
|
| Not anymore. JPEG had the best HDR support with ISO 21496-1
| weirdly enough, but AVIF also just recently got that capability
| with 1.2 ( https://aomedia.org/blog%20posts/Libavif-Improves-
| Support-fo... ).
|
| The last discussion in libjxl about this was seemingly taking
| the stance it wasn't necessary since JXL has "native HDR" which
| completely fails to understand the problem space entirely.
| lonjil wrote:
| The JXL spec already has gainmaps...
|
| Also, just because there's a spec for using gainmaps with
| JPEG doesn't mean that it works well. With only 8 bits of
| precision, it really sucks for HDR, gainmap or no gainmap.
| You just get too much banding. JXL otoh is completely immune
| to banding, with or without gainmaps.
| kllrnohj wrote:
| > With only 8 bits of precision, it really sucks for HDR,
| gainmap or no gainmap. You just get too much banding.
|
| This is simply not true. In fact, you get _less_ banding
| than you do with 10-bit bt2020 PQ.
|
| > JXL otoh is completely immune to banding
|
| Nonsense. It has a lossy mode (which is its primary mode so
| to speak), so of course it has banding. Only lossless
| codecs can plausibly be claimed to be "immune to banding".
|
| > The JXL spec already has gainmaps...
|
| Ah looks like they added that sometime last year but
| decided to call it "JHGM" and also made almost no mention
| of this in the issue tracker, and didn't bother updating
| the previous feature requests asking for this that are
| still open.
| spaceducks wrote:
| > Nonsense. It has a lossy mode (which is its primary
| mode so to speak), so of course it has banding. Only
| lossless codecs can plausibly be claimed to be "immune to
| banding".
|
| color banding is not a result of lossy compression*, it
| results from not having enough precision in the color
| channels to represent slow gradients. VarDCT, JPEG XL's
| lossy mode, encodes values as 32-bit floats. in fact,
| image bit depth in VarDCT is just a single value that
| tells the decoder what bit depth it should output to,
| _not_ what bit depth the image is encoded as internally.
| optionally, the decoder can even blue-noise dither it for
| you if your image wants to be displayed in a higher bit
| depth than your display or software supports
|
| this is more than enough precision to prevent _any_ color
| banding (assuming of course the source data that was
| encoded into a JXL didn 't have any banding either). if
| you still want more precision for whatever reason, the
| spec just defines that the values in XYB color channels
| are a real number between 0 and 1, and the header
| supports signaling an internal depth up to 64 bit per
| channel
|
| * technically color banding _could_ result from "lossy
| compression" if high bit depth values are quantized to
| lower bit depth values, however with sophisticated
| compression, higher bit depths often compress better
| because transitions are less harsh and as such need fewer
| high-frequency coefficients to be represented. even in
| lossless images, slow gradients can be compressed better
| if they're high bit depth, because frequent consistent
| changes in pixel values can be predicted better than
| sudden occasional changes (like suddenly transitioning
| from one color band to another)
| twotwotwo wrote:
| Wanted to note https://issues.chromium.org/issues/40141863 on
| making the lossless JPEG recompression a Content-Encoding,
| which provides a way that, say, a CDN could deploy it in a way
| that's fully transparent to end users (if the user clicks Save
| it would save a .jpg).
|
| (And: this is great! I think JPEG XL has chance of being
| adopted with the recompression "bridge" and fast decoding
| options, and things like progressive decoding for its VarDCT
| mode are practical advantages too.)
| 12_throw_away wrote:
| > performant and memory safe decoder (in Rust?).
|
| Isn't this exactly the case that wuffs [1] is built for? I had
| the vague (and, looking into it now, probably incorrect)
| impression that Google was going to start building all their
| decoders with that.
|
| [1] https://github.com/google/wuffs
| lonjil wrote:
| WUFFS only works for very simple codecs. Basically useless
| for anything complex enough that memory bugs would be common.
| FerritMans wrote:
| Love this, been waiting for Google to integrate this, from my
| experience with AVIF and JPEGXL, JPEGXL is much more promising
| for the next 20years.
| masswerk wrote:
| Nice example for how a standard, like PDF, can even
| persuade/force one of the mighty to adopt a crucial bit of
| technology, so that this may become a common standard in its own
| right (i.e. "cascading standards").
| ChrisArchitect wrote:
| [dupe] https://news.ycombinator.com/item?id=46021179
| crazygringo wrote:
| Dupe. From yesterday (183 points, 82 comments):
|
| https://news.ycombinator.com/item?id=46021179
| markdog12 wrote:
| Ah, I think I searched for "jpegxl", that's why there was no
| match.
| Pxtl wrote:
| > Lossless JPEG recompression (byte-exact JPEG recompression,
| saving about 20%) for legacy images
|
| Lossless recompression is the main interesting thing on offer
| here compared to other new formats... and honestly with only 20%
| improvement I can't say I'm super excited by this, compared to
| the pain of dealing with yet another new image format.
|
| For example, ask a normal social media user how they feel about
| .webp and expect to get an earful. The problem is that even if
| your browser supports the new format, there's no guarantee that
| every other tool you use supports it, from the OS to every site
| you want to re-upload to, etc.
| F3nd0 wrote:
| If I remember correctly, WebP was single-handedly forced into
| adoption by Chrome, while offering only marginal improvements
| over existing formats. Mozilla even worked on an improved JPEG
| encoder, MozJPEG, to show it could compete with WebP very well.
| Then came HEIF and AVIF, which, like WebP, were just repurposed
| video codecs.
|
| JPEG XL is the first image format in a long while that's been
| actually designed for images and brings a substantial
| improvement to quality while _also_ covering a wide range of
| uses and preserving features that video codecs don 't have. It
| supports progressive decoding, seamless very large image sizes,
| potentially large amount of channels, is reasonably resilient
| against generation loss, and more. The fact that it has no
| major drawbacks alone gives it much more merit than WebP has
| ever had. Lossless recompression is in addition to all of that.
|
| The difference is that this time around, Google has single-
| handedly held back the adoption of JPEG XL, while a number of
| other parties have expressed interest.
| Dwedit wrote:
| Having a PNG go from 164.5K to 127.1K as lossless WEBP is not
| what I'd call "marginal". An improvement of over 20% is huge
| for lossless compression.
|
| Going from lossless WEBP to lossless JXL is marginal though,
| and is not worth the big decode performance loss.
| lonjil wrote:
| Since the person you replied to mentioned MozJPEG, I have
| to assume they meant that WebP's lossy capabilities were a
| marginal improvement.
| F3nd0 wrote:
| In context of the parent comment, 'only 20% improvement' is
| not super exciting, 'compared to the pain of dealing with
| yet another new image format'.
|
| You raise a good point, though; WebP certainly did (and
| continues to do) well in some areas, but at the cost of
| lacking in others. Moreover, when considering a format for
| adoption, one should compare it with other candidates for
| adoption, too. And years before WebP gained widespread
| support in browsers, it had competition from other
| interesting formats like FLIF, which addressed some of its
| flaws, and I have to wonder how it compares to the even
| older JPEG 2000.
| halapro wrote:
| You're not being fair. Webp has been the only choice for
| lossy image compression with alpha layer. Give it some
| credit.
| F3nd0 wrote:
| Fair point, though not entirely true: you can run an image
| through lossy compression and store the result in a PNG,
| using tools like pngquant [1]. Likely not as efficient for
| many kinds of images, but totally doable.
|
| [1] https://pngquant.org/
| tempest_ wrote:
| 20% is massive for those storing those social media images
| though.
| Pxtl wrote:
| I get that there are people who _are_ super excited by this
| for very good reasons, but for those of us downstream this is
| just going to be a hassle.
| 7jjjjjjj wrote:
| I think there's a difference here.
|
| If I right click save and get a webp, it was probably converted
| from JPG. Very very few images are uploaded in webp. So getting
| a webp image means you've downloaded an inferior version.
|
| JXL doesn't have this issue because conversion from jpeg is
| lossless. So you've still gotten the real, fully-quality image.
| Pxtl wrote:
| Let's be realistic - when most users are upset they got a
| .webp, they're not annoyed because of quality-loss, they're
| annoyed because they can't immediately use it in many other
| services & software.
| spider-mario wrote:
| Since the recompression is lossless, you don't need every tool
| you use to support it, as long as one of them is one that can
| do the decompression back to JPEG. This sounds a bit like
| complaining that you can't upload .7z everywhere.
| Pxtl wrote:
| AFAIK downconverting to jpeg is only an option for legacy
| jpegs that have been upconverted to jpegxl though. Many
| jpegxl images likely _won 't_ support downconverting if they
| were created as jxl from the get-go.
|
| Basically, jpeg->jxl->jpeg is perfectly lossless conversion,
| but a newly-made jxl->jpeg is not, even if it doesn't use
| modern jxl-only features like alpha channels.
|
| With that in mind I'd actually prefer if those were treated
| as separate file-formats with distinct file-extensions
| (backwards-compatible jpeg->jxls vs pure-jxl). The former
| could be trivially handled with automated tools, but the
| latter can't.
| spaceducks wrote:
| I'm not sure if that will be an issue in practice. in any
| case, you need a JPEG XL decoder to perform the transition
| from a recompressed-JPEG-JXL to the original JPEG, so
| whatever tool is doing this, it can already handle native-
| JXL too. it could be the conversion happens on the server
| side and the client always sees JPEG, in which case a
| native JXL can _also_ be decoded to a JPEG (or if lossless
| a PNG), though obviously with information loss since JPEG
| is a subset of JXL (to put it lightly)
| spider-mario wrote:
| Well, sure, but wasn't that the use case we were
| discussing?
| Pxtl wrote:
| Right. And that particular use-case sounds nice, but
| realistically this new format will not be _exclusively_
| used in that particular case.
|
| Dealing with basically another .webp-like format in those
| cases (one that _might_ be a backwards-compatible jpeg or
| might not and determining that can only be done by
| inspecting the file contents) doesn 't sound super fun.
|
| So ideally, to make up names, I wish they'd used separate
| extensions and so a ".jp3" is a file that can be
| downconverted to a jpg and you could get a browser
| extension to automate that for you if you wanted, and a
| ".jxl" is the new file format that's functionally another
| ".webp"-like thing to deal with and all the pain-points
| that implies.
| albert_e wrote:
| > Chrome Jpegxl Issue Reopened
|
| > (this is the tracking bug for this feature)
|
| Is it just me -- or it's confusing to use the terms issue / bug /
| feature interchangeably?
| mort96 wrote:
| It's not really used interchangeably: "bug" is used to mean
| "entry in the bug tracker database", while "feature" is used to
| mean what we colloquially think of as a feature of a computer
| program.
|
| It's arguably a slight abuse of a bug tracking system to also
| track progress and discussion on features, but it's not exactly
| uncommon; it's just that many systems would call it an "issue"
| rather than a "bug".
| nandomrumber wrote:
| Maybe more like a heading bug:
|
| https://aviation.stackexchange.com/questions/23166/what-is-t...
| crazygringo wrote:
| Not really -- they're all "potential todos" that need to be
| tracked and prioritized in the same place.
|
| And the difference between a bug and a feature is often in the
| eye of the beholder. I'll very often title a GitHub issue with
| "Bug/Feature Request:" since it's often debatable whether the
| existing behavior was by design or not, and I don't want to
| presume one way or the other.
|
| So I do consider them all pretty interchangeable at the end of
| the day, and therefore not really confusing.
| gen2brain wrote:
| I like how even the nus product (jpegli) is a significant
| improvement. I am in the process of converting my comic book
| collection. I save a lot of space and still use JPEG, which is
| universally supported.
___________________________________________________________________
(page generated 2025-11-24 23:00 UTC)