[HN Gopher] Netflix's AV1 Journey: From Android to TVs and Beyond
       ___________________________________________________________________
        
       Netflix's AV1 Journey: From Android to TVs and Beyond
        
       Author : CharlesW
       Score  : 485 points
       Date   : 2025-12-05 00:09 UTC (22 hours ago)
        
 (HTM) web link (netflixtechblog.com)
 (TXT) w3m dump (netflixtechblog.com)
        
       | kvirani wrote:
       | Top post without a single comment and only 29 points. Clearly my
       | mental model of how posts bubble to the top is broken.
        
         | yjftsjthsd-h wrote:
         | IIRC, there's a time/recency factor. If we assume that most
         | people don't browse /newest (without commenting on _should_ , I
         | suspect this _is_ true), then that seems like a reasonable way
         | to help surface things; enough upvotes to indicate interest
         | means a story gets a chance at the front page.
        
       | IgorPartola wrote:
       | Amazing. Proprietary video codecs need to not be the default and
       | this is huge validation for AV1 as a production-ready codec.
        
         | raw_anon_1111 wrote:
         | Why does it matter if _Netflix_ is using an open standard if
         | every video they stream is wrapped in proprietary closed DRM?
        
           | chii wrote:
           | because device makers will not care for the DRM, but will
           | care for the hardware decoder they need to decide to put into
           | their devices to decode netflix videos. By ensuring this
           | video codec is open, it benefits everybody else now, as this
           | same device will now be able to hardware decode _more_ videos
           | from different video providers, as well as make more video
           | providers choose AV1.
           | 
           | Basically, a network effect for an open codec.
        
             | raw_anon_1111 wrote:
             | You've convinced me... (no snark intended)
        
             | csmpltn wrote:
             | But they still need to decode the proprietary DRM before it
             | can be fed to the hardware decoder... lol
        
               | cm2187 wrote:
               | I think the point is that if you are not Netflix, you can
               | use AV1 as most of your clients devices support hardware
               | acceleration thanks to the big guys using AV1 themselves.
        
               | nevi-me wrote:
               | I struggle to follow your point. They still need to do
               | that for any codec, and I would think that the DRM
               | decryption would be using algorithms that might also be
               | hardware accelerated.
        
           | cheema33 wrote:
           | > Why does it matter if Netflix is using an open standard if
           | every video they stream is wrapped in proprietary closed DRM?
           | 
           | I am not sure if this is a serious question, but I'll bite in
           | case it is.
           | 
           | Without DRM Netflix's business would not exist. Nobody would
           | license them any content if it was going to be streamed
           | without a DRM.
        
             | realusername wrote:
             | I don't think anybody could suggest going back to Blueray
             | at this point, if selling online without DRM would be the
             | only choice, they would have to comply.
        
             | reddalo wrote:
             | >Without DRM Netflix's business would not exist. Nobody
             | would license them any content if it was going to be
             | streamed without a DRM.
             | 
             | I don't agree. If people refused to watch DRM-protected
             | content, they would get rid of it.
             | 
             | For example, Pluto TV is a free streaming service that has
             | much content without DRM. GOG lets you buy DRM-free games.
             | Even Netflix itself lets you stream DRM-free content,
             | albeit in low resolution.
        
               | bobdvb wrote:
               | From previous experience some platforms are considered a
               | "leakage source" for content and major rights owners
               | won't put their content there because it's too easy to
               | steal from. The security measures that are put on
               | streaming platforms aren't totally ineffective, they're
               | restrictive but it's considered worth the trouble because
               | platforms can actually measure the effect of
               | restrictions.
               | 
               | The low resolution option is something many rightsholders
               | accept, but from a product proposition perspective it's
               | difficult to explain to many customers. They're just
               | grumpy that they paid for content and can only watch it
               | in SD, that reduces your customer satisfaction. Better to
               | do nothing than a poor job sometimes.
        
             | IshKebab wrote:
             | I'm not convinced by that. If DRM didn't exist then do you
             | really think studios would be like "nah we'll just miss out
             | on all that money".
             | 
             | They just want DRM because it makes them even _more_ money.
             | Or at least they think it does. I have yet to find a single
             | TV show or film that isn 't available on Bittorrent so I
             | don't think the DRM is actually preventing piracy in the
             | slightest. I guess they want it in order to prevent legal
             | tools from easily working with videos, e.g. for backup,
             | retransmission etc.
        
             | dontlaugh wrote:
             | Why? Everything gets pirated anyway, even with all the DRM.
             | There's no difference.
        
               | jamesnorden wrote:
               | Security theater mostly, makes the executives feel good.
        
               | bobdvb wrote:
               | I've spent >20 years doing content security in various
               | forms at various companies. Until recently I was
               | directing the technology at a major streaming platform.
               | 
               | I can confirm that while there are serious issues with
               | Widevine (and to a lesser extent PlayReady), the
               | protection measures aren't totally ineffective. My work
               | in improving security had measurable results saving
               | significant amounts of money and reducing content
               | leakage. One memorable time my colleague and I had a call
               | with a big rights owner who tracks the piracy of their
               | assets and they said "Can you tell us what you've been
               | doing recently? Because the amount of piracy from your
               | platform has dropped significantly."
               | 
               | Anti-piracy and content security is also a differentiator
               | between platforms when bidding for content deals. Rights
               | owners will absolutely give the best deals to the
               | provider who provides more assurance and avoid platforms
               | which are leaky buckets.
               | 
               | I know that doesn't fit the narrative, but until recently
               | this was literally my job.
        
               | Dylan16807 wrote:
               | Are we talking about pretty mainstream content here?
               | Stuff with at least a hundred thousand views in the first
               | week?
               | 
               | I don't think I've ever looked for a recent show and not
               | seen a pirate version.
        
       | pbw wrote:
       | There's an HDR war brewing on TikTok and other social apps. A
       | fraction of posts that use HDR are just massively brighter than
       | the rest; the whole video shines like a flashlight. The apps are
       | eventually going to have to detect HDR abuse.
        
         | jsheard wrote:
         | Sounds like they need something akin to audio volume
         | normalization but for video. You can go bright, but only in
         | moderation, otherwise your whole video gets dimmed down until
         | the average is reasonable.
        
           | illiac786 wrote:
           | I was about to write that. The algorithm need to be chosen,
           | what is mostly used for audio gain normalization? Rolling
           | average?
        
           | solarkraft wrote:
           | Actually I don't even agree with that. I don't want to be
           | flash banged.
        
         | JoshTriplett wrote:
         | That's true on the web, as well; HDR images on web pages have
         | this problem.
         | 
         | It's not obvious whether there's any automated way to
         | _reliably_ detect the difference between  "use of HDR" and
         | "abuse of HDR". But you could probably catch the _most_
         | egregious cases, like  "every single pixel in the video has
         | brightness above 80%".
        
           | eru wrote:
           | > It's not obvious whether there's any automated way to
           | reliably detect the difference between "use of HDR" and
           | "abuse of HDR".
           | 
           | That sounds like a job our new AI overlords could probably
           | handle. (But that might be overkill.)
        
           | kmeisthax wrote:
           | Funnily enough HDR already has to detect this problem,
           | because most HDR monitors literally do not have the power
           | circuitry or cooling to deliver a complete white screen at
           | maximum brightness.
           | 
           | My idea is: for each frame, grayscale the image, then count
           | what percentage of the screen is above the standard white
           | level. If more than 20% of the image is >SDR white level,
           | then tone-map the whole video to the SDR white point.
        
             | JoshTriplett wrote:
             | That needs a temporal component as well: games and videos
             | often use HDR for sudden short-lived brightness.
        
             | Koffiepoeder wrote:
             | I now present you: HDRbooster. The tool to boost your image
             | to 19.99% BOOSTED highlights and 80.01% MAX brightness
             | (99.99% of SDR white)!
        
         | munificent wrote:
         | Just what we need, a new loudness war, but for our eyeballs.
         | 
         | https://en.wikipedia.org/wiki/Loudness_war
        
           | eru wrote:
           | Interestingly, the loudness war was essentially fixed by the
           | streaming services. They were in a similar situation as Tik
           | Tok is now.
        
             | aoeusnth1 wrote:
             | What's the history on the end to the loudness war? Do
             | streaming services renormalize super compressed music to be
             | quieter than the peaks of higher dynamic range music?
        
               | eru wrote:
               | Yes. Basically the streaming services started using a
               | decent model of perceived loudness, and normalise tracks
               | to roughly the same perceived level. I seem to remember
               | that Apple (the computer company, not the music company)
               | was involved as well, but I need to re-read the history
               | here. Their music service and mp3 players were popular
               | back in the day.
               | 
               | So all music producers got out of compressing their music
               | was clipping, and not extra loudness when played back.
        
               | cdash wrote:
               | It hasn't really changed much in the mastering process,
               | they still are doing the same old compression. Maybe not
               | the to the same extremes, but dynamic range is still
               | usually terrible. They do it a a higher LUFS target than
               | the streaming platforms normalize to because each
               | streaming platform has a different limit and could change
               | it at any time, so better to be on the safe side. Also
               | the fact that majority of music listening doesn't happen
               | on good speakers/environment.
        
               | account42 wrote:
               | > Also the fact that majority of music listening doesn't
               | happen on good speakers/environment.
               | 
               | Exacly this. I usually do not want high dynamic audio
               | because that means it's either to quiet sometimes or loud
               | enough to annoy neighbors at other times, or both.
        
             | Demiurge wrote:
             | You would think, but not in a way that matters. Everyone
             | still compresses their mixes. People try to get around
             | normalization algorithms by clever hacks. The dynamics
             | still suffer, and bad mixes still clip. So no, I don't
             | think streaming services fixed the loudness wars.
        
             | irae wrote:
             | I hope they end up removing HDR from videos with HDR text.
             | Recording video in sunlight etc is OK, it can be sort of
             | "normalized brightness" or something. But HDR text on top
             | is terrible always.
        
           | morshu9001 wrote:
           | What if they did HDR for audio? So an audio file can tell
           | your speakers to output at 300% of the normal max volume,
           | even more than what compression can do.
        
             | Cthulhu_ wrote:
             | Isn't that just by having generally low volume levels? I'm
             | being pedantic, but audio already supports a kind of HDR
             | like that. That said, I wonder if the "volume
             | normalisation" tech that definitely Spotify, presumably
             | other media apps / players / etc have, can be abused to
             | think a song is really quiet.
        
         | ElasticBottle wrote:
         | Can someone explain what the war is about?
         | 
         | Like HDR abuse makes it sound bad, because the video is bright?
         | Wouldn't that just hurt the person posting it since I'd skip
         | over a bright video?
         | 
         | Sorry if I'm phrasing this all wrong, don't really use TikTok
        
           | JoshTriplett wrote:
           | > Wouldn't that just hurt the person posting it since I'd
           | skip over a bright video?
           | 
           | Sure, in the same way that advertising should never work
           | since people would just skip over a banner ad. In an ideal
           | world, everyone would uniformly go "nope"; in our world, it's
           | very much analogous to the
           | https://en.wikipedia.org/wiki/Loudness_war .
        
           | johncolanduoni wrote:
           | Not everything that glitters (or blinds) is gold.
        
         | hbn wrote:
         | HDR videos on social media look terrible because the UI isn't
         | in HDR while the video isn't. So you have this insanely bright
         | video that more or less ignores your brightness settings, and
         | then dim icons on top of it that almost look incomplete or
         | fuzzy cause of their surroundings. It looks bizarre and
         | terrible.
        
           | nine_k wrote:
           | But isn't it the point? Try looking at a light bulb;
           | everything around it is so much less bright.
           | 
           | OTOH pointing a flaslight at your face is at least impolite.
           | I would put a dark filter on top of HDR vdeos until a video
           | is clicked for watching.
        
             | hbn wrote:
             | I'm not trying to watch videos or read text on my light
             | bulb
        
               | nine_k wrote:
               | A video of a sunrise, or a firework, or metal being cast,
               | etc feels much more real in HDR. There are legitimate
               | uses.
        
           | NathanielK wrote:
           | It's good if you have black text on white background, since
           | your app can have good contrast without searing your eyes.
           | People started switching to dark themes to avoid having their
           | eyeballs seared monitors with the brightness high.
           | 
           | For things filmed with HDR in mind it's a benefit. Bummer
           | things always get taken to the extreme.
        
             | hbn wrote:
             | I only use light themes for the most part, and HDR videos
             | look insane and out of place. If you scroll past an HDR
             | video on Instagram you have a, eyeball-searing section of
             | your screen because your eyes aren't adjusted to looking at
             | that brightness, and then once you scroll it off the screen
             | and you have no HDR content, everything looks dim and muted
             | because you just got flashbanged.
        
           | hombre_fatal wrote:
           | Not sure how it works on Android, but it's such amateur UX on
           | Apple's part.
           | 
           | 99.9% of people expect HDR content to get capped / tone-
           | mapped to their display's brightness setting.
           | 
           | That way, HDR content is just magically better. I think this
           | is already how HDR works on non-HDR displays?
           | 
           | For the 0.01% of people who want something different, it
           | should be a toggle.
           | 
           | Unfortunately I think this is either (A) amateur
           | enshittification like with their keyboards 10 years ago, or
           | (B) Apple specifically likes how it works since it forces you
           | to see their "XDR tech" even though it's a horrible
           | experience day to day.
        
             | lern_too_spel wrote:
             | Android finally addressed this issue with the latest
             | release. https://9to5google.com/2025/12/02/the-top-new-
             | features-andro...
        
             | turtletontine wrote:
             | 99% of people have no clue what "HDR" and "tone-mapping"
             | mean, but yes are probably weirded out by some videos being
             | randomly way brighter than everything else
        
           | crazygringo wrote:
           | The alternative is even worse, where the whole UI is blinding
           | you. Plus, that level of brightness isn't meant to be
           | sustained.
           | 
           | The solution is for social media to be SDR, not for the UI to
           | be HDR.
        
             | miladyincontrol wrote:
             | Imo the real solution is for luminance to scale
             | appropriately even in HDR range, kinda like how gain map
             | HDR images can. Scaled both with regards to the display's
             | capabilities and the user/apps intents.
        
               | solarkraft wrote:
               | My personal solution would be to cap video brightness to
               | the brightness I selected.
        
         | crazygringo wrote:
         | This is one of the reasons I don't like HDR support "by
         | default".
         | 
         | HDR is meant to be so much more intense, it should really be
         | limited to things like immersive full-screen long-form-ish
         | content. It's for movies, TV shows, etc.
         | 
         | It's not what I want for non-immersive videos you scroll
         | through, ads, etc. I'd be happy if it were disabled by the OS
         | whenever not in full screen mode. Unless you're building a
         | video editor or something.
        
           | JoshTriplett wrote:
           | Or a photo viewer, which isn't necessarily running in
           | fullscreen.
        
         | dylan604 wrote:
         | sounds like every fad that came before it where it was over
         | used by all of the people copying with no understanding of what
         | it is or why. remember all of the HDR still images that pushed
         | everything to look post-apocalyptic? remember all of the people
         | pushing washed out videos because they didn't know how to grade
         | the images recorded in log and it became a "thing"?
         | 
         | eventually, it'll wear itself out just like every other over
         | use of the new
        
         | recursive wrote:
         | My phone has this cool feature where it doesn't support HDR.
        
           | illiac786 wrote:
           | Every phone has it, it's called "power save mode" on most
           | devices and provides additional advantages like preventing
           | apps from doing too much stuff in the background. =)
        
         | thrdbndndn wrote:
         | The whole HDR scene still feels like a mess.
         | 
         | I know how bad the support for HDR is on computers
         | (particularly Windows and cheap monitors), so I avoid consuming
         | HDR content on them.
         | 
         | But I just purchased a new iPhone 17 Pro, and I was very
         | surprised at how these HDR videos on social media still look
         | like shit on apps like Instagram.
         | 
         | And even worse, the HDR video I shoot with my iPhone looks like
         | shit even when playing it back on the same phone! After a few
         | trials I had to just turn it off in the Camera app.
        
           | Forgeties79 wrote:
           | The only time I shoot HDR on anything is because I plan on
           | crushing the shadows/raising highlights after the fact. S
           | curves all the way. Get all the dynamic range you can and
           | then dial in the look. Otherwise it just looks like a flat
           | washed out mess most of the time
        
           | johncolanduoni wrote:
           | I wonder if it fundamentally only really makes sense for
           | film, video games, etc. where a person will actually tune the
           | range per scene. Plus, only when played on half decent
           | monitors that don't just squash BT.2020 so they can say HDR
           | on the brochure.
        
             | Dylan16807 wrote:
             | Even without tuning it shouldn't look _worse_ than
             | squishing to SDR at capture time. There are are significant
             | ecosystem failures that could be fixed.
        
           | theshackleford wrote:
           | The HDR implementation in Windows 11 is fine. And it's not
           | even that bad in 11 in terms of titles and content officially
           | supporting HDR. Most of the ideas that it's "bad" comes from
           | the "cheap monitor" part, not windows.
           | 
           | I have zero issues and only an exceptional image on W11 with
           | a PG32UQX.
        
             | thrdbndndn wrote:
             | Good to know!
        
             | RealStickman_ wrote:
             | IIRC Windows still uses the sRGB curve for tone mapping of
             | SDR content in HDR, so you have to toggle it on and off all
             | the time.
             | 
             | KDE Wayland went the better route and uses Gamma 2.2
        
             | lwkl wrote:
             | Also if you get flashbanged by SDR content on Windows 11
             | there is a slider in HDR settings that lets you turn down
             | the brightness of SDR content. I didn't know about this at
             | first and had HDR disable because of this for a long time.
        
         | kmeisthax wrote:
         | I would love to know who the hell thought adding "brighter than
         | white" range to HDR was a good idea. Or, even worse, who the
         | hell at Apple thought implementing that should happen by way of
         | locking UI to the standard range. Even if you have a properly
         | mastered HDR video (or image), and you've got your brightness
         | set to where it doesn't hurt to look at, it still makes all the
         | UI surrounding that image look grey. If I'm only supposed to
         | watch HDR in fullscreen, where there's no surrounding UI, then
         | maybe you should tone-map to SDR until I fullscreen the damn
         | video?
        
           | crazygringo wrote:
           | Yup, totally agreed. I said the same thing in another comment
           | -- HDR should be reserved only for full-screen stuff where
           | you want to be immersed in it, like movies and TV shows.
           | 
           | Unless you're using a video editor or something, everything
           | should just be SDR when it's within a user interface.
        
         | morshu9001 wrote:
         | HDR has a slight purpose, but the way it was rolled out was so
         | disrespectful that I just want it permanently gone everywhere.
         | Even the rare times it's used in a non-abusive way, it can hurt
         | your eyes or make things display weirdly.
        
         | baby_souffle wrote:
         | > The apps are eventually going to have to detect HDR abuse
         | 
         | The latest android release has a setting that is the HDR
         | version of "volume leveling".
        
       | resolutefunctor wrote:
       | This is really cool. Props to the team that created AV1. Very
       | impressive
        
       | tr45872267 wrote:
       | >AV1 sessions use one-third less bandwidth than both AVC and HEVC
       | 
       | Sounds like they set HEVC to higher quality then? Otherwise how
       | could it be the same as AVC?
        
         | dylan604 wrote:
         | definitely reads like "you're holding it wrong" to me as well
        
         | pornel wrote:
         | There are other possible explanations, e.g. AVC and HEVC are
         | set to the same bitrate, so AVC streams lose quality, while AV1
         | targets HEVC's quality. Or they compare AV1 traffic to the sum
         | of all mixed H.26x traffic. Or the rates vary in more complex
         | ways and that's an (over)simplified summary for the purpose of
         | the post.
         | 
         | Netflix developed VMAF, so they're definitely aware of the
         | complexity of matching quality across codecs and bitrates.
        
           | tr45872267 wrote:
           | I have no doubt they know what they are doing. But it's a
           | srange metric no matter how you slice it. Why compare AV1's
           | bandwith to the average of h.264 and h.265, and without any
           | more details about resolution or compression ratio? Reading
           | between the lines, it sounds like they use AV1 for low
           | bandwidth and h.265 for high bandwidth and h.264 as a
           | fallback. If that is the case, why bring up this strange
           | average bandwidth comparison?
        
             | slhck wrote:
             | Yeah it's a weird comparison to be making. It all depends
             | on how they selected the quality (VMAF) target during
             | encoding. You couple easily end up with other results had
             | they, say, decided to keep the bandwidth but improve
             | quality using AV1.
        
       | crazygringo wrote:
       | Wow. To me, the big news here is that ~30% of devices now support
       | AV1 hardware decoding. The article lists a bunch of examples of
       | devices that have gained it in the past few years. I had no idea
       | it was getting that popular -- fantastic news!
       | 
       | So now that h.264, h.265, and AV1 seem to be the three major
       | codecs with hardware support, I wonder what will be the next one?
        
         | JoshTriplett wrote:
         | > So now that h.264, h.265, and AV1 seem to be the three major
         | codecs with hardware support, I wonder what will be the next
         | one?
         | 
         | Hopefully AV2.
        
           | jsheard wrote:
           | H266/VVC has a five year head-start over AV2, so probably
           | that first unless hardware vendors decide to skip it
           | entirely. The final AV2 spec is due this year, so any day
           | now, but it'll take a while to make it's way into hardware.
        
             | adgjlsfhk1 wrote:
             | H266 is getting fully skipped (except possibly by Apple).
             | The licensing is even worse than H265, the gains are
             | smaller, and Google+Netflix have basically guaranteed that
             | they won't use it (in favor of AV1 and AV2 when ready).
        
               | johncolanduoni wrote:
               | Did anybody, including the rightsholders, come out ahead
               | on H265? From the outside it looked like the mutually
               | assured destruction situation with the infamous mobile
               | patents, where they all end up paying lawyers to demand
               | money from each other for mostly paper gains.
        
               | tux3 wrote:
               | Why, the patent office did. There are many ideas that
               | cannot be reinvented for the next few decades, and thanks
               | to submarine patents it is simply not safe to innovate
               | without your own small regiment of lawers.
               | 
               | This is a big victory for the patent system.
        
               | Dylan16807 wrote:
               | The patent office getting $100k or whatever doesn't sound
               | like a win for them either.
               | 
               | I'm not sure what you mean by "patent system" having a
               | victory here, but it's not that the goal of promoting
               | innovation is happening.
        
               | gary_0 wrote:
               | MBAs got to make deals and lawyers got to file lawsuits.
               | Everyone else got to give them money. God bless the
               | bureaucracy.
        
               | TitaRusell wrote:
               | For smart TVs Netflix is obviously a very important
               | partner.
        
             | kevincox wrote:
             | If it has a five year start and we've seen almost zero
             | hardware shipping that is a pretty bad sign.
             | 
             | IIRC AV1 decoding hardware started shipping within a year
             | of the bitstream being finalized. (Encoding took quite a
             | bit longer but that is pretty reasonable)
        
               | jsheard wrote:
               | https://en.wikipedia.org/wiki/Versatile_Video_Coding#Hard
               | war...
               | 
               | Yeah, that's... sparse uptake. A few smart TV SOCs have
               | it, but aside from Intel it seems that none of the major
               | computer or mobile vendors are bothering. AV2 next it is
               | then!
        
             | adzm wrote:
             | VVC is pretty much a dead end at this point. Hardly anyone
             | is using it; it's benefits over AV1 are extremely minimal
             | and no one wants the royalty headache. Basically everyone
             | learned their lesson with HEVC.
        
               | ksec wrote:
               | It is being used in China and India for Streaming. Brazil
               | chose it with LCEVC for their TV 3.0. Broadcasting
               | industry is also preparing for VVC. So it is not popular
               | as in Web and Internet is usage, but it is certainly not
               | dead.
               | 
               | I am eagerly awaiting for AV2 test results.
        
             | shmerl wrote:
             | When even H.265 is being dropped by the likes of Dell,
             | adoption of H.266 will be even worse making it basically
             | DOA for anything promising. It's plagued by the same
             | problems H.265 is.
        
               | SG- wrote:
               | Dell is significant in the streaming and media world?
        
               | close04 wrote:
               | Dell and HP are significant in the "devices" world and
               | they just dropped the support for HEVC hardware
               | encoding/decoding [1] to save a few cents per device. You
               | can still pay for the Microsoft add-in that does this.
               | It's not just streaming, your Teams background blur was
               | handled like that.
               | 
               | Eventually people and companies will associate HEVC with
               | "that thing that costs extra to work", and software
               | developers will start targeting AV1/2 so their software
               | performance isn't depending on whether the laptop
               | manufacturer or user paid for the HEVC license.
               | 
               | [1] https://arstechnica.com/gadgets/2025/11/hp-and-dell-
               | disable-...
        
               | nolok wrote:
               | On the same line, Synology dropped it on their NAS too
               | (for their video, media etc ... Even thumbnails, they ask
               | the sender device to generate one locally and send it,
               | the NAS won't do it anymore for HEVC)
        
               | shmerl wrote:
               | Also you can just use Linux, Dell / HP have no control
               | over the actual GPU for that, I think they just disabled
               | it in Windows level. Linux has no gatekeepers for that
               | and you can use your GPU as you want.
               | 
               | But this just indicates that HEVC etc. is a dead end
               | anyway.
        
               | danudey wrote:
               | Dell is dropping it to save 4 cents per device, so users
               | will have to pay $1 to Microsoft per user instead. Go
               | figure.
        
         | dylan604 wrote:
         | how does that mean "~30% of devices now support AV1 hardware
         | encoding"? I'm guessing you meant hardware decoding???
        
           | crazygringo wrote:
           | Whoops, thanks. Fixed.
        
         | dehrmann wrote:
         | Not trolling, but I'd bet something that's augmented with
         | generative AI. Not to the level of describing scenes with
         | words, but context-aware interpolation.
        
           | randall wrote:
           | for sure. macroblock hinting seems like a good place for
           | research.
        
           | km3r wrote:
           | https://blogs.nvidia.com/blog/rtx-video-super-resolution/
           | 
           | We already have some of the stepping stones for this. But
           | honestly much better for upscaling poor quality streams vs
           | just gives things a weird feeling when it is a better quality
           | stream.
        
           | afiori wrote:
           | AI embeddings can be seen as a very advanced form of lossy
           | compression
        
           | mort96 wrote:
           | I don't want my video decoder inventing details which aren't
           | there. I much rather want obvious compression artifacts than
           | a codec where the "compression artifacts" look like perfectly
           | realistic, high-quality hallucinated details.
        
             | cubefox wrote:
             | In case of many textures (grass, sand, hair, skin etc) it
             | makes little difference whether the high frequency details
             | are reproduced exactly or hallucinated. E.g. it doesn't
             | matter whether the 1262nd blade of grass from the left side
             | is bending to the left or to the right.
        
               | mort96 wrote:
               | And in the case of many others, it makes a very
               | significant difference. And a codec doesn't have enough
               | information to know.
               | 
               | Imagine a criminal investigation. A witness happened to
               | take a video as the perpetrator did the crime. In the
               | video, you can clearly see a recognizable detail on the
               | perpetrator's body in high quality; a birthmark perhaps.
               | This rules out the main suspect -- but can we trust that
               | the birthmark actually exists and isn't hallucinated?
               | Would a non-AI codec have just showed a clearly
               | compression-artifact-looking blob of pixels which can't
               | be determined one way or the other? Or would a non-AI
               | codec have contained actual image data of the birth mark
               | in sufficient detail?
               | 
               | Using AI to introduce realistic-looking details where
               | there was none before (which is what your proposed AI
               | codec inherently does) should _never_ happen
               | automatically.
        
               | cubefox wrote:
               | Maybe there could be a "hallucination rate" parameter in
               | the encoder: More hallucination would enable higher
               | subjective image quality without increased accuracy. It
               | could be used for Netflix streaming, where birthmarks and
               | other forensic details don't matter because it's all just
               | entertainment. Of course the hallucination parameter
               | needs to be hard coded somehow in the output in order to
               | determine its reliability.
        
               | beala wrote:
               | There's an infamous case of xerox photocopiers
               | substituting in incorrect characters due to a poorly
               | tuned compression algorithm. No AI necessary.
               | 
               | https://en.wikipedia.org/wiki/JBIG2#:~:text=Character%20s
               | ubs...
        
               | mort96 wrote:
               | Yeah, I had that case in mind actually. It's a perfect
               | illustration of why compression artifacts should be
               | obvious and not just realistic-looking hallucinations.
        
               | mapt wrote:
               | > a codec doesn't have enough information to know.
               | 
               | The material belief is that modern trained neural network
               | methods that improve on ten generations of variations of
               | the discrete cosine transform and wavelets, can bring a
               | codec from "1% of knowing" to "5% of knowing". This is
               | broadly useful. The level of abstraction does not need to
               | be "The AI told the decoder to put a finger here", it may
               | be "The AI told the decoder how to terminate the wrinkle
               | on a finger here". An AI detail overlay. As we go from
               | 1080p to 4K to 8K and beyond we care less and less about
               | individual small-scale details being 100% correct, and
               | there are representative elements that existing
               | techniques are just really bad at squeezing into higher
               | compression ratios.
               | 
               | I don't claim that it's ideal, and the initial results
               | left a lot to be desired in gaming (where latency and
               | prediction is a Hard Problem), but AI upscaling is
               | already routinely used for scene rips of older videos
               | (from the VHS Age or the DVD Age), and it's clearly going
               | to happen inside of a codec sooner or later.
        
               | mort96 wrote:
               | I'm not saying it's not going to happen. I'm saying it's
               | a terrible idea.
               | 
               | AI upscaling built in to video players isn't a problem,
               | as long as you can view the source data by disabling AI
               | upscaling. The human is in control.
               | 
               | AI upscaling and detail hallucination built in to video
               | _codecs_ is a problem.
        
               | mapt wrote:
               | The entire job of a codec is subjectively authentic, but
               | lossy compression. AI is our best and in some ways
               | easiest method of lossy compression. All lossy
               | compression produces artifacts; JPEG macroblocks are
               | effectively a hallucination, albeit one that is
               | immediately identifiable because it fails to simulate
               | anything else we're familiar with.
               | 
               | AI compression doesn't have to be the level of
               | compression that exists in image generation prompts,
               | though. A SORA prompt might be 500 bits (~1 bit per
               | character natural English), while a decompressed 4K frame
               | that you're trying to bring to 16K level of simulated
               | detail starts out at 199 million bits. It can be a much
               | finer level of compression.
        
               | amiga386 wrote:
               | > And in the case of many others, it makes a very
               | significant difference.
               | 
               | This is very true, but we're talking about an
               | entertainment provider's choice of codec for streaming to
               | millions of subscribers.
               | 
               | A security recording device's choice of codec ought to be
               | very different, perhaps even regulated to exclude codecs
               | which could "hallucinate" high-definition detail not
               | present in the raw camera data, and the limitations of
               | the recording media need to be understood by law
               | enforcement. We've had similar problems since the
               | introduction of tape recorders, VHS and so on, they
               | always need to be worked out. Even the phantom of
               | Helibronn
               | (https://en.wikipedia.org/wiki/Phantom_of_Heilbronn)
               | turned out to be DNA contamination of swabs by someone
               | who worked for the swab manufacturer.
        
               | mort96 wrote:
               | I don't understand why it needs to be a part of the
               | codec. Can't Netflix use relatively low
               | bitrate/resolution AV1 and then use AI to upscale or add
               | back detail in the player? Why is this something we want
               | to do _in the codec_ and therefore set in stone with
               | standard bodies and hardware implementations?
        
               | amiga386 wrote:
               | We're currently indulging a hypothetical, the idea of AI
               | being used to either improve the quality of streamed
               | video, or provide the same quality with a lower bitrate,
               | so the focus is what would both ends of the codec could
               | agree on.
               | 
               | The coding side of "codec" needs to know what the
               | decoding side would add back in (the hypothetical AI
               | upscaling), so it knows where it can skimp and get a good
               | "AI" result anyway, versus where it has to be generous in
               | allocating bits because the "AI" hallucinates too badly
               | to meet the quality requirements. You'd also want it
               | specified, so that any encoding displays the same on any
               | decoder, and you'd want it in hardware as most devices
               | that display video rely on dedicated decoders to play it
               | at full frame rate and/or not drain their battery. It
               | it's not in hardware, it's not going to be adopted. It is
               | possible to have different encodings, so a "baseline"
               | encoding could leave out the AI upscaler, at the cost of
               | needing a higher bitrate to maintain quality, or
               | switching to a lower quality if bitrate isn't there.
               | 
               | Separating out codec from upscaler, and having a
               | deliberately low-resolution / low-bitrate stream be
               | naively "AI upscaled" would, IMHO, look like shit. It's
               | already a trend in computer games to render at lower
               | resolution and have dedicated graphics card hardware "AI
               | upscale" (DLSS, FSR, XeSS, PSSR), because 4k resolutions
               | are just too much work to render modern graphics
               | consistently at 60fps. But the result, IMHO, noticibly
               | and distractingly glitches and errors all the time.
        
           | cubefox wrote:
           | Neural codecs are indeed the future of audio and video
           | compression. A lot of people / organizations are working on
           | them and they are close to being practical. E.g.
           | https://arxiv.org/abs/2502.20762
        
         | snvzz wrote:
         | >So now that h.264, h.265, and AV1 seem to be the three major
         | codecs with hardware support
         | 
         | That'd be h264 (associated patents expired in most of the
         | world), vp9 and av1.
         | 
         | h265 aka HEVC is less common due to dodgy, abusive licensing.
         | Some vendors even disable it with drivers despite hardware
         | support because it is nothing but legal trouble.
        
           | ladyanita22 wrote:
           | I have the feeling that H265 is more prevalent than VP9
        
         | thrdbndndn wrote:
         | I'm not too surprised. It's similar to the metric that "XX% of
         | Internet is on IPv6" -- it's almost entirely driven by mobile
         | devices, specifically phones. As soon as both mainstream
         | Android and iPhones support it, the adoption of AV1 should be
         | very 'easy'.
         | 
         | (And yes, even for something like Netflix lots of people
         | consume it with phones.)
        
         | 0manrho wrote:
         | > To me, the big news here is that ~30% of devices now support
         | AV1 hardware decoding
         | 
         | Where did it say that?
         | 
         | > AV1 powers approximately 30% of all Netflix viewing
         | 
         | Is admittedly a bit non-specific, it could be interpreted as
         | 30% of users or 30% of hours-of-video-streamed, which are very
         | different metrics. If 5% of your users are using AV1, but that
         | 5% watches far above the average, you can have a minority
         | userbase with an outsized representation in hours viewed.
         | 
         | I'm not saying that's the case, just giving an example of how
         | it doesn't necessarily translate to 30% of devices using
         | Netflix supporting AV1.
         | 
         | Also, the blog post identifies that there is an
         | effective/efficient software decoder, which allows people
         | without hardware acceleration to still view AV1 media in some
         | cases (the case they defined was Android based phones). So that
         | kinda complicates what "X% of devices support AV1 playback," as
         | it doesn't necessarily mean they have hardware decoding.
        
           | sophiebits wrote:
           | "30% of viewing" I think clearly means either time played or
           | items played. I've never worked with a data team that would
           | possibly write that and mean users.
           | 
           | If it was a stat about users they'd say "of users", "of
           | members", "of active watchers", or similar. If they wanted to
           | be ambiguous they'd say "has reached 30% adoption" or
           | something.
        
             | 0manrho wrote:
             | Agreed, but this is the internet, the ultimate domain of
             | pedantry, and they didn't say it explicitly, so I'm not
             | going to put words in their mouth just to have a circular
             | discussion about why I'm claiming they said something they
             | didn't technically say, which is why I asked "Where did it
             | say that" at the very top.
             | 
             | Also, either way, my point was and still stands: it doesn't
             | say 30% of devices have hardware encoding.
        
             | csdreamer7 wrote:
             | I am not in data science so I can not validate your
             | comment, but 30% of viewing I would assume mean users or
             | unique/discreet viewing sessions and not watched minutes. I
             | would appreciate it if Netflix would clarify.
        
           | endorphine wrote:
           | In either case, it is still big news.
        
           | cogman10 wrote:
           | That was one of the best decisions of AOMedia.
           | 
           | AV1 was specifically designed to be friendly for a hardware
           | decoder and that decision makes it friendly to software
           | decoding. This happened because AOMedia got hardware
           | manufacturers on the board pretty early on and took their
           | feedback seriously.
           | 
           | VP8/9 took a long time to get decent hardware decoding and
           | part of the reason for that was because the stream was more
           | complex than the AV1 stream.
        
             | Neywiny wrote:
             | Hmmm disagree on your chain there. Plenty of easy hardware
             | algorithms are hard for software. For example, in hardware
             | (including FPGAs), bit movement/shuffling is borderline
             | trivial if it's constant, while in software you have to
             | shift and mask and or over and over. In hardware you
             | literally just switch which wire is connected to what on
             | the next stage. Same for weird bit widths. Hardware doesn't
             | care (too much) if you're operating on 9 bit quantities or
             | 33 or 65. Software isn't that granular and often you'll
             | double your storage and waste a bunch.
             | 
             | I think they certainly go hand in hand in that algorithms
             | relatively easier for software vs previously are easier for
             | hardware vs previously and vice versa, but they are good at
             | different things.
        
               | cogman10 wrote:
               | I'm not claiming that software will be more efficient.
               | I'm claiming that things that make it easy to go fast in
               | hardware make it easy to go fast in software.
               | 
               | Bit masking/shifting is certainly more expensive in
               | software, but it's also about the cheapest software
               | operation. In most cases it's a single cycle transform.
               | In the best cases, it's something that can be done with
               | some type of SIMD instruction. And in even better cases,
               | it's a repeated operation which can be distributed across
               | the array of GPU vector processors.
               | 
               | What kills both hardware and software performance is data
               | dependency and conditional logic. That's the sort of
               | thing that was limited in the AV1 stream.
        
             | galad87 wrote:
             | All I read about is that it's less hardware friendly than
             | H.264 and HEVC, and they were all complaining about it. AV2
             | should be better in this regard.
             | 
             | Where did you read that it was designed to make creating an
             | hardware decoder easier?
        
               | cogman10 wrote:
               | It was a presentation on AV1 before it was released. I'll
               | see if I can find it but I'm not holding my breath. It's
               | mostly coming from my own recollection.
               | 
               | Ok, I don't think I'll find it. I think I'm mostly just
               | regurgitating what I remember watching at one of the
               | research symposiums. IDK which one it was unfortunately
               | [1]
               | 
               | [1]
               | https://www.youtube.com/@allianceforopenmedia2446/videos
        
               | danudey wrote:
               | I've heard that same anecdote before, that hardware
               | decoding was front of mind. Doesn't mean that you (we)
               | are right, but at least if you're hallucinating it's not
               | just you.
        
         | vitorgrs wrote:
         | I mean... I bought a Samsung TV in 2020, and it already
         | supported AV1 HW decoding.
         | 
         | 2020 feels close, but that's 5 years.
        
           | cubefox wrote:
           | Two years ago I bought a Snapdragon 8+ Gen 1 phone (TSMC 4nm,
           | with 12 GB LPDDR RAM, 256 GB NAND flash, and a 200 megapixel
           | camera). It still feels pretty modern but it has no AV1
           | support.
        
           | usrusr wrote:
           | Is that supposed to be long-lived for a TV?
           | 
           | I'm running an LG initially released in 2013 and the only
           | thing I'm not happy with is that about a year ago Netflix
           | ended their app for that hardware generation (likely for
           | phasing out whatever codec it used). Now I'm running that
           | unit behind an Amazon fire stick and the user experience is
           | so much worse.
           | 
           | (that LG was a "smart" TV from before they started
           | enshittifying, such a delight - had to use and set up a
           | recent LG once on a family visit and it was even worse than
           | the fire stick, omg, _so_ much worse!)
        
             | windexh8er wrote:
             | If, by chance, you're not running the latest version
             | RootMyTV [0] may be an option. Or downgrade might still be
             | an option [1].
             | 
             | [0] https://github.com/RootMyTV/RootMyTV.github.io [1]
             | https://github.com/throwaway96/downgr8
        
             | StilesCrisis wrote:
             | Fire Stick is the most enshittified device (which is why it
             | was so cheap). AppleTV is fantastic if you're willing to
             | spend $100. You don't need the latest gen; previous gen are
             | just as good.
        
             | Dylan16807 wrote:
             | > Is that supposed to be long-lived for a TV?
             | 
             | I don't see anything in that comment implying such a thing.
             | It's just about the uptake of decoders.
        
         | alex_duf wrote:
         | That's not at all how I read it.
         | 
         | They mentioned they delivered a software decoder on android
         | first, then they also targeted web browsers (presumably through
         | wasm). So out of these 30%, a good chunk of it is software not
         | hardware.
         | 
         | That being said, it's a pretty compelling argument for phone
         | and tv manufacturers to get their act together, as Apple has
         | already done.
        
           | danudey wrote:
           | This is something that infuriates me to no end - companies
           | forcing software decoding on my devices rather than shipping
           | me a codec my device supports.
           | 
           | When I'm watching something on YouTube on my iPhone, they're
           | usually shipping me something like VP9 video which requires a
           | software decoder; on a sick day stuck in bed I can burn
           | through ten percent of my battery in thirty minutes.
           | 
           | Meanwhile, if I'm streaming from Plex, all of my media is
           | h264 or h265 and I can watch for hours on the same battery
           | life.
        
         | mort96 wrote:
         | > So now that h.264, h.265, and AV1 seem to be the three major
         | codecs with hardware support, I wonder what will be the next
         | one?
         | 
         | Hopefully, we can just stay on AV1 for a long while. I don't
         | feel any need to obsolete all the hardware that's now finally
         | getting hardware decoding support for AV1.
        
       | Eduard wrote:
       | I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that
       | Netflix clients without AV1 hardware acceleration capabilities
       | would be overwhelmed by it?
        
         | adgjlsfhk1 wrote:
         | There are a lot of 10 year old TVs/fire sticks still in use
         | that have a CPU that maxes out running the UI and rely
         | exclusively on hardware decoding for all codecs (e.g. they
         | couldn't hardware decode h264 either). Image a super budget
         | phone from ~2012 and you'll have some idea the hardware
         | capability we're dealing with.
        
         | FrostKiwi wrote:
         | Thanks to libdav1d's [1] lovingly hand crafted SIMD ASM
         | instructions it's actually possible to reasonably playback AV1
         | without hardware acceleration, but basically yes: From
         | Snapdragon 8 onwards, Google Tensor G3 onwards, NVIDIA RTX 3000
         | series onwards. All relatively new .
         | 
         | [1] https://code.videolan.org/videolan/dav1d
        
           | snvzz wrote:
           | Even RISC-V vector assembly[0].
           | 
           | 0. https://code.videolan.org/videolan/dav1d/-/issues/435
        
           | jeffparsons wrote:
           | It's possible without specific hardware acceleration, but
           | murderous for mobile devices.
        
         | eru wrote:
         | If you are on a mobile device, decoding without hardware
         | assistance might not overwhelm the processors directly, but it
         | might drain your battery unnecessarily fast?
        
         | boterock wrote:
         | tv manufacturers don't want high end chips for their tv sets...
         | hardware decoding is just a way to make cheaper chips for tvs.
        
         | johncolanduoni wrote:
         | Compression gains will mostly be for the benefit of the
         | streaming platform's bills/infra unless you're trying to stream
         | 4K 60fps on hotel wifi (or if you can't decode last-gen codecs
         | on hardware either ). Apparently streaming platforms still
         | favor user experience enough to not heat their rooms for no
         | observable improvement. Also a TV CPU can barely decode a PNG
         | still in software - video decoding of any kind is simply
         | impossible.
        
           | solarkraft wrote:
           | > Apparently streaming platforms still favor user experience
           | enough to not heat their rooms for no observable improvement
           | 
           | It's more like "why does Netflix kill my battery within an
           | hour when I used to be able to play for 20"
        
         | dd_xplore wrote:
         | They would be served h.265
        
         | MaxL93 wrote:
         | I'd love to watch Netflix AV1 streams but they just straight up
         | don't serve it to my smart TV or my Windows computers despite
         | hardware acceleration support.
         | 
         | The only way I can get them to serve me an AV1 stream is if I
         | block "protected content IDs" through browser site settings.
         | Otherwise they're giving me an H.264 stream... It's really
         | silly, to say the least
        
         | solarkraft wrote:
         | Absolutely. Playing back any video codec is a terrible
         | experience without acceleration.
        
       | ls612 wrote:
       | On a related note, why are release groups not putting out AV1
       | WEB-DLs? Most 4K stuff is h265 now but if AV1 is supplied without
       | re-encoding surely that would be better?
        
         | Dwedit wrote:
         | Because pirates are unaffected by the patent situation with
         | H.265.
        
           | ls612 wrote:
           | But isn't AV1 just better than h.265 now regardless of the
           | patents? The only downside is limited compatibility.
        
             | BlaDeKke wrote:
             | Encoding my 40TB library to AV1 with software encoding
             | without losing quality would take more then a year of not
             | multiple years, consume lots of power while doing this, to
             | save a little bit of storage. Granted, after a year of non
             | stop encoding I would save a few TB of space. But it think
             | it is cheaper to buy a new 20TB hard drive than the
             | electricity used for the encoding.
        
             | phantasmish wrote:
             | I avoid av1 downloads when possible because I don't want to
             | have to figure out how to disable film grain synthesis and
             | then deal with whatever damage that causes to apparent
             | quality on a video that was encoded with it in mind. Like I
             | just don't want any encoding that supports that, if I can
             | stay away from it.
        
               | coppsilgold wrote:
               | In MPV it's just "F1 vf toggle format:film-grain=no" in
               | the input config. And I prefer AV1 because of this,
               | almost everything looks better without that noise.
               | 
               | You can also include "vf=format:film-grain=no" in the
               | config itself to start with no film grain by default.
        
               | phantasmish wrote:
               | I watch almost everything in Infuse on Apple TV or in my
               | browser, though.
        
               | adgjlsfhk1 wrote:
               | What's wrong with film grain synthesis? Most film grain
               | in modern films is "fake" anyway (The modern VFX pipeline
               | first removes grain, then adds effects, and lastly re-
               | adds fake grain), so instead of forcing the codec to try
               | to compress lots of noise (and end up blurring lots of it
               | away), we can just have the codec encode the noisless
               | version and put the noise on after.
        
               | phantasmish wrote:
               | I watch a lot of stuff from the first 110ish years of
               | cinema. For the most recent 25, and especially 15... yeah
               | I dunno, maybe, but easier to just avoid it.
               | 
               | I do sometimes end up with av1 for streaming-only stuff,
               | but most of that looks like shit anyway, so some (more)
               | digital smudging isn't going to make it much worse.
        
               | adgjlsfhk1 wrote:
               | Even for pre-digital era movies, you want film grain. You
               | just want it done right (which not many places do to be
               | fair).
               | 
               | The problem you see with AV1 streaming isn't the film
               | grain synthesis; it's the bitrate. Netflix is using film
               | grain synthesis to save bandwidth (e.g. 2-5mbps for
               | 1080p, ~20mbps for 4k), 4k bluray is closer to 100mbps.
               | 
               | If the AV1+FGS is given anywhere close to comparable
               | bitrate to other codecs (especially if it's encoding from
               | a non-compressed source like a high res film scan), it
               | will absolutely demolish a codec that doesn't have FGS on
               | both bitrate and detail. The tech is just getting a bad
               | rap because Netflix is aiming for minimal cost to deliver
               | good enough rather than maximal quality.
        
               | Wowfunhappy wrote:
               | With HEVC you just don't have the option to disable film
               | grain because it's burned into the video stream.
        
               | phantasmish wrote:
               | I'm not looking to disable film grain, if it's part of
               | the source.
        
               | Mashimo wrote:
               | Does AV1 add it if it's not part of the source?
        
               | phantasmish wrote:
               | I dunno, but if there is grain in the source it may erase
               | it (discarding information) then invent new grain (noise)
               | later.
        
               | Wowfunhappy wrote:
               | I'm skeptical of this (I think they avoid adding grain to
               | the AV1 stream which they add to the other streams--of
               | course all grain is artificial in modern times), but even
               | if true--like, all grain is noise! It's random noise from
               | the sensor. There's nothing magical about it.
        
               | phantasmish wrote:
               | The grain's got randomness because distribution and size
               | of grains is random, but it's not noise, it's the
               | "resolution limit" (if you will) of the picture itself.
               | The whole picture _is_ grain. The film _is_ grain.
               | Displaying that is accurately displaying the picture.
               | Erasing it for compression's sake is tossing out
               | information, and adding it back later is just an effect
               | to add noise.
               | 
               | I'm ok with that for things where I don't care that much
               | about how it looks (do I give a shit if I lose just a
               | little detail on Happy Gilmore? Probably not) and agree
               | that faking the grain probably gets you a closer look to
               | the original if you're gonna erase the grain for better
               | compression, but if I want actual high quality for a film
               | source then faked grain is no good, since if you're
               | having to fake it you definitely already sacrificed a lot
               | of picture quality (because, again, the grain _is_ the
               | picture, you only get rid of it by discarding information
               | from the picture)
        
               | Wowfunhappy wrote:
               | If you're watching something from the 70s, sure. I would
               | hope synthesized grain isn't being used in this case.
               | 
               | But for anything modern, the film grain was likely added
               | during post-production. So it really is just random
               | noise, and there's no reason it can't be recreated (much
               | more efficiently) on the client-side.
        
             | bubblethink wrote:
             | HW support for av1 is still behind h265. There's a lot of
             | 5-10 year old hw that can play h265 but not av1. Second,
             | there is also a split bw Dovi and HDR(+). Is av1 + Dovi a
             | thing? Blu rays are obviously h265. Overall, h265 is the
             | common denominator for all UHD content.
        
               | HelloUsername wrote:
               | > Blu rays are obviously h265
               | 
               | Most new UHD, yes, but otherwise BRD primarily use
               | h264/avc
        
           | homebrewer wrote:
           | Everyone is affected by that mess, did you miss the recent
           | news about Dell and HP dropping HEVC support in hardware they
           | have already shipped? Encoders might not care about legal
           | purity of the encoding process, but they do have to care
           | about how it's going to be decoded. I like using proper
           | software to view my videos, but it's a rarity afaik.
        
         | avidiax wrote:
         | I looked into this before, and the short answer is that release
         | groups would be allowed to release in AV1, but the market seems
         | to prefer H264 and H265 because of compatibility and release
         | speed. Encoding AV1 to an archival quality takes too long,
         | reduces playback compatibility, and doesn't save that much
         | space.
         | 
         | There also are no scene rules for AV1, only for H265 [1]
         | 
         | [1] https://scenerules.org/html/2020_X265.html
        
           | ls612 wrote:
           | Yeah I'm talking about web-dl though not a rip so there is no
           | encoding necessary.
        
           | breve wrote:
           | > _Encoding AV1 to an archival quality takes too long_
           | 
           | With the SVT-AV1 encoder you can achieve better quality in
           | less time versus the x265 encoder. You just have to use the
           | right presets. See the encoding results section:
           | 
           | https://www.spiedigitallibrary.org/conference-proceedings-
           | of...
        
             | dd_xplore wrote:
             | Yeah, is there any good(and simple)guide for SVT-AV1
             | settings? I tried to convert many of my stuff to it but you
             | really need to put a lot of time to figure out the correct
             | settings for your media, and it becomes more difficult if
             | your media is in mixed formats, encodings etc.
        
               | Drybones wrote:
               | I do a lot of AV1 encoding. Here's a couple of guides for
               | encoding with SVT-AV1 from enthusiast encoding groups:
               | 
               | https://wiki.x266.mov/docs/encoders/SVT-AV1
               | 
               | https://jaded-encoding-thaumaturgy.github.io/JET-
               | guide/maste...
        
           | aidenn0 wrote:
           | I'm surprised it took so long for CRF to dethrone 2-pass. We
           | used to use 2-pass primarily so that files could be made to
           | fit on CDs.
        
           | MaxL93 wrote:
           | AV1 is the king of ultra-low bitrates, but as you go higher
           | -- and not even that much higher -- HEVC becomes just as
           | good, if not more. Publicly-available AV1 encoders (still)
           | have a tendency to over-flatten anything that is low-contrast
           | enough, while x265 is much better at preserving visual
           | energy.
           | 
           | This problem is only just now starting to get solved in SVT-
           | AV1 with the addition of community-created psychovisual
           | optimizations... features that x264 had over 15 years ago!
        
         | chrisfosterelli wrote:
         | Player compatibility. Netflix can use AV1 and send it to the
         | devices that support it while sending H265 to those that don't.
         | A release group puts out AV1 and a good chunk of users start
         | avoiding their releases because they can't figure out why it
         | doesn't play (or plays poorly).
        
         | mrbluecoat wrote:
         | h.264 has near-universal device support and almost no playback
         | issues at the expensive of slightly larger file sizes. h.265
         | and av1 give you 10-bit 4K but playback on even modest laptops
         | can become choppy or produce render artifacts. I tried all
         | three, desperately wanting av1 to win but Jellyfin on a small
         | streaming server just couldn't keep up.
        
         | aidenn0 wrote:
         | I'm not in the scene anymore, but for my own personal encoding,
         | at higher quality settings, AV1 (rav1e or SVT; AOM was crazy
         | slow) doesn't significantly beat out x265 for most sources.
         | 
         | FGS makes a _huge_ difference at moderately high bitrates for
         | movies that are very grainy, but many people seem to _really_
         | not want it for HQ sources (see sibling comments). With FGS
         | off, it 's hard to find any sources that benefit at bitrates
         | that you will torrent rather than stream.
        
         | hapticmonkey wrote:
         | I've seen some on private sites. My guess is they are not
         | popular enough yet. Or pirates are using specific hardware to
         | bypass Widevine encryption (like an Nvidia Shield and burning
         | keys periodically) that doesn't easily get the AV1 streams.
        
         | qingcharles wrote:
         | I'm seeing releases pop up on Pirate Bay with AV1 this year.
        
         | Drybones wrote:
         | Smaller PT sites usually allow it
         | 
         | Bigger PT sites with strict rules do not allow it yet and are
         | actively discussing/debating it.Netflix Web-DLs being AV1 is
         | definitely pushing that. The codec has to be a select-able
         | option during upload.
        
       | bofaGuy wrote:
       | Netflix has been the worst performing and lowest quality video
       | stream of any of the streaming services. Fuzzy video, lots of
       | visual noise and artifacts. Just plan bad and this is on the 4k
       | plan on 1GB fiber on a 4k Apple TV. I can literally tell when
       | someone is watching Netflix without knowing because it looks like
       | shit.
        
         | mtoner23 wrote:
         | Probably some function of your location to data centers. I find
         | hbo max to be aysmal these days. But I've learned to just stop
         | caring about this stuff since no one else in my life does
        
           | jiggawatts wrote:
           | https://xkcd.com/1015/
           | 
           | Now you can be mad about two things nobody else notices.
        
         | odo1242 wrote:
         | This is actually their DRM speaking. If you watch it on a Linux
         | device or basically anything that isn't a smart TV on the
         | latest OS, they limit you to a 720p low bitrate stream, even if
         | you pay for 4k. (See Louis Rossman's video on the topic)
        
           | jsheard wrote:
           | OP said they're using an Apple TV, which most definitely
           | supports the 4K DRM.
        
             | array_key_first wrote:
             | The bit rate is unfortunately crushed to hell and back,
             | leading to blockiness on 4K.
        
           | misiek08 wrote:
           | Have same experience as OP on newest ATV 4k. Good it's not
           | only me who wonders how is it possible that they describe
           | such great approaches to encoding, but final result is just
           | so bad.
           | 
           | Good that the OCAs really work and are very inspiring in
           | content delivery domain.
        
         | mapontosevenths wrote:
         | It's not AV1's fault though, I'm pretty sure it's that they
         | cheap out on the bitrate. Apple is among the highest bitrates
         | (other than Sony's weird hardware locked streaming service).
         | 
         | I actually blamed AV1 for the macro-blocking and generally
         | awful experience of watching horror films on Netflix for a long
         | time. Then I realized other sources using AV1 were better.
         | 
         | If you press ctl-alt-shift-d while the video is playing you'll
         | note that most of the time that the bitrate is appallingly low,
         | and also that Netflix plays their own original content using
         | higher bitrate HEVC rather than AV1.
         | 
         | That's because they actually want it to look good. For partner
         | content they often default back to lower bitrate AV1, because
         | they just don't care.
        
         | mulderc wrote:
         | I also find Netflix video quality shockingly bad and oddly
         | inconsistent. I think they just don't prioritize video quality
         | in the same way as say apple or Disney does.
        
         | pcchristie wrote:
         | I cancelled Netflix for this exact reason. 4K Netflix looks
         | worse than 720 YouTube, yet I pay(paid) for Netflix 4K, and at
         | roughly 2x what I paid for Netflix when it launched. It's
         | genuinely a disgrace how they can even claim with a straight
         | face that you're actually watching 4K. The last price rise was
         | the tipping point and I tapped out after 11 years.
        
         | not_a_bot_4sho wrote:
         | Oddly enough, I observe something to the opposite effect.
         | 
         | I wonder if it has more to do with proximity to edge delivery
         | nodes than anything else.
        
         | prhn wrote:
         | Netflix on Apple TV has an issue if "Match Content" is "off"
         | where it will constantly downgrade the video stream to a lower
         | bitrate unnecessarily.
         | 
         | Even fixing that issue the video quality is never great
         | compared to other services.
        
         | bombela wrote:
         | Yep, and they also silently downgrade resolution and audio
         | channels on an ever changing and hidden list of
         | browsers/OS/device overtime.
         | 
         | Meanwhile pirated movies are in Blu-ray quality, with all audio
         | and language options you can dream of.
        
         | deanylev wrote:
         | I was able to improve things somewhat by going to
         | https://www.netflix.com/settings/playback/<myprofileid> and
         | changing "Data usage per screen" from Auto to High
        
       | conartist6 wrote:
       | For a second there I wasn't looking very close and I thought it
       | said that 30% of Netflix was running on .AVI files
        
       | shmerl wrote:
       | Qualcomm seems to be lagging behind and doesn't have AV1 decoder
       | except in high end SoCs.
        
       | notatoad wrote:
       | I understand that sometimes the HN titles get edited to be less
       | descriptive and more generic in order to match the actual article
       | title.
       | 
       | What's the logic with changing the title here from the actual
       | article title it was originally submitted with "AV1 -- Now
       | Powering 30% of Netflix Streaming" to the generic and not at all
       | representative title it currently has "AV1: a modern open codec"?
       | That is neither the article title nor representative of the
       | article content.
        
         | 7e wrote:
         | Also, it's not the whole picture. AV1 is open because it didn't
         | have the good stuff (newly patented things) and as such I also
         | wouldn't say it's the most modern.
        
           | bawolff wrote:
           | Just because something is patented doesn't necessarily mean
           | its good. I think head to head comparisons matter more.
           | (Admittedly i dont know how av1 holds up)
        
             | parl_match wrote:
             | Yes, but in this case, it does.
             | 
             | AV1 is good enough that the cost of not licensing might
             | outweigh the cost of higher bandwidth. And it sounds like
             | Netflix agrees with that.
        
           | adgjlsfhk1 wrote:
           | AV1 has plenty of good stuff. AOM (the agency that developed
           | AV1) has a patent pool
           | https://www.stout.com/en/insights/article/sj17-the-
           | alliance-... comprising of video hardware/software patents
           | from Netflix, Google, Nvidia, Arm, Intel, Microsoft, Amazon
           | and a bunch of other companies. AV1 has a bunch of patents
           | covering it, but also has a guarantee that you're allowed to
           | use those patents as you see fit (as long as you don't sue
           | AOM members for violating media patents).
           | 
           | AV1 definitely is missing some techniques patented by h264
           | and h265, but AV2 is coming around now that all the h264
           | innovations are patent free (and now that there's been
           | another decade of research into new cutting edge techniques
           | for it).
        
         | VerifiedReports wrote:
         | Amen. The mania for obscurity in titles here is infuriating.
         | This one is actually replete with information compared to many
         | you see on the front page.
        
           | CyberDildonics wrote:
           | hacker news loves low information click bait titles. The
           | shorter and more vague the better.
        
           | tomhow wrote:
           | If there really was a "mania for obscurity in titles" we'd
           | see a lot more complaints than we do.
           | 
           | Our title policy is pretty simple and attuned for maximum
           | respect to the post's author/publisher and the HN audience.
           | 
           | We primarily just want to retain the title that was chosen by
           | the author/publisher, because it's their work and they are
           | entitled to have such an important part of their work
           | preserved.
           | 
           | The only caveat is that if the title is baity or misleading,
           | we'll edit it, but only enough that it's no longer baity or
           | misleading. That's because clickbait and misleading titles
           | are disrespectful to the audience.
           | 
           | Any time you see a title that doesn't conform to these
           | principles, you're welcome to email us and ask us to review
           | it. Several helpful HN users do this routinely.
        
         | pants2 wrote:
         | Though in the original title AV1 could be anything if you don't
         | know it's a codec. How about:
         | 
         | "AV1 open video codec now powers 30% of Netflix viewing, adds
         | HDR10+ and film grain synthesis"
        
           | nerdsniper wrote:
           | AV1 is fine as-is. Plenty of technical titles on HN would
           | need to be googled if you didn't know it. Even in yours,
           | HDR10+ "could be anything if you don't know it". Play this
           | game if you want, but it's unwindable. The only people who
           | care about AV1 already know what it is.
        
             | pants2 wrote:
             | Well, I'm interested in AV1 as a videographer but hadn't
             | heard of it before. Without 'codec' in the title I would
             | have thought it was networking related.
             | 
             | Re: HDR - not the same thing. HDR has been around for
             | decades and every TV in every electronics store blasts you
             | with HDR10 demos. It's well known. AV1 is extremely niche
             | and deserves 2 words to describe it.
        
               | cyphar wrote:
               | AV1 has been around for a decade (well, it was released 7
               | years ago but the Alliance for Open Media was formed a
               | decade ago).
               | 
               | It's fine that you haven't heard of it before (you're one
               | of today's lucky 10,000!) but it really isn't that niche.
               | YouTube and Netflix (from TFA) also started switching to
               | AV1 several years ago, so I would expect it to have
               | similar name recognition to VP9 or WebM at this point. My
               | only interaction with video codecs is having to futz
               | around with ffmpeg to get stuff to play on my TV, and I
               | heard about AV1 a year or two before it was published.
        
               | edoceo wrote:
               | I'm old (50) and have heard AV1 before. My modern TV
               | didn't say HDR or HDR10 (it did say 4k). Agree that AV1
               | should include "codec".
               | 
               | One word, or acronym, just isn't enough to describe
               | anything on this modern world.
        
               | notatoad wrote:
               | this is the reason articles exist, and contain more
               | information than the headline does.
               | 
               | you might not know what AV1 is, and that's fine, but the
               | headline doesn't need to contain all the background
               | information it is possible to know about a topic. if you
               | need clarification, click the link.
        
           | efitz wrote:
           | The article barely mentioned "open", and certainly gave no
           | insight as to what "open" actually means wrt AV1.
        
           | lII1lIlI11ll wrote:
           | > Though in the original title AV1 could be anything if you
           | don't know it's a codec.
           | 
           | I'm not trying to be elitist, but this is "Hacker News", not
           | CNN or BBC. It should be safe to assume some level of
           | computer literacy.
        
             | averageRoyalty wrote:
             | Knowledge of all available codecs is certainly not the same
             | tier as basic computer literacy. I agree it doesn't need to
             | be dumbed down to the general user, but we also shouldn't
             | assume everyone here know every technical abbreviation.
        
         | cortesoft wrote:
         | It is usually Dang using his judgment.
        
           | big-and-small wrote:
           | I really like moderation on HN in general, but honestly this
           | inconsistent policy of editorializing titles is bad. There
           | were plenty of times where submitter editorialized titles
           | (e.g GitHub code dumps of some project) were changed back to
           | useless and vague (without context) original titles.
           | 
           | And now HN administration tend to editorialize in their own
           | way.
        
         | wltr wrote:
         | For me that's a FU moment that reminds me 'TF am I doing here?'
         | I genuinely see this resource as a censoring plus advertising
         | (both for YC, obviously) platform, where there are generic
         | things, but also things someone doesn't want you to read or
         | know. The titles are constantly being changed to gibberish like
         | right here, the adequate comments or posts are being dead, yet
         | the absolutely irrelevant or offensive things, can stay not
         | touched. Etc.
        
         | tomhow wrote:
         | OK guys, my screwup.
         | 
         | We generally try to remove numbers from titles, because numbers
         | tend to make a title more baity than it would otherwise be, and
         | quite often (e.g., when reporting benchmark test results) a
         | number is cherry-picked or dialed up for maximum baitiness. In
         | this case, the number isn't exaggerated, but any number tends
         | to grab the eye more than words, so it's just our convention to
         | remove number-based titles where we can.
         | 
         | The thing with this title is that the number isn't primarily
         | what the article is about, and in fact it under-sells what the
         | article really is, which is a quite-interesting narrative of
         | Netflix's journey from H.264/AVC, to the initial adoption of
         | AV1 on Android in 2020, to where it is now: 30% adoption across
         | the board.
         | 
         | When we assess that an article's original title is baity or
         | misleading, we try to find a subtitle or a verbatim sentence in
         | the article that is sufficiently representative of the content.
         | 
         | The title I chose is a subtitle, but I didn't take enough care
         | to ensure it was adequately representative. I've now chosen a
         | different subtitle which I do think is the most accurate
         | representation of what the whole article is about.
        
       | endorphine wrote:
       | Is it me or this post has LLM vibes?
        
       | VerifiedReports wrote:
       | I had forgotten about the film-grain extraction, which is a
       | clever approach to a huge problem for compression.
       | 
       | But... did I miss it, or was there no mention of any tool to
       | specify grain parameters up front? If you're shooting "clean"
       | digital footage and you decide in post that you want to add
       | grain, how do you convey the grain parameters to the encoder?
       | 
       | It would degrade your work and defeat some of the purpose of this
       | clever scheme if you had to add fake grain to your original
       | footage, feed the grainy footage to the encoder to have it
       | analyzed for its characteristics and stripped out (inevitably
       | degrading real image details at least a bit), and then have the
       | grain re-added on delivery.
       | 
       | So you need a way to specify grain characteristics to the encoder
       | directly, so clean footage can be delivered without degradation
       | and grain applied to it upon rendering at the client.
        
         | crazygringo wrote:
         | You just add it to your original footage, and accept whatever
         | quality degradation that grain inherently provides.
         | 
         | Any movie or TV show is ultimately going to be streamed in lots
         | of different formats. And when grain is added, it's often on a
         | per-shot basis, not uniformly. E.g. flashback scenes will have
         | more grain. Or darker scenes will have more grain added to
         | emulate film.
         | 
         | Trying to tie it to the particular codec would be a crazy
         | headache. For a solo project it could be doable but I can't
         | ever imagine a streamer building a source material pipeline
         | that would handle that.
        
           | VerifiedReports wrote:
           | Mmmm, no, because if the delivery conduit uses AV1, you can
           | optimize for it and realize better quality by avoiding the
           | whole degrading round of grain analysis and stripping.
           | 
           | "I can't ever imagine a streamer building a source material
           | pipeline that would handle that."
           | 
           | That's exactly what the article describes, though. It's
           | already built, and Netflix is championing this delivery
           | mechanism. Netflix is also famous for dictating technical
           | requirements for source material. Why would they not want the
           | director to be able to provide a delivery-ready master that
           | skips the whole grain-analysis/grain-removal step and
           | provides the best possible image quality?
           | 
           | Presumably the grain extraction/re-adding mechanism described
           | here handles variable grain throughout the program. I don't
           | know why you'd assume that it doesn't. If it didn't, you'd
           | wind up with a single grain level for the entire movie; an
           | entirely unacceptable result for the very reason you mention.
           | 
           | This scheme loses a major opportunity for new productions
           | unless the director can provide a clean master and an
           | accompanying "grain track." Call it a GDL: grain decision
           | list.
           | 
           | This would also be future-proof; if a new codec is devised
           | that also supports this grain layer, the parameters could be
           | translated from the previous master into the new codec. I
           | wish Netflix could go back and remove the hideous soft-focus
           | filtration from The West Wing, but nope; that's baked into
           | the footage forever.
        
             | irae wrote:
             | I believe you are speculating on digital mastering and not
             | codec conversion.
             | 
             | From the creator's PoV their intention and quality is
             | defined in post-production and mastering, color grading and
             | other stuff I am not expert on. But I know a bit more from
             | music mastering and you might be thinking of a workflow
             | similar to Apple, which allows creators to master for their
             | codec with "Mastered for iTuenes" flow, where the creators
             | opt-in to an extra step to increase quality of the encoding
             | and can hear in their studio the final quality after Apple
             | encodes and DRMs the content on their servers.
             | 
             | In video I would assume that is much more complicated,
             | since there are many quality the video is encoded to allow
             | for slower connections and buffering without interruptions.
             | So I assume the best strategy is the one you mentioned
             | yourself, where AV1 obviously detects on a per scene or
             | keyframe interval the grain level/type/characteristics and
             | encode as to be accurate to the source material at this
             | scene.
             | 
             | In other words: The artist/director preference for grain is
             | already per scene and expressed in the high bitrate/low-
             | compression format they provide to Netflix and competitors.
             | I find it unlikely that any encoder flags would
             | specifically benefit the encoding workflow in the way you
             | suggested it might.
        
               | VerifiedReports wrote:
               | "I believe you are speculating on digital mastering and
               | not codec conversion."
               | 
               | That's good, since that's what I said.
               | 
               | "The artist/director preference for grain is already per
               | scene and expressed in the high bitrate/low-compression
               | format they provide to Netflix and competitors. I find it
               | unlikely that any encoder flags would specifically
               | benefit the encoding workflow in the way you suggested it
               | might."
               | 
               | I'm not sure you absorbed the process described in the
               | article. Netflix is analyzing the "preference for grain"
               | as expressed by the grain detected in the footage, and
               | then they're preparing a "grain track," as a stream of
               | metadata that controls a grain "generator" upon delivery
               | to the viewer. So I don't know why you think this
               | pipeline wouldn't benefit from having the creator provide
               | perfectly accurate grain metadata to the delivery network
               | along with already-clean footage up front; this would
               | eliminate the steps of analyzing the footage and
               | (potentially lossily) removing fake grain... only to re-
               | add an approximation of it later.
               | 
               | All I'm proposing is a mastering tool that lets the
               | DIRECTOR (not an automated process) do the "grain
               | analysis" deliberately and provide the result to the
               | distributor.
        
             | crazygringo wrote:
             | You're misunderstanding.
             | 
             | > _if the delivery conduit uses AV1, you can optimize for
             | it_
             | 
             | You could, in theory, as I confirmed.
             | 
             | > _It 's already built, and Netflix is championing this
             | delivery mechanism._
             | 
             | No it's not. AV1 encoding is already built. Not a pipeline
             | where source files come without noise but with noise
             | metadata.
             | 
             | > _and provides the best possible image quality?_
             | 
             | The difference in quality is not particularly meaningful.
             | Advanced noise-reduction algorithms already average out
             | pixel values across many frames to recover a noise-free
             | version that is quite accurate (including accounting for
             | motion), and when the motion/change is so overwhelming that
             | this doesn't work, it's too fast for the eye to be
             | perceiving that level of detail anyways.
             | 
             | > _This scheme loses a major opportunity for new
             | productions unless the director can provide a clean master
             | and an accompanying "grain track."_
             | 
             | Right, that's what you're proposing. But it doesn't exist.
             | And it's probably never going to exist, for good reason.
             | 
             | Production houses generally provide digital masters in IMF
             | format (which is basically JPEG2000), or sometimes ProRes.
             | At a technical level, a grain track _could_ be invented.
             | But it basically flies in the face of the idea that the
             | pixel data itself is the final  "master". In the same way,
             | color grading and vector graphics aren't provided as
             | metadata either, even though they could be in theory.
             | 
             | Once you get away from the idea that the source pixels are
             | the ultimate source of truth and put additional
             | postprocessing into metadata, it opens up a whole can of
             | worms where different streamers interpret the metadata
             | differently, like some streamers might choose to never add
             | noise and so the shows look different and no longer reflect
             | the creator's intent.
             | 
             | So it's almost less of a technical question and more of a
             | philosophical question about what represents the finished
             | product. And the industry has long decided that the
             | finished product is the pixels themselves, not layers and
             | effects that still need to be composited.
             | 
             | > _I wish Netflix could go back and remove the hideous
             | soft-focus filtration from The West Wing, but nope; that 's
             | baked into the footage forever._
             | 
             | In case you're not aware, it's not a postproduction filter
             | -- the soft focus was done with diffusion filters on the
             | cameras themselves, as well as choice of film stock. And
             | that was the creative intent at the time. Trying to
             | "remove" it would be like trying to pretend it wasn't the
             | late-90's network drama that it was.
        
               | VerifiedReports wrote:
               | Nothing in there indicates "misunderstanding." You're
               | simply declaring, without evidence, that the difference
               | in quality "is not particularly meaningful." Whether it's
               | meaningful or not to you is irrelevant; the point is that
               | it's unnecessary.
               | 
               | You are ignoring the fact that the scheme described in
               | the article does not retain the pixel data any more than
               | what I'm proposing does; in fact, it probably retains
               | less, even if only slightly. The analysis phase examines
               | grain, comes up with a set of parameters to simulate it,
               | and then removes it. When it's re-added, it's only a
               | generated simulation. The integrity of the "pixel data"
               | you're citing is lost. So you might as well just allow
               | content creators to skip the pointless
               | adding/analyzing/removing of grain and provide the
               | "grain" directly.
               | 
               | Furthermore, you note that the creator may provide the
               | footage as a JPEG2000 (DCP) or ProRes master; both of
               | those use lossy compression that will waste quality on
               | fake grain that's going to be stripped anyway.
               | 
               | Would they deliver this same "clean" master along with
               | grain metadata to services not using AV1 or similar?
               | Nope. In that case they'd bake the grain in and be on
               | their way.
               | 
               | The article describes a stream of grain metadata to
               | accompany each frame or shot, to be used to generate
               | grain on the fly. It was acquired through analysis of the
               | footage. It is totally reasonable to suggest that this
               | analysis step can be eliminated and the metadata provided
               | by the creator expressly.
               | 
               | And yes I'm well aware that West Wing was shot with
               | optical filters; that's the point of my comment. The
               | dated look is baked in. If the creators or owner wanted
               | to rein in or eliminate it to make the show more
               | relatable to modern audiences, they couldn't. Whether
               | they should is a matter of opinion. But if you look at
               | the restoration and updating of the Star Trek original
               | series, you see that it's possible to reduce the visual
               | cheesiness and yet not go so far as to ruin the flavor of
               | the show.
        
               | crazygringo wrote:
               | I'm not disagreeing with you as to what could be
               | technically built.
               | 
               | I don't need to provide evidence that the resulting
               | difference in quality is negligible because you can play
               | with ffmpeg to verify it yourself if it want. I'm just
               | telling you from experience.
               | 
               | I understand your logic that it could be built, and how
               | it would skip steps and by definition have no loss of
               | quality in that part. But process- and philosophy-wise it
               | just doesn't fit, that's all I'm explaining.
               | 
               | And the fact that JPEG2000/ProRes are lossy is
               | irrelevant. They're encoded at such high quality settings
               | for masters that they become virtually lossless for
               | practical purposes. That's why the source noise is so
               | high-fidelity in the first place.
        
         | bob1029 wrote:
         | Actual film grain (i.e., photochemical) is arguably a valid
         | source of information. You can frame it as noise, but does
         | provide additional information content that our visual system
         | can work with.
         | 
         | Removing real film grain from content and then recreating it
         | parametrically on the other side is not the same thing as
         | directly encoding it. You are killing a lot of information. It
         | is really hard to quantify exactly how we perceive this sort of
         | information so it's easy to evade the consequences of screwing
         | with it. Selling the Netflix board on an extra X megabits/s per
         | streamer to keep genuine film grain that only 1% of the
         | customers will notice is a non-starter.
        
           | VerifiedReports wrote:
           | Exactly. In the case of stuff shot on film, there's little to
           | be done except increase bitrate if you want maximal fidelity.
           | 
           | In the case of fake grain that's added to modern footage, I'm
           | calling out the absurdity of adding it, analyzing it,
           | removing it, and putting yet another approximation of it back
           | in.
        
       | aperture147 wrote:
       | AV1 is not new anymore and I think most of the modern devices are
       | supporting them natively. Some devices like Apple even have a
       | dedicated AV1 HW-accelerator. Netflix has pushing AV1 for a while
       | now so I thought that the adoption rate should be like 50%, but
       | it seems like AV1 requires better hardware and newer software
       | which a lot of people don't have.
        
         | smallstepforman wrote:
         | Dont forget that people also view Netflix on TV's, and a large
         | number of physical TV's were made before AV1 was specced. So
         | 30% overall may also mean 70% on modern devices.
        
         | brnt wrote:
         | > AV1 is not new anymore
         | 
         | Uh what. (Embedded) hardware lasts a long time (and it
         | should!). TV's around the globe are not all built after 2018.
         | H264 is still the gold standard if you want to be sure a random
         | device has hardware acceleration.
         | 
         | I make use of this by taking a USB hard drive with me on trips.
         | Random TV's rarely have issue with my H264 catalogue. It'll be
         | a while before I look at AV1 for this. Sure, I wish I could
         | benefit faster, but I don't want people to throw out perfectly
         | good hardware either!
        
       | shanemhansen wrote:
       | > AV1 streaming sessions achieve VMAF scores1 that are 4.3 points
       | higher than AVC and 0.9 points higher than HEVC sessions. At the
       | same time, AV1 sessions use one-third less bandwidth than both
       | AVC and HEVC, resulting in 45% fewer buffering interruptions.
       | 
       | Just thought I'd extract the part I found interesting as a
       | performance engineer.
        
         | slhck wrote:
         | This VMAF comparison is to be taken with a grain of salt.
         | Netflix' primary goal was to reduce the bitrate consumption, as
         | can be seen, while roughly keeping the same nominal quality of
         | the stream. This means that, ignoring all other factors and
         | limitations of H.264 with higher resolutions, VMAF scores for
         | all their streaming sessions should roughly be the same, or in
         | a comparable range, because that's what they're optimizing for.
         | (See the Dynamic Optimizer Framework they have publicly posted
         | a few years ago.)
         | 
         | Still impressive numbers, of course.
        
       | forgotpwd16 wrote:
       | Am I the only one that thought this is an old article by the
       | title? AV1 is now 10 years old and AV2 has been announced for
       | year-end release few months ago. If anything the news is that AV1
       | powers _only_ 30% by now. At least HEVC, released about the same
       | time, has gotten quite popular in warez scene (movies /TV/anime)
       | for small encodes, whereas AV1 releases are still considered a
       | rarity. (Though to be fair 30% Netflix & YT means AV1 usage in
       | total is much higher.) Will've expected a royalty-free codec
       | to've been embraced more but seems its difficulty for long time
       | to be played on low power devices hindered its adoption.
        
       | nrhrjrjrjtntbt wrote:
       | > At Netflix, our top priority is delivering the best possible
       | entertainment experience to our members.
       | 
       | I dont think that is true of any streamers. Otherwise they
       | wouldnt provide the UI equivalent of a shopping centre that tries
       | to get you lost and unable to find your way out.
        
         | sen wrote:
         | Or compression that makes a streamed 4K video look worse than a
         | 1080p video played locally.
        
       | _puk wrote:
       | I imagine that's a big part of the drive behind discontinuing
       | Chromecast support..
       | 
       | https://www.androidcentral.com/streaming-tv/chromecast/netfl...
        
         | StrLght wrote:
         | I doubt that. Netflix has an app on TVs as old as 8-10 years
         | now. SoCs in such TVs aren't enough to decode AV1. They're
         | stuck with H.264 for a long time.
        
         | kotaKat wrote:
         | Nah, that's more "we can't get ad injection working on the old
         | Chromecast client" because it still works on early Chromecasts
         | for ad-free plans.
        
       | vitorgrs wrote:
       | Weirdly, Netflix on my Samsung TV it's been a few months it's
       | using only H264. Not AV1. When they first launched AV1, it worked
       | there...
       | 
       | Honestly not complaining, because they were using AV1 with
       | 800-900~kbps for 1080p content, which is clearly not enough
       | compared to their 6Mbps h.264 bitrate.
        
         | SG- wrote:
         | they may have determined the decoding of av1 was too poor or
         | that software decoding av1 wasn't a good idea.
        
       | techpression wrote:
       | Compression is great and all, but Netflix is overdoing it and
       | their content looks like an over-sharpened mess with lego blocks
       | in high intensity scenes. And no, it's not my connection, Apple
       | TV does it far better and so does Prime.
       | 
       | It's really sad that most people never get to experience a good
       | 4K Blu-ray, where the grain is actually part of the image as
       | mastered and there's enough bitrate to not rely on sharpening.
        
       | liampulles wrote:
       | I'm a hobbiest video encoder (mostly I like to experiment in
       | backing up my DVD collection), and I recently switched to using
       | AV1 over HEVC.
       | 
       | I've found the ratio of a fixed quality vs CPU load to be better,
       | and I've found it is reasonably good at retaining detail over
       | smoothing things out when compared to HEVC. And the ability to
       | add generated "pseudo grain" works pretty well to give the
       | perception of detail. The performance of GPU encoders (while
       | still not good enough fory maybe stringent standards) is better.
        
       | testdelacc1 wrote:
       | Something doesn't quite add up to me. The post says "AV1 powers
       | approximately 30% of all Netflix viewing". Impressive, but I'm
       | wondering why it isn't higher? I'm guessing most devices should
       | support AV1 software decoders. 88% of devices in certified in the
       | last 4 years support AV1, all browsers support AV1 software
       | decoding, Netflix apps on Android (since 2021) and iOS (since
       | 2023) obviously do.
       | 
       | So why isn't it AV1 higher? The post doesn't say, so we can only
       | speculate. It feels like they're preferring hardware decoding to
       | software decoding, even if it's an older codec. If this is true,
       | it would make sense - it's better for the client's power and
       | battery consumption.
       | 
       | But then why start work on AV2 before AV1 has even reached a
       | majority of devices? I'm sure they have the answer but they're
       | not sharing here.
        
         | jhugo wrote:
         | Smart TVs, TV sticks, and a lot of mobile devices will not be
         | capable of decoding AV1 in software in realtime, given their
         | low-spec CPUs. I imagine that Netflix is only serving AV1 to
         | devices with hardware decoding support.
        
       | philipallstar wrote:
       | This is a great result from Google, Netflix, Cisco, etc.
        
       | ksec wrote:
       | Worth a note, H.264 High Profile is patent free in most countries
       | and soon be patent free too in US.
        
         | sylware wrote:
         | Isn't AV1 on the level of H.265? And are H.265 and the future
         | H.266(will face the upcoming av2) free of charge forever and
         | wherever like av[12]?
         | 
         | They could do the Big Tech way: make it all 'free' for a good
         | while, estinguish/calm down any serious competition, then make
         | them not 'free' anymore.
         | 
         | In the end, you cannot trust them.
        
           | adzm wrote:
           | VP9 is more on the level of H265 really. VVC/H266 is closer
           | to AV1. It's not an exact comparison but it is close. The
           | licensing is just awful for VVC similar to HEVC and now that
           | AV1 has proved itself everyone is pivoting away from VVC/h266
           | especially on the consumer side. Pretty much all VVC adoption
           | is entirely internal (studios, set top boxes, etc) and it is
           | not used by any major consumer streaming service afaik.
        
             | ksec wrote:
             | I guess most people forgot about x264 dark shikari's post
             | already.
             | 
             | VP9 isn't H.265 level. That is the marketing spin of AOM.
             | And even AOM members admit VVC is better than AV1.
             | 
             | Liking one codec or whether it is royalty free is one
             | thing, whether it is performing better is another thing.
        
           | anordal wrote:
           | Absolutely not.
           | 
           | I wish everyone knew the difference between patents and
           | copyright.
           | 
           | You can download an open source HEVC codec, and use it for
           | all they care according to their copyright. But! You also owe
           | MPEG-LA 0.2 USD if you want to use it, not to mention an
           | undisclosed sum to actors like HEVC Advance and all the other
           | patent owners I don't remember, because they have their own
           | terms, and it's not their problem that you compiled an open
           | source implementation.
        
       | TMWNN wrote:
       | The one big hardware deficiency of my Nvidia Shield TV is its
       | lack of YouTube AV1 support.
        
       ___________________________________________________________________
       (page generated 2025-12-05 23:00 UTC)