[HN Gopher] Replacing HLS/Dash - Live Mass Fanout with Media ove...
___________________________________________________________________
Replacing HLS/Dash - Live Mass Fanout with Media over QUIC
Author : kixelated
Score : 115 points
Date : 2023-11-14 15:10 UTC (7 hours ago)
(HTM) web link (quic.video)
(TXT) w3m dump (quic.video)
| repelsteeltje wrote:
| Hmm. Okay that's nice and all.
|
| But isn't client-side _adaptive bitrate_ switching the point of
| HLS and DASH? Given that server side software (encoders, live and
| VOD origins) and clients (shaka, hls.js, dash reference client,
| set top boxes, smart TVs) exist ... doesn 't it make more sense
| to set up http/3 and have it all in a well established standard?
| pipo234 wrote:
| Looks like a Will Law (of DASH fame) pet project. Interesting
| to see where it will lead. Seems to have sponsorship from some
| industry heavyweights Meta, Cisco, Twitch, Google, Akamai,
| Ericson. But notable absentee is of course: Apple (!)
|
| https://datatracker.ietf.org/doc/draft-ietf-moq-transport/
| goeiedaggoeie wrote:
| Delivery on web IOS specifically is the PITA. This post
| however uses CMAF and with IOS offering better API's for
| video from the next release there is a bit of hope that this
| protocol could be implemented without Apple needing to do it,
| once private relay on ios supports http3, all the bits needed
| to do this should be there.
| englishm wrote:
| The main thing we're waiting on is WebTransport in Safari.
|
| Apple has committed to adding it, and there's already been
| some work on it in WebKit, but we don't know when it will
| make its way into a Safari release yet.
| londons_explore wrote:
| Apple accepts external PR's for Webkit right? Couldn't
| someone external implement it?
| withinboredom wrote:
| At the risk of them rejecting it and wasting everyone's
| time?
| kixelated wrote:
| ABR is absolutely required for distribution. MoQ also supports
| ABR, but additionally gives you the ability to also drop the
| tail of a GoP if a rendition switch is too slow, or no lower
| rendition exists.
|
| As for maintaining existing standards, it depends on your
| goals. If you're not trying to push boundaries then absolutely,
| HLS/DASH is great for quickly getting off the ground. But if
| you're looking for something in-between HLS and WebRTC then MoQ
| is compelling.
| KaiserPro wrote:
| But as the article points out HTTP/3 is designed to be
| reliable. This is why TCP is not really great for realtime/low
| latency video.
|
| Re-transmission of old frames is the opposite what you want for
| live streaming. If you can't re-generate the frame from forward
| error correction, then you're shit out of luck. if you're
| hitting network throughput limits, wasting data on transmitting
| frames you cant display is not a good use of a limited
| resource.
| adriancr wrote:
| I've always wondered why the need for ABR...
|
| Article mentions droppping from 1080p to 360p like it's a
| feature, but I would say its horrible to do that...
|
| If I'm watching a movie or listening to music I don't want random
| switches to poor quality or lower resolution and associated
| glitches.
|
| I'd much rather see the spinning wheel and complain to my
| provider that I'm not getting what I paid for or check what's
| going wrong.
|
| I can understand various resolutions and codecs based on user
| preferences and devices but that's a fixed choice.
|
| Am I in the minority here?
| izacus wrote:
| Yes, absolutely. Vastly. Hugely.
|
| As someone who's worked in video delivery, I can't overstate
| how much the majority prefers not to have stutters in their
| video while sitting on a subway/bus/train/massive SUV.
| JoshTriplett wrote:
| Exactly. I'd much rather wait a minute and buffer more video
| at reasonable resolution and bitrate, rather than switching
| to something unwatchable.
| stilldavid wrote:
| I believe this post covers the most common use case of
| Twitch, which is live streaming. You don't want to be a
| minute behind the video when chat is live.
| eska wrote:
| I'm not sure, since most people don't interact with chat.
|
| Personally I use yt-dlp with mpv and have the buffer set
| very large. This allows me to pause the stream when I
| take a toilet break or similar. When I come back I can
| also skip ahead through ads or idle times on the stream,
| or watch at increased speed to catch up if needed.
| remram wrote:
| You're saying "exactly' but then state the opposite...
| Aurornis wrote:
| > I'd much rather wait a minute and buffer more video at
| reasonable resolution and bitrate, rather than switching to
| something unwatchable.
|
| A full minute of buffering would only give about 5% more
| bandwidth for a typical 22 minute TV show. That's not going
| to transform it from "something unwatchable" to the higher
| bitrate video you want.
|
| It will, however, cause a large majority of people to give
| up on the stream. The number of people who would actually
| prefer 1 minute of buffering for negligibly higher quality
| is almost nonexistent.
|
| This is why buffering is futile: Once a stream gets to the
| point of stopping to buffer, the mismatch between available
| bandwidth and the encoded bitrate is just too great to
| reasonably overcome.
| adriancr wrote:
| I think you're talking about a different use case then
| OP.
|
| For your point, an example is viewing 1080p on 64kbit/s
| is unreasonable and nothing will fix it.
|
| For OP's point, I can imagine viewing 1080p (10Mbit/s) on
| a 1Gbit/s connection means a little buffer will solve the
| need for ABR as random packet loss or minor drops can be
| recovered with retries seamlessly as opposed to dropping
| quality to ABR.
|
| So buffering is futile only when bandwidth is not
| sufficient, and even there its questionable, perhaps user
| might just want to download and view after that's
| complete rather then have poor quality.
| IanCal wrote:
| You still have buffering with abr.
| aidenn0 wrote:
| This might be true on a wired link, but on wifi, it's not
| unusual to see bandwidth drop well below the mean
| throughput for 10s of seconds. One minute of buffering is
| enough to hide those issues. In many cases the quality
| drops during those times and then never goes back up
| again, so I'm stuck looking at giant macroblock artifacts
| for the rest of the movie (unless I hit stop and start
| playing it over again).
| pricechild wrote:
| This reminds me of Google search rankings historically(?)
| favouring sites which loaded quicker. Stories at the time
| (5-10+ years ago?) reported this to be evident at
| surprisingly small limits - people not waiting less than a
| second.
|
| I'm not surprised it's a similar tale with video delivery.
|
| In my experience, what frustrated me most was predictable
| stalls every 10s or so. I don't see that much any more.
| izacus wrote:
| Yep, I believe loading times over 300ms or so already
| caused a drop of visitors? Funny how react developers
| completely forgot about that :D
| wpm wrote:
| In Halo Master Chief Collection, viewing any of the video
| terminals that are scattered throughout the game world would
| boot you out of the game over to some stupid app you had to
| download. When I first came across this I was already mad, but
| the kicker was that the app would stream the video, these
| gorgeous, meticulously crafted and headline featured videos,
| over HLS and the first few dozen seconds of video were always
| blocky, blurry messes because my internet at the time was
| rather slow. No option to download all of the videos ahead of
| time. No option to force video quality to some higher
| resolution. Just a big blurry mess. So, I skipped that entire
| feature of the game and figured I'd have to watch them on
| YouTube later, where I could force it into 1080p and let it
| buffer a bit.
|
| More annoying even now are the constant "Hey moron, is your
| video stuttering? Try lowering the quality!" 4head tapping
| messages I see on Twitch or YouTube when _their CDN_ is taking
| a shit. I pay good money for 1G /1G symmetric, my PC is 6 feet
| from my wireless AP. If you can't feed me 6Mbps of live stream
| at me without stuttering I don't think it's a me problem.
|
| If MoQ means an end to or improvement that terrible ABR
| experience, it can't come _quic_ enough.
| QuercusMax wrote:
| Watching anything on Paramount+ gives me 30 seconds of
| incredibly awful blocky graphics, then it finally gives me
| full-resolution high quality. Usually it's just the
| Paramount+ ads that are this awful, but I really wish there
| was a way to say "spend 20s buffering so I can watch the
| whole thing in HQ".
| Uehreka wrote:
| I think it's fine as long as the user can opt-out and have the
| opt-out stick[0]. My problem with YouTube is that I can't tell
| them "when I'm not on a metered connection, blast my face with
| as many pixels, frames and bits of color as possible" and
| actually have that happen.
|
| [0] I say opt-out not opt-in because the audience for ABR is
| people who wouldn't know where to find an opt-in button or
| wouldn't know that it exists.
| kroltan wrote:
| The "Enhancer for YouTube" browser extension has a setting to
| force assign the resolution to whatever you choose.
|
| I was having issues that even if my player indicated it was
| playing 1440p, it clearly was at most 640p or something like
| that, complete potatovision. Re-clicking the 1440p option
| would fix it, and this extension basically does that for you.
|
| Good luck on mobile, though lol
| KaiserPro wrote:
| > Am I in the minority here?
|
| Probably. I used to work in broadcast, so I have been
| conditioned to seek out the highest quality video I can find.
| So like you, I will accept buffering for higher quality
| playback.
|
| However I still see people who force 4:3 into 16:9 because it
| fills the screen, and people who still have motion smoothing on
| their TV so everything looks like 60fps until the stream
| breaks.
|
| however on mobile, and I just want to see the shitty politician
| say the shitty thing.
|
| I think most people want decent audio with mostly moving
| pictures. Most people can't see HD at TV distances anyway, so I
| suspect so long as it sounds good, most people don't notice too
| much.
| andersa wrote:
| TV without motion smoothing is absurdly terrible. Might as
| well watch the movie in Google Slides. Once they get with the
| times and record the movies in 120 fps, we can get rid of it.
| dontlaugh wrote:
| 24 fps panning shots give me headaches. A small amount of
| smoothing is essential.
| andersa wrote:
| Yes, this is the exact issue. My eyes can't even
| interpret this as motion sometimes, so it just looks like
| a juddering mess and I can't really see anything.
| jonathanlydall wrote:
| I think it's very subjective as I personally find injected
| frames makes things look "odd".
|
| It's also worth mentioning that Peter Jackson's The Hobbit
| was shot at double FPS (48 if I recall correctly), and was
| regarded by many as looking "off". Based on the fact
| Hollywood has not made any high FPS movies since, it seems
| that it was generally regarded as a failed experiment.
| andersa wrote:
| Some of the interpolated frames do indeed look odd, but
| that is much preferable to the horrible sideways judder
| when watching 24fps content on a modern OLED tv with
| near-instant pixel response.
|
| I wonder if new display tech is contributing to this
| becoming ever worse of an experience? With the tv
| snapping to the new frames practically instantly and then
| waiting for 50ms with no visible changes, motion blur
| baked into the movie can no longer hide the judder
| effectively.
|
| I haven't seen The Hobbit in cinema, so I can't comment
| on that experience. Maybe they just did it wrong somehow,
| which would be a shame.
| bick_nyers wrote:
| For those interested in learning more:
| https://blurbusters.com/faq/oled-motion-blur/
|
| LCD/OLED follows a "Sample and Hold" paradigm, whereas
| CRT/Plasma don't have the same limitations.
| toast0 wrote:
| > With the tv snapping to the new frames practically
| instantly and then waiting for 50ms with no visible
| changes, motion blur baked into the movie can no longer
| hide the judder effectively.
|
| Film projectors would close the shutter while the film
| moved, and might open the shutter two or three times per
| frame; I wonder if you'd do well with black frame
| insertion than some of the modern displays do?
| Personally, it seemed pretty awful for me on a samsung
| qled, maybe it's more useful on OLED; but I also dislike
| motion smoothing, so clearly we have different taste.
|
| I will agree with you that panning shots in 24 fps aren't
| great. I think there's some rules of thumb for
| cinematography to reduce audience discomfort, but not
| everyone follows those.
| andersa wrote:
| I tried the black frame insertion, but it seems unusable.
| I can see it flimmering. Very uncomfortable to look at
| the screen with it on.
| bick_nyers wrote:
| People said the same thing about Avatar 2. I think the
| criticism in the media comes from "film purists" who are
| bothered that the smoothness catches their eye in a
| certain way, and instead of trying to get used to it over
| time will proclaim that 24 fps is some magical golden
| ratio of cinema perfection.
| andersa wrote:
| The jarring part in Avatar 2 is that for some completely
| inexplicable reason it had sections with low fps. I
| noticed every single one of them instantly and it was
| very off putting.
|
| And while 48 fps is a major improvement from the
| prehistoric 24 (apparently it was chosen because of the
| power grid a hundred years ago??), it's still a far cry
| from actually smooth motion at 144 and above fps.
|
| If rendering the whole movie at this frame rate is too
| difficult, it would be quite interesting to explore
| exporting the additional buffers (velocity, depth, etc)
| that allow DLSS 3 frame generation to work near perfectly
| in games. Can cameras even capture these?
| bick_nyers wrote:
| To capture depth the cameras would need to be augmented
| with LIDAR. LIDAR is insanely more expensive than a
| regular camera. Budget friendly LIDAR systems that I've
| seen will have "refresh rates" in the realm of 2Hz or 8Hz
| and are absolutely nowhere near what we would consider 4k
| resolution.
|
| Creating depth maps of existing images/videos is one of
| the interests of the field of Computer Vision, it is not
| a solved problem.
|
| When I watched Avatar 2 in IMAX 3D I didn't notice the
| drop to 24, was that during talking/low motion scenes
| only? I'm sure if I looked for it I could see it, but I
| went into the movie not knowing that refresh rate switch
| occured.
|
| We have 8k/120Hz cameras now, although I'm not too well
| versed in how increasing the refresh rate affects how the
| camera captures light, and what effects that has
| artistically.
| KaiserPro wrote:
| In VFX generally they don't bother capturing depth
| because its much noisier than the alternative: replacing
| the subject with a 3D rendered object. Using markers to
| track motion, and marrying them up to model marker points
| is allows artistic interpretation of the actor's
| movements. (or static object's )
|
| Lidar is noisy, and doesn't have a 1:1 pixel mapping
| between image pixel and depth pixel.
|
| Even when we had stereosopic recordings for 3D the depth
| map we could generate was always a bit shit. Granted
| things have moved on in the last few years.
|
| > We have 8k/120Hz cameras now, although I'm not too well
| versed in how increasing the refresh rate affects how the
| camera captures light, and what effects that has
| artistically.
|
| If its a RED it has shit colour sensitivity. however,
| generally running cameras at higher framerates require
| more light. _but_ the shutter angle is generally faster
| than the frame rate (when you record video on your phone
| outside in the sun, the shutter speed is 1 /1000th of a
| second, even though its doing 60fps)
| Uehreka wrote:
| I don't think it's "film purists". I remember seeing the
| Hobbit at a time where I knew basically nothing about
| common film framerates and thinking it looked really
| weird. I also remember the first time my family got a
| motion-smoothing TV and I tried to watch a DVD of The
| Matrix, and about 2 minutes in (when Trinity is fighing
| the cops) I (a college student who regularly watched 360p
| rips of movies on my laptop) blurted out "why does it
| look like garbage?!"
| kroltan wrote:
| Also doesn't help that a lot of TVs have screen tearing or
| uneven frame timing.
|
| You would think that a device whose sole purpose is to
| divine a picture from wagging pixies in the air and make
| the little lamps twinkle in a convincing way would be able
| to do it by now.
|
| But I have yet to find a "smart" TV that can handle it
| decently, even well into the mid range it's still a bit hit
| and miss. Get a cheap decoder and plug it into any mid
| range laptop and the pacing is leagues better.
|
| Do they just use the absolutely worst SoC that still turns
| enough product to pay the bills, or what?
| andersa wrote:
| Hmm. I don't think I've ever seen screen tearing or
| something similar on my LG G3, or any of the previous
| generations that came before.
| bick_nyers wrote:
| To be fair, it sounds like you purchase the best of the
| best in TVs.
| KaiserPro wrote:
| > Once they get with the times and record the movies in 120
| fps, we can get rid of it.
|
| SuperHighDef (https://en.wikipedia.org/wiki/Ultra-high-
| definition_televisi...) is absolutely brilliant. I saw it
| live at a broadcast show in 2012.
|
| However motion smoothing is an aberration, as is the stupid
| saturation and "artifact removal AI" that basically applies
| a box blur to everything, then applies sharpening to make
| up for the slow LCD response.
|
| The thing that fucks me off about motion smoothing is that
| its so unreliable, its smooth for 8 frames, then breaks and
| comes back.
|
| However, I think the reason why most people hate high
| framerate video is that is associated with cheap TV. If
| you're young and have never seen crappy low budget TV, I
| could see that you don't have that association.
|
| The reason why 24FPS feel "expensive" is because its
| associated with high budget movies(so does high colour
| kodak feel).
| andersa wrote:
| > its so unreliable, its smooth for 8 frames, then breaks
| and comes back
|
| That's unfortunately true, even on the best TVs. But for
| me that is merely slightly jarring, while panning cameras
| at 24 fps are completely unwatchable. So I begrudgingly
| use it and hope that some day they will realize movies in
| 2023 should not be using the same frame rate they had due
| to technical limitations 100 years ago.
|
| A true 120 fps movie would be so, so much better. Just
| like how I am playing all games at 4k / 120 fps on the
| TV.
|
| Something I've been meaning to explore is pre-processing
| a movie with RIFE, which produces a much better result
| than the builtin motion smoothing since it doesn't have
| to run in real time on a subpar processor. Though it
| takes quite a while even with an RTX 4090 and you need
| the actual frames so can only do it with pirated movies
| from blu-ray rips, which is annoying.
| lucideer wrote:
| While the sentiment in this comment is generally correct, I
| don't think this extends to 360p (or even 480p). Most people
| will generally tolerate much lower quality than the current
| state of the art at any given time, but there's still a kind
| of "overton window" of acceptability where the floor of
| what's acceptable raises with time too.
|
| I'd be fairly sure 360p is well below it today.
| giantrobot wrote:
| But streams don't drop from 4K down to 360p. That's
| YouTube's low end. Most streaming services bottom out at
| 540p (quarter of full HD) or 720p. Keep in mind a lot of
| over the air and cable broadcasts are still 720p, so
| streaming low end looks as good/bad as over the air TV.
| makapuf wrote:
| I think its funny because we have error correction with
| retransmission on the link then quality drop if the buffer is
| empty, in order packets with retransmission, etc. just to
| have motion smoothing (or "buffering..." waiting screens)...
|
| It would be interesting to "see" the packets on screen
| arriving and decoded with no error correction or buffering
| (maybe with reconstruction of missing squares...) , like an
| analog kind of signal : no latency, full available bandwidth,
| no waiting (but sometimes... less than ideal quality).
| andersa wrote:
| It depends. If I'm watching a movie, I obviously want the
| highest quality, and it really annoys me when the app doesn't
| have a feature to force the quality. Great, that means the
| movie is canceled and I'll have to try again at some arbitrary
| later time, because I'm not going to watch it as a blocky mess
| just because their servers are having a moment.
|
| If I'm watching a live stream, I'll take whatever quality
| currently works. Waiting to buffer it would mean I'm behind
| everyone else...
| jeppesen-io wrote:
| > Am I in the minority here?
|
| Yes, and it's not even close. Metrics are clear -
| stutter/loading will cause many more people to close the window
| vs lower quality.
|
| > I can understand various resolutions and codecs based on user
| preferences and devices but that's a fixed choice.
|
| What are the odds that many of those users who set a preference
| and forget about it, now are wondering why youtube etc. never
| works at their coffee shop or when in the back yard where the
| WiFi is weak? The majority of video on the internet is viewed
| from wireless networks which always have a chance to vary in
| quality
| anotherhue wrote:
| I'd much rather the audio keep streaming while the video does
| its thing.
| CYR1X wrote:
| The masses aren't great at distinguishing between 1080p and
| 720p, or 720p and 360p. They don't even know what video codecs
| are. So no they aren't really going to complain if their 1080p
| video downs to 360p for 2 seconds.
| Aurornis wrote:
| > I'd much rather see the spinning wheel and complain to my
| provider that I'm not getting what I paid for or check what's
| going wrong.
|
| You are deeply in the minority.
|
| People hate buffering because it's rarely a 1-time event. Once
| a video starts buffering, it's usually a sign that something is
| wrong and it's not going to continue happening. People have
| internalized this and will give up quickly once buffering
| happens.
|
| You're also assuming that the issue would be with your internet
| provider and that complaining to them will resolve it. Neither
| of those assumptions are true most of the time. While there are
| cases of degraded internet downlink performance, the more
| common failure modes involve people's WiFi. Borderline
| connection quality can turn into interruptions when their
| nearby neighbor starts a download on the same WiFi channel, or
| if someone in the same house starts using the WiFi at the same
| time.
| boneitis wrote:
| (This is all to say that I 100% feel the same way.)
|
| I like getting feedback. I don't like when platforms try to
| smooth out the rough edges invisibly... they often end up in me
| experiencing breaking problems without my realizing I am
| experiencing breaking problems. It extends so much farther than
| having the choice to force a video feed resolution.
|
| I don't like when voice and video channels "seamlessly"
| pitchshift and speed-up/slow-down (at least, without
| indication). I already find it unacceptable how aggressively
| most platforms like to compress sound in their audio codecs,
| presumably in the name of saving bandwidth.
|
| Slack application going 1000% out of their way to never let you
| know that it has a connection drop, so you never know if it re-
| established comms when you know your net dropped out for a
| moment, or if you need to hard refresh with Ctrl+R, because you
| heard the Slack notification on the phone right next to you,
| but you can't tell whether the desktop app has received the
| latest messages (often, it hasn't).
|
| In all seriousness, I would love something where my computer
| lcd subtly shades red or blue along the edges of the display
| depending on when I'm getting network jitter/latency and when
| it's catching up, and things like that. How cool would that be?
| HumblyTossed wrote:
| > Am I in the minority here?
|
| Yes, actually.
| tschellenbach wrote:
| Replacing LLHLS and Webrtc will be hard since they both work
| quite well.
| belthesar wrote:
| With regards to live video broadcasting (as opposed to VoIP),
| I'd be hard pressed to call WebRTC a technology that works
| well. WebRTC for one-to-many feeds is poor for live video for
| many reasons. For ABR scenarios, there's no concept of having
| multiple renditions, so if one subscriber is having trouble
| receiving video from a publisher, the subscriber reduces the
| bitrate of that publishers feed for all receivers (unless you
| take out that capability in its entirety, see the FTL protocol
| created by the team at Mixer). In addition to this, WebRTC has
| no real concept of segments, which makes building a CDN to
| reduce load and latency at an endpoint is a difficult
| proposition at best.
|
| LLHLS is darn good, but it's not as good as WebRTC-powered
| technologies have shown from a latency perspective. Whether
| that's required for particular use cases is really pertinent to
| the case. For non-interactive feeds, it's not really a value
| add.
|
| That said, the biggest quality improvements MoQ has are how it
| recovers from a poor connection issue, which happens frequently
| even on the best of consumer networks and Internet connections,
| let alone on poorer quality WiFi and cellular networks.
| superkuh wrote:
| Ah yes, QUIC, the protocol that is infeasible to use to connect
| to people you don't know without getting continued approval from
| a corporate CA. Imagine if you couldn't use TCP without getting
| re-approved by a third party corporation every ~90 days. I
| suppose that's fine for corporate for-profit use but it makes
| things extremely brittle in both the technical and social senses.
| conradludgate wrote:
| Can you elaborate on this? This doesn't seem like a fundamental
| issue with QUIC
| johncolanduoni wrote:
| They're probably referring to the fact that QUIC always
| performs TLS1.3 negotiation as part of its transport
| handshake, there's simply no unencrypted version of it.
| However WebTransport actually supports providing a
| certificate hash at connection time on the client and
| ignoring the normal Web PKI altogether. But I doubt that will
| stop anyone who instinctively hates it because it came out of
| Google from continuing to do so.
| conradludgate wrote:
| Ah, I see. Well I use self signed TLS certificates just
| fine with QUIC. I also wouldn't log into a livestream
| without using a CA trusted TLS connection (unless you're ok
| on a free anonymous account with adverts I guess).
| thedaly wrote:
| > the protocol that is infeasible to use to connect to people
| you don't know without getting continued approval from a
| corporate CA.
|
| What do you mean? Are you saying you can't serve content via
| QUIC with a certificate issued by lets say letsencrypt certbot?
| giantrobot wrote:
| Self-signed certs in general are a pain in the ass with
| Chrome. It's only a matter of time before they're disallowed
| entirely. With QUIC requiring TLS self-signed certificates
| will only get more difficult to use if not impossible.
| peaBerberian wrote:
| I liked this article very much (and I do share many points it's
| making).
|
| However there are some claims in that article that bothered me:
|
| > if you're watching 1080p video and your network takes a dump,
| well you still need to download seconds of unsustainable 1080p
| video before you can switch down to a reasonable 360p.
|
| A client can theoretically detect a bandwidth fall (or even guess
| it) while loading a segment, abort its request (which may close
| the TCP socket, event that then may be processed server-side, or
| not), and directly switch to a 360p segment instead (or even a
| lower quality). In any case, you don't "need to" wait for a
| request to finish before starting another.
|
| > For live media, you want to prioritize new media over old media
| in order to skip old content
|
| From this, I'm under the impression that this article only
| represents the point of view of applications where latency is the
| most important aspect by far, like twitch I suppose, but I found
| that this is not a generality for companies relying on live
| media.
|
| Though I guess the tl;dr properly recognizes that, but I still
| want to make my point as I found that sentence not precize
| enough.
|
| On my side and the majority of cases I've professionally seen,
| latency may be an important aspect for some specific contents
| (mainly sports - just for the "neighbor shouting before you"
| effect - and some very specific events), but in the great
| majority of cases there were much more important features for
| live contents: timeshifting (e.g. being able to seek back to the
| beginning of the program or the one before it, even if you
| "zapped" to it after), ad-switching (basically in-stream targeted
| ads), different encryption keys depending on the quality, type of
| media AND on the program in question, different tracks, codecs
| and qualities also depending on the program, and surely many
| other things I'm forgetting... All of those are in my case much
| more important aspects of live contents than seeing broadcasted
| content a few seconds sooner.
|
| Not to say that a new way of broadcasting live contents with much
| less latency wouldn't be appreciated there, but to me, that part
| of the article complained about DASH/HLS by just considering the
| ""simpler"" (I mean in terms of features, not in terms of
| complexity) live streaming cases where they are used.
|
| > You also want to prioritize audio over video
|
| Likewise, in the case I encountered, we do largely prefer re-
| buffering over not having video for even less than a second, even
| for contents where latency is important (e.g. football games),
| but I understand that twitch may not have the same need and would
| prefer a more direct interaction (like other more socially-
| oriented media apps).
|
| > LL-DASH can be configured down to +0ms added latency,
| delivering frame-by-frame with chunked-transfer. However it
| absolutely wrecks client-side ABR algorithms.
|
| For live contents where low-latency is important, I do agree that
| it's the main pain point I've seen.
|
| But perhaps another solution here may be to update DASH/HLS or
| exploit some of its features in some ways to reduce that issue.
| As you wrote about giving more control to the server, both
| standards do not seem totally against making the server-side more
| in-control in some specific cases, especially lately with
| features like content-steering.
|
| ---
|
| Though this is just me being grumpy over unimportant bits, we're
| on HN after all! In reality it does seem very interesting and I
| thank you for sharing, I'll probably dive a little more into it,
| be humbled, and then be grumpy about something else I think I
| know :p
| kixelated wrote:
| I'm glad you liked it.
|
| > A client can theoretically detect a bandwidth fall (or even
| guess it) while loading a segment, abort its request (which may
| close the TCP socket, event that then may be processed server-
| side, or not), and directly switch to a 360p segment instead
| (or even a lower quality). In any case, you don't "need to"
| wait for a request to finish before starting another.
|
| HESP works like that as far as I understand. The problem is
| that dialing a new TCP/TLS connection is expensive and has an
| initial congestion control window (slow-start). You would need
| to have a second connection warmed and ready to go, which is
| something you can do in the browser as HTTP abstracts away
| connections.
|
| HTTP/3 gives you the ability to cancel requests without this
| penalty though, so you could utilize it if you can detect the
| HTTP version. Canceling HTTP/1 requests especially during
| congestion will never work through.
|
| Oh and predicting congestion is virtually impossible,
| ESPECIALLY on the receiver and in application space. The server
| also has incentive to keep the TCP socket full to maximize
| throughput and minimize context switching.
|
| > From this, I'm under the impression that this article only
| represents the point of view of applications where latency is
| the most important aspect by far, like twitch I suppose, but I
| found that this is not a generality for companies relying on
| live media.
|
| Yeah, I probably should have went into more detail but MoQ also
| uses a configurable buffer size. Basically media is delivered
| based on importance, and if a frame is not delivered in X
| seconds then the player skips over it. You can make X quite
| large or quite small depending on your preferences, without
| altering the server behavior.
|
| > But perhaps another solution here may be to update DASH/HLS
| or exploit some of its features in some ways to reduce that
| issue. As you wrote about giving more control to the server,
| both standards do not seem totally against making the server-
| side more in-control in some specific cases, especially lately
| with features like content-steering.
|
| A server side bandwidth estimate absolutely helps. My
| implementation at Twitch went a step further and used server-
| side ABR to great effect.
|
| Ultimately, the sender sets the maximum number of bytes allowed
| in flight (ex. BBR). By also making the receiver independently
| determine that limit, you can only end up with a sub-optimal
| split brain decision. The tricky part is finding the right
| balance between smart client and smart server.
| inthewings wrote:
| Anything that deprecates HLS which is a piece of randomly
| specified and implemented crap is welcome
| HumblyTossed wrote:
| > It's a bold claim I know. But I struggle to think of a single
| reason why you would use TCP over QUIC going forward.
|
| I know that I'm talking about apples vs oranges, but how long
| have we been waiting for IPv6 to take over from IPv4? I don't see
| QUIC taking over from TCP any time soon.
| themerone wrote:
| Corporate firewalls are going to allow QUIC sometime between
| never and the heat death of the universe.
| kixelated wrote:
| QUIC is a major functionality improvement, while IPv6 is a
| capacity improvement. You can do some extremely cool things
| with QUIC that definitely deserves a post of its own.
| JimDabell wrote:
| > I don't see QUIC taking over from TCP any time soon.
|
| HTTP/3 uses QUIC, so there's already a substantial amount of
| production traffic going over it.
|
| https://blog.cloudflare.com/http3-usage-one-year-on/
| cryptonector wrote:
| Switching over all uses of TCP to QUIC will go even slower than
| the IPv6 migration. TCP works for most things and is trivial to
| use so it will continue to get use.
|
| But unlike IPv6 there's few barriers to adoption of QUIC where
| it really pays, so I expect that QUIC will be adopted by a lot
| of applications pretty quickly.
| favorited wrote:
| > allowed the Apple-controlled HLS player to reduce the bitrate
| rather than pummel a poor megacorp's cellular network
|
| lol the telcos will just throttle you, they don't care that your
| twitch stream constantly pauses to buffer. lower bitrate streams
| make the content playable, period.
| lxgr wrote:
| > If your app delivers video over cellular networks, and the
| video exceeds either 10 minutes duration or 5 MB of data in a
| five minute period, you are required to use HTTP Live Streaming.
|
| Wait, what? Is MPEG-DASH really prohibited by Apple for native
| applications? Or do they just mean "HLS or equivalent; don't do
| progressive downloads"?
|
| What about applications that can't pick their delivery format,
| like web cams? Does this only apply to VOD-like use cases?
| banana_giraffe wrote:
| They mean HLS
|
| When/If they notice you're not using HLS instead of some other
| solution like MPEG-DASH, they will raise a stink and force you
| to fix it.
|
| I know this because I had the privilege of changing a video
| playback stack to be 100% HLS over a weekend for a fairly big
| name app when it became a Pri 0 issue thanks to Apple noticing
| the app hadn't been using HLS 100% of the time during a minor
| update.
|
| There's also verbiage requiring a laughably low bitrate stream
| as a fallback. Our users would file bugs when we fell back to
| that stream instead of audio only since we had a rich audio
| only mode they preferred given that the video at whatever
| bitrate (192kbps?) for our content was nothing but a colorly
| mess.
|
| This was years ago, the rules are still in the app store, but I
| have no idea how well they're enforced anymore.
| remram wrote:
| Are the links on this page rendered exactly the same as other
| bold text? Why would you do this?
| jolmg wrote:
| The links don't appear bold to me.
___________________________________________________________________
(page generated 2023-11-14 23:00 UTC)