[HN Gopher] BBC Subtitle Guidelines
___________________________________________________________________
BBC Subtitle Guidelines
Author : rogual
Score : 105 points
Date : 2022-12-23 12:28 UTC (10 hours ago)
(HTM) web link (www.bbc.co.uk)
(TXT) w3m dump (www.bbc.co.uk)
| sircastor wrote:
| I recently went through Netflix's subtitle guidelines when I was
| trying to figure out how to alter a subtitle file that I'd found
| that was pretty barebones. One of the interesting things about it
| was reading a rule, and then understanding the justification
| associated with the decision. Some of it is relatively arbitrary,
| but after you give it a bit of thought it makes sense.
| wpietri wrote:
| Ooh, nice. I especially appreciate seeing jokes mentioned in
| there as a specific case. I often watch with subtitles on and
| it's frustrating when a joke is spoiled by bad subtitle timing.
| TechBro8615 wrote:
| Something I've wondered... when politicians give speeches they
| often have some hand gesturing from a sign language interpreter
| standing next to them. But I've never understood why this is
| better than subtitles. If you're deaf then wouldn't you rather
| read text than follow sign language?
| jameshart wrote:
| Think about how you learn to read - you 'sound out' words,
| turning letters into sounds to match them to a pronunciation to
| figure out what word is represented.
|
| Now imagine trying to learn how to do that when you have _never
| heard any words spoken out loud_.
|
| People who are deaf from birth often have a lot of difficulty
| with spelling and reading, because both skills are closely
| connected to saying and hearing words. Connecting written words
| to lip movements (which is kind of the closest thing to
| 'phonics' for a deaf person) is lossy - the letter-to-lip
| connections are fuzzier than letter-to-sound, and lip-to-letter
| is very ambiguous.
|
| Subtitles are great for people who are confident and
| comfortable readers - say, people who have become deaf due to
| age - but for some deaf people following subtitles can be like
| asking someone who's dyslexic to quickly read a sentence out
| loud.
| foldr wrote:
| The more salient point is that English is going to be a
| second language for people who grew up deaf (with a
| completely unrelated sign language probably being the first
| language).
| SanjayMehta wrote:
| I would guess that the sign language interpreter is translating
| in near real time in live speeches.
| kvm000 wrote:
| Live closed captions (text only) is very common and standard.
| Usually those are done by an external company listening in to
| an audio feed, and sending the data back. It used to be done
| with regular POTS lines and telnet, but now it's obviously
| more common to use public internet based services like EEG
| iCap[1]
|
| I don't know too much about it but I had read recently that
| ASL sign language can be thought of as a different language,
| rather than a direct equivalent to text subtitles[2].
|
| [1] https://eegent.com/icap [2] https://imanyco.com/closed-
| captions-and-sign-language-not-a-...
| bloak wrote:
| > I had read recently that ASL sign language can be thought
| of as a different language
|
| Yes, it is a different language. I've heard that ASL is
| rather similar to French Sign Language and quite different
| from BSL (British Sign Language). If someone were to
| translate something from English into ASL, and someone else
| were to translate the ASL back into English, I'd expect the
| result to be as different from the original as if they'd
| gone via some other language, like Italian, for example.
| RajT88 wrote:
| There's a variety of running jokes that Italian is half
| sign language anyways.
|
| (Apparently derived from the fact that in Italy, there's
| quite a lot more non-verbal communication with hand
| gestures than other parts of the world)
| sigwinch28 wrote:
| There's more to speaking than just the words. Sign language can
| convey inflection and emotion in ways that closed captions
| cannot.
|
| Watch someone signing: they use their face and body to convey
| emotions like anger, confusion, hesitation, love, joy, and the
| rest of the human range.
| crazygringo wrote:
| But if you're watching the politician speak, you can already
| see all of the emotion in their facial expression and body
| movement.
|
| And that's the "original" emotion, it's not filtered through
| another human being. When you watch a movie without audio and
| with subtitles (like in a bar or on a bus), the emotions of
| the speakers are already awfully clear from the visuals.
| [deleted]
| dragonwriter wrote:
| > But if you're watching the politician speak, you can
| already see all of the emotion in their facial expression
| and body movement.
|
| No, you can't; how much emotion is shown via those things
| vs. tone, volume, and other auditory cues varies from
| speaker to speaker and speech to speech; sometimes,
| speakers demonstrate one emotion through gestures but
| indicate that it is insincere/being mocked/etc. via vocal
| cues, even.
|
| Not to mention the degree to which simultaneously tracking
| face and subtitles makes you likely to miss parts of either
| or both.
| rwmj wrote:
| No I think you're missing the grandparent's point.
| Inflection in spoken language "translates" to facial
| expression in sign language.
| crazygringo wrote:
| No I'm getting that completely. But inflection in spoken
| language is redundant to a large degree with facial
| expression. If you're watching the original speaker,
| you're already getting that.
|
| For example, if we ask a question, it's not just that our
| voice goes up at the end. Our eyes move in a certain way
| too, slightly more opened and our eyebrows and sometimes
| cheeks raise.
| mgkimsal wrote:
| Not everything that's spoken in a televised event has a
| single accompanying speaker to watch for cues the entire
| time.
|
| Separately, relatively dry sarcasm can't be visually
| picked up directly, but a signer may be able to suggest
| some of that with body/hand language.
| tshaddox wrote:
| > But inflection in spoken language is redundant to a
| large degree with facial expression.
|
| I don't know how you'd possibly attempt to objectively
| quantify that degree, but my guess is that you're
| understating it. The entire deaf community is probably
| not mistaken about which means of communication are the
| most effective for them.
| bonaldi wrote:
| And you're seeing this nuance while simultaneously
| reading subtitles? From a speaker at a distance? GP is
| correct, sign is a fully expressive language, far richer
| than subtitles.
| thrdbndndn wrote:
| Not who you replied to, but weren't we talking about
| televised speech? (We're talking about adding subtitles.)
| So the "at distance" really isn't an issue. And yes I can
| watch subtitle and speaker's face at the same time and
| get their expression.
| crazygringo wrote:
| Yes of course. Haven't you ever watched a movie without
| audio and with subtitles turned on? It's quite easy to
| get the nuance. People's faces are incredibly expressive.
|
| And nobody's at a distance, the cameras are always on
| either a medium or close-up shot when filming politicians
| speaking.
| bonaldi wrote:
| This is a deeply strange line of thought to follow. Do
| you think broadcasters would go to the trouble and
| expense if there was no value for people in it?
|
| If subtitles were equal value or even "good enough",
| they'd be used exclusively. That they aren't should tell
| you something, and you repeatedly protesting that you are
| unable to comprehend the value doesn't mean it isn't
| there.
| sigwinch28 wrote:
| > But if you're watching the politician speak, you can
| already see all of the emotion in their facial expression
| and body movement.
|
| Yes, but a deaf person has trouble hearing how the speaker
| is speaking. They're missing out on the emotion in the
| voice.
|
| Consider all of the emotion that can be conveyed in an
| audiobook, or on a phone call, or through music. Humans
| convey a lot of emotion in sound that is not represented
| visually. Part of sign language is conveying the emotion
| usually present in speech.
| rwmj wrote:
| Since I live with someone whose first language isn't English I
| basically watch everything with subtitles. And live subtitling
| - while it's a thing - isn't that good. Try turning on
| subtitles for something like a news programme or live broadcast
| and it'll usually be quite delayed, with lots of misspellings
| and even outright wrong text. (This is true for premier public
| broadcasters in the UK such as the BBC, I don't know if this is
| solved better in other countries).
|
| Anyway my theory is that sign language interpreters may be much
| better at this because sign language uses the same areas of the
| brain as speaking[1] so they're able to listen and sign much
| more intuitively than typing. Think if you were able to listen
| and speak at the same time without your "speech" drowning out
| what you are listening to.
|
| [1] https://maxplanckneuroscience.org/language-is-more-than-
| spea...
| ClassyJacket wrote:
| Nobody can type fast enough on QWERTY to keep up with human
| speech, so the comparison is versus chorded typing on a
| stenography keyboard, or someone repeating the dialogue into
| voice dictation software which is obviously prone to errors.
| rwmj wrote:
| As someone who watches TV news with subtitles on, whatever
| entry technique they're using, the result is not very good
| [in the UK, can't speak for other countries].
| crazygringo wrote:
| Sure, but the closed captioning is still extremely good. The
| typists use chorded keyboards for speed and yes they
| occasionally make mistakes but everything is generally quite
| clear and accurate.
|
| On the other hand, signing involves actual _translation_ ,
| not just transcription, which is much more likely to drop
| meaning or introduce confusion. Translation is already hard
| enough, and live translation is a whole other level of
| difficulty.
| IanCal wrote:
| Transcription can lose meaning, as text is not a 1:1
| replacement for speech. You lose cadence, stress and
| emotion.
| M2Ys4U wrote:
| >Sure, but the closed captioning is still extremely good.
| The typists use chorded keyboards for speed and yes they
| occasionally make mistakes but everything is generally
| quite clear and accurate.
|
| Live subtitling (on the BBC at least) is mainly done using
| re-speaking and voice recognition, rather than typing.
| tshaddox wrote:
| But it's being done by professional translators, right?
| That might be more expensive than having a human do live
| subtitle transcription, but I bet it's not a huge
| difference especially relative to the production costs of
| any broadcast to a large audience, and based on the live
| subtitles I've seen (mostly on national sports and news
| broadcasts, so I don't know if that's automated or done by
| a human) it's hard imagine the quality achieved by
| professional signers wouldn't be significantly better.
| blowski wrote:
| I'm not deaf, don't know any sign language, am not close
| friends with anyone who does.
|
| However, I understand that it's much easier to be
| expressive in sign language. Non-verbal language used by
| the speaker - sarcasm, tone, inflection - either translate
| badly, or get lost entirely when transliterating into
| subtitles. A talented sign-language translator is able to
| carry this over much better.
| dfee wrote:
| Yes, but.
|
| Now, you're getting the inflection of the interpreter and
| not the speaker (derivative). Certainly you're not
| hearing inflection on the original orator, either, so
| maybe it's a mixed bag.
| enkid wrote:
| I don't know much about sign language. Does it use a
| different grammar from English?
| IanCal wrote:
| Yes, it's best to consider it an entirely different
| language rather than replacements of words.
| matthewbarras wrote:
| We have subtitles on all the time for my little boy and can
| attest that the BBC is very poor - even on iPlayer.
|
| Interesting side note: if ever subtitles are turned off, or
| we are watching TV elsewhere, me and my wife can't 'hear'
| well. Even if the volume is up. Like we've untrained our
| ability...
| nmstoker wrote:
| Totally agree, the "live" subtitling on BBC is remarkably
| bad.
|
| It's way worse than even using a cheap computer with open
| source solutions - it's strange no one at the BBC decides
| to resolve this as it would be easy and could easily give
| the public a far better result. Even if you took a hit on
| the most obscure words, off the shelf would outperform the
| current process by a country mile.
| [deleted]
| retrac wrote:
| Sign languages are not coded speech. They are languages with
| their own grammar and vocabulary. For example, American SL is
| descended from Old French Sign Language and is partially
| understandable by French SL speakers today, while British SL is
| completely different, not in the same language family. It is
| even possible to write sign language. it is done like with
| spoken language. The most basic components, akin to phonemes in
| spoken language, are a closed set, assigning a symbol to each
| allows lossless transcription. Mostly used by linguists; but
| there are some books in ASL.
|
| Deaf people who speak sign language natively approach English
| as a second language. And it is hard to learn a spoken language
| when deaf. English literacy rates among ASL native speakers are
| rather low.
| amelius wrote:
| Still, I suppose that many people still want the original
| phrasing, not a translation where subtleties might get lost.
| lazyeye wrote:
| This has to be a generational thing. There is no way a deaf
| person growing up now is not going to be using the internet.
| adammarples wrote:
| A speech and language therapist tells me that spelling and
| reading is hard for deaf children because we match phonics
| to text but they don't have access to phonics, so they have
| much lower literacy levels without specialist help
| retrac wrote:
| Partial literacy is all you need for YouTube or video
| calls. Plenty of hearing people can read well enough for
| that too, or to find what they need at the store, but can't
| read well enough to e.g. summarize the main points of a
| newspaper story. I've seen estimates that something like 20
| - 40% of Americans are functionally illiterate in that way.
| For the Deaf, it is even higher.
| gpvos wrote:
| Sign language is the native language for most deaf people,
| while subtitling is derived from spoken language which is
| usually their second language. Also you can express emotions
| better using it, similar to how it's easier to convey them
| using speech than using text.
| MrJohz wrote:
| The other part of the jigsaw that a lot of people don't
| realise at first is that sign languages are distinct
| languages from spoken languages. Or to put it another way:
| ASL is to American English as Portuguese is to Korean.
| jbms wrote:
| Sign language has a different grammar. At least in British Sign
| Language. Simplistically, put the object of the sentence first
| so it's clearer what's being talked about.
|
| For someone who is profoundly deaf from birth and who can't
| lipread, the way we speak and write is a massive struggle.
| Cochlear implants before a year old are much more common now,
| while the brain is still more malleable, so there's maybe less
| and less deaf people who are totally profoundly deaf and you
| may not realise what it's like for them if you never come
| across them.
| TechBro8615 wrote:
| Interesting perspective. I've never considered how much of
| reading and writing is dependent on first listening and
| speaking. I guess it makes sense, since the first steps to
| reading are "sounding out the words."
| AstixAndBelix wrote:
| Depends, if the speech is IRL-first and video-second then a
| sign language interpreter is better and cheaper than installing
| some sort of concoction to display live subtitles (which have
| to be typed by a paid steganographer).
|
| But in any case, deaf people still have the need to practice
| reading their language, so removing it from everywhere except
| IRL conversations might be detrimental to them
| aaron695 wrote:
| [dead]
| rozab wrote:
| As I understand it, BSL is a fully different language to spoken
| English, with different grammar and syntax. For someone whose
| 'first' language is BSL, reading subtitles is more like a
| second language, where meaning is not conveyed in the same way.
|
| https://www.british-sign.co.uk/what-is-british-sign-language...
| camyule wrote:
| Seems as relevant a place as any to highlight the lack of support
| for subtitles in BBC iPlayer on AppleTV:
| https://www.bbc.co.uk/iplayer/help/questions/accessibility/a....
|
| I work in a related industry so fully understand how these
| situations come to pass, but as a user it's frustrating to say
| the least.
| zinekeller wrote:
| Is it only Apple TV (and not on iPhone for example)? If it's
| Apple-wide I can imagine it but I'm curious what is the
| possible reason that BBC can't support having a separate
| subtitle renderer?
| fredoralive wrote:
| I just checked, iPad has it (and using colour coding, as that
| has come up elsewhere in these threads). On Apple TV a
| subtitle menu appears in the time bar, but it doesn't seem to
| do anything in either the "Automatic" or "CC" options it
| gives?
|
| CC is a bit of an Americanism, perhaps it is running more
| directly though some Apple playback code that is more
| particular about subtitle formats?
|
| I suspect the ultimate case is probably low usage statistics
| not making it a priority. The icon / BBC logo didn't update
| at the same time as the iOS version, so I suspect its a
| separate codebase for some reason?
| payamb wrote:
| iPlayer on TV (across all platforms) is a generic web
| application, topped with a custom wrapper app (think
| webview) for each platform, responsible to hook platform's
| native APIs to Web/JS APIs.
|
| I'm guessing there are some complications hooking Apple
| TV's native subtitles APIs to relevant web APIs, and low
| usage statistics doesn't help prioritising fixing the
| issue. Although the rumour is that they are working on a
| completely new Apple TV app.
| dbbk wrote:
| For most platforms this is true, but not Apple TV. I
| don't think they even allow it. iPlayer is definitely
| native code.
| fredoralive wrote:
| I think Apple must allow it, either that or Google have
| tried really hard to capture in native code the "shitty
| non-native web app" feel with the YouTube app. But the
| iPlayer app does feel fairly native.
| fiestajetsam wrote:
| Apple TV and AirPlay doesn't support "out-of-band" of
| subtitles into playback or the use of a separate subtitle
| renderer, the subtitles have to be linked in the HLS
| manifest. For a lot of OTT platforms, subtitles are processed
| and handled separately from AV media, so this is difficult to
| fix. Some platforms work around this by doing manifest
| manipulation either server or device side, both have
| pitfalls.
| rwmj wrote:
| Kind of related (to broadcast TV, not online media) -
| broadcasters are required to subtitle British TV in most cases.
| Tom Scott has a video about it:
| https://www.youtube.com/watch?v=m__OZ3ZsO4Y
| ErikVandeWater wrote:
| Couldn't find anything about the use of (sic) or [sic]. Maybe
| they want leeway on that so as not to be offensive?
| M2Ys4U wrote:
| The default position is that subtitles are a verbatim
| transcription of what was actually _said_ , so there's no need
| for "[sic]".
| operator-name wrote:
| Only recently the UK government style guide was on the front
| page:
|
| A simple guide on words to avoid in government -
| https://news.ycombinator.com/item?id=34104530
| zinekeller wrote:
| One of those big difference between UK subtitling and US
| captioning is the use of colo(u)r or the absence of it (https://w
| ww.bbc.co.uk/accessibility/forproducts/guides/subti...), which I
| believe boils down to technological differences between
| television systems in the '80s. While the UK-developed teletext
| system is also available in NTSC (in fact CBC have deployed it),
| the US opted out to develope a separate captioning system that
| only supports basic text and positoning. I'M NOT
| SURE IF THIS IS BECAUSE OF GOOD OLD PROTECTIONISM OR
| CONVERTING STATIONS TO HAVE A TELETEXT COMPUTER ON EVERY
| STATION IS IMPRACTICAL IN THE US WHERE, AT THE TIME AT
| LEAST, THERE ARE A DIVERSE NUMBER OF BROADCASTERS
| WHICH MIGHT NOT AFFORD THE EQUIPMENT NEEDED SINCE
| TELETEXT IS AN "ACTIVE" SYSTEM WHEREAS THE PBS-DEVE-
| LOPED SYSTEM CAN BE USED ON EXISTING, CAPTION-UNAWARE
| SYSTEMS
|
| ... while the UK only had two broadcasters at the time, which are
| the BBC and the IBA (Independent Broadcasting Authority*), it is
| easier to develop and actually broadcast teletext.
|
| * Despite their name suggesting that it only monitors broadcasts,
| they are actually the broadcaster which has the authority to
| install teletext equipment. The then-ITV companies are only
| program suppliers to the IBA, not broadcasters in their own
| right.
| artandtechnic wrote:
| The Closed Captioning System for the Deaf - which became known
| as "608" - was always capable of displaying captions in color.*
| However, this faculty was largely underutilized, due to a
| variety of historical / technical reasons.**
|
| * Indeed, the first test broadcasts from 1979 included color.
|
| ** This is due to the design of the original 1980 "decoder,"
| which only 'worked well' for color when installed inside an
| NTSC television. For the external adapters, grayscale instead
| of color was displayed. The only way to view color captions in
| color back in 1980/1981 was with a TeleCaption TV - and there
| was only one model ever made.
| kvm000 wrote:
| In my experience with a few big broadcasters like Paramount
| (previously Viacom) and Discovery, for broadcast in Europe/UK
| the signal they generate often has a mix of Teletext and/or DVB
| inserted based on the channel since those signals are
| distributed to a LOT of partners like local satellite and cable
| companies who can decide which parts of the signal to map into
| their system.
|
| In that context, "teletext" just works the same as the North
| American 608 captions and has nothing to do with the older
| full-screen data stuff. There are no restrictions around
| authority for "teletext equipment" - for those channels they
| actually use systems fully based in AWS with the playout engine
| running on an EC2 instance so all software-only to generate the
| single MPEG transport stream with video/graphics + audio +
| captions + subtitles.
|
| It's also common to send a DVB subtitle multiplex that has a
| number of languages (up to 20+) embedded in the same signal.
| fredoralive wrote:
| I was under the impression the "teletext" style ones still
| used teletext encoding, not EIA-608? It's basically just a
| data steam containing a single page of teletext rather than a
| whole magazine? AFAIK Sky Digital uses this approach (at
| least for SD), with a more modern looking decoder, and it
| certainly has colour support at least[1].
|
| [1] Although DVB can contain a full teletext stream that can
| be reinserted into the SD analogue VBI by the receiver. Sky
| boxes supported that so on some channels you could just go to
| ye olde page 888, although I haven't a clue if any channels
| still do that (I don't have an old SD Sky setup around to
| look).
| kvm000 wrote:
| That's correct - 608/708 is North America only, and
| UK/Europe could use OP42/47 teletext for simple captions
| (not whole magazine), and/or DVB (mostly language
| translation).
| zinekeller wrote:
| To be fair, I'm talking about the old PAL-based teletext and
| not about DVB teletext. Also, computers are now everywhere so
| the 888 page can be inserted with minimal effort unlike back
| then that it requires coordination with different teams.
| kvm000 wrote:
| I was talking about the older PAL-based teletext as well -
| but you're right, I understand it was different in terms of
| equipment originally with the full-screen data feeds.
|
| Separate from DVB, the old PAL OP42/47 teletext payload
| gets inserted into the MPEG-TS using SMPTE 2038[1] then
| when decoded, it would go into regular ancillary data of
| the uncompressed stream per ITU-R BT.1120-7[2].
|
| The broadcast industry really loves standards[3].
|
| [1] https://ieeexplore.ieee.org/document/7290549
|
| [2] https://www.freetv.com.au/wp-
| content/uploads/2019/08/OP-42-C...
|
| [3] https://xkcd.com/927
| crazygringo wrote:
| That's fascinating, to color-code by character.
|
| For (foreign-language) subtitles that seems distracting and
| unnecessary, since even if you don't see the character
| speaking, you can recognize their voice.
|
| But for closed-captioning (for the hard of hearing), it seems
| like it could add clarity. Yet in the past I've watched movies
| with CC without sound (like on a long-distance bus) and don't
| remember ever having a problem understanding who a line
| belonged to.
|
| Does anybody have actual experience with color-coded captions
| and whether they're more of a help or more of a distraction?
| mynameisvlad wrote:
| Why would it be _distracting_ to know who's line it is
| without color coding? Why would color be the thing that
| distracts you; wouldn't the actual captions be far more
| distracting?
|
| As to why: it's another datapoint to use to reconstruct what
| fully able people can do quickly. Sometimes captions aren't
| well timed. Sometimes a lot of things happen very quickly and
| it's hard to keep track. What if the camera isn't on anyone's
| face? You can probably figure it out by context, but why
| expend the mental energy when it can just be colored?
|
| Just because you never had this issue (and I'm sure you did,
| it's not like we remember the most mundane of details like
| this years later) doesn't mean it's not an issue.
| kvm000 wrote:
| The older "608"[1] system in North America was much simpler but
| the current "708"[2] standard does support specifying colours
| and fonts, but in my experience in the industry nobody uses
| those functions at all and just uses the 708 function to embed
| the older 608 payload data within the newer 708 data structure.
|
| In UK/Europe the older/simpler format would be OP-42/OP-47
| Teletext[3] which can be used for captions instead of the full-
| screen data pages, or DVB Subtitles[4], which get into more
| uses around "subtitles" in terms of language translation,
| rather than only the "closed caption" use case where it matches
| the content language. DVB subtitles can be sent as pre-rendered
| bitmaps or as data for client-side rendering.
|
| [1] https://en.wikipedia.org/wiki/EIA-608 [2]
| https://en.wikipedia.org/wiki/CTA-708 [3]
| https://en.wikipedia.org/wiki/Teletext [4]
| https://en.wikipedia.org/wiki/Subtitles
| zinekeller wrote:
| Yeah, I know of the "708" captioning system but it is
| surprisingly underutilized* by broadcasters. I think that
| they don't see any use for e.g. color?
|
| * in terms of 708-only features, not on the pedantic "ATSC
| uses the 708 system"
| kvm000 wrote:
| There are very strong lobbying groups that push for
| accessibility in terms of captions (as well as the "DV"
| described video audio track) but my impression is that
| their focus is on the quantity of content that's covered,
| and the quality (spelling, time-alignment), and I guess
| they don't care as much about text styling.
|
| The requirements are quite high in Canada[1] and have been
| expanding in the US as well[2].
|
| The company I work for makes products for broadcast
| customers, around asset management, linear playout
| automation, and the playout servers that insert the
| captions (from files or live data sources) so working out
| how that all happens is part of every big project.
|
| [1] https://crtc.gc.ca/eng/info_sht/b321.htm [2]
| https://www.fcc.gov/consumers/guides/closed-captioning-
| telev...
| [deleted]
___________________________________________________________________
(page generated 2022-12-23 23:01 UTC)