Broadcast News
10/06/2015
Live Subtitle Quality – A Conundrum Worth Considering
Ofcom, the UK broadcast and communications regulator, has just published its third report into the quality of live subtitles accompanying TV programmes in the UK, writes Andrew Lambourne – Business Development Director for Screen Subtitling Systems.
Working with five main channels including the BBC and ITV, Ofcom has requested feedback on quality and is suggesting moves which could improve subtitle timing and accuracy. After the fourth report later this year the possibility exists that regulation may be changed to underpin changes in order to benefit viewers. The process is in response to continuing complaints from viewers that live subtitles are late, sometimes too fast to read and more error-prone than they would like.
With subtitles often displayed on TVs in gyms, pubs and waiting rooms it is frequently remarked upon by those unfamiliar with the techniques involved, that there are surprising and sometimes amusing errors. Newspapers and websites regularly feature the "best of the worst", implying that those involved in the production of live subtitles are careless, sloppy or incompetent. Given that this is an incredibly tough job, a manifest lack of awareness of the challenge and difficulty involved in creating an accurate real-time transcript of spoken text and 180+ words per minute with minimal delay explains but does not necessarily excuse such comments.
The workflow for producing subtitles for recorded programmes is little remarked-on and probably now taken for granted: a script (if available) is used to generate dialogue texts, and if not then the dialogue is transcribed. These texts are coloured to indicate change of speaker, timed against the programme audio, checked (which mostly involves balancing timing against the audio and visual content) and released as a file for inclusion with the media during the transmission. If errors occur, they are likely to be due to human error during checking, or occasionally some technical failure in bringing together the right assets at the right time for transmission. Bearing in mind that subtitling is often seen as an unwelcome cost and can be therefore frequently commissioned on a shoestring budget compared to the rest of the content production process (generally a fraction of a percent of overall budget) the quality which is achieved is testament to the professionalism of those involved.
When it comes to live subtitling, the same levels of professionalism are brought to bear on a much tougher problem. No longer is there time to create and carefully check a transcript using conventional keyboard technology: the text must be produced, reviewed and broadcast by one person in a matter of seconds literally as the programme proceeds. And in the case of a live chat-show (one of the hardest genres) there may be more than one guest speaking at a time in a heated discussion. There is no opportunity to "rewind the tape" and listen again – as a subtitler you get one chance to hear and comprehend what was said.
At the core of the process is the tool used to transcribe text in real time. When live news subtitling first started in the UK this was a "fast keyboard" technology, but due to lack of sufficiently trained operators (training for stenograph keyboards can take two years or more) a transition has been made to the predominant use of speech recognition software. This software is generic, out-of-the-box and has to be optimised to the voice and speech-patterns of each individual live subtitling re-speaker, as well as improved through the addition of many hundreds of topical words and phrases covering names of people, places, organisations and things likely to occur in live broadcasts.
Similarly the re-speaker is carefully trained to be able to listen, comprehend and re-speak the content of live television programmes in such a way as to render an accurate and meaningful transcript with the minimum of delay, no factual distortion, and with utterly clear diction so as to get the required 99%+ response from the speech recognition software. And in order to produce a readable result, this involves speaking all the necessary punctuation as well as remembering and using special verbal cues which help to distinguish between homophones or capitalised forms. Training takes months, and as with any live broadcast operation, you generally only get one shot to get it right; stopping to correct a serious error is possible, but the likelihood is that concentration will be lost and part or all of the next utterance may be missed.
The skills, tools and techniques are well understood; what is needed are two things. Firstly, better education of the audience to understand the real challenges involved, and secondly serious consideration of whether live programmes could be delayed by just a few seconds in order to give live subtitlers, working as a team rather than isolated individuals, time to spot and correct errors so as to satisfy those who insist on the highest standards of accuracy.
This article is also available to read at BFV online.
(JP)
Working with five main channels including the BBC and ITV, Ofcom has requested feedback on quality and is suggesting moves which could improve subtitle timing and accuracy. After the fourth report later this year the possibility exists that regulation may be changed to underpin changes in order to benefit viewers. The process is in response to continuing complaints from viewers that live subtitles are late, sometimes too fast to read and more error-prone than they would like.
With subtitles often displayed on TVs in gyms, pubs and waiting rooms it is frequently remarked upon by those unfamiliar with the techniques involved, that there are surprising and sometimes amusing errors. Newspapers and websites regularly feature the "best of the worst", implying that those involved in the production of live subtitles are careless, sloppy or incompetent. Given that this is an incredibly tough job, a manifest lack of awareness of the challenge and difficulty involved in creating an accurate real-time transcript of spoken text and 180+ words per minute with minimal delay explains but does not necessarily excuse such comments.
The workflow for producing subtitles for recorded programmes is little remarked-on and probably now taken for granted: a script (if available) is used to generate dialogue texts, and if not then the dialogue is transcribed. These texts are coloured to indicate change of speaker, timed against the programme audio, checked (which mostly involves balancing timing against the audio and visual content) and released as a file for inclusion with the media during the transmission. If errors occur, they are likely to be due to human error during checking, or occasionally some technical failure in bringing together the right assets at the right time for transmission. Bearing in mind that subtitling is often seen as an unwelcome cost and can be therefore frequently commissioned on a shoestring budget compared to the rest of the content production process (generally a fraction of a percent of overall budget) the quality which is achieved is testament to the professionalism of those involved.
When it comes to live subtitling, the same levels of professionalism are brought to bear on a much tougher problem. No longer is there time to create and carefully check a transcript using conventional keyboard technology: the text must be produced, reviewed and broadcast by one person in a matter of seconds literally as the programme proceeds. And in the case of a live chat-show (one of the hardest genres) there may be more than one guest speaking at a time in a heated discussion. There is no opportunity to "rewind the tape" and listen again – as a subtitler you get one chance to hear and comprehend what was said.
At the core of the process is the tool used to transcribe text in real time. When live news subtitling first started in the UK this was a "fast keyboard" technology, but due to lack of sufficiently trained operators (training for stenograph keyboards can take two years or more) a transition has been made to the predominant use of speech recognition software. This software is generic, out-of-the-box and has to be optimised to the voice and speech-patterns of each individual live subtitling re-speaker, as well as improved through the addition of many hundreds of topical words and phrases covering names of people, places, organisations and things likely to occur in live broadcasts.
Similarly the re-speaker is carefully trained to be able to listen, comprehend and re-speak the content of live television programmes in such a way as to render an accurate and meaningful transcript with the minimum of delay, no factual distortion, and with utterly clear diction so as to get the required 99%+ response from the speech recognition software. And in order to produce a readable result, this involves speaking all the necessary punctuation as well as remembering and using special verbal cues which help to distinguish between homophones or capitalised forms. Training takes months, and as with any live broadcast operation, you generally only get one shot to get it right; stopping to correct a serious error is possible, but the likelihood is that concentration will be lost and part or all of the next utterance may be missed.
The skills, tools and techniques are well understood; what is needed are two things. Firstly, better education of the audience to understand the real challenges involved, and secondly serious consideration of whether live programmes could be delayed by just a few seconds in order to give live subtitlers, working as a team rather than isolated individuals, time to spot and correct errors so as to satisfy those who insist on the highest standards of accuracy.
This article is also available to read at BFV online.
(JP)
Top Related Stories
Click here for the latest broadcast news stories.
03/08/2023
New OOONA API Automates Caption And Subtitle File Validation
OOONA has announced a new API providing additional functionality to its range of online-accessible support services. The new API is designed to check
New OOONA API Automates Caption And Subtitle File Validation
OOONA has announced a new API providing additional functionality to its range of online-accessible support services. The new API is designed to check
15/03/2017
VOD Subtitle Enforcement – A Wider Consideration
The recent decision by the UK government to give regulators new powers to enforce subtitling on 'video on demand' media indicates that legislation is
VOD Subtitle Enforcement – A Wider Consideration
The recent decision by the UK government to give regulators new powers to enforce subtitling on 'video on demand' media indicates that legislation is
09/02/2017
Advanced Online Subtitle Creation Tool Launches At BVE
The online subtitling tool resulting from the triparty partnership revealed at NAB 2016 between OOONA, Screen and Cavena will be formally launched at
Advanced Online Subtitle Creation Tool Launches At BVE
The online subtitling tool resulting from the triparty partnership revealed at NAB 2016 between OOONA, Screen and Cavena will be formally launched at
11/10/2016
Web Based Semi-Pro Subtitle Creation Tool Revealed At IBC
The announcement made by Screen Subtitling Systems, OOONA and Cavena at NAB earlier this year that they had entered a tri-party, technical collaborati
Web Based Semi-Pro Subtitle Creation Tool Revealed At IBC
The announcement made by Screen Subtitling Systems, OOONA and Cavena at NAB earlier this year that they had entered a tri-party, technical collaborati
09/08/2016
Changes Still Happening for Subtitle Technologist
I looked back on the piece from Screen just prior to IBC last year and it very much focusses on the vast and rapid changes that broadcast manufacturer
Changes Still Happening for Subtitle Technologist
I looked back on the piece from Screen just prior to IBC last year and it very much focusses on the vast and rapid changes that broadcast manufacturer
23/10/2012
Screen To Subtitle Royal Opera House Cinema Season
Screen has announced it will be the chosen supplier for the technical infrastructure behind the subtitling of the new Royal Opera House 2012/13 Live C
Screen To Subtitle Royal Opera House Cinema Season
Screen has announced it will be the chosen supplier for the technical infrastructure behind the subtitling of the new Royal Opera House 2012/13 Live C
16/04/2012
A 'Qu4ntum' Leap In Caption & Subtitle Preparation
Screen is at NAB again this year but this time they return with the ground-breaking subtitle preparation solution WinCAPS Qu4ntum. NAB 2012 will be Sc
A 'Qu4ntum' Leap In Caption & Subtitle Preparation
Screen is at NAB again this year but this time they return with the ground-breaking subtitle preparation solution WinCAPS Qu4ntum. NAB 2012 will be Sc
04/04/2011
SysMedia Customer Mega Is First Private Greek Channel To Subtitle
Mega Channel in Greece is the first private channel in the country to provide subtitles for the hard of hearing. With their teletext service already p
SysMedia Customer Mega Is First Private Greek Channel To Subtitle
Mega Channel in Greece is the first private channel in the country to provide subtitles for the hard of hearing. With their teletext service already p
04/08/2010
Softel ScheduleSmart Subtitle Makes Debut
Softel will present the ScheduleSmart automated subtitle control centre at IBC 2010. ScheduleSmart uses information from the broadcast event schedule
Softel ScheduleSmart Subtitle Makes Debut
Softel will present the ScheduleSmart automated subtitle control centre at IBC 2010. ScheduleSmart uses information from the broadcast event schedule
29/06/2010
IBC Launch For Subtitle Suite
This year's International Broadcasting Convention (IBC), in Amsterdam will see the launch of SysMedia's WinCAPS Quantum, a new subtitle preparation su
IBC Launch For Subtitle Suite
This year's International Broadcasting Convention (IBC), in Amsterdam will see the launch of SysMedia's WinCAPS Quantum, a new subtitle preparation su
02/03/2010
RadiantGrid Adds Softel Swift Subtitle
A developer of transcoding, transformation and new media automation service platforms, has announced that it is now adding subtitle processing to the
RadiantGrid Adds Softel Swift Subtitle
A developer of transcoding, transformation and new media automation service platforms, has announced that it is now adding subtitle processing to the
10/11/2009
SysMedia's Wincaps Subtitle Technology Deployed
Mexico's Televisa Deportes Network (TDN) and Argentina's Supercanal are the first Latin American broadcasters to take advantage of SysMedia's WinCAPS
SysMedia's Wincaps Subtitle Technology Deployed
Mexico's Televisa Deportes Network (TDN) and Argentina's Supercanal are the first Latin American broadcasters to take advantage of SysMedia's WinCAPS
01/07/2008
SysMedia To Launch New InFILE Subtitle Embedding Software At IBC
SysMedia will launch the latest version of its InFILE subtitle embedding software at IBC 2008. This will now add full broadcast quality open-caption s
SysMedia To Launch New InFILE Subtitle Embedding Software At IBC
SysMedia will launch the latest version of its InFILE subtitle embedding software at IBC 2008. This will now add full broadcast quality open-caption s
19/11/2007
France 3 Purchases SDR7 DVB Subtitle Inserter/Transcoders From SysMedia
France 3, the second largest public television network in France, has purchased SDR7 DVB subtitle inserter/transcoders from SysMedia for deployment ac
France 3 Purchases SDR7 DVB Subtitle Inserter/Transcoders From SysMedia
France 3, the second largest public television network in France, has purchased SDR7 DVB subtitle inserter/transcoders from SysMedia for deployment ac
31/05/2007
SysMedia Launch Teletext-to-DVB Subtitle Transcoder
SysMedia has announced the launch of its new teletext-to-DVB subtitle transcoder, the SDR7. The SDR7 extracts teletext subtitles from an analogue vide
SysMedia Launch Teletext-to-DVB Subtitle Transcoder
SysMedia has announced the launch of its new teletext-to-DVB subtitle transcoder, the SDR7. The SDR7 extracts teletext subtitles from an analogue vide