Broadcast News
16/07/2014
Subtitling In An Integrated Workflow
Subtitling has remained the responsibility of the broadcaster rather than the original content producer (where different). But given that subtitles appear as part of the programme, it is valid to ask: "What if subtitling were part of the programme production process?" - by Andrew Lambourne, Business Development Director, Screen Systems.
In workflow terms, rather than being 'tacked on at the end' for a recorded programme, or produced by a team outside the newsroom for live material, could and should subtitling become part of an integrated workflow – and what benefits might accrue? Ofcom, the UK's broadcast regulator, is focusing on subtitle quality issues – though these mostly relate to live subtitling, where perfection is by definition impossible.
Subtitling costs are always under scrutiny – but the 1000:1 ratio between a typical production budget (£500-700,000 for mid-range drama) and the cost of subtitling the programme suggests that there is room to be less parsimonious.
After all, subtitles can deliver an extra 10% viewership as well as significant cultural benefits through help with language learning (see references 1, 2 below).
For recorded programmes, closer ties with production could deliver real benefits to the subtitling workflow. One simple plus would be access to production scripts, or if there is no accurate script, access to a dialogue-only audio track. Not only could this be transcribed automatically if there was no script, but the transcript would deliver important timing information which subtitlers otherwise have to spend time recreating. For live programmes, integration between subtitlers and the newsroom or live production team would minimize the chance of topical vocabulary – proper nouns in particular – not being added to the speech recognition engine which turns respoken text into live subtitles.
Subtitle quality issues are well understood. For recorded material, the task involves editorial skill in order to deliver the subtitles at a readable speed, or to handle two or three people speaking at once by producing a manageable sequence of texts. There is also the challenge of timing and positioning the texts to marry unobtrusively with pictures, shot changes and onscreen faces (obscuring the lips of the speaker is unhelpful). Nevertheless this is a mature industry and subtitle presentation guidelines are readily available and well researched. So for recorded material, subtitling can and should attain 100% quality, and if it does not then the issues are either lack of funding or lack of time. Both would be addressed by ensuring that the subtitles were embraced by the same high production standards and tight workflow management that governed the creation of the original content – and indeed why should they not be? Are deaf and hearing impaired viewers any less deserving of a quality product?
For live subtitling, there are far more significant challenges simply due to the requirement to produce live texts within a few seconds of unscripted speech. Achieving 100% textual accuracy is extremely difficult because of the unpredictable nature of real-time events.
A subtitler transcribing live reportage will be using some kind of real-time phonetic transcription system. Most commonly, this will be a speech recognition engine trained to the sound of their voice, though some still use Stenograph machine shorthand systems which work on the same basis: phonetic codes are converted into conventional text by a computer. And crucially, the computer needs to know in advance the vocabulary which may appear, and how it is represented in sound units or 'phonemes'.
Unfortunately, the nature of news or live interviews is that it is not always possible to predict in advance exactly the scope of the vocabulary, and when an unexpected name of a person, Icelandic volcano, organisation or football team, is spoken, there is a split second in which to decide what to do.
If the (typically lone) subtitler was backed up by members of the production team, such problems could be headed off by another team member entering the missing vocabulary 'on the fly' into a place-holder which the subtitler could call up.
This 'buddy' could also spot any serious recognition errors in the text output and rapidly intervene – although to make this feasible, there would need to be a few seconds delay to the live programme delivery otherwise the subtitles would be significantly late. So-called antenna delays are resisted by broadcasters for fear of disenfranchising the majority audience or causing loopholes for fraudulent betting: in which case the quality impact on live subtitles has to be taken on the chin and explained to viewers.
Achieving 100% timing accuracy is again impossible if texts are produced in real time: the subtitler has to listen to the original utterance, perhaps edit it, formulate a coherent text, speak it to the computer, and have it delivered to air. This can take 5-10 seconds. By being fed with pictures and sound before digital encoding and compression, the subtitler is given a few seconds start on the audio/video signal received by viewers, which reduces the delay a little. But again, the only way to eliminate it is to incorporate antenna delay in the delivery chain.
As we move further into the multi-platform multi-modal world, delivering subtitles for web-streamed content becomes even more important. Screen has developed technology which can be used to provide high-quality optional subtitles on live or recorded content delivered to the most popular player technologies. This overcomes the difficulties in timing a text channel with an AV channel on the web, and given the extra encoding delay may allow the timing of live texts to be pulled in closer to the original audio.
Subtitling is comparatively under‑ funded compared to programme production, and this necessarily impacts on quality. If subtitling were more integrated into programme production, there would be knock-on benefits to the workflow, and likely benefits by being embraced by the overall quality standards of the whole production.
Were the true commercial and cultural benefits of subtitles more widely understood - delivering potentially a 10% larger audience and helping people with language learning - there would likely be more commitment to delivering the best possible subtitles, rather than perhaps the just acceptable.
References:
1) 'Measuring and improving subtitle quality', Mike Armstrong, BBC R&D, presentation at BVE 2014
2) 'Study on the use of subtitling - The potential of subtitling to encourage language learning and improve the mastery of foreign languages'. European Commission, June 2011
The article is available to read in BFV online.
(IT/JP)
In workflow terms, rather than being 'tacked on at the end' for a recorded programme, or produced by a team outside the newsroom for live material, could and should subtitling become part of an integrated workflow – and what benefits might accrue? Ofcom, the UK's broadcast regulator, is focusing on subtitle quality issues – though these mostly relate to live subtitling, where perfection is by definition impossible.
Subtitling costs are always under scrutiny – but the 1000:1 ratio between a typical production budget (£500-700,000 for mid-range drama) and the cost of subtitling the programme suggests that there is room to be less parsimonious.
After all, subtitles can deliver an extra 10% viewership as well as significant cultural benefits through help with language learning (see references 1, 2 below).
For recorded programmes, closer ties with production could deliver real benefits to the subtitling workflow. One simple plus would be access to production scripts, or if there is no accurate script, access to a dialogue-only audio track. Not only could this be transcribed automatically if there was no script, but the transcript would deliver important timing information which subtitlers otherwise have to spend time recreating. For live programmes, integration between subtitlers and the newsroom or live production team would minimize the chance of topical vocabulary – proper nouns in particular – not being added to the speech recognition engine which turns respoken text into live subtitles.
Subtitle quality issues are well understood. For recorded material, the task involves editorial skill in order to deliver the subtitles at a readable speed, or to handle two or three people speaking at once by producing a manageable sequence of texts. There is also the challenge of timing and positioning the texts to marry unobtrusively with pictures, shot changes and onscreen faces (obscuring the lips of the speaker is unhelpful). Nevertheless this is a mature industry and subtitle presentation guidelines are readily available and well researched. So for recorded material, subtitling can and should attain 100% quality, and if it does not then the issues are either lack of funding or lack of time. Both would be addressed by ensuring that the subtitles were embraced by the same high production standards and tight workflow management that governed the creation of the original content – and indeed why should they not be? Are deaf and hearing impaired viewers any less deserving of a quality product?
For live subtitling, there are far more significant challenges simply due to the requirement to produce live texts within a few seconds of unscripted speech. Achieving 100% textual accuracy is extremely difficult because of the unpredictable nature of real-time events.
A subtitler transcribing live reportage will be using some kind of real-time phonetic transcription system. Most commonly, this will be a speech recognition engine trained to the sound of their voice, though some still use Stenograph machine shorthand systems which work on the same basis: phonetic codes are converted into conventional text by a computer. And crucially, the computer needs to know in advance the vocabulary which may appear, and how it is represented in sound units or 'phonemes'.
Unfortunately, the nature of news or live interviews is that it is not always possible to predict in advance exactly the scope of the vocabulary, and when an unexpected name of a person, Icelandic volcano, organisation or football team, is spoken, there is a split second in which to decide what to do.
If the (typically lone) subtitler was backed up by members of the production team, such problems could be headed off by another team member entering the missing vocabulary 'on the fly' into a place-holder which the subtitler could call up.
This 'buddy' could also spot any serious recognition errors in the text output and rapidly intervene – although to make this feasible, there would need to be a few seconds delay to the live programme delivery otherwise the subtitles would be significantly late. So-called antenna delays are resisted by broadcasters for fear of disenfranchising the majority audience or causing loopholes for fraudulent betting: in which case the quality impact on live subtitles has to be taken on the chin and explained to viewers.
Achieving 100% timing accuracy is again impossible if texts are produced in real time: the subtitler has to listen to the original utterance, perhaps edit it, formulate a coherent text, speak it to the computer, and have it delivered to air. This can take 5-10 seconds. By being fed with pictures and sound before digital encoding and compression, the subtitler is given a few seconds start on the audio/video signal received by viewers, which reduces the delay a little. But again, the only way to eliminate it is to incorporate antenna delay in the delivery chain.
As we move further into the multi-platform multi-modal world, delivering subtitles for web-streamed content becomes even more important. Screen has developed technology which can be used to provide high-quality optional subtitles on live or recorded content delivered to the most popular player technologies. This overcomes the difficulties in timing a text channel with an AV channel on the web, and given the extra encoding delay may allow the timing of live texts to be pulled in closer to the original audio.
Subtitling is comparatively under‑ funded compared to programme production, and this necessarily impacts on quality. If subtitling were more integrated into programme production, there would be knock-on benefits to the workflow, and likely benefits by being embraced by the overall quality standards of the whole production.
Were the true commercial and cultural benefits of subtitles more widely understood - delivering potentially a 10% larger audience and helping people with language learning - there would likely be more commitment to delivering the best possible subtitles, rather than perhaps the just acceptable.
References:
1) 'Measuring and improving subtitle quality', Mike Armstrong, BBC R&D, presentation at BVE 2014
2) 'Study on the use of subtitling - The potential of subtitling to encourage language learning and improve the mastery of foreign languages'. European Commission, June 2011
The article is available to read in BFV online.
(IT/JP)
Top Related Stories
Click here for the latest broadcast news stories.
14/05/2024
Chyron LIVE Cloud-Native Live Production Platform Updated
Chyron has released a series of significant updates to its Chyron LIVE cloud-native live production platform that were demonstrated at the 2024 NAB Sh
Chyron LIVE Cloud-Native Live Production Platform Updated
Chyron has released a series of significant updates to its Chyron LIVE cloud-native live production platform that were demonstrated at the 2024 NAB Sh
21/11/2018
Subtitling Is A Profit-Boosting Opportunity For Broadcasters
They are not limited to just being used as translation devices for foreign films or only of benefit to the hearing-impaired. Therefore, why is it that
Subtitling Is A Profit-Boosting Opportunity For Broadcasters
They are not limited to just being used as translation devices for foreign films or only of benefit to the hearing-impaired. Therefore, why is it that
29/07/2021
MAX Live Media Access Services Invests In SubtitleNEXT
Belgium-based live-subtitling experts MAX Live Media Access Services have invested in SubtitleNEXT to provide real-time subtitling workflows. Subtitle
MAX Live Media Access Services Invests In SubtitleNEXT
Belgium-based live-subtitling experts MAX Live Media Access Services have invested in SubtitleNEXT to provide real-time subtitling workflows. Subtitle
19/09/2012
WinCAPS Q-Live Catapults Live Subtitling To A New Level At IBC
Screen’s IBC stand has shown that their acquisition of SysMedia has been a catalyst for further market expansion with connected TV and HbbTV solutions
WinCAPS Q-Live Catapults Live Subtitling To A New Level At IBC
Screen’s IBC stand has shown that their acquisition of SysMedia has been a catalyst for further market expansion with connected TV and HbbTV solutions
10/11/2004
SysMedia brings WinCAPS live subtitling to Portuguese polytechnic
SysMedia, a leading specialist in subtitling solutions, content management and production systems for interactive television and teletext, has secured
SysMedia brings WinCAPS live subtitling to Portuguese polytechnic
SysMedia, a leading specialist in subtitling solutions, content management and production systems for interactive television and teletext, has secured
24/03/2009
Screen Subtitling Systems Wins Turner Contract For HD Subtitling In Argentina
Screen Subtitling Systems has won a contract with Turner to provide two HD subtitling systems for Imagen Satelital S.A. its affiliate in Argentina. Pr
Screen Subtitling Systems Wins Turner Contract For HD Subtitling In Argentina
Screen Subtitling Systems has won a contract with Turner to provide two HD subtitling systems for Imagen Satelital S.A. its affiliate in Argentina. Pr
02/05/2006
Screen Subtitling Systems unveil latest HD subtitling technology at NAB
Screen Subtitling’s latest innovations mean that all types of subtitling services are available in HD. From preparation through transmission to compre
Screen Subtitling Systems unveil latest HD subtitling technology at NAB
Screen Subtitling’s latest innovations mean that all types of subtitling services are available in HD. From preparation through transmission to compre
23/06/2006
World Cup Subtitling First For ARD With SysMedia's WinCAPS Technology
Using SysMedia’s market-leading WinCAPS subtitling system, German national public broadcaster ARD is providing extensive subtitling for hearing-impair
World Cup Subtitling First For ARD With SysMedia's WinCAPS Technology
Using SysMedia’s market-leading WinCAPS subtitling system, German national public broadcaster ARD is providing extensive subtitling for hearing-impair
10/09/2007
Screen And Sysmedia Reach Solutions Partner Agreement
SysMedia Ltd and Screen Subtitling Systems Ltd have announced the signing of a strategic agreement that will enable them each to recommend the most co
Screen And Sysmedia Reach Solutions Partner Agreement
SysMedia Ltd and Screen Subtitling Systems Ltd have announced the signing of a strategic agreement that will enable them each to recommend the most co
03/06/2024
ENCO To Present Live Production And Dynamic Innovations
ENCO is to emphasise its latest live production and dynamic presentation innovations for AV environments at InfoComm next month, highlighting the powe
ENCO To Present Live Production And Dynamic Innovations
ENCO is to emphasise its latest live production and dynamic presentation innovations for AV environments at InfoComm next month, highlighting the powe
03/04/2013
Timed Text and Subtitling
Potential users of timed-text are still undecided on which format to use. John Birch, Strategic Partnerships Manager, Screen Systems, discusses the va
Timed Text and Subtitling
Potential users of timed-text are still undecided on which format to use. John Birch, Strategic Partnerships Manager, Screen Systems, discusses the va
17/01/2011
Denmark's TV2 Selects SysMedia For Regional Live Subtitling
The eight TV2 regions that serve the Danish population have all installed SysMedia WinCAPS Live subtitling systems to provide subtitles for their regi
Denmark's TV2 Selects SysMedia For Regional Live Subtitling
The eight TV2 regions that serve the Danish population have all installed SysMedia WinCAPS Live subtitling systems to provide subtitles for their regi
27/06/2005
ARD goes live with SysMedia Subtitling System
SysMedia has announced that German national broadcaster ARD has gone to live production with SysMedia’s WinCAPS subtitling system. ARD, along with reg
ARD goes live with SysMedia Subtitling System
SysMedia has announced that German national broadcaster ARD has gone to live production with SysMedia’s WinCAPS subtitling system. ARD, along with reg
27/05/2003
Subtitling system at BBC Wales goes live with Isis
On Monday May 19 the subtitling unit at BBC Wales went live for news with Starfish Technologies' Isis subtitling system, creating both live and pre-re
Subtitling system at BBC Wales goes live with Isis
On Monday May 19 the subtitling unit at BBC Wales went live for news with Starfish Technologies' Isis subtitling system, creating both live and pre-re
09/09/2024
Amagi To Showcase Advanced Live Remote Production Capabilities
Amagi is to showcase its latest innovations in live remote production, ground-to-cloud migration, and streaming at the upcoming IBC2024 in Amsterdam.
Amagi To Showcase Advanced Live Remote Production Capabilities
Amagi is to showcase its latest innovations in live remote production, ground-to-cloud migration, and streaming at the upcoming IBC2024 in Amsterdam.