NEWPORT BEACH: From 2006 – when solutions to enable the provision of closed captions on inflight entertainment first began to be within reach – until today, the closed caption capabilities of inflight entertainment systems (IFES) where the video display is under viewer control have gone from virtually zero to nearly half of such systems flying.
Today, due to the accomplishments made by the inflight entertainment industry, nearly all of the inflight entertainment systems (IFES) being sold support closed captions of the kind codified by the Airline Passenger Experience Association (APEX) Technology Committee (TC) in 2009. And by the first quarter of 2015, only closed caption-capable IFES are expected to be sold.
This was the message imparted by Russ Lemieux, Executive Director of APEX, and this writer, as invited speakers on November 5, 2014 at the U.S. Department of Transportation (DOT) Disabilities Forum in Washington DC. We were joined by Donna Danielewski, PhD, Director of the National Center for Accessible Media (NCAM), in a panel discussion.
This Forum, attended by 150 stakeholders comprised of regulators, including Transportation Secretary Anthony Foxx, airline industry representatives, and representatives of disability organizations, considered a wide range of issues such as wheel chair use, service animals, and kiosks, as DOT also considers regulations to require closed captions on inflight entertainment content.
APEX (formerly known as the World Airline Entertainment Association [WAEA]) has approximately 416 global members, including 90 of the world’s airlines, major equipment manufacturers, virtually all IFES suppliers, content providers (including all six major motion picture studios), major broadcast and media companies, and principal audio-video post-production providers.
APEX, which had begun consideration of closed caption technology as early as 2004 along with NCAM, engaged with DOT in 2006 when the agency released a Notice of Proposed Rulemaking (NPRM) that would have required closed captions on IFE content at that time. In 2009, DOT announced that it was not able to establish requirements then because of the limitations of the technology at that time. Our industry, however, continued to pursue a solution and set out a roadmap later that year.
Given that it was just over seven years ago that possible solutions began to emerge, and only about five years ago that we were able to codify those solutions in a specification that the industry could follow in developing IFES, this represents real progress in the minds of the IFE industry. While we in the IFE community are understandably proud of this progress, from the perspective of the disability community, some may understandably see this as the glass still being half empty.
A 12 to 15-year lifecycle
For those who measure the speed of technology in how frequently a new iPhone comes out, five to seven years may seem slow. But in the inflight entertainment (IFE) industry, the life cycle of an IFES is about 12-15 years. Development time, where technology and features are set, averages about two years. From installation, an airline will expect the system to last 10 to 13 years.
It was recently reported on The Wall Street Journal website that two airlines noted for the quality of their cabin amenities still use one IFES supplier’s oldest IFES with screen technology dating back to the early 1990s.
Why so long? A typical new MPEG-4 digital in-seat IFES costs as much as $5 million per aircraft, depending on the size of the aircraft and the features of the IFES. Among the reasons for such costs is that the FAA is required by law to certify all onboard equipment for aircraft installation and airworthiness, necessitating designs unique to IFE use and substantial costs and time requirements.
This hardware has specific capabilities, and may look like consumer versions externally, but operational features are specific to aircraft use. To ensure ready supplies, hardware design is locked in for a longer period of time than with consumer products, and interfaces and operating software are specific to IFE.
And to replace such IFES, aircraft must be removed from service—at a cost of perhaps hundreds of thousands of dollars per day. Moreover, IFE is not a profit center—it is a cost center—and even IFES that generate revenue rarely if ever recover their costs.
So it is easy to see why an airline might consider such costs to be prohibitive to the replacement of IFES for the singular purpose of enabling closed captions. Despite this, airlines have shown a great desire to make continuous progress and investment in accommodating passengers with disabilities.
Hardware is only half of the solution
But though it might appear that enabling closed captions on IFES is the major issue, the hardware/software platforms are only half of the picture. The content—the movies, TV, etc.—delivered to IFE must also support closed captions. And the closed captions that are delivered must be technologically compatible with the IFES they are being delivered to.
The closed captions used in IFE come from other sources such as digital cinema, packaged media (DVDs, etc.), broadcast television, and even the Internet. But these sources use different standards for closed captions. While the entertainment industry has initiatives underway, of which the APEX Technology Committee is a part, to support interoperable standards like the so-called “Interoperable Master Format (IMF)”supported by the Society of Motion Picture and Television Engineers (SMPTE), these objectives are not fully realized.
So perhaps the Digital Cinema Package delivers closed captions in SMPTE Timed Text 2052, and a television broadcaster delivers in WebVTT, a DVD supplier delivers in a DVB format, and other sources in a number of other possibilities. The IFE industry needs a common standard for the delivery of closed captions. That common standard is the responsibility of a group of well-qualified industry experts, including Geoff Freed of NCAM who is a member of the W3C Timed Text Working Group. .
The APEX Technology Committee established its Closed Caption Working Group (CCWG), chaired by Jon Norris of Lumexis, as a result of the 21st Century Communications and Video Act (CVAA) which, in 2010, established SMPTE Timed Text 2052 as the “safe harbor” technology for closed captions deliverable under the Act.
In 2009, the APEX Technology Committee had codified a so-called “bitmap” solution, designed specifically for IFES, and which differs significantly from the WebVTT and Timed Text formats used in either U.S. broadcast or Internet television. With roots in DVD technology, bitmap appeared to have a reasonably long life, but emerging technology, a decline in DVD sales, and the adoption of CVAA and its support of Timed Text changed all that. The CCWG is tasked with updating the specification accordingly.
CCWG is considering the available technologies for a digital delivery profile that can be delivered to IFE from the wide range of sources from which IFE ingests its content. Included in its considerations are SMPTE Timed Text 2052 potentially in the UltraViolet Common File Format (CFF) Timed Text (TT) profile, the W3C Simple Delivery Profile, and WebVTT.
But the job is not over. Once this IFE closed caption delivery profile is complete, the closed captions delivered thereunder still have to be repurposed for delivery to two, three or more potential generations of IFES—some of which are limited to bitmap, and some of which can deal with Timed Text, and a few of which may not support either. So the group will also codify the conversion technologies required.
A considerable amount of IFE content must support subtitles—text that converts the audio language into a different language for viewer that don’t speak the audio language—as well as captions—which convert the audio language into text along with non-verbal sounds like gunshots, screeching tires, or music crescendos.
Today those subtitles are often “burned in” to the video file, i.e., not rendered separately. If the display is to offer both subtitles and captions in shared space, then both the subtitles and captions must be provided separate from the video—sometimes referred to as “dynamic subtitles” as well as “closed captions.” But subtitle languages are often in Unicode and are read right-to-left rather than left-to-right.
Creating digital files capable of rendering right-to- left and left- to-right subtitles and captions dynamically on the same screens may be beyond the current capability of some of the encoders used by some specialty IFE post-production providers, potentially necessitating hardware/software upgrades, and potentially revised workflows. CCWG participants Andy Rosen and Sam Larkin of a Seattle company called Bitlogic are particularly focused on this area.
What’s more, IFE often requires multiple versions of the content it offers due to cultural concerns and the potential for one viewer to be offended by what is display on a neighbor’s screen. A typical IFE movie may be released in a theatrical version, an “airline-edited” version, a “conservative-edited” version, a Middle East edited version, and even airline-specific edited version.
The caption sets must be edited to each of these edited versions, and then each of those versions repurposed to the platform requirements of each IFES. A bit of simple math demonstrates how easily the number of versions gets to a dozen or more.
IFE is a global business, but closed captions are more readily available for English language content of U.S. origin than in other languages. Even English originating outside the U.S. may not have the same closed caption support, or it may come in an additional range of standards.
All these issues make the provision of closed both complex and costly.
In the opinion of this writer, outreach such as participation in the DOT Forum is necessary to educate disability groups and governmental agencies that the provision of closed captions for the deaf and hard-of-hearing is not a simple “flip of the switch”.
No other industry is faced with the need to support platforms during a two-year development and 10 to 13-year deployment cycle while receiving content in digital formats that may have changed and evolved multiple times during the lifecycle of the platforms themselves.
No other industry depends solely on content created for other markets that must be ingested from so many different sources and repurposed to the requirements of multiple generations of systems.
This is a story that we must tell.
But the limitations on access to IFE for persons who are deaf, hard-of-hearing, blind or visually impaired must be considered as an important part of the passenger experience which our industry needs to deal with using our best efforts.
 The capability referred to is the capability of displaying closed captions in accordance with the specifications codified by the APEX Technology Committee in 2009.
 The National Center for Accessible Media (NCAM) is part of the Media Access Group at Boston’s public broadcasting station WGBH. In 1972, WGBH revolutionized television and video for people who are deaf or hard-of-hearing by providing the first instance of open captions on Julia Child’s “The French Chef.” In 1992, WGBH began researching captioning in movie theaters to enable independent access to films and successfully developed innovative technologies that make it possible to provide captions on movies shown in theaters.
Michael is a content management consultant who has led or participated in a number of industry working groups that have established specifications for the delivery of digital content. He and a working group co-chair Pierre Schuberth, currently of Thales, drafted the industry’s response to the DOT NPRM concerning closed caption requirements in IFE in 2006. He is a member of the Society of Motion Picture and Television Engineers (SMPTE) and was given APEX’s Outstanding Contribution Award in 2013 for his contributions to the industry’s move into digital standards.