English is the most widely-spoken language in human history. An estimated 1.5 billion people speak this bastard child of French, Latin and German, its spread fuelled first by the expansion of the British Empire, and thereafter with the rise of its American successor. As a language, English has come to be associated with all that is modern and culturally avant-garde, and a bridge from the ‘provinces of silence,’ as the critic George Steiner put it, to opportunities denied those whose vernaculars do not stretch far beyond the borders of their own homelands.

It is also remarkable how alienating English language can be to the unfamiliar. For one, argues the journalist and author Jacob Mikanowski in his magisterial essay on the subject for the Guardian, the language is as bland as it is complex, the milquetoast of speech, ‘whose only defect […] is that it makes everyone who speaks it sound like a duck’. Then come the conceptual barriers it throws up, most notably in its emphasis on using the first-person singular, anchoring the individual speaking as the primary reference point in the conversation – a difficult concept to understand for some learners from collective societies.

Few of these obstacles are evident to native English speakers, among whom it is commonly assumed that most people on the planet will be able to convey at least something, somehow, in their language through a mixture of boilerplate phrases and gestures – an arrogant attitude, to say the least. This philosophy is reflected not only in the number of adults in the UK and US capable of speaking a foreign language (38% and 20%, respectively) but also in their airports. Travel beyond those gateways of international travel like Heathrow or LAX to those regional airports in Scotland, Canada or the continental US, and one will be hard-pressed to find signage in anything other than English.

The experience of travelling through even a major hub for a non-English speaker can be overwhelming. “We have 34 airlines in our building, and 22 million passengers from all over the world,” explains Roel Huinink, chief executive of JFKIAT, the company responsible for the operation of Terminal 4 at JFK International Airport in New York City. “We see people from all over the world, which is fantastic […] but that also means that they need to be supported, because we realise that not everybody speaks the English language.”

For most of those travellers who found themselves stuck in a linguistic bind, their first port of call would be one of Terminal 4’s many service assistants and volunteers. Even so, not all of them would necessarily be able to understand Mandarin, Arabic or Vietnamese. Now, by using a modified version of the interpreter mode available on Google Nest Hubs installed in the arrivals hall, the help desk in the retail lounge and the post-security information booth, they now have the ability to conduct conversations in all three.

“We see people from all over the world, which is fantastic. But that also means that they need to be supported, because we realise that not everybody speaks the English language.”
Roel Huinink

Tried, tested and translated

Huinink is a veteran of airport management. After spending 15 years climbing the greasy management pole at the Royal Schiphol Group, he joined JFKIAT as its new chief executive in 2018. His experience helping to run some of Europe’s busiest airports had left him convinced that the industry was “not as fast-moving as some of the other businesses around the world” when it came to adopting new technology. With that in mind, the new boss quickly proposed a deal to high-tech companies working in the New York area: if you have a solution that you think would work in the airport space, come test it out at Terminal 4.

Google was one of the first to respond, pointing out that its new ‘Interpreter Mode’ for Google Assistant might be usefully deployed at JFK. Although the feature had been debuted at CES 2019, it was at the time only used on smartphones. Weaving into other hardware, like tablets and smart speakers, was the next logical step, and Terminal 4 the ideal test bed. Huinink needed little convincing as a regular user of Google’s basic translation app, ‘Google Translate’. “I’ve been using it a lot,” he says. “I know that, even though I’m pretty fluent in English, finding the right words can be difficult.”

That is doubly true for travellers from parts of the world where English is not widely spoken, where even a familiarity with some of the most basic phrases is lacking. “In an airport environment where you are already stressed, you’re really looking for help,” says Huinink. The integration of Google Assistant’s interpreter mode on Nest Hubs managed by service assistants was one way of giving these passengers the confidence to ask for and obtain the assistance they needed.

“We normally see around 70,000 passengers per day, and the technology even during the pilot stage has worked very effectively. It was able to eliminate the background noise and understand, quite clearly, the voice from the passenger.”
Steve Tukavkin

In line with JFKIAT’s desire to make Terminal 4 a clearing house for viable new technology, the software giant supplied the most cutting-edge version of its translation program. “Prior to the pilot, Google had been working on this initiative for several months,” explains Steve Tukavkin, vice-president for IT and digital at JFKIAT. Underpinned by the software giant’s latest research into artificial intelligence, the version of Google Assistant being used by the terminal’s staff supports up to 29 languages, including English, Spanish, Mandarin, Arabic, German and Vietnamese.

Once it was deployed in January 2020, its users found the program had little difficulty in contending with the loud environment of the arrivals hall. “We normally see around 70,000 passengers per day, and the technology even during the pilot stage has worked very effectively,” says Tukavkin. “It was able to eliminate the background noise and understand, quite clearly, the voice from the passenger.”

The application is also more comprehensive in how it translates speech, adds the IT director. As well as recognising a multitude of accents in English – “Even the Dutch accent,” Huinink interjects – it can also translate idioms, both essential components in the interpretation of passengers who are speaking fragmented English. Both men knew this, of course, before they deployed the software in the terminal. They left the verdict as to its practicality up to those using it, namely the airport’s service agents and their auxiliary corps of elderly volunteers.

In addition to enjoying working in the airport environment, these members of staff love interacting with passengers. They’re also not the savviest bunch when it comes to new technology – which is why Huinink was so pleased at how quickly they took to using the new translation software. “If the technology is not working properly, it will not be used very quickly – right?” he says. Instead, its uptake was more or less seamless. “They could use it directly, and we haven’t heard any negative feedback.”

Further adoption

Numerous airports and airlines around the world have followed Terminal 4’s example and implemented their own machine translation solutions. At Francisco Bangoy International Airport in the Philippines, the site’s police force began rolling out language translation apps to their officers to communicate with travellers with poor English, while two airlines – the Dutch carrier KLM and Air New Zealand – began deploying their own versions of Google’s automatic translation software.

Other hubs have relied on hardware to augment translation. In January 2020, Edinburgh Airport began its deployment of ‘Pocketalk,’ a miniature device capable of translating up to 74 different languages. Designed so that travellers can state their inquiry into the device and it can be translated for the benefit of staff on its screen or through a nearby smart speaker, the technology has initially been deployed in the airport’s security area.

“The security process is one of the most important, as we need to ensure the safety of all passengers and staff,” said the airport’s chief executive, Gordon Dewar, at launch. “It’s vital that we have the ability to clearly explain the process and help people understand, so we can make that process as positive as possible.”

Both Huinink and Tukavkin are proud of their pioneering role in this space, recognising also that the act of translation doesn’t simply extend to speech. In early 2019, JFKIAT partnered with AIRA Access to enhance the experiences of visually impaired passengers in Terminal 4 through a dedicated mobile app. After logging in, the traveller can speak to a dedicated service agent who is able to provide all relevant data on where specific facilities are located inside the building, whether that be restaurants, gates or exits.

More broadly, both men view these kind of incremental changes as essential in creating a seamless, touchless journey for passengers through Terminal 4 – all the more important in a post-Covid-19 airport environment, where travellers should not only have the confidence that they are understood, but also that they are free from the danger of infection.

“It’s extremely important to all of us that passengers feel safe when they travel here, and that they are helped,” says Huinink, “whether it’s AIRA for the visually impaired; whether it’s Google’s translator, for people who don’t speak the language; or temperature checks for passengers. Those are all technologies that can help reassure passengers that this is a seamless and safe place to travel to.”

1.5 billion

People across the world who speak English.


The number of languages that can be translated by Pocketalk, a device deployed by Edinburgh Airport.
Edinburgh Airport