07 July 2019

Language lab

Talk to the flipper

Tech is opening up an extraordinary new frontier of communication – between humans and animals

By Mark Fletcher

I

t now seems possible that language-translating algorithms may confirm the ancient belief that animals can have something important to say. The idea has appeal, from St Francis to Doctor Dolittle, from Piglet to a three-eyed raven.

Trust Google to lead the way. In an official video from 2010, a geeky exec introduces Google Translate for Animals, an app for Android. He holds his phone to a pig. The synthesised voice of Google Assist translates: “New person smells good!” comments Bella the pig in translation. Donna the donkey says to the delighted and blushing farmer, “Love you”. “You need to sort out your dress sense”, declares a well-groomed ewe.

It was April Fools’ Day.

Nine years on, the idea of an animal translator app is no longer a spoof. Con Slobodchikoff, a biology professor at Northern Arizona University is developing Zoolingua to apply computer-learning algorithms to crack the code embedded in prairie dog calls. Prairie dogs are easily alarmed rodents: “Eagle!”, “Snake!”, and “Man in red shirt!” they cry. For years, Slobodchikoff analysed waveforms of prairie dog calls, elicited by cardboard eagles and rubber snakes. His latest aims are modest, a few more calls, or words, at a time. But he hopes eventually to be able to provide a simple translation app to help us understand our pets.

In 2017 a Swedish language technology company called Gavagai teamed up with dolphin researchers at the KTH Royal Institute of Technology in Stockholm. Gavagai is a spinoff from the Swedish Institute of Computer Science, and their algorithm has already successfully taught itself 40 human languages. Robert Eklund, Jussi Karlgren and their colleagues plan to create a dolphin thesaurus. The project will eavesdrop, for the next four years, a group of bottlenose dolphins living in a wildlife park near Stockholm, with algorithms searching for meaning in the day-to-day chatter of whistles and clicks.

Scientists long resisted the concept of the talking animal as ridiculous and anthropomorphic. But, over the last few decades, it has been more generally accepted that some animals communicate in ways that can best be described as language.

We now know that killer whales coordinate complicated hunts in squeal and pops; cuttlefish convey passion in flashing colours; and orangutans have been shown to be able to plan for the next day. Information is encoded in sound, colour and movement, and the rules of language, syntax and semantics often apply. Sperm whales have different accents and dialects, like us, learned where they grew up.

Now, artificial intelligence is opening the door on animal intelligence. Translation software for animals is increasingly conceivable thanks to computers that detect patterns in large amounts of data. Their power is in codebreaking rather than translating. The algorithm doesn’t start with a dictionary of meanings, instead it turns words into numerical vectors, and looks for the associations and substitution patterns that connect them.

“You should know the words by the company they keep,” declared linguistic professor John Rupert Firth in 1957, anticipating the approach of unsupervised machine translators. Banks of computers with artificial neural networks and “deep learning” hierarchies search for repeated uses of sounds or words. From understanding context with the code, meaning emerges more or less by itself.

An algorithm that deduces meaning from data sets was tested on horses by Christoph Dahl at St Andrews University. He put motion sensors on 26 horses in 2016, and the data was then fed into a supervised learning algorithm. It worked better than expected, he admits, and “yielded surprisingly rich and multi-layered sets of information. In particular, we were able to discriminate identity, breed, sex and some personality traits from the overall movement patterns of freely moving subjects.” The machine learned to recognise the temperament of each horse from body-language data without knowing anything about horses.

We can imagine that the computer could then take its knowledge a stage further. A robotic horse could converse in body language based on patterns in the sensor data. If a dialogue starts, that generates its own data and the computer learns much faster.

The search is widening into anything that meows, bleats or howls. Dahl is now moving on to rats, while Klaus Zuberbuehler, also at St Andrews, applies the same technology to chimp calls and gibbon songs. “We have used unsupervised learning algorithms to classify animal calls and movements,” he writes, “which I think has great promise.”

To a zoologist, demonstrating that an animal is capable of language helps further confirm that animals can be self-aware individuals. Carthage College’s Professor Angela Dassow has inputted the sounds of whales, bats, gibbons, dogs and songbirds into a computer, and upon discovering unexpected levels of complexity, admitted that “some remarkable similarities between humans and animals emerge”.

Translation software for your mobile isn’t here yet, but there is a waterproof backpack for dolphins designed by Dr Thad Starner and his team at Georgia Tech. Inside the Cetacean Hearing and Telemetry system, or CHAT, is a computer running language recognition software to analyse the whistles and clicks that make up a dolphin’s natural communication. Dr Denise Herzing and the team at Hawaii University trained dolphins to use specific calls, and in 2014 the computer recognised its first word in real time. One day while swimming with the pod, CHAT translated a dolphin’s whistle as “sargassum”, a common type of seaweed. Imagine the moment. Scientists are listening to an indecipherable stream of high frequency whistles, when suddenly they hear the computer whisper “sargassum”. It’s both amazing and startlingly mundane, like the first translators of Babylonian tablets realising they were almost entirely warehouse inventories.

Herzing plays down the case of the dolphin. CHAT is not a translator, she insists, it is an interface, like an auditory keyboard. Besides, she and other dolphin researchers have been talking to animals for decades, using signs, gestures, sound, and giant underwater keypads. Discovering that a dolphin can refer to seaweed is nothing compared to the discoveries already made. Dolphins have names that they give themselves, called signature whistles by scientists, and they understand that word order is important and refers to objects and past or future events. But before CHAT, to have a conversation with a dolphin, a third language had to be invented that both sides could learn and use to communicate, involving arbitrary gestures, signboards and underwater keypads.

Despite advances in unsupervised machine learning, using an intermediate language to talk to animals has proved, so far, much more successful than trying to translate an animal’s own language.

In 1966 a chimpanzee called Washoe learnt a modified form of American Sign Language. She was brought up by Beatrix and Allen Gardner as if she were a human child, and was spoken to only in sign language. She learned 350 signs, and was able to generate original sentences. After a few years, she could make jokes, and occasionally was caught out with a lie. She understood who she was, and seemed able to express her feelings. Washoe’s talent with sign language tells us almost nothing about her own language, if she has one, but a lot about her. Perhaps the most moving part of the story is that she taught other chimps to sign, including her own children.

Whilst an infant in the 1980s, a bonobo called Kanzi learnt to communicate using a large board of buttons with tiny pictures or symbols, called a lexigram. He became proficient and could understand sentences, and express himself well, better than many of the researchers who invented it. After a few years, researchers took to asking Kanzi questions in English, which he also understood, and he replied using the lexigram, not having a human voice box. Kanzi is far from unique. In the majority of the animal language studies of the last fifty years, the animals became good at understanding us, while we learned almost nothing of their own animal language. Similarly, many dogs appear to understand a little simple English, but few people can speak dog to the same standard.

Mining vast amounts of natural animal communication data for meaning may not yield coherent translations easily. Each species has an unfathomable blend of sound and movement from many parts of its body. Snakes communicate by smell, and bees in dance. And even if we found a language, and identified that it had some sort of meaning, that meaning could be incomprehensible to us. As philosopher Ludwig Wittgenstein said: “If a lion could speak, we could not understand him.”

The team that is studying dolphins in Stockholm worry that they too will not be able to understand dolphin, even in translation. “It is possible or even likely that much of dolphin-dolphin communication concerns states and aspects of dolphin life which are difficult to observe and may be near impossible for humans to conceptualise.”

Meanwhile, algorithms have hit a more practical problem. The computers may be able to collect data and deduce meaning from context, but nature is full of complicated patterns that have nothing to do with language. The great codebreaking machines, from Enigma to Google Translate, work on the premise that all of the data that they are given is a language of some form, with some semantic meaning, and a rough knowledge of what that meaning is likely to be. With animals it is more difficult. A nightingale’s song is infinitely complex and has meaning. The algorithms can break down the song into “words”, but these words may have no more meaning than musical notes. The same problem is faced in translating humpback whale song. Perhaps, like our music, the songs convey truths that cannot be easily translated. No algorithm is likely to tell us what Mozart means.

Data-mining for meaning is becoming ubiquitous. Computer algorithms have the power to decode many aspects of our lives, probably more than is comfortable. But if it leads to an ability to understand animals, the benefits to the planet could be huge.

However, I do foresee one problem. A frank discussion with a rat, a dolphin, a horse or a chimp might dent our confidence in the supremacy and wisdom of mankind. Our innocence and self-belief as a species could be lost.

“Doctor Dolittle has just walked into the room”

The significance of gibbons

 

Studies of gibbons, a close ape relative of ours, may prove fruitful. Gibbons sing together. Their duets contain structure in stereotypical patterns of notes, like music, but within the songs are complex variations called “interlude sequences”, which seem more like a conversation. Dr Esther Clarke at MIT breaks down those sounds, looking for patterns. She admits that, “to figure out the meaning is very complicated when you are dealing with a completely different species. An algorithm that could not only predict the sequences in gibbons’ songs, but also potentially translate them would completely revolutionise the field.” She and her colleagues are working on it.

Codebreaking, pioneered most famously in wartime, is based in part on finding linguistic rules within the code. From the 1940s, Zipf’s law states that common words are shorter than rarely used ones. Menzerath’s law, published in the 1950s, dictates that as a stream of communication gets longer, each sentence becomes shorter. The first is for efficiency; the second, clarity. What used to take the Bletchley brainboxes weeks, if not months, to dig out, AI computers now do in seconds. These two laws taken together are often the door that lets codebreakers in. Both apply only to communication that encodes meaning – music doesn’t generally follow the two rules, but lyrics do. Researchers have been testing animal communication, searching for patterns that obey these two laws.

Gibbons have the most complex long-distance calls of any ape, so are a perfect place to look for the laws of language. Their song is the only carrier of information: you can’t see a raised eyebrow or rude hand gesture from a kilometre away. The most complex gibbon song is heard in the newly discovered Skywalker gibbon, according to Professor Peng-Fei Fan at Sun Yat-Sen University. He has just confirmed to me that, in a yet unpublished study, he and his team found that gibbon calls conform to both laws of language. It seems that their songs do have lyrics. The door to decoding what they are saying has opened. Doctor Dolittle has just walked into the room.

My own interest in gibbons involves working with Professor Peng-Fei Fan and his team in China on a documentary about Skywalker gibbons and a dating app. Gibbons form lifelong relationships, and lovelorn singles check each other out from a distance, by singing songs together while a kilometre apart. But their jungle home in southern China is fragmented by roads and farms, so young gibbons are now forced too far apart to hear each other. We have been connecting them though microphones, speakers, and a standard chatroom app on the local mobile network, so that they can sing together at any distance. As well as the pleasure of acting as matchmaker and enabling animals to converse over the telephone, we can also eavesdrop on the conversation and try to decode what they say. Our first test was to connect a male gibbon in Beijing zoo to a wild single female. Neither had heard a potential partner for a decade, if ever.

Long-distance relationships are tricky. Their first call was tentative, but they sang to each other down the phone for about twenty minutes. The second call was almost euphoric, and their mutual enthusiasm obvious. The third call was meant to clinch the deal, and it was hoped that the zoo male might be released back into the wild. But something went badly wrong. He became very alarmed, refused to sing any more, and was clearly upset.

Photography by Getty Images 

Further reading

The literature on this subject is huge. Amongst the standard texts are Con Slobodchikoff’s Chasing Doctor Dolittle: Learning the Language of Animals (2012), and Janine M. Benyus’s The Secret Language of Animals: A Guide to Remarkable Behavior (2014).

On the unexpected connections between human rhetoric and animal behaviour, see Debra Hawhee’s Rhetoric in Tooth and Claw: Animals, Language, Sensation (2016)

To explore the emotional life of animals, try Frans de Waal’s Mama’s Last Hug: Animal Emotions and What They Tell Us about Ourselves (2019)

The philosophical question posed by Wittgenstein about the comprehensibility of lions is addressed, along with many other issues, by Carey Wolfe in Zoontologies: The Question Of The Animal (paperback, 2003)

For the unexpected activist path that led Hugh Lofting to write his Doctor Dolittle books, see Catherine L. Elick’s ‘Anxieties of an Animal Rights Activist: The Pressures of Modernity in Hugh Lofting’s Doctor Dolittle Series’, Children’s Literature Association Quarterly, Volume 32, Number 4, Winter 2007.