“Did you ever try to measure a smell?” Alexander Graham Bell once asked an audience of graduands at a high school in Washington DC.
Then he asked the class of 1914, who were probably confused by the question, if they could tell a scent is twice as strong as another or if they could measure the difference between the two odours. Eventually, though, he came to the point: “Until you can measure their likenesses and difference, you can have no science of odour,” Bell said. “If you are ambitious to find a new science, measure a smell.”
Scientists at the time understood that the sound of Bell’s voice and the sight of him on stage could be explained by vibrations in air and different wavelengths, but they had no way to explain the odours that were in the air in May. The mechanics behind smell was a mystery and they are in many respects still. “Unlike sound or vision – where the wavelength and amplitude clearly map to perceptual properties like tone frequency, colour or intensity – the relationship between a chemical’s structure and the underlying perception is not understood in olfaction,” explains Douglas Storace, assistant professor of neuroscience at Florida State University.
“The first thing to remember is how little attention and work has occurred in olfaction versus other fields,” says Alex Wiltschko, chief executive of olfactory AI startup Osmo, as he recalls the hefty neural science textbook he was given as a PhD student. “I took callipers and measured the width of the paper that’s used to teach vision and hearing. It’s about three quarters of an inch for vision. It’s about a half an inch for hearing. It’s maybe 30 pages – a few millimetres – for smell.”
Osmo’s stated purpose is to “give computers a sense of smell”, because while we have learned to digitally encode sights and sounds, we have no way of doing so for scents. Wiltschko is trying to change it, and usher a new era in olfactory sciences, by mapping out how we perceive scents.
The nose is basically a chemical detector. We sniff volatile organic compounds, or VOCs, released by a cup coffee. “These small VOCs bind to certain olfactory receptors, and this binding basically triggers an electric signal that goes to the brain,” explains Cecília Roque, an associate professor of chemistry at Portugal’s Nova School of Science and Technology.
It’s a good thing to be able to duplicate the process using machines. Some VOCs – such as contaminants in food or carcinogens such as benzene – can be harmful and worth detecting before they reach our noses; others might point to dangers such as gas leaks or concealed explosives; and some can indicate other problems. If someone’s breath smells like freshly mown clover it may be a sign of liver failure, while sweat with an odour of freshly plucked feathers could suggest a case of rubella.
Electronic noses have been developed by researchers to detect certain compounds in the early 80s. While some of these devices are currently used in industry, they are limited. “Demonstrations so far have either been very large analytical instruments, or are very narrowly targeted, or have relatively weak selectivity,” says Jacob Rosenstein, an associate professor of engineering at Brown University, who in 2018 co-developed a low-cost e-nose called Trufflebot.
According to some, what olfactory technology needs is a way of mapping molecules’ structures to their perceived smells. “Some molecules look very similar structurally and smell very different, and some look very different but smell very similar,” says Joel Mainland, a professor at the Monell Chemical Senses Centre in Philadelphia. “You’re constantly trying to build a model to fix that problem.”
“You can’t design anything of meaningful complexity without a specification,” adds Wiltschko. “You can’t build a digital camera without the red, green, blue colour model (RGB). You can’t build a microphone without a low to high frequency space. And so the map has to come before the engineering.”
Wiltschko was a member of Mainland’s research team, which published a study about odour mapping in the first half of this year. Wiltschko started the research when he was at Google Research. The artificial intelligence used was called a graph neuron network (GNN), and it was trained by using two datasets that linked molecular structures to odour. The Leffingwell dataset was created in the early 2000s, and it pairs 3,523 molecules to descriptions of their smells. As an example, Acetaldehyde Ethyl Phenylethyl Actal is said to smell like lilacs and green leaves.
The work resulted in a “principal odour map” – the olfactory equivalent of the colour palette you might use on a computer. “Anybody who’s looked at a map of colour in Photoshop knows intuitively what’s going on,” says Mainland, and just as the “colour space” in such a map helps us say that purple is closer to red than to green, the team’s odour map allowed them to locate scents in a kind of multi-dimensional “smell space”.
“RGB is three-dimensional, but you can depict it on a flat piece of paper,” Wiltschko says. “There’s three channels of colour information in our eye, but there’s 350 channels of odour information in our nose.
“Whatever map we were going to find was not going to fit on a flat piece of paper. Therefore, the map-making tools we’ve used as scientists in the past were not going to help us. We needed to wait for software, for artificial intelligence, for statistical analysis of patterns in large datasets.”
Researchers can now map and predict the relationship between smells based on these technologies. For the study, the group trained a panel of 15 people to describe scents by rating them against 55 labels, including “buttery”, “earthy”, “sulphurous” and “metallic”, then asked them to apply these to 400 different molecules whose odours the GNN odour map had already predicted. The sample molecules were then passed to Christophe Laudamiel – a master perfumer now working with Osmo – for a more nuanced opinion. Mainland’s favourite of Laudamiel’s assessments, for a molecule that scored highly for descriptors such as musty, ozone, and medicinal, was: “the hot tub is near”.
“Some other ones are really interesting combinations,” Laudamiel adds. “One for instance smells very nice, of saffron and hot metal.”
Impressively, the GNN’s odour predictions for the 400 molecules turned out to be closer to the average human description more than 50% of the time. “Basically if you were to take that panel of people and pull one person out and put the model in its place, would you do better or worse at describing this average human perception?” says Mainland. “The answer here for most of the molecules, most of the time, is that it does better.”
Osmo is now continuing the work. “Right now, they’re studying 7bn molecules,” Laudamiel says. “If I or you would spend just five minutes per ingredient to smell and study it, five minutes for 7bn molecules, it means you need 66,590 years.”
Having accurate predictions of the odours of so many previously unsmelled compounds would be a boon to those in the flavour and fragrance industries – Laudamiel likens it to having a piano that suddenly gains more keys – and this research is likely to have its biggest initial impact on the search for cheaper, safer, and more appealing scents in perfumes, laundry detergents, and anything else with added odour or flavour. Researchers are hoping that their work can be extended much further. “If you think about what digitising images or digitising sounds has done for us, it’s not a thing that you can say very easily in one sentence, right?” says Mainland.
Wiltschko says that agriculture, food preservation, pandemic tracking, and disease prevention could all benefit from our digitalising smell. Some progress has been made. Deet, or N,N-Diethyl-m-toluamide, is the oldest and most common insect repellent on the market, but it eats into clothes and plastics, can have adverse side-effects, and there’s evidence that some disease-causing mosquitoes may be developing resistance, becoming less sensitive to Deet’s smell. “We’ve actually published a paper showing we can find molecules that are as potent as Deet in human trials,” says Wiltschko.
For Mainland, one of the most exciting aspects of the research is the possibility of discovering “primary odours”. He hopes that a limited set of odours, combined in the right ratios, could be used to create any colour, just as a printer can create a picture. The discovery of primary odours would not only allow us to recreate any scent that our noses can detect, but it could also breathe new life into old novelty films such as Smell-O-Vision from the 1950s. “It’s very exciting,” Laudamiel says. “We don’t necessarily know that they exist, but it’s very cool if they do.”
To do this, however, researchers must first map odours to not only individual compounds, as well as elaborate combinations that represent the complexity of daily odours. “Think of a smell that smells of only one thing,” Laudamiel points out. “People say, ‘oh, cut grass’. OK. Next time you go and you smell cut grass, whether on the ground or as you’re mowing the lawn, I guarantee you, it’s going to be grassy. It’s going to be mushroomy. It’s going to be earthy. It’s going to be maybe mouldy or musty or appley.”
A common problem with deep-learning AI is that it is a blackbox. While the results are impressive and potentially useful, they don’t necessarily bring us closer to understanding the biological workings of smell. “Though there are connections, the relationship between chemical structure and qualitative olfactory perception is not directly linked,” says Rachel Herz, of the department of psychiatry and human behaviour at Brown University. “The human level is influenced by a multitude of variables ranging from experience, context, and language to individual differences in the genetic expression of olfactory receptors.
Ultimately, this may be just one small step towards understanding olfaction, but more than 100 years after Alexander Graham Bell asked whether we can measure the difference between two odours, the answer now appears to be “yes”.