The robot chef has a sense of taste
In Alex Garland’s sci-fi masterpiece Ex Machina, Nathan Bateman (Oscar Isaac) secretly builds an advanced robot with artificial intelligence called Ava (Alicia Vikander). She is, in all the ways that matter, indistinguishable from a human being, save for a robotic body and a – perhaps justified – penchant for violence.
Making machines that can perform tasks and behave similarly to humans is the ultimate goal of robotics, but we have so far come up against the limits of our current technology when we ask robots to perform even relatively simple tasks. Our machines can perform well enough to perform highly repeatable or strictly defined tasks, such as exploring the ocean floor or, with the help of human operators, performing surgery. However, things we take for granted, especially those that require real-time feedback from the environment, remain a challenge.
Today, scientists at the University of Cambridge’s Department of Engineering’s Bio-Inspired Robotics Laboratory have taken a delightful step towards building better robots that can perform one of the most critical human tasks: preparing a better breakfast. The results of their work were published in the journal Frontiers in Robotics and AI.
Humans are only able to successfully navigate the world through our senses. Taking in environmental stimuli allows us to assess a situation and make adjustments in our behavior. While many robots have rudimentary senses, especially vision and hearing, there are parts of the world’s sensory experiences that have been lacking until now.
Recent experiments by another research team have engineered robotic arms with an impressive sense of touch, ticking another box on the sensory checklist, and this ongoing work adds flavor to the mix. The researchers trained their robotic chef to sample a dish prepared under different conditions to assess how our experience of a meal changes as we eat it.
Professional chefs — and parents hoping to squeeze in a few bites before the family grabs dinner — use a “taste-as-you-go” technique that lets them monitor the flavor profile of the foods they eat. cook in the act of production. Not only does this allow a cook to make flavor adjustments before a dish is finished, but it also provides information on how taste changes between taking a bite and swallowing.
Due to changes in texture and the introduction of saliva as we chew, our experience of food evolves as we eat it, and it’s this process that the team wanted their robot to understand. To that end, the researchers fed the robot nine versions of an egg and tomato dish at three different stages of chewing and asked it to produce flavor maps.
In the absence of chewing teeth, the researchers put the dishes in a blender to imitate the different stages of chewing. With no tongue for tasting, they fitted the robot with a conductance probe attached to its arm that acted as a salinity sensor. By poking the probe into the dishes in multiple places, she was able to gather data about the taste of the dish and build her flavor maps.
Using taste maps of foods in different combinations, with different textures, and at different levels of seasoning could allow robots to accurately produce tasty foods based on an individual consumer’s preferences, making it a complementary useful for commercial and home kitchens. In fact, a previous study by some of the same researchers had their robot making omelets, cookies, pancakes, and pizza. In the case of the omelettes, they used human feedback to adjust the robot’s technique and improve its capabilities.
The new built-in tasting capability removes the need for human intervention, at least when it comes to salt, and could eventually result in a fully automated robotic chef that mimics your personal tastes. If this happens, it may be important to remember to tip your metal waiter, a robot that cooks a decent breakfast is better than one that comes out of its cell and murders you on the way out.