Tiny faces, big expressions: Reading rodent faces

December 19th, 2023

Written by: Abby Lieberman

Although you probably don’t think about your facial expressions every day, they are a fundamental part of human communication. We rely on reading the facial expressions of others to have successful social interactions and to learn how others are feeling1. Despite decades of research on emotions, understanding how our brains generate and process emotions remains a challenge in neuroscience. For ethical reasons, there are many experiments we can’t perform on people, so to better understand emotions we rely on studies in animal models. But, how does one know how an animal is feeling? We can’t ask animals what they are thinking, as much as we would like to! Most present-day scientists agree that animals have “emotion-like states” that have been preserved over the course of evolution, but whether animals have emotions similar to ours, how we define animal emotions, and how we study animal emotions is up for debate2

It’s a bit easier to tell what animals like cats, dogs, or monkeys may be feeling based on body language like tail wags, showing teeth, and vocalizations like purrs, barks, and aggressive calls (see our recent post for more!). But what about smaller animals like mice? We care about mouse facial expressions since these tiny friends are one of the most commonly used animals in neuroscience research. Mice make noticeable facial expressions when they are in pain and this knowledge has been used to improve how researchers care for mice in their laboratories, leading to the development of an official pain assessment scale called the Mouse Grimace Scale 3. It is possible that mice also make more subtle facial expressions when they are experiencing other emotions, but they are hard to see with our bare eyes. Neuroscientists will benefit from tools that track mouse facial expressions quickly and precisely. Fortunately, research groups focused on this challenge have recently developed artificial intelligence methods to track changes in mouse facial expressions.

Although they are tiny critters, it turns out that mice do make facial expressions in response to different experiences and we can see them with help from computers! A groundbreaking paper published in 2020 by neuroscientists at the Max Planck Institute demonstrated that mice make different faces  in response to different experiences4. The researchers took videos of mice while they relaxed and compared them to when the mice tasted sugar water (to trigger pleasure), tasted bitter water (to trigger disgust), or received an injection of a chemical that will make them feel mildly sick for a short time. Afterwards, they used artificial intelligence to compare the relaxed mouse faces to the faces the mice made during the different experiences. 

Remarkably, the researchers found that different mice reliably made similar facial expressions in response to the different experiences they had–just like how we can reliably pick out whether another human is smiling or wincing in disgust. It turns out that for mice, subtle changes in ear, nose and, like us, mouth position are particularly important for understanding what the animal may be feeling. When the mice tasted sweet water, their noses relaxed and pointed downwards while their ears moved towards the front of their body. When they tasted bitter quinine, they scrunched their noses in and moved their ears to point straight back towards the back of their bodies. A second exciting discovery was that the animals’ facial expressions were more intense as the experiences they were having became more intense. For example, as mice drank water droplets that became more and more bitter, they looked more disgusted and scrunched their noses harder.

Once the mouse facial expressions had been categorized, the authors wanted to look inside the brain and see what neurons might be related to each facial expression. They used a technique called two-photon calcium imaging which lets the researcher record movies from single neurons. The authors recorded a brain region called the insular cortex, which has known involvement in emotion. They found some neurons in the insular cortex that became active when the mice had each experience in the study. Additionally, they found other neurons in the insular cortex that were strongly correlated with different faces the mouse made.

Since this paper came out, other research groups have developed exciting new tools to track facial movements in mice.  One cutting-edge tool that was published this year is called Facemap5. Although the Max Planck team looked at brain activity in the insular cortex, they did not look at how facial expressions relate to brain activity in many regions at once. Facemap goes one step beyond the first research team’s efforts by looking at how facial expressions relate to brain activity in many parts of the brain simultaneously. Facemap tracks several points across the mouse face (such as the nose and whiskers) with artificial intelligence tools. While the mice are having videos taken of their faces, large scale brain recordings are done. With just a few steps, Facemap can learn to predict the activity of thousands of neurons across the mouse brain from just how the mouse moves its face! Check out this video to see what the key tracking points look like in a video of a mouse:

Hopefully, facial tracking will become commonplace in mouse behavioral neuroscience studies. The ability to track and interpret mouse facial expressions, as well as pair the facial expressions with brain activity, will radically transform mouse neuroscience experiments. “Being able to measure the emotion state of an animal can help us identify the ‘how’ and ‘where’ in the brain, and hopefully get hints at how emotions arise in humans too,” explained Dr. Nadine Gogolla, the primary author on the first study6. Ultimately, these techniques will help us apply findings from mouse research to human emotion studies, with the ultimate goal of improving care for neuropsychiatric conditions.

References

  1. Erickson, K., & Schulkin, J. (2003). Facial expressions of emotion: a cognitive neuroscience perspective. Brain and cognition, 52(1), 52–60. https://doi.org/10.1016/s0278-2626(03)00008-3
  2. Anderson, D. J., & Adolphs, R. (2014). A framework for studying emotions across species. Cell, 157(1), 187–200. https://doi.org/10.1016/j.cell.2014.03.003
  3. National Centre for the Replacement Refinement & Reduction of Animals in Research (2021). Grimace scale: Mouse. https://www.nc3rs.org.uk/3rs-resources/grimace-scales/grimace-scale-mouse
  4. Dolensek, N., Gehrlach, D. A., Klein, A. S., & Gogolla, N. (2020). Facial expressions of emotion states and their neuronal correlates in mice. Science (New York, N.Y.), 368(6486), 89–94. https://doi.org/10.1126/science.aaz9468
  5. Syeda, A., Zhong, L., Tung, R. et al. Facemap: a framework for modeling neural activity based on orofacial tracking. Nat Neurosci (2023). https://doi.org/10.1038/s41593-023-01490-6
  6. Andrew, S. (2020). Mice make different faces depending on how they feel – and that could impact how we treat mood disorders, a new study says. Cnn.com. https://www.cnn.com/2020/04/02/world/mice-facial-expressions-scn-trnd/index.html

Cover photo by Anastasiya Kolpakova on Pixabay

Leave a comment

Website Powered by WordPress.com.

Up ↑