Social Engineering Blogs

An Aggregator for Blogs About Social Engineering and Related Fields

The Humintell Blog February 24, 2025

Can Artificial Intelligence (AI) Read Animal Emotions?

As technology advances, scientists from around the globe have been investigating the use of AI to help recognize animal pain signals.

Through computerized facial expression analysis, this AI technology can quickly and accurately recognize pain signals in animals. In some cases, AI is better at this task than some humans!

This AI technology has been used in animals from sheep to horses to cats.

An example includes the Intellipig System developed by scientists at the University of the West of England Bristol (UWE) and Scotland’s Rural College (SRUC).

Intellipig examines photos of pigs’ faces and notifies farmers if there are signs of pain, sickness, or emotional distress.

Facial Expressions in Animals

Scientists assess an animal’s level of pain by looking for telltale muscle movements around the eyes, ears, and other facial features. Artificial intelligence (AI) systems make similar judgments by measuring the distance between “landmarks” on the face (orange and teal dots).

Like humans, animals convey how they’re feeling through their facial expressions. In fact, humans share 38% of our facial movements with dogs, 34% with cats and 47% with primates and horses.

But, as an article in Science points out, “the anatomical similarities don’t mean we can read animals’ faces like those of fellow humans. So, researchers studying animal communication often infer what an animal is experiencing through context”.

An example of this is pain; researchers studying animals can induce mild discomfort or be cognizant of pain signals after an invasive procedure such as castration.

After spending countless hours observing the faces of animals in painful or stressful situations, scientists can then compare them against animals who are pain or stress-free.

As a result, scientists developed “grimace scales” which provide a measure of how much pain or stress an animal is experiencing based on movement of its facial muscles.

In addition, like the Facial Action Coding System (FACS) used on humans, experts have also become skilled at coding facial movements in animals (AnimalFACS).

Amazingly at present, the FACS system has been adapted into 8 different species and their manuals are freely accessible through the animalfacs.com website:

  • ChimpFACS: common chimpanzees
  • MaqFACE: rhesus macaques
  • GibbonFACS: hylobatid species
  • OrangFACS: orangutans
  • DogFACS: domestic dogs
  • CatFACS: cats
  • EquiFACS: domestic horses
  • CalliFACS: marmoset species

However, coding work is incredibly tedious, and human coders need 2 to 3 hours to code 30 seconds of video.

This is where AI comes in.

AI can do the same task almost instantaneously, but first it must be taught.

Teaching AI to Read Animal Faces

AI systems are becoming faster and more accurate than humans at determining whether an animal is in pain. That’s partly because they can identify the tiniest muscle movements and find new indicators of pain that humans are not even aware of.

At the University of Haifa, scientists Anna Zamansky and her team have been using AI to pinpoint the subtle signs of discomfort in animals’ faces.

There are many steps in teaching AI to read animal faces.

These steps include:

  1. AI learning to ID parts of the face crucial to creating expressions (this is done by manually flagging important parts of the face associated with specific muscle movements).
  2. Feeding AI a plethora of landmarked photos to teach it to find landmarks on its own.
  3. AI identifying specific facial expressions by analyzing distances between landmarks.
  4. Cross referencing expressions against grimace scales to determine signs of pain or distress.

Zamansky’s team trained their AI on photos of Labrador retrievers who were either eagerly anticipating a treat or were able to see the treat but were prevented from reaching it.

Their AI was able to successfully detect whether the dog was happy or frustrated 89% of the time.

The AI also successfully differentiated happy and frustrated horses in the same experiment.

Despite some limitations to their technology, Zamansky’s team is about to release an AI based app that will allow cat owners to scan their pets’ faces for 30 seconds and get easy to read messages.

The technology also extends to horses- researchers in the Netherlands have developed a similar app that scans resting horses’ faces and bodies to estimate their pain levels.

This app could potentially be used in equestrian competitions to improve animal welfare and fairness in the sport.

The post Can Artificial Intelligence (AI) Read Animal Emotions? first appeared on Humintell | Master the Art of Reading Body Language.

Filed Under: Emotion, Technology

The Humintell Blog October 27, 2020

To Face the Fear, Don’t Press Mute

Guest Blog by AnnMarie Baines

Even in the midst of a pandemic, people still feel the pressure to appear “perfect”. Unfortunately, the pressure to be perfect only increases the fear of public speaking, regardless of a speaker’s experience level.

As a public speaking coach and founder of non-profit, The Practice Space, I have observed more people using the virtual world to hide and avoid that fear completely. By turning off our cameras and putting ourselves on mute, it is easier to opt out of public speaking and observe discussions at a distance, as opposed to being spotlighted and risk judgment and uncertainty.

While it is more equitable to give people the option of whether they want to turn on video, as a woman of color, I also know it is important not to silence ourselves. For those whose voices are unrepresented in powerful places, including women, youth, people of color, the fear of public speaking is already entrenched in histories of oppression and discrimination that instruct us to feel that our voice is somehow inferior. It is even more essential to push back on the conditions that are set up to push diverse voices aside.

Instead of pressing mute, facing the fear of public speaking instead begins with a change in mindset. Public speaking is infinitely more scary when we view it as a test, or feel like we have to defend ourselves on trial. If we view communication as a tool for human connection, then public speaking should be viewed as a chance to teach and enhance understanding. Confident communication emerges when we listen, teach, commit to our ideas, and let go when things don’t go as planned.

Tip #1: Value your connection to the audience.

Regardless of whether we are online or in person, all the anxiety-coping strategies in the world will not help until a speaker personally reframes the goal of public speaking. When the goal is still to “get through the speech unscathed” or “deliver a presentation without any mistakes” or “deliver everything perfectly from memory”, the irony is that speakers are much more likely to be nervous and unsatisfied with their performance. Instead, public speakers need to frame goals that prioritize the effect they want to have on their audience. For instance, public speaking goals such as teaching new ideas, inspiring connections, communicating content that people remember, and encouraging follow-up conversations do not depend on perfection. Rather than having goals that are all about you, effective communication should value connection over seamless presentation.

Tip #2: Expect and embrace discomfort.

Everything in 2020 is deeply uncomfortable, and communicating over a webcam is no exception. That said, for many, public speaking has always been an uncomfortable and somewhat unnatural experience, even before the pandemic hit. Many speakers and performers use visualization techniques, where they prepare themselves by imagining the result they want. Instead of imagining situations where you don’t feel any nerves at all, it can help to imagine the jitters you might have at the start and then imagine them disappearing as you sink into the moment and connect with your audience. It can also help to embrace the reality that public speaking will sometimes feel awful, but also that the discomfort won’t last forever — sometimes, it is only a few minutes.

Tip #3: When you can, always speak about what excites you.

Given how fearful and anxious some people can feel about public speaking, the discomfort is only worthwhile if your message is personally important to you. Sometimes, when I am faced with a speech that is particularly nerve-wracking to me, I will say to myself, “right now, this work is more important than my fears.” While it is always useful to think about what your audience might want to hear, at the end of the day, every speech should always derive from content that drives, motivates, and excites you. When you talk about what genuinely interests you, it is easier to get lost in your message and drown out evil voices of self-critique and doubt. The byproduct is that your speaking delivery will automatically be better because you are speaking from the heart instead of from a place of stress.

When I interview my students about their growth and confidence, it always surprises me that they never say that their nerves have gone away. Even the most advanced students say that the fear is always there, but that they have learned to embrace it. In the words of one of my high school students, who was a champion public speaker and state champion finalist, “So I’m still kind of afraid of talking in front of people and I try to avoid it as much as possible or get other people to go before me, but I’m just kind of on terms with it now. I can choose to rise above it rather than let it inhibit me.” Before we voluntarily put ourselves on mute, take a moment to reflect on why. If it is to listen deeply and learn from others, then mute away. But if it is to avoid fears, then don’t be the one to silence yourself because there are plenty of people out there who will.

Read a previous guest blog by AnnMarie on how to feel less nervous about speaking in public 

The post To Face the Fear, Don’t Press Mute first appeared on Humintell.

Filed Under: Emotion, Public Speaking, Technology

The Humintell Blog August 17, 2020

Featured on National Geographic! How Facial Expressions Help Robots Communicate With Us

National Geographic Explores: A wrinkled nose, raised eyebrows, a frowning mouth—all can say a lot without uttering a single word.

Facial expressions are the closest thing humans have to a universal language, and it could change our relationship with androids and other human-like robots.

Filed Under: Emotion, Nonverbal Behavior, Psychology, Technology

  • 1
  • 2
  • 3
  • …
  • 7
  • Next Page »

About

Welcome to an aggregator for blogs about social engineering and related fields. Feel free to take a look around, and make sure to visit the original sites.

If you would like to suggest a site or contact us, use the links below.

Contact

  • Contact
  • Suggest a Site
  • Remove a Site

© Copyright 2025 Social Engineering Blogs · All Rights Reserved ·