Social Engineering Blogs

An Aggregator for Blogs About Social Engineering and Related Fields

The Humintell Blog July 18, 2017

Prenatal Facial Recognition

We already know that faces are incredibly central to human interaction, but facial recognition may also be fundamental to our brain’s development.

Science has long demonstrated that even newborn infants have a strong preference for human faces over other stimuli. Now, a new study from the University of California, Los Angeles, may have found that our preference for faces begins even before birth itself!

These researchers exposed fetuses to triangles of red dots which sought to mimic facial structures, by representing the triangle that two eyes and a mouth create in a real face. In fact, past research has shown that such triangles serve as similar stimuli to faces for newborn children.

After projecting these dots into the fetus’ peripheral vision, researchers slowly moved them away from the fetus’ line of sight. Amazingly, ultrasound pictures show that a significant number of fetus’ moved their heads to follow the dots. While this was still a minority of total exposures, when contrasted with nonfacelike triangles, the fetuses reacted almost three times as often.

While some critics have said that it is too early to conclude any level of actual facial recognition, the very method of projecting images into the womb has yielded praise. Scott Johnson, a developmental psychologist uninvolved in the study, said the method “opens up all kinds of new doors to understand human development,” adding that it was “very, very exciting.”

While it may be premature to conclude a preference for faces at this stage in development, such a conclusion would be consistent with previous research that has found a consistent preference for human faces amongst newborn babies, within minutes of birth.

For example, a 1974 study showed newborns images of faces after only nine minutes. They found that the newborns would follow these faces as they moved for longer than they would for similar images of unintelligible images.

Subsequent research found that, within hours, babies would be able to differentiate their mother’s face from strangers, showing a preference for their mother. What is most striking about this is the speed at which young humans learn how to recognize and differentiate faces.

Similar research has even found that newborns, after only a day, show increased preference towards “beautiful faces.” These researchers contended that such faces better represent the stereotypical or “prototypical” human face, helping to explain these surprising results.

If facial recognition is really this deeply ingrained in our brain’s development, it would help explain the notion of universal emotions that span cultures. Followers of this blog will be familiar with the notion of universal basic emotions, and of the idea that these have an evolutionary origin.

For more information on this, check out our relevant blogs here and here!

Filed Under: Emotion, Science

The Humintell Blog June 14, 2017

Anxiety and Ambiguity

Why does that person look so angry?

You don’t have to have any sort of chronic anxiety to understand how easy it is to misunderstand other people’s facial expressions. We often interact with strangers, or even friends, and find ourselves unable to read their emotions, fearing that they are unhappy or angry.

In fact, newly published research suggests that feelings of anxiety do actually make us misread ambiguous facial expressions. Not only are we left unable to accurately determine their emotions, but we are more likely to mistakenly conclude that they are angry.

A group of researchers at the University of Bristol sought to investigate the role that anxiety has on impairing emotion recognition. They brought together a group of volunteers and exposed them to a series of images showing the same face, but with fifteen different emotional expressions. These ranged from surprise and happiness to disgust and anger, and the volunteers were asked to identify each emotion. A follow-up study expanded this analysis to 45 images.

Sounds pretty straightforward, but here is where the experiment gets really interesting. Each participant was given a facemask that pumped air into their lungs. Now, some of these facemasks simply delivered normal oxygen-dominated air, but half of them contained large amounts of carbon dioxide. The carbon dioxide spiked participants’ heart rates and blood pressure, causing anxiety attacks.

When compared to the control group, who was given normal air, the participants who inhaled carbon dioxide were about eight percent worse at correctly identifying emotions. Moreover, they tended to perceive anger much more often than happiness.

This is certainly not the first evidence that anxiety emphasizes negative emotional recognition. As this 2016 study outlines, a great deal of research demonstrates how those with social anxiety have trouble recognizing emotion and often attribute anger or sadness to neutral expressions.

Similarly, other disorders, such as depression and eating disorders, thwart one’s understanding of facial expressions.

All of this is perhaps not surprising, however, given the role of the amygdala in the recognition of ambiguous expressions. The amygdala, which is deeply connected with anxiety and fear, is activated when we see people with uncertain expressions, firing according to the level of perceived ambiguity. Similarly, it is also activated when attempting to read fear into other people’s faces.

Given that the amygdala is connected with both anxiety and ambiguous expressions, it would certainly make sense that increased levels of anxiety would thwart effective emotional recognition.

So perhaps, in your next uncertain social interaction, don’t conclude that the other person is angry with you. Perhaps, they are just distracted or not particularly emotive, leading to ambiguous expressions.

For more information on the neurological underpinnings of emotional recognition, check out our past blogs here and here.

Filed Under: Emotion, Science

The Humintell Blog May 31, 2017

Emotion in an Artificial Intelligence World

It is an almost omnipresent fear these days that technology is degrading human connections, but could we leverage that same technology to foster closer emotional ties?

We already have enough trouble reading each other’s emotions, and this becomes even harder when we communicate over long-distances, whether through email, phone calls, or even video chat. We are not face to face and cannot develop that emotional recognition, as technology, distance, and distracting stimuli add up to interfere with our emotional connections.

Instead, many people compensate with uniform online expressions, like emojis or acronyms conveying laughter or sadness, such as the classic LOL (Laughing Out Loud). While these can make progress in expressing emotions, they cannot replace actual, in-person laughter and lack a great deal of emotional nuance. Moreover, the problem seems intractable, given that modern society forces us to communicate over great distances in order to stay in touch.

That’s where developers like True Emoji, Affectiva, and Brain Power come in. These companies are using a concept called Emotion AI which attempts to give computers and other forms of artificial intelligence the ability to recognize and understand human emotions. It was Affectiva that developed a form of Emotion AI, and this software has been used in fascinating ways to teach computers to recognize facial expressions and understand emotions.

True Emoji has used this to address the problem of online communication. They have developed an app that reads a user’s facial expression and develops corresponding personalized emojis. This allows users to select an emoji that fits with their own emotion, rather than relying on a clichéd list of preset icons.

As True Emoji CEO Sumesh Dugar points out “The Internet has created a huge divide between emotions and communications… How many times have you shown surprise just by sending an emoticon?”

Similarly, Brain Power has put this principle to use trying to teach autistic children how to better recognize emotions. While autistic children have trouble recognizing emotion, this does not mean that they lack empathy. As we discussed in a previous blog, this just means that they can benefit from being taught emotional recognition skills!

Brain Power’s work focuses on the creation of interactive games that respond to the emotions of the participants. Joey Salisbury, the Brain Power director of software development, describes these as “augmented reality games with the goal of creating new, engaging ways for families to play together and grow closer while learning about emotions.”

These are just two applications of this burgeoning field. By abandoning an age-old divide between technology and emotion, we can allow the two to build on each other, creating emotionally intelligent machines and using technology to foster social ties.

For more information on this subject, see our previous blogs here and here.

Filed Under: Emotion, Science, Technology

  • « Previous Page
  • 1
  • …
  • 21
  • 22
  • 23
  • 24
  • 25
  • …
  • 128
  • Next Page »

About

Welcome to an aggregator for blogs about social engineering and related fields. Feel free to take a look around, and make sure to visit the original sites.

If you would like to suggest a site or contact us, use the links below.

Contact

  • Contact
  • Suggest a Site
  • Remove a Site

© Copyright 2025 Social Engineering Blogs · All Rights Reserved ·