Social Engineering Blogs

An Aggregator for Blogs About Social Engineering and Related Fields

The Humintell Blog June 14, 2017

Anxiety and Ambiguity

Why does that person look so angry?

You don’t have to have any sort of chronic anxiety to understand how easy it is to misunderstand other people’s facial expressions. We often interact with strangers, or even friends, and find ourselves unable to read their emotions, fearing that they are unhappy or angry.

In fact, newly published research suggests that feelings of anxiety do actually make us misread ambiguous facial expressions. Not only are we left unable to accurately determine their emotions, but we are more likely to mistakenly conclude that they are angry.

A group of researchers at the University of Bristol sought to investigate the role that anxiety has on impairing emotion recognition. They brought together a group of volunteers and exposed them to a series of images showing the same face, but with fifteen different emotional expressions. These ranged from surprise and happiness to disgust and anger, and the volunteers were asked to identify each emotion. A follow-up study expanded this analysis to 45 images.

Sounds pretty straightforward, but here is where the experiment gets really interesting. Each participant was given a facemask that pumped air into their lungs. Now, some of these facemasks simply delivered normal oxygen-dominated air, but half of them contained large amounts of carbon dioxide. The carbon dioxide spiked participants’ heart rates and blood pressure, causing anxiety attacks.

When compared to the control group, who was given normal air, the participants who inhaled carbon dioxide were about eight percent worse at correctly identifying emotions. Moreover, they tended to perceive anger much more often than happiness.

This is certainly not the first evidence that anxiety emphasizes negative emotional recognition. As this 2016 study outlines, a great deal of research demonstrates how those with social anxiety have trouble recognizing emotion and often attribute anger or sadness to neutral expressions.

Similarly, other disorders, such as depression and eating disorders, thwart one’s understanding of facial expressions.

All of this is perhaps not surprising, however, given the role of the amygdala in the recognition of ambiguous expressions. The amygdala, which is deeply connected with anxiety and fear, is activated when we see people with uncertain expressions, firing according to the level of perceived ambiguity. Similarly, it is also activated when attempting to read fear into other people’s faces.

Given that the amygdala is connected with both anxiety and ambiguous expressions, it would certainly make sense that increased levels of anxiety would thwart effective emotional recognition.

So perhaps, in your next uncertain social interaction, don’t conclude that the other person is angry with you. Perhaps, they are just distracted or not particularly emotive, leading to ambiguous expressions.

For more information on the neurological underpinnings of emotional recognition, check out our past blogs here and here.

Filed Under: Emotion, Science

The Humintell Blog June 7, 2017

Distractions from the Standing Desk

The idea of standing desks is certainly in vogue in the workplace, but is it the best choice?

Proponents point out that sitting for too long can have serious health problems, and many people even argue that standing at one’s desk can help boost productivity. While there may be some truth in this trendy approach, psychologist Mary Lamia emphasizes the possible downsides of being more in touch with your coworkers.

Anybody who has worked in a cubicle is familiar with how isolating that can feel, putting us out of sight from the rest of the office. According to Dr. Lamia, however, this might be a very good thing, based on our inevitable exposure to other people’s microexpressions.

Essentially, when we are constantly able to look at other people while standing at our desks, we subconsciously read into the expressions of everyone else in the room. Sitting at a cubicle limits our field of vision, but standing not only expands our field of view, it also raises our perspective. Both of these factors increase the amount of people’s emotions we are forced to process.

As Dr. Lamia says: “You’re like a lightning rod… You don’t just notice your colleagues’ presence—you start to literally imitate their presence.”

With all of these people in our peripheral gaze, we subconsciously process their emotions. Even when people are staring blank faced at a computer screen, for instance, they display microexpressions involuntarily. As these flit across their faces, our brains seize on these changes, processing the emotions, and distracting us from our work.

It is important to remember that reading expressions is not something that is done through careful analysis. Instead, we see a face, and immediately come to recognize what emotion is being displayed. Because this is not a rational or conscious process, it can happen at the subconscious level as well.

As psychologist Dr. Derek Chapman points out, standing desks can also contribute to what is called the “spotlight effect.” This phenomena distracts us by making us believe that people are paying undue attention to us. This can occur if we are one of only a few people at a standing desk and exacerbates Dr. Lamia’s concerns.

That said, despite the risks of empathetic overload, standing desks can have the potential to boost productivity and combat obesity.

Instead of completing rejecting or embracing this new phenomenon, Dr. Chapman urges a level of moderation. He points out that “People perform optimally at a moderate level of arousal… too much and we can’t focus, too little and we’re bored.”

With this all in mind, must work to strike a balance, staying conscious of the role that microexpressions play in all of our lives, whether we are aware of them or not.

For more information on how emotional recognition impacts the workplace, see our past blogs here and here.

Filed Under: Emotion

The Humintell Blog May 31, 2017

Emotion in an Artificial Intelligence World

It is an almost omnipresent fear these days that technology is degrading human connections, but could we leverage that same technology to foster closer emotional ties?

We already have enough trouble reading each other’s emotions, and this becomes even harder when we communicate over long-distances, whether through email, phone calls, or even video chat. We are not face to face and cannot develop that emotional recognition, as technology, distance, and distracting stimuli add up to interfere with our emotional connections.

Instead, many people compensate with uniform online expressions, like emojis or acronyms conveying laughter or sadness, such as the classic LOL (Laughing Out Loud). While these can make progress in expressing emotions, they cannot replace actual, in-person laughter and lack a great deal of emotional nuance. Moreover, the problem seems intractable, given that modern society forces us to communicate over great distances in order to stay in touch.

That’s where developers like True Emoji, Affectiva, and Brain Power come in. These companies are using a concept called Emotion AI which attempts to give computers and other forms of artificial intelligence the ability to recognize and understand human emotions. It was Affectiva that developed a form of Emotion AI, and this software has been used in fascinating ways to teach computers to recognize facial expressions and understand emotions.

True Emoji has used this to address the problem of online communication. They have developed an app that reads a user’s facial expression and develops corresponding personalized emojis. This allows users to select an emoji that fits with their own emotion, rather than relying on a clichéd list of preset icons.

As True Emoji CEO Sumesh Dugar points out “The Internet has created a huge divide between emotions and communications… How many times have you shown surprise just by sending an emoticon?”

Similarly, Brain Power has put this principle to use trying to teach autistic children how to better recognize emotions. While autistic children have trouble recognizing emotion, this does not mean that they lack empathy. As we discussed in a previous blog, this just means that they can benefit from being taught emotional recognition skills!

Brain Power’s work focuses on the creation of interactive games that respond to the emotions of the participants. Joey Salisbury, the Brain Power director of software development, describes these as “augmented reality games with the goal of creating new, engaging ways for families to play together and grow closer while learning about emotions.”

These are just two applications of this burgeoning field. By abandoning an age-old divide between technology and emotion, we can allow the two to build on each other, creating emotionally intelligent machines and using technology to foster social ties.

For more information on this subject, see our previous blogs here and here.

Filed Under: Emotion, Science, Technology

  • « Previous Page
  • 1
  • …
  • 48
  • 49
  • 50
  • 51
  • 52
  • …
  • 67
  • Next Page »

About

Welcome to an aggregator for blogs about social engineering and related fields. Feel free to take a look around, and make sure to visit the original sites.

If you would like to suggest a site or contact us, use the links below.

Contact

  • Contact
  • Suggest a Site
  • Remove a Site

© Copyright 2025 Social Engineering Blogs · All Rights Reserved ·