Social Engineering Blogs

An Aggregator for Blogs About Social Engineering and Related Fields

The Humintell Blog July 13, 2013

Mapping Emotions

stockvault-map-analysing144045

Courtesy of StockVault

Emotions seem to play a role in most aspects of human interaction and life, yet scientists and philosophers still know relatively little about them.  New information on emotions is continuously evolving and Science Codex has reported on one of the newest theories on the science of Emotions.

This new theory, “the integrated embodiment theory of emotions”, is outlined in the journal of Philosophy and Phenomenological Research.  It posits that emotions are formulated by the integration of different bodily perceptions that have representations of external objects, events, or states of affairs.  That is, emotions are not just representations of perception or thought but are separate mental states, which are a reflection of the integration of feelings of bodily processes and cognitive events.

Prof. Dr. Albert Newen and Dr. Luca Barlassina of the Institute of Philosophy II at the Ruhr-Universität Bochum, are the creators of this new emotion theory and purport that their theory gives a unified and principled account of the relation between emotions and bodily perceptions, the intentionality of emotions, and emotion phenomenology.

This theory labeled an impure somatic theory of emotions and is contrasted with current pure somatic theories that posit emotions are entirely constituted by bodily perceptions.  Emotions are nothing but the perception of a bodily state.  That is we do not tremble because we are scared, but rather we are scared because we tremble.  “This theory does not, however, consider the cognitive content of many emotions“, says  Newen.

The “cognitive theory of emotions” says that emotions are essentially an assessment of the situation based on reason: this dog is dangerous because he is baring his teeth. “This theory is also unsatisfactory,” says Newen, “because it forgets the feelings as a central component of the emotion.“ For example, a person can judge that a dog is dangerous and at the same time have no fear because he is an expert in handling dangerous dogs. So the cognitive assessment does not necessarily determine the emotion.

According to Newen and Barlassina, the new theory is superior to Jesse Prinz’s most sophisticated theory of emotions so far, because this does not take into account that an emotion can also be directed at an object that is not present or does not even exist.

A related article from Science World Report purports that scientists may be able to tell exactly how a person feels by mapping their brain. For the first time, researchers have identified exactly which emotion a person is experiencing based solely on brain activity.

This study, published in PLOS One journal, claims to be different from others in that it does not rely on people to delineate their emotional state(s) (i.e. self-report).  It uses a computational model that identifies individuals’ thoughts of concrete objects.

Amanda Markey , one of the researchers, points out, “Despite manifest differences between people’s psychology, different people tend to neutrally encode emotions in remarkably similar ways.“

The researchers also found that emotion signatures aren’t necessarily limited to specific brain regions. Instead, they produce characteristics patterns throughout a number of brain regions.  In the future, the researchers plan to use this new identification method in order to overcome a number of challenging problems in emotion research, including identifying emotions that individuals are actively trying to suppress.

 Is this new theory of emotions being separate mental states superior to the old?

Filed Under: Hot Spots, Nonverbal Behavior, Science

The Humintell Blog July 11, 2013

Dr. Matsumoto’s Radio Interview – “View Point” with Ellen Shehadeh

stockvault-microphone102018

Photo courtesy of StockVault

Listen to Humintell director’s radio interview on facial expressions, emotions, & culture on 90.5 FM’s “View Point” with Ellen Shehadeh.

“Faces are special because they communicate specific information about our emotional states as well as sometimes our thoughts and our feelings,“ Dr. Matsumoto pointed out.

He comments on an investigators duty when trying to evaluate truthfulness:

“That’s why being able to read microexpressions as well as all the other non-verbal as well as the verbal indicators are aids for an investigator to then follow-up because the follow-up and how you follow-up and what you’re gonna say and the content that you’re gonna explore whether you’re an investigator or psychotherapist or physician, or lawyer that’s the other very important half of the equation of being able to use these kinds of indicators.“

He goes on to talk about the difference between micro and macro facial expressions specifically the microexpression of fear:

“Now having said that [quote above] I believe that the dynamics of the expressions are gonna be different.  For example if you’re walking into an airport and you’re showing fear, you could be afraid of being caught because you’re carrying some contraband…or you could be afraid of the fact that you forgot where you parked your car or whether you turned off your lights in the garage…so the fact that you’re afraid doesn’t tell you which one that is [what reason you are displaying fear for], but I do believe that if you’re afraid of being caught you’re gonna be more likely to hide your fear. Whereas if you’re afraid that you forgot to turn off the lights in your car you’re not gonna be that afraid of wanting to hide that [type of] fear.  So, the fear is gonna look different and that’s the difference between micro expressions [trying to conceal the fact that you are afraid] and macro expressions…“

 For more information or to listen to the entire interview visit KWMR 90.5 FM.

Filed Under: Hot Spots, Nonverbal Behavior, Science

The Humintell Blog July 5, 2013

Infants Recognize Emotions

stockvault-abeera126849

Photo Courtesy of StockVault

A recent article by Popular Science  reports on a new study from Psychology professor Ross Flom and colleagues that found babies are able to read each other’s emotional expressions as early as 5 months old.  The study which was published in Infancy journal comes right after similar research  published by Flom on infants’ ability to understand the moods of dogs, monkeys and classical music .

Flom explains that while babies are unable to communicate through language they do learn how to communicate through affect, or emotion.  This implies that not only can they read emotional expressions of their infant peers, but they can perceive and associate changes in those expressions as well.  Flom points out, “… it is not surprising that in early development, infants learn to discriminate changes in affect.”   This change in affect is where babies are able to “read” each other while most adults are left scratching their heads.

The study, held at Brigham Young University which was co-authored by Professor Lorraine Bahrick  and  graduate student Mariana Vaillant-Molina from Florida International University, looked at 40 babies ranging from 3.5 to 5 months old.

The study placed baby participants in front of two monitors.  One displayed a video of a happy baby and the other displayed a video of an unhappy baby.  While the babies were placed in front of the monitors, researchers played audio from a third baby.  The audio was either of a happy, laughing baby or of a sad, crying baby.

Researchers noticed that when the audio reflected happy baby noises the infants focused on the happy baby video and when the audio was sad they looked more to the sad video.

Past studies found that babies (not infants) are able to perceive facial expressions of emotion in familiar adults at 6 months and all other adults by 7 months.  However, this study takes it a step further documenting that infants as young as 5 months (but not as young as 3.5 months) have the capability to perceive and recognize emotional expressions in other infants

Flom substantiates, “These findings add to our understanding of early infant development by reiterating the fact that babies are highly sensitive to and comprehend some level of emotion.” Flom goes on to say, “Babies learn more in their first 2 1/2 years of life than they do the rest of their lifespan, making it critical to examine how and what young infants learn and how this helps them learn other things.”

Flom would like to take his recent findings a step further by testing whether infants younger than 5 months are able to demonstrate this same level of perception by watching and hearing clips of themselves.

What do you think? Will babies be able to read emotion even earlier if it’s their own?

Filed Under: Nonverbal Behavior, Science

  • « Previous Page
  • 1
  • …
  • 74
  • 75
  • 76
  • 77
  • 78
  • …
  • 128
  • Next Page »

About

Welcome to an aggregator for blogs about social engineering and related fields. Feel free to take a look around, and make sure to visit the original sites.

If you would like to suggest a site or contact us, use the links below.

Contact

  • Contact
  • Suggest a Site
  • Remove a Site

© Copyright 2025 Social Engineering Blogs · All Rights Reserved ·