Technology Archives - Social Engineering Blogs http://www.socialengineeringblogs.com/category/technology/ An Aggregator for Blogs About Social Engineering and Related Fields Mon, 24 Feb 2025 19:12:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 Can Artificial Intelligence (AI) Read Animal Emotions? https://www.humintell.com/2025/02/can-artificial-intelligence-ai-read-animal-emotions/?pk_campaign=rss_feed&pk_kwd=can-artificial-intelligence-ai-read-animal-emotions Mon, 24 Feb 2025 19:12:58 +0000 https://www.humintell.com/?p=44616 As technology advances, scientists from around the globe have been investigating the use of AI to help recognize animal pain signals. Through computerized facial expression analysis, this AI technology can quickly and accurately recognize pain signals in animals. In some cases, AI is better at this task than some humans! This AI technology has been…

The post Can Artificial Intelligence (AI) Read Animal Emotions? first appeared on Humintell | Master the Art of Reading Body Language.

The post Can Artificial Intelligence (AI) Read Animal Emotions? appeared first on Social Engineering Blogs.

]]>

As technology advances, scientists from around the globe have been investigating the use of AI to help recognize animal pain signals.

Through computerized facial expression analysis, this AI technology can quickly and accurately recognize pain signals in animals. In some cases, AI is better at this task than some humans!

This AI technology has been used in animals from sheep to horses to cats.

An example includes the Intellipig System developed by scientists at the University of the West of England Bristol (UWE) and Scotland’s Rural College (SRUC).

Intellipig examines photos of pigs’ faces and notifies farmers if there are signs of pain, sickness, or emotional distress.

Facial Expressions in Animals

Scientists assess an animal’s level of pain by looking for telltale muscle movements around the eyes, ears, and other facial features. Artificial intelligence (AI) systems make similar judgments by measuring the distance between “landmarks” on the face (orange and teal dots).

Like humans, animals convey how they’re feeling through their facial expressions. In fact, humans share 38% of our facial movements with dogs, 34% with cats and 47% with primates and horses.

But, as an article in Science points out, “the anatomical similarities don’t mean we can read animals’ faces like those of fellow humans. So, researchers studying animal communication often infer what an animal is experiencing through context”.

An example of this is pain; researchers studying animals can induce mild discomfort or be cognizant of pain signals after an invasive procedure such as castration.

After spending countless hours observing the faces of animals in painful or stressful situations, scientists can then compare them against animals who are pain or stress-free.

As a result, scientists developed “grimace scales” which provide a measure of how much pain or stress an animal is experiencing based on movement of its facial muscles.

In addition, like the Facial Action Coding System (FACS) used on humans, experts have also become skilled at coding facial movements in animals (AnimalFACS).

Amazingly at present, the FACS system has been adapted into 8 different species and their manuals are freely accessible through the animalfacs.com website:

  • ChimpFACS: common chimpanzees
  • MaqFACE: rhesus macaques
  • GibbonFACS: hylobatid species
  • OrangFACS: orangutans
  • DogFACS: domestic dogs
  • CatFACS: cats
  • EquiFACS: domestic horses
  • CalliFACS: marmoset species

However, coding work is incredibly tedious, and human coders need 2 to 3 hours to code 30 seconds of video.

This is where AI comes in.

AI can do the same task almost instantaneously, but first it must be taught.

Teaching AI to Read Animal Faces

AI systems are becoming faster and more accurate than humans at determining whether an animal is in pain. That’s partly because they can identify the tiniest muscle movements and find new indicators of pain that humans are not even aware of.

At the University of Haifa, scientists Anna Zamansky and her team have been using AI to pinpoint the subtle signs of discomfort in animals’ faces.

There are many steps in teaching AI to read animal faces.

These steps include:

  1. AI learning to ID parts of the face crucial to creating expressions (this is done by manually flagging important parts of the face associated with specific muscle movements).
  2. Feeding AI a plethora of landmarked photos to teach it to find landmarks on its own.
  3. AI identifying specific facial expressions by analyzing distances between landmarks.
  4. Cross referencing expressions against grimace scales to determine signs of pain or distress.

Zamansky’s team trained their AI on photos of Labrador retrievers who were either eagerly anticipating a treat or were able to see the treat but were prevented from reaching it.

Their AI was able to successfully detect whether the dog was happy or frustrated 89% of the time.

The AI also successfully differentiated happy and frustrated horses in the same experiment.

Despite some limitations to their technology, Zamansky’s team is about to release an AI based app that will allow cat owners to scan their pets’ faces for 30 seconds and get easy to read messages.

The technology also extends to horses- researchers in the Netherlands have developed a similar app that scans resting horses’ faces and bodies to estimate their pain levels.

This app could potentially be used in equestrian competitions to improve animal welfare and fairness in the sport.

The post Can Artificial Intelligence (AI) Read Animal Emotions? first appeared on Humintell | Master the Art of Reading Body Language.

The post Can Artificial Intelligence (AI) Read Animal Emotions? appeared first on Social Engineering Blogs.

]]>
To Face the Fear, Don’t Press Mute https://www.humintell.com/2020/11/to-face-the-fear-dont-press-mute/?pk_campaign=rss_feed&pk_kwd=to-face-the-fear-dont-press-mute Tue, 27 Oct 2020 15:20:33 +0000 http://www.humintell.com/?p=37415 Guest Blog by AnnMarie Baines Even in the midst of a pandemic, people still feel the pressure to appear “perfect”. Unfortunately, the pressure to be perfect only increases the fear of public speaking, regardless of a speaker’s experience level. As a public speaking coach and founder of non-profit, The Practice Space, I have observed more…

The post To Face the Fear, Don’t Press Mute first appeared on Humintell.

The post To Face the Fear, Don’t Press Mute appeared first on Social Engineering Blogs.

]]>
Guest Blog by AnnMarie Baines

Even in the midst of a pandemic, people still feel the pressure to appear “perfect”. Unfortunately, the pressure to be perfect only increases the fear of public speaking, regardless of a speaker’s experience level.

As a public speaking coach and founder of non-profit, The Practice Space, I have observed more people using the virtual world to hide and avoid that fear completely. By turning off our cameras and putting ourselves on mute, it is easier to opt out of public speaking and observe discussions at a distance, as opposed to being spotlighted and risk judgment and uncertainty.

While it is more equitable to give people the option of whether they want to turn on video, as a woman of color, I also know it is important not to silence ourselves. For those whose voices are unrepresented in powerful places, including women, youth, people of color, the fear of public speaking is already entrenched in histories of oppression and discrimination that instruct us to feel that our voice is somehow inferior. It is even more essential to push back on the conditions that are set up to push diverse voices aside.

Instead of pressing mute, facing the fear of public speaking instead begins with a change in mindset. Public speaking is infinitely more scary when we view it as a test, or feel like we have to defend ourselves on trial. If we view communication as a tool for human connection, then public speaking should be viewed as a chance to teach and enhance understanding. Confident communication emerges when we listen, teach, commit to our ideas, and let go when things don’t go as planned.

Tip #1: Value your connection to the audience.

Regardless of whether we are online or in person, all the anxiety-coping strategies in the world will not help until a speaker personally reframes the goal of public speaking. When the goal is still to “get through the speech unscathed” or “deliver a presentation without any mistakes” or “deliver everything perfectly from memory”, the irony is that speakers are much more likely to be nervous and unsatisfied with their performance. Instead, public speakers need to frame goals that prioritize the effect they want to have on their audience. For instance, public speaking goals such as teaching new ideas, inspiring connections, communicating content that people remember, and encouraging follow-up conversations do not depend on perfection. Rather than having goals that are all about you, effective communication should value connection over seamless presentation.

Tip #2: Expect and embrace discomfort.

Everything in 2020 is deeply uncomfortable, and communicating over a webcam is no exception. That said, for many, public speaking has always been an uncomfortable and somewhat unnatural experience, even before the pandemic hit. Many speakers and performers use visualization techniques, where they prepare themselves by imagining the result they want. Instead of imagining situations where you don’t feel any nerves at all, it can help to imagine the jitters you might have at the start and then imagine them disappearing as you sink into the moment and connect with your audience. It can also help to embrace the reality that public speaking will sometimes feel awful, but also that the discomfort won’t last forever — sometimes, it is only a few minutes.

Tip #3: When you can, always speak about what excites you.

Given how fearful and anxious some people can feel about public speaking, the discomfort is only worthwhile if your message is personally important to you. Sometimes, when I am faced with a speech that is particularly nerve-wracking to me, I will say to myself, “right now, this work is more important than my fears.” While it is always useful to think about what your audience might want to hear, at the end of the day, every speech should always derive from content that drives, motivates, and excites you. When you talk about what genuinely interests you, it is easier to get lost in your message and drown out evil voices of self-critique and doubt. The byproduct is that your speaking delivery will automatically be better because you are speaking from the heart instead of from a place of stress.

When I interview my students about their growth and confidence, it always surprises me that they never say that their nerves have gone away. Even the most advanced students say that the fear is always there, but that they have learned to embrace it. In the words of one of my high school students, who was a champion public speaker and state champion finalist, “So I’m still kind of afraid of talking in front of people and I try to avoid it as much as possible or get other people to go before me, but I’m just kind of on terms with it now. I can choose to rise above it rather than let it inhibit me.” Before we voluntarily put ourselves on mute, take a moment to reflect on why. If it is to listen deeply and learn from others, then mute away. But if it is to avoid fears, then don’t be the one to silence yourself because there are plenty of people out there who will.

Read a previous guest blog by AnnMarie on how to feel less nervous about speaking in public 

The post To Face the Fear, Don’t Press Mute first appeared on Humintell.

The post To Face the Fear, Don’t Press Mute appeared first on Social Engineering Blogs.

]]>
Featured on National Geographic! How Facial Expressions Help Robots Communicate With Us https://www.humintell.com/2020/08/how-facial-expressions-help-robots-communicate-with-us/?pk_campaign=rss_feed&pk_kwd=featured-on-national-geographic-how-facial-expressions-help-robots-communicate-with-us Mon, 17 Aug 2020 14:30:41 +0000 http://www.humintell.com/?p=37063 National Geographic Explores: A wrinkled nose, raised eyebrows, a frowning mouth—all can say a lot without uttering a single word.
Facial expressions are the closest thing humans have to a universal language, and it could change our relationship with a...

The post Featured on National Geographic! How Facial Expressions Help Robots Communicate With Us appeared first on Social Engineering Blogs.

]]>
National Geographic Explores: A wrinkled nose, raised eyebrows, a frowning mouth—all can say a lot without uttering a single word.

Facial expressions are the closest thing humans have to a universal language, and it could change our relationship with androids and other human-like robots.

The post Featured on National Geographic! How Facial Expressions Help Robots Communicate With Us appeared first on Social Engineering Blogs.

]]>
Nonverbal Cues in the 21st Century https://www.humintell.com/2017/11/nonverbal-cues-in-the-21st-century/?pk_campaign=rss_feed&pk_kwd=nonverbal-cues-in-the-21st-century Wed, 01 Nov 2017 22:44:27 +0000 http://www.humintell.com/?p=34258 The more impersonal communication gets, the more we remember the need for personal contact. While technology has many great features, it can often distill communication down to text messages, emails, or instant messages. These really can help manage spread out workforces or enable people to work from home, but they also prevent us from reading […]

The post Nonverbal Cues in the 21st Century appeared first on Social Engineering Blogs.

]]>
The more impersonal communication gets, the more we remember the need for personal contact.

While technology has many great features, it can often distill communication down to text messages, emails, or instant messages. These really can help manage spread out workforces or enable people to work from home, but they also prevent us from reading each other’s nonverbal behavior. This does more than just prevent effective communication and can even prevent the development of trusting and empathetic relationships.

A 2012 study found that when comparing impersonal communication with face to face interaction, there were measurably different neurological responses in the brain. Moreover, the study authors concluded that the neurological effects unique to face to face dialogue may be crucial to successful interactions.

These neurological findings fit closely with the first hand experiences of a variety of entrepreneurs. For instance, Max Brown, the founder of Silicon Beach Trust emphasized the trust building aspects of in-person interaction: “Overall the biggest value of face time is that it’s really the only legitimate way to build trust with someone.”

This notion of trust proved crucial to other testimonials. Anna Barber, the managing director for Techstars, stressed the need for trust to mediate possible interpersonal conflicts. Barber contended that without trust “you won’t have a basic mutual empathy and understanding to fall back on when you hit the inevitable bumps that arise.”

Barber also emphasized that creative problem solving is much better employed while in the same room than when relying on phone calls or emails.

With such a wealth of benefits for in-person communication, it is a little concerning to see a tendency towards less personal methods of cooperation. However, the notion that all young people eschew conversation in favor of texting doesn’t seem to be correct.

Perhaps surprisingly, a 2016 survey found that 55 percent of millennials actually do prefer in person communication! That said, this is not a particularly overwhelming majority.

Followers of this blog will have already made the connection between in-person communication and either nonverbal behavior or microexpressions. We have found repeatedly that both are critical in really understanding a person, either by recognizing their underlying emotional states or by telling more effectively if they are lying to us.

While we cannot help you emphasize in-person communication, check out our past blog here about the power of reading into the sound of a voice, or just get better at handling the face to face conversations that are so important.

The post Nonverbal Cues in the 21st Century appeared first on Social Engineering Blogs.

]]>
Emotion in an Artificial Intelligence World https://www.humintell.com/2017/05/emotion-in-an-artificial-intelligence-world/?pk_campaign=rss_feed&pk_kwd=emotion-in-an-artificial-intelligence-world Wed, 31 May 2017 21:09:45 +0000 http://www.humintell.com/?p=34017 It is an almost omnipresent fear these days that technology is degrading human connections, but could we leverage that same technology to foster closer emotional ties? We already have enough trouble reading each other’s emotions, and this becomes even harder when we communicate over long-distances, whether through email, phone calls, or even video chat. We […]

The post Emotion in an Artificial Intelligence World appeared first on Social Engineering Blogs.

]]>
It is an almost omnipresent fear these days that technology is degrading human connections, but could we leverage that same technology to foster closer emotional ties?

We already have enough trouble reading each other’s emotions, and this becomes even harder when we communicate over long-distances, whether through email, phone calls, or even video chat. We are not face to face and cannot develop that emotional recognition, as technology, distance, and distracting stimuli add up to interfere with our emotional connections.

Instead, many people compensate with uniform online expressions, like emojis or acronyms conveying laughter or sadness, such as the classic LOL (Laughing Out Loud). While these can make progress in expressing emotions, they cannot replace actual, in-person laughter and lack a great deal of emotional nuance. Moreover, the problem seems intractable, given that modern society forces us to communicate over great distances in order to stay in touch.

That’s where developers like True Emoji, Affectiva, and Brain Power come in. These companies are using a concept called Emotion AI which attempts to give computers and other forms of artificial intelligence the ability to recognize and understand human emotions. It was Affectiva that developed a form of Emotion AI, and this software has been used in fascinating ways to teach computers to recognize facial expressions and understand emotions.

True Emoji has used this to address the problem of online communication. They have developed an app that reads a user’s facial expression and develops corresponding personalized emojis. This allows users to select an emoji that fits with their own emotion, rather than relying on a clichéd list of preset icons.

As True Emoji CEO Sumesh Dugar points out “The Internet has created a huge divide between emotions and communications… How many times have you shown surprise just by sending an emoticon?”

Similarly, Brain Power has put this principle to use trying to teach autistic children how to better recognize emotions. While autistic children have trouble recognizing emotion, this does not mean that they lack empathy. As we discussed in a previous blog, this just means that they can benefit from being taught emotional recognition skills!

Brain Power’s work focuses on the creation of interactive games that respond to the emotions of the participants. Joey Salisbury, the Brain Power director of software development, describes these as “augmented reality games with the goal of creating new, engaging ways for families to play together and grow closer while learning about emotions.”

These are just two applications of this burgeoning field. By abandoning an age-old divide between technology and emotion, we can allow the two to build on each other, creating emotionally intelligent machines and using technology to foster social ties.

For more information on this subject, see our previous blogs here and here.

The post Emotion in an Artificial Intelligence World appeared first on Social Engineering Blogs.

]]>
Can We Learn Empathy from Robots? https://www.humintell.com/2017/03/can-we-learn-empathy-from-robots/?pk_campaign=rss_feed&pk_kwd=can-we-learn-empathy-from-robots Thu, 23 Mar 2017 18:05:18 +0000 http://www.humintell.com/?p=33921 Many people familiar with science fiction tend to have an ingrained fear and repulsion at what are seen as cold and unfeeling robots. The idea of widespread artificial intelligence brings to mind terrifying visions from films such as The Terminator or The Matrix, both of which present an apocalyptic future where artificial intelligence turns on […]

The post Can We Learn Empathy from Robots? appeared first on Social Engineering Blogs.

]]>
Many people familiar with science fiction tend to have an ingrained fear and repulsion at what are seen as cold and unfeeling robots.

The idea of widespread artificial intelligence brings to mind terrifying visions from films such as The Terminator or The Matrix, both of which present an apocalyptic future where artificial intelligence turns on mankind with disastrous results. The basic concern seems to be that robots lack any sense of empathy towards their human creators.

However, many humans already struggle with empathy, and this problem is especially poignant in the field of medicine. Unfortunately, many patients struggle to effectively communicate their pain to doctors, the very people who are able to treat it. Granted, pain is a difficult thing to communicate, but there is some evidence that doctors are even worse at recognizing it than the general population.

This may be born out of necessity, as medical professionals are required to distance themselves emotionally from patients in order to conduct treatments in a scientific and objective fashion. That said, it creates problems in trying to understand and diagnose pain conditions.

Dr. Laurel Riek, a professor of computer science at the University of California, San Diego, actually sought to test whether doctors could properly recognize emotional expressions in their patients. In fact, when medical experts and laypeople were exposed to digitally simulated facial expressions, the clinicians proved to be much less accurate at recognizing pain.

While the study analyzed various emotions, including anger and disgust, recognition of pain represented the starkest disparity between the groups. Only 54 percent of medical professionals successfully identified pain as opposed to an 83 percent success rate for laypeople.

This experiment managed to simulate facial expressions, not from images of actual humans, but from computer generated imagery and an actual robot. This robot was created by analyzing a vast video archive depicting human expressions and using face-tracking software to graft those expressions onto the uncannily realistic rubber face of the robot, named Philip K. Dick.

Now, Dr. Riek is trying to use robots like Philip K. Dick to teach doctors how to better understand emotion. There is some precedent for this, as clinicians have often used robots as practice dummies for learning medicine.

But she has pointed out a major flaw in the use of these robotic training tools: “These robots can bleed, breathe, and react to medication… They are incredible, but there is a major design flaw – their face.” She explains that facial expressions are critical in communicating pain to doctors, not just in interacting with the patient but also in quickly diagnosing strokes or adverse reactions to medication.

This entire enterprise may strike many readers as highly ironic, given the cold, calculated image that science fiction has given us for artificial intelligence. Even the robot’s namesake was a prolific writer who dealt with the problem of robots’ lack of empathy. However, Dr. Riek’s work demonstrates how many varied applications such a powerful technology can have on better understanding emotions and facial expressions.

For more research on empathy and facial recognition, check out our past blogs here and here.

The post Can We Learn Empathy from Robots? appeared first on Social Engineering Blogs.

]]>
Identifying Emotion in Emails http://www.humintell.com/2016/09/identifying-emotion-in-emails/?pk_campaign=rss_feed&pk_kwd=identifying-emotion-in-emails Mon, 12 Sep 2016 15:17:27 +0000 http://www.humintell.com/?p=33622 “OMG I just LOVE pizza.” Is this statement sarcastic? Is it heartfelt? As our everyday communication is increasingly text-driven, inferring emotion from messages is an important skill. If the receiver of the message is a friend, they should be able to understand the sender’s emotion better than a complete stranger. But a recent study by […]

The post Identifying Emotion in Emails appeared first on Social Engineering Blogs.

]]>
mail-566337_640“OMG I just LOVE pizza.” Is this statement sarcastic? Is it heartfelt? As our everyday communication is increasingly text-driven, inferring emotion from messages is an important skill. If the receiver of the message is a friend, they should be able to understand the sender’s emotion better than a complete stranger. But a recent study by researchers at Chatham University found that friends are no better at interpreting correct emotional intent in e-mails than complete strangers.

Monica A. Riordan and Lauren A. Trichtinger (Chatham University) published their findings in the journal Human Communication Research. The researchers conducted three studies to find out the effect of contextual information on the confidence and accuracy of affective communication via e-mail.

In the first two studies, writers wrote two e-mails, indicating the presence or absence of eight different emotions in each e-mail. One e-mail was based on a predetermined scenario, and the other freely written. These e-mails were then read by strangers, who rated each e-mail for those same eight emotions.

The third study tweaked the procedure to test the effect of relationship. Writers wrote two e-mails (one based on a scenario, the other freely written) and indicated whether eight different emotions were present in each e-mail they wrote. Writers then sent these two e-mails to both friends and strangers, each of whom rated the e-mail for the same eight emotions, then wrote response e-mails.

The researchers found that writers are more confident their friends can correctly interpret their e-mails than strangers- and readers are more confident in interpreting e-mails from friends than strangers, as well. In fact, everyone was highly confident in their e-mail writing and reading abilities. However, this confidence had no relationship with actual accuracy, suggesting people are poor judges of their affect-detection skills. They also found that verbal and nonverbal cues, like emoticons, all caps, or repeated exclamation points did not have a positive effect on accuracy.

Past research has sought to determine how we communicate our emotions in environments from which facial expressions, vocal intonation, body language, and other cues are missing. But many of the studies have flaws in that they are based on artificial stimuli that third parties are asked to rate. It is difficult to determine whether nonverbal or verbal cues are substitutes for emotion without examining the communication as a whole.

“As e-mail, text messaging, and other forms of computer-mediated communication become more dominant forms of interaction, the communication of affect becomes more difficult, primarily because facial expressions, gestures, vocal intonation, and other forms of expressing emotion are lost,” said Riordan. “It is clear from this study that readers can determine that we are angry, but cannot determine HOW angry. The loss of this subtlety could lead to consequences in many forms– especially in our relationships, where the difference between annoyance and rage can be vast, and a simple misinterpretation of an intended emotion can lead to a drastic alteration in that emotion.”

The post Identifying Emotion in Emails appeared first on Social Engineering Blogs.

]]>
NSA Hoards Zero Days; Doesn’t Disclose Them all to Vendors http://www.socialengineeringblogs.com/nsa-hoards-zero-days-doesnt-disclose-them-all-to-vendors/?pk_campaign=rss_feed&pk_kwd=nsa-hoards-zero-days-doesnt-disclose-them-all-to-vendors Mon, 22 Aug 2016 01:13:19 +0000 https://socialhax.com/?p=395 The NSA does not always disclose the zero day vulnerabilities it finds to unprotected vendors. Some security flaws are kept secret “when they can be used to serve a clear national security or law enforcement need” (Wired). The US National Security Agency (NSA) was hacked by a suspected Russian hacker group and many of their […]

The post NSA Hoards Zero Days; Doesn’t Disclose Them all to Vendors appeared first on Social Hax.

The post NSA Hoards Zero Days; Doesn’t Disclose Them all to Vendors appeared first on Social Engineering Blogs.

]]>
The NSA does not always disclose the zero day vulnerabilities it finds to unprotected vendors. Some security flaws are kept secret “when they can be used to serve a clear national security or law enforcement need” (Wired).

The US National Security Agency (NSA) was hacked by a suspected Russian hacker group and many of their exploits and hacking tools were archived. Leaked information was made public that showed the NSA collects exploits and does not always disclose them to vulnerable vendors. When vulnerabilities are not disclosed, problems do not get fixed. The NSA appears to operate “on the premise that secrets will never get out. That no one will ever discover the same bug. That no one will ever use the same bug. That there will never be a leak” (Business Insider).

Unfortunately, as we are currently witnessing with this recent leak, other types of hackers are able to find the same bugs and those hackers could have more malicious intent than the NSA. When hackers obtain a trove of U.S. secrets, that could put the government and corporations worldwide in a susceptible position. For example, the leaked data includes information on breaching popular commercial firewalls. Emergency service providers, governments, financial systems and many businesses all rely on these firewall technologies.

Global networking company, Cisco Systems, confirmed last week that the NSA exploited an undetected severe vulnerability that allows remote attackers “who have already gained a foothold in a targeted network to gain full control over a firewall” (Ars Technica). The NSA knew about this vulnerability since 2013 and did nothing to stop it. Now that the data is leaked, Cisco fears that the information “could be used to breach its Adaptive Security Appliance (ASA) software used in its firewalls. An exploit could allow the attacker to execute arbitrary code and obtain full control of the system or to cause a reload of the affected system”. It can be argued that these exploits would have been patched had the NSA disclosed the vulnerabilities instead of collecting them for their own use.

(Watch – Snowden discusses NSA hack, Cisco to cut 5,500 jobs, NASA preps an asteroid rocket):

The post NSA Hoards Zero Days; Doesn’t Disclose Them all to Vendors appeared first on Social Hax.

The post NSA Hoards Zero Days; Doesn’t Disclose Them all to Vendors appeared first on Social Engineering Blogs.

]]>
In Texting, Punctuation Conveys Different Emotions. Period. http://www.humintell.com/2015/12/in-texting-punctuation-conveys-different-emotions-period/?pk_campaign=rss_feed&pk_kwd=in-texting-punctuation-conveys-different-emotions-period Sat, 12 Dec 2015 00:26:51 +0000 http://www.humintell.com/?p=33042 By Christina Passariello for the WSJ Technology is changing language, period The use of a period in text messages conveys insincerity, annoyance and abruptness, according to a new study from the State University of New York Binghamton. Omitting better communicates the conversational tone of a text message, the study says. As with any study by […]

The post In Texting, Punctuation Conveys Different Emotions. Period. appeared first on Social Engineering Blogs.

]]>
By Christina Passariello for the WSJ

texting-girlsTechnology is changing language, period

The use of a period in text messages conveys insincerity, annoyance and abruptness, according to a new study from the State University of New York Binghamton. Omitting better communicates the conversational tone of a text message, the study says.

As with any study by university researchers, though, it’s not that simple. The study found that some punctuation expresses sincerity. An exclamation point is viewed as the most sincere. (I overuse exclamation points!)

“It’s not simply that including punctuation implies a lack of sincerity,” said the study’s lead author, Celia Klin, an associate professor of psychology at Binghamton. “There’s something specific about the use of the period.”

The study asked 126 undergrads to evaluate conversations that appeared as text messages and handwritten notes (who uses those anymore?). The exchange started with an invitation, such as, “Dave gave me his extra tickets. Wanna come?” The students were asked to react to one-word responses – “Okay, Sure, Yeah, Yup” – with or without a period.

Grammar is evolving as we use new communication tools. Ms. Klin said she suspects periods in email to be more acceptable than in text messages, for example, because email is less conversational. Text messages are often short one-word replies, she said.

“The rapid exchange of text messaging gives it a speech-like quality,” said Ms. Klin. “It makes sense that texters rely on what they have available to them — emoticons, deliberate misspellings that mimic speech sounds and, according to our data, punctuation.”

Ms. Klin’s study, “Texting insincerely: the role of the period in text messaging,” appeared in the journal Computers in Human Behavior last month.

The post In Texting, Punctuation Conveys Different Emotions. Period. appeared first on Social Engineering Blogs.

]]>
Susan Constantine Appears on Dr Phil Show about Ashley Summers http://www.humintell.com/2015/05/susan-constantine-appears-on-dr-phil-show-about-ashley-summers/?pk_campaign=rss_feed&pk_kwd=susan-constantine-appears-on-dr-phil-show-about-ashley-summers Fri, 01 May 2015 15:14:23 +0000 http://www.humintell.com/?p=32529 Taken from Humintell Affiliate, Susan Constantine’s website: Orlando, FL – March 20, 2015 – ATM surveillance photos taken in Warwick, Rhode Island in October 2014 and subsequently released by the city’s police department are currently garnering much attention. There is interest in determining whether they show Ashley Summers, the girl who went missing from her […]

The post Susan Constantine Appears on Dr Phil Show about Ashley Summers appeared first on Social Engineering Blogs.

]]>
Taken from Humintell Affiliate, Susan Constantine’s website:

Orlando, FL – March 20, 2015 – ATM surveillance photos taken in Warwick, Rhode Island in October 2014 and subsequently released by the city’s police department are currently garnering much attention. There is interest in determining whether they show Ashley Summers, the girl who went missing from her Cleveland home at the age of 14 back on July 9, 2007.

There were no witnesses to offer an account of what happened to Summers, and for eight years, no leads have yielded answers and her disappearance has remained unsolved. Now, the FBI has taken an interest in the ATM photos, publicized in relation to a string of ID thefts, after being alerted to them by Summers’ step-grandmother. She believes the woman in the pictures bears a striking resemblance to her missing step-granddaughter. Many agree, and the woman is also said to closely resemble computer-generated images of what Summers would look like now, at age 21.

As the FBI works to analyze the new evidence that may establish Summers is still alive, the Dr. Phil show invited Susan Constantine, MPsy, a leading body language and facial recognition expert, to provide her own analysis while appearing alongside the Summers family. She herself has trained law enforcement personnel in reading body language and detecting deception, and is a frequent guest on national television programs offering body language analysis of public figures, court testimonies and more.

On Monday, March 23, Constantine appeared on the Dr Phil Show and applied her expertise to comparing these most recent images to those of the girl prior to her disappearance. She draws on extensive experience in facial mapping and expressions to draw a conclusion as to whether or not the young woman seen in these surveillance photos are in fact Ashley Summers, eight years after her mysterious disappearance.

Constantine’s analysis does more than shed light on how these images affect the famously dead-ended case of the missing girl from Cleveland, though. Her appearance also offers fascinating insights into how law enforcement professionals evaluate, compare, contrast, and “read” images to determine whether they offer solid evidence to support theories and hunches.

Watch Constantine decipher the clues provided by facial recognition techniques and examining body language, and hear her verdict on Dr. Phil.

Click here to view the embedded video.

The post Susan Constantine Appears on Dr Phil Show about Ashley Summers appeared first on Social Engineering Blogs.

]]>