Member-only story
Computers Can Understand Our Emotions Now?
As dumb as we thought them previously, new programming is bringing computer understanding of humans to reality.
Body language is one area where social scientists believe they have access via behavioral indicators to someone's interior life, motivation, or emotions. The research first began with Ray Birdwhistell in 1952 and continues to this day. Birdwhistell coined the term "kinesics." It means “facial expression, gestures, posture, and gait, as well as visible arm and body movements.” He believed words carried only 30 to 35 percent of the social meaning of a conversation or an interaction. But now there's a new fillip in technology, which is not without its concerns and flaws.
Based on commonly held beliefs, people can easily determine someone's emotional state by observing facial movements, also called emotional expressions or facial expressions. This assumption is used to make decisions about the law, policies, national security, and education.
The assumption also affects how psychiatric illnesses are diagnosed and treated. Finally, it affects how people interact with each other, and research in science areas like computer vision, neuroscience, and artificial intelligence is probing the depths of behavioral delineators of internal aspects of life.