Original article:Although it may seem difficult for adults to understand what an infant is feeling, a new study from Brigham Young University finds that it's so easy a baby could do it.Psychology professor Ross Flom's study, published in the academic journal Infancy, shows that infants can recognize each other's emotions by five months of age. This study comes on the heels of other significant research by Flom on infants' ability to understand the moods of dogs, monkeys and classical music."Newborns can't verbalize to their mom or dad that they are hungry or tired, so the first way they communicate is through affect or emotion," says Flom. "Thus it is not surprising that in early development, infants learn to discriminate changes in affect."Infants can match emotion in adults at seven months and familiar adults at six months. In order to test infant's perception of their peer's emotions, Flom and his team of researchers tested a baby's ability to match emotional infant vocalizations with a paired infant facial expression.
Mariana Vaillant-Molina1, Lorraine E. Bahrick1, Ross Flom. Young Infants Match Facial and Vocal Emotional Expressions of Other Infants. Infancy.
Article first published online: 25 MAR 2013, DOI: 10.1111/infa.12017Abstract: Research has demonstrated that infants recognize emotional expressions of adults in the first half year of life. We extended this research to a new domain, infant perception of the expressions of other infants. In an intermodal matching procedure, 3.5- and 5-month-old infants heard a series of infant vocal expressions (positive and negative affect) along with side-by-side dynamic videos in which one infant conveyed positive facial affect and another infant conveyed negative facial affect. Results demonstrated that 5-month-olds matched the vocal expressions with the affectively congruent facial expressions, whereas 3.5-month-olds showed no evidence of matching. These findings indicate that by 5 months of age, infants detect, discriminate, and match the facial and vocal affective displays of other infants. Further, because the facial and vocal expressions were portrayed by different infants and shared no face–voice synchrony, temporal, or intensity patterning, matching was likely based on detection of a more general affective valence common to the face and voice.