Skip Menu

Return to Skip Menu

Main Content

infant-Language Emotion Attention Perception

Welcome to the Infant – Language Emotion Attention Perception Lab!

We extend a warm welcome to TYLER MCFAYDEN - our new graduate student in Fall 2016.  Tyler is interested in the intersection of early social development, social motivation, emerging language skills, and autism.  She is a member of the Clinical Science graduate program and working with Dr. Panneton in the iLEAP Lab.  So glad to have you here, Tyler!

WANTED: A highly motivated, energetic, and developmentally-oriented graduate applicant for Fall 2017!  Please contact Dr. Panneton at or (540) 231-5938 if you'd like to learn more about our lab, our Developmental Science program, our department, and our university.


The i-LEAP Laboratory conducts research focused on infants' language, emotion, attention and perception during the first two years after birth.  We are interested in a variety of research questions having to do with factors influencing how, when, and where infants' attention is directed in situations involving communication partners, objects, or both.

Current Projects:

  • How does the emotion of a speaker affect infants’ ability to learn the relationship between words and objects?  Is this relationship influenced by the mother’s predominant emotional style?
  • Do infants process emotion in faces the same way their mothers do?
  • How does a speaker’s eye gaze control infants’ attention to objects as they are being labeled?
  • How do infants’ learning styles at 3, 6, 9, 12, and 18 months of age relate to their emerging cognitive, social, and language skills? 
  • Is they relationship between learning style and developmental skill mediated by risk factors (e.g., low SES)?
  • How do infants process static vs. dynamic visual and auditory displays of emotion?
  • Does attention control positively predict language skill in a social context?

Current Methods:

  • Behavioral attention: Infants’ visual preferences for or discrimination of various auditory and visual events on a large screen
  • Heart rate activity: Infants’ pattern of heart rate during familiar and novel auditory and visual events (e.g., slower heart rate = more attention)
  • Eye tracking: Infants’ precise scanning patterns of auditory/visual events on a screen (including pupil size)
  • Joint attention: Infants’ interest in pictures and toys that an adult is looking at and/or talking about.  Also includes free play with mother/father in a room with books and toys.

We welcome all infants between the ages of 3 and 18 months to participate in our projects. We are located in the Department of Psychology (Williams Hall) on the campus of Virginia Tech in Blacksburg, Virginia. Please call us at 231-3972 if you are interested, or email us at