- Overview
-
The overarching research goal is to understand neurocognitive processes underlying dynamic interactions between human visual perception and goal-directed action. Sometimes, visual perception and action can be arbitrarily linked based on what we have learned over and over again (e.g., seeing "red" from a traffic light, we "stop" and seeing "green," we "keep moving"). However, some functional links between perception and action can be somewhat implicit (and automatic) and hard-wired. For example, seeing a group of people with angry faces is naturally associated with avoidance actions, whereas people with happy, friendly faces tend to motivate approach actions. My research investigates how different brain parts coordinate countless connections between perception and action to mediate motor behaviours in various physical and social contexts. I also study neurocognitive bases of deficient perception-action links in impaired visuomotor coordination skills (as observed in many visual or motor deficits) or maladaptive social behaviours (e.g., excessive avoidance behaviour, which is a hallmark of many emotional disorders). My research employs psychophysics and neuroimaging methods, such as MEG (magnetoencephalography) and functional MRI in adults and children with typical and atypical brain development.
- Publications
-
Americans weigh an attended emotion more than Koreans in overall mood judgments
Scientific Reports
DOI: 10.1038/s41598-023-46723-7
2023The effect of masks on the emotion perception of a facial crowd
Scientific Reports
DOI: 10.1038/s41598-023-41366-0
2023Inconsistent attentional contexts impair relearning following gradual visuomotor adaptation
Journal of Neurophysiology
Im, H.Y. and Liddy, J.J. and Song, J.-H.
DOI: 10.1152/jn.00463.2021
2022An explicit investigation of the roles that feature distributions play in rapid visual categorization
Attention, Perception, and Psychophysics
Im, H.Y. and Tiurina, N.A. and Utochkin, I.S.
DOI: 10.3758/s13414-020-02046-7
2021Differential neurodynamics and connectivity in the dorsal and ventral visual pathways during perception of emotional crowds and individuals: a MEG study
Cognitive, Affective and Behavioral Neuroscience
Im, H.Y. and Cushing, C.A. and Ward, N. and Kveraga, K.
DOI: 10.3758/s13415-021-00880-2
2021Fast saccadic and manual responses to faces presented to the koniocellular visual pathway
Journal of Vision
Kveraga, K. and Im, H.Y. and Ward, N. and Adams, R.B.
DOI: 10.1167/jov.20.2.9
2020Magnocellular and parvocellular pathway contributions to facial threat cue processing
Social Cognitive and Affective Neuroscience
Cushing, C.A. and Im, H.Y. and Adams, R.B. and Ward, N. and Kveraga, K.
DOI: 10.1093/scan/nsz003
2019Spatial and feature-based attention to expressive faces
Experimental Brain Research
Kveraga, K. and De Vito, D. and Cushing, C. and Im, H.Y. and Albohn, D.N. and Adams, R.B.
DOI: 10.1007/s00221-019-05472-8
2019Differential magnocellular versus parvocellular pathway contributions to the combinatorial processing of facial threat
Progress in Brain Research
Adams, R.B. and Im, H.Y. and Cushing, C. and Boshyan, J. and Ward, N. and Albohn, D.N. and Kveraga, K.
DOI: 10.1016/bs.pbr.2019.03.006
2019Neurodynamics and connectivity during facial fear perception: The role of threat exposure and signal congruity
Scientific Reports
Cushing, C.A. and Im, H.Y. and Adams, R.B. and Ward, N. and Albohn, D.N. and Steiner, T.G. and Kveraga, K.
DOI: 10.1038/s41598-018-20509-8
2018Sex-related differences in behavioral and amygdalar responses to compound facial threat cues
Human Brain Mapping
Im, H.Y. and Adams, R.B. and Cushing, C.A. and Boshyan, J. and Ward, N. and Kveraga, K.
DOI: 10.1002/hbm.24035
2018Correction to: Cross-cultural and hemispheric laterality effects on the ensemble coding of emotion in facial crowds
Culture and Brain
Hee Yeon Im and Sang Chul Chong and Jisoo Sun and Troy G. Steiner and Daniel N. Albohn and Reginald B. Adams and Kestutis Kveraga
DOI: 10.1007/s40167-017-0058-7
2017Anxiety modulates perception of facial fear in a pathway-specific, lateralized, manner
bioRxiv
Im, H.Y. and Adams, R.B. and Boshyan, J. and Ward, N. and Cushing, C.A. and Kveraga, K.
DOI: 10.1101/141838
2017Observer's anxiety facilitates magnocellular processing of clear facial threat cues, but impairs parvocellular processing of ambiguous facial threat cues
Scientific Reports
Im, H.Y. and Adams, R.B. and Boshyan, J. and Ward, N. and Cushing, C.A. and Kveraga, K.
DOI: 10.1038/s41598-017-15495-2
2017Differential hemispheric and visual stream contributions to ensemble coding of crowd emotion
Nature Human Behaviour
Im, H.Y. and Albohn, D.N. and Steiner, T.G. and Cushing, C.A. and Adams, R.B. and Kveraga, K.
DOI: 10.1038/s41562-017-0225-z
2017Cross-cultural effects on ensemble coding of emotion in facial crowds
bioRxiv
Im, H.Y. and Chong, S.C. and Sun, J. and Steiner, T.G. and Albohn, D.N. and Adams, R.B. and Kveraga, K.
DOI: 10.1101/141861
2017Long lasting attentional-context dependent visuomotor memory.
Journal of Experimental Psychology: Human Perception and Performance
Hee Yeon Im and Patrick Bédard and Joo-Hyun Song
DOI: 10.1037/xhp0000271
09/2016Grouping by proximity and the visual impression of approximate number in random dot arrays
Vision Research
Im, H.Y. and Zhong, S.-H. and Halberda, J.
DOI: 10.1016/j.visres.2015.08.013
2016PsiMLE: A maximum-likelihood estimation approach to estimating psychophysical scaling and variability more reliably, efficiently, and flexibly
Behavior Research Methods
Odic, D. and Im, H.Y. and Eisinger, R. and Ly, R. and Halberda, J.
DOI: 10.3758/s13428-015-0600-5
2016Encoding attentional states during visuomotor adaptation
Journal of Vision
Hee Yeon Im and Patrick Bédard and Joo-Hyun Song
DOI: 10.1167/15.8.20
06/2015Ensemble statistics as units of selection
Journal of Cognitive Psychology
Im, H.Y. and Park, W.J. and Chong, S.C.
DOI: 10.1080/20445911.2014.985301
2015Mean size as a unit of visual working memory
Perception
Im, H.Y. and Chong, S.C.
DOI: 10.1068/p7719
2014The effects of sampling and internal noise on the representation of ensemble average size
Attention, Perception, and Psychophysics
Im, H.Y. and Halberda, J.
DOI: 10.3758/s13414-012-0399-4
2013Computation of mean size is based on perceived size
Attention, Perception, and Psychophysics
Im, H.Y. and Chong, S.C.
DOI: 10.3758/APP.71.2.375
2009The Influence of Depth Context on Blind Spot Filling-in
Korean Journal of Cognitive Science
¿¿¿ and ¿¿¿ and ¿¿¿ and Sang Chul Chong and ¿¿¿
DOI: 10.19066/cogsci.2007.18.4.002
2007Ensemble coding of crowd emotion: Differential hemispheric and visual stream contributions
Hee Yeon Im and Daniel N. Albohn and Troy G. Steiner and Cody A. Cushing and Reginald B. Adams and Kestutis Kveraga
DOI: 10.1101/101527 - Research
-
Temporal dynamics of neural communication underlying perception-action link
Face perception naturally triggers goal-directed actions based on social, behavioural motivations to either ‘approach for more exploration’ or ‘avoid danger.’ For example, seeing angry mobs or panicked crowds will alert us to potential dangers and motivate immediate avoidance behaviours. Detection of threats from others’ faces must occur rapidly, since delays are often costly and maladaptive. This project examines groups of brain regions engaged in action-oriented (and time-sensitive) visual processing and timing of neural computation that triggers immediate connections between vision and action. We also examine the effects of anxiety on the functional interactions among widely distributed brain regions underlying the integration of visual perception, emotion, and goal-directed action.Visual perception of ensembles and objects facilitated by different action goals
This project examines how the human brain is wired up to mediate different units of perception - individual objects and ensembles - as a means of managing efficient and flexible descriptions of the visual world. The visual system quickly extracts a higher-order summary description (e.g., average, variance, or numerosity of sets) from an image containing many objects, while it also perceives a few objects as separate entities at the same time. The two different representations provide complementary visual information about parts and the whole of an image. Using fMRI, MEG, and psychophysics, we examine neural computations underlying the different types of perception. We also investigate when the brain selectively prioritizes global, ensemble perception over the individual object or the other way around, depending on the current action goal.Functional connectivity in children with developmental disorders
In this project, we are investigating patterns of functional connectivity among different brain networks in children with developmental disorders (e.g., dyslexia) compared to typically developing children.MEG decoding of children’s brains
This decoding project is to discover brain dynamics in perceiving spoken languages during the early period of life (6 months - 4 years). Based on our preliminary MEG decoding data, we expect to be able to characterize fine-scale time courses of different Japanese speech sounds encoded across the superior temporal plane of the children’s brains. We can also determine functional brain connectivity emerging in children with vs. without previous exposure to the Japanese language. In addition to contributing to developmental neuroscience, our work may provide valuable information for speech-language pathology and designing a device for replaying spoken speech sounds encoded in the brain to assist children with hearing or specific language impairments. We also hope to develop and extend this approach to test other perceptual abilities, including understanding facial expressions and reading other’s actions and intentions.Research Group MembersAkosua Asare, Doctoral Candidate
Max Garson, Research Assistant
Zahra Kheradmandsaadi, Doctoral Student
Enda Tan, Postdoctoral Fellow
Congratulations CIHR Spring 2024 Project Grant recipients
Congratulations to the BC Children’s Hospital Research Institute (BCCHR) and Women’s Health Research Institute (WHRI) investigators and their teams who were awarded $5 million in funding through the Canadian Institutes of Health Research (CIHR) Spring 2024 Project Grant competition.