Overall research aims
The Visuo-Motor Lab investigates interactions between visual perception and eye movements using psychophysical and eyetracking methods. The current focus is on understanding visuo-motor learning and adaptation in performing tasks consolidated over human evolution (e.g. dealing with re-afferent motion during pursuit), one’s lifetime (e.g. crossing the road), or over a few hours of training (e.g. gaze-typing). Finally, we attempt to bridge the gap between fundamental and applied research by adapting our paradigms to immersive virtual reality (VR).
Eye movement learning and adaptation
We move our eyes very efficiently and constantly to gather visual information. This is done effortlessly and mostly unconsciously, hiding the constant adjustment of eye movements that ensures accurate acquisition of visual targets (Souto, Gegenfurtner and Schütz, 2016). We have studied the role of top-down control and visual uncertainty on saccadic adaptation, showing that task-demands alone can generate saccadic adaptation. More recently, we have extended the investigation of eye movement learning to the more complex task of controlling a human-computer interface with gaze.
Saccadic adaptation: Visual and attentional control
Gaze-typing: learning and inhibitory control
Gaze-interaction is a form of assistive communication technology that proved invaluable in allowing efficient communication in conditions such as motor neuron disease. We recently obtained funding from the British Academy to study the involvement of inhibitory control in learning to type with gaze. We benefited from the guidance of Special Effects and ACE centre in Oxford. The preregistered project can be found here (Open Science Framework).
We will be presenting this work at the European Conference on Eye Movements in Alicante (August, 2019).
We rely on visual attention to select the objects that enter consciousness and guide action. In the past we have explored the coupling between visual attention and eye-movements and the perceptual and oculomotor effects of our attention being drawn by visual transients.
Visual attention and eye movements coupling
There is a well-known coupling between the allocation visual attention resources and the programming of eye movements as demonstrated by the brain functional anatomy and behaviour. In several publications we have explored the extent of this coupling in the context of saccadic and smooth pursuit eye movements.
Souto, D., & Kerzel, D. (2014). Ocular tracking responses to background motion gated by feature-based attention. Journal of Neurophysiology, 112 (5) 1074-1081; doi:10.1152/jn.00810.2013 [full-text][pdf]
Perceptual and oculomotor effects of exogenous attention
With colleagues in the University of Geneva we have looked at how distracting visual transients can disrupt current eye movement plans and impact perceptual appearance and discrimination performance.
Kerzel, D., Born, S., & Souto, D. (2010). Inhibition of steady-state smooth pursuit and catch-up saccades by abrupt visual and auditory onsets. Journal of Neurophysiology, 104(5), 2573-2585.[html][pdf]
Motion processing and perception
Internalized physics and ambiguous motion
Souto, D., Smith L., & Bloj, M. (2018, March). Where the rubber meets the road: Visually-inferred friction. Poster presented at the Applied Vision Association Meeting, University of Bradford, UK.[Souto.Smith.Bloj.AVA]
Souto, D., & Kerzel, D. (2013). Like a rolling stone: Naturalistic visual kinematics facilitate tracking eye movements. Journal of Vision, 13(2), 1-12.[html][pdf]
Motion perception during smooth pursuit eye movements
How does the visual system achieve coherent perception of an object’s motion while the eyes themselves are moving? We used a new paradigm to address the question, using multiple-aperture arrays. They allow us to test motion coherence during pursuit which should be unaffected by location information. Reflexive eye movement and perceptual judgements indicated a strong asymmetry in processing global motion during pursuit, suggesting the visual system downplays the influence of motion opposite to pursuit, likely because it is dominated by re-afferent (self-induced) information.
Video demos can be downloaded on figshare. To see the low-contrast stimulus you will need to enlarge the video and play it several times before you are able to see coherent motion. The example below shows coherent motion opposite and downward from pursuit direction. Compare this to the second video, showing global motion in the direction of pursuit.
Coherent motion in the direction of pursuit
Apparent motion and masking
Motion perception in immersive VR
During her PhD, Jennifer Sudkamp will use immersive VR to investigate gaze-control and motion perception during road-crossing.
See also the ambitious interdisciplinary project to reconstruct the Vauxhall gardens led by Andrew Hugill combining museum studies, history/literature, engineering and psychology.