Research themes

Overall research aims

The Visuo-Motor Lab investigates interactions between visual perception and eye movements using psychophysical and eyetracking methods. The current focus is on understanding visuo-motor learning and adaptation in performing tasks consolidated over human evolution (e.g. dealing with re-afferent motion during pursuit), one’s lifetime (e.g. crossing the road), or over a few hours of training (e.g. gaze-typing). Finally, we attempt to bridge the gap between fundamental and applied research by adapting our paradigms to immersive virtual reality (VR).

Eye movement learning and adaptation

We move our eyes very efficiently and constantly to gather visual information. This is done effortlessly and mostly unconsciously, hiding the constant adjustment of eye movements that ensures accurate acquisition of visual targets (Souto, Gegenfurtner and Schütz, 2016). We have studied the role of top-down control and visual uncertainty on saccadic adaptation, showing that task-demands alone can generate saccadic adaptation. More recently, we have extended the investigation of eye movement learning to the more complex task of controlling a human-computer interface with gaze.

Saccadic adaptation: Visual and attentional control

Souto, D., Gegenfurtner, K. R., Schütz, A. C. (2016). Saccade adaptation and visual uncertainty. Frontiers in Human Neuroscience, 10:3387, doi: 10.3389/fnhum.2016.00227 [full-text][pdf]

Schütz, A. C., Souto, D. (2015). Perceptual task induces saccadic adaptation by target selection. Frontiers in Human Neuroscience, 9:566. doi: 10.3389/fnhum.2015.00566 [full-text][pdf]

Schütz, A. C., & Souto, D. (2011). Adaptation of catch-up saccades during the initiation of smooth pursuit eye movements. Experimental Brain Research, 209(4), 537-549.[html][pdf]

Figure 1 of Souto, Gegenfurtner and Schütz (2016). Perception of a change in position of the target of an eye movement and eye movement adaptation (C) were measured simultaneously.

Gaze-typing: learning and inhibitory control

Gaze-interaction is a form of assistive communication technology that proved invaluable in allowing efficient communication in conditions such as motor neuron disease. We recently obtained funding from the British Academy to study the involvement of inhibitory control in learning to type with gaze. We benefited from the guidance of Special Effects and ACE centre in Oxford. The preregistered project can be found here (Open Science Framework). 

We will be presenting this work at the European Conference on Eye Movements in Alicante (August, 2019).

Visual attention

We rely on visual attention to select the objects that enter consciousness and guide action. In the past we have explored the coupling between visual attention and eye-movements and the perceptual and oculomotor effects of our attention being drawn by visual transients.

Visual attention and eye movements coupling

There is a well-known coupling between the allocation visual attention resources and the programming of eye movements as demonstrated by the brain functional anatomy and behaviour. In several publications we have explored the extent of this coupling in the context of saccadic and smooth pursuit eye movements.

Souto, D., & Kerzel, D. (2014). Ocular tracking responses to background motion gated by feature-based attention. Journal of Neurophysiology, 112 (5) 1074-1081; doi:10.1152/jn.00810.2013 [full-text][pdf]

Souto, D., & Kerzel, D. (2011). Attentional constraints on target selection for smooth pursuit eye movements. Vision Research, 51(1), 13-20.[html][pdf]

Kerzel, D., Born, S., & Souto, D. (2009). Smooth pursuit eye movements and perception share target selection, but only some central resources. Behavioural Brain Research, 201(1), 66-73.[html][pdf]

Kerzel, D., Souto, D., & Ziegler, N. E. (2008). Effects of attention shifts to stationary objects during steady-state smooth pursuit eye movements. Vision Research, 48(7), 958-969.[html][pdf]

Souto, D., & Kerzel, D. (2008). Dynamics of attention during the initiation of smooth pursuit eye movements. Journal of Vision, 8(14), 3 1-16.[html][pdf]

Perceptual and oculomotor effects of exogenous attention

With colleagues in the University of Geneva we have looked at how distracting visual transients can disrupt current eye movement plans and impact perceptual appearance and discrimination performance.

Kerzel, D., Schonhammer, J., Burra, N., Born, S., & Souto, D. (2011). Saliency changes appearance. PLoS One, 6(12), e28292.[html][pdf]

van Diepen, R. M., Born, S., Souto, D., Gauch, A., & Kerzel, D. (2010). Visual flicker in the gamma-band range does not draw attention. Journal of Neurophysiology, 103(3), 1606-1613.[html][pdf]

Kerzel, D., Born, S., & Souto, D. (2010). Inhibition of steady-state smooth pursuit and catch-up saccades by abrupt visual and auditory onsets. Journal of Neurophysiology, 104(5), 2573-2585.[html][pdf]

Motion processing and perception

Internalized physics and ambiguous motion

 Souto, D., Smith L., & Bloj, M. (2018, March). Where the rubber meets the road: Visually-inferred friction. Poster presented at the Applied Vision Association Meeting, University of Bradford, UK.[Souto.Smith.Bloj.AVA]

Souto, D., & Kerzel, D. (2013). Like a rolling stone: Naturalistic visual kinematics facilitate tracking eye movements. Journal of Vision, 13(2), 1-12.[html][pdf]

Motion perception during smooth pursuit eye movements

How does the visual system achieve coherent perception of an object’s motion while the eyes themselves are moving? We used a new paradigm to address the question, using multiple-aperture arrays. They allow us to test motion coherence during pursuit which should be unaffected by location information. Reflexive eye movement and perceptual judgements indicated a strong asymmetry in processing global motion during pursuit, suggesting the visual system downplays the influence of motion opposite to pursuit, likely because it is dominated by re-afferent (self-induced) information.

Video demos can be downloaded on figshare. To see the low-contrast stimulus you will need to enlarge the video and play it several times before you are able to see coherent motion. The example below shows coherent motion opposite and downward from pursuit direction. Compare this to the second video, showing global motion in the direction of pursuit. 

Souto, D., Chudasama, J., Kerzel, D., & Johnston, A. (2019). Motion integration is anisotropic during smooth pursuit eye movements. Journal of Neurophysiology, 121(5), 1787-1797.[html][pdf]
Coherent motion opposite to pursuit

Coherent motion in the direction of pursuit

Apparent motion and masking

When two dots are presented in succession we can see illusory motion between them. But, to what extent is illusory motion like the real thing? In a discrimination task we showed that illusory motion can mask a target presented at very specific points along the putative apparent motion path. See more here:

Souto, D., & Johnston, A. (2012). Masking and color inheritance along the apparent motion path. Journal of Vision, 12(7), 1-18.[html][pdf]

Motion perception in immersive VR

During her PhD, Jennifer Sudkamp will use immersive VR to investigate gaze-control and motion perception during road-crossing. 

See also the ambitious interdisciplinary project to reconstruct the Vauxhall gardens led by Andrew Hugill combining museum studies, history/literature, engineering and psychology.