Daniela Balslev

Ocular Proprioception

How does the brain monitor the direction in which one looks?


There are two main sources of eye position information. The efference copy of the command issued to the extraocular muscles or the corollary discharge and the afferent information from these muscles or eye proprioception. A role of the corollary discharge in visual localization had long been assumed because, in some animal models, visual localization is intact after sectioning the nerves that bring proprioceptive input from the eye muscles to the brain. Proprioception was believed to play no role in visual localization, but rather, update the corollary discharge over days.


I have developed and validated a safe method to test this sensory modality in humans using transcranial magnetic stimulation over the primary somatosensory cortex, mapped its pathways in the central nervous system ( cerebral cortex and the brainstem ), and reported on its role in sensing the direction of one's own gaze, which is a pre-requisite for locating visual objects relative to the body and orienting attention.


This contribution was recognised by a Magstim/University of Oxford Young Investigator Award and by an Attempto Prize at the Universityof Tuebingen. The research was funded by personal fellowships from the Danish Medical Research Councils and the EU.

Spatial Attention

Spatial Neglect

How do we attend to an object and what happens when this process breaks down in stroke patients with spatial neglect?


Organisms receive a wealth of sensory inputs that far exceed neural processing resources. To deal with the bottleneck, only the most important stimuli are selected. This is called attention. To attend to an object, one needs to know its location. All sensory modalities provide location cues, however, their coordinate systems differ, e.g. the location of visual objects is coded relative to the center of the retina while that of sounds is relative to the head midline. Retinal information is insufficient for an accurate perception of object location. The same retinal projection corresponds to different locations in the physical space, depending on the direction of gaze (Figure). To obtain a single estimate of location, the brain transforms between each sense’s coordinate system using information about the relative positions of our sensory organs. The information about the rotation of the eyes in the head is essential for this transformation.


A former PhD student, Barthel Odoj, and I reported that an eye position signal in the somatosensory cortex (eye proprioception or oculoproprioception) is specific for spatial attention. This signal is less important for locating external objects (where to reach) and more important for allocating internal priorities for perception (where to attend).


This discovery is exciting not only because it shows that a basic sensorimotor signal can play a selective role in cognition rather than in movement control, but also because it indicates a potential disease mechanism in spatial neglect. Spatial neglect is a common disorder in stroke survivors characterized by a lateral displacement in the focus of attention, often to the right of the body midline. We found that when patients with spatial neglect are given an attention cue, they do not attend to its exact location, but rather, slightly to the right of it. This error is present even in their right, "normal", hemispace. Could a disease mechanism in spatial neglect be an error in the cortical gaze direction signal? Our research predicts that such an error could selectively impact the brain's attentional priority map (Figure). Could one treat spatial neglect by targetting the oculoproprioceptive signal using repetitive transcranial magnetic stimulation?




Eilidh Sandilands

Hilary Chan

Ocular Alignment

Strabismus

How do the two eyes coordinate, to achieve one single, fused image of a visual object? Why cannot individuals with strabismus (squint or cross-eyes) achieve ocular alignment?


There are corresponding retinal elements (like twin pixels) in the two eye retinae. To achieve a fused image, a visual object must fall on corresponding retinal elements in the two eyes. Therefore, the normal development of these retinal elements as well as the correct alignment of the two visual axes are both pre-requisites for a fused image. 

The development of the corresponding retinal elements and the precise control of the eye movements both depend on processing sensory signals from the eyes towards appropriate oculomotor commands. 


Among these signals, ocularproprioception is the least investigated, probably because of the difficulty of stimulating the extraocular muscle proprioceptors in a safe and controlled way. 


Our recent functional magnetic resonance imaging (fMRI) study  suggests that oculoproprioception contributes to the coupling between the movement of the two eyes.  A brief passive lateral deviation of the right eye activates the extraocular motor nuclei that control the lateral movement of the left eye. 


In collaboration with colleagues at the University of Glasgow and NHS Fife, I will now investigate the neural mechanisms for the proprioceptive coupling between the two eyes and whether these mechanisms break down in children with some types of strabismus. The project is funded by the Medical Research Councils, UK (2025-2028). 





Patrick Faria

Alex Mitchell

Eye Dominance

What are the neural mechanisms underlying the (typically stable) preference for one eye? Could a stable eye dominance contribute to precise vision, for instance during reading?


Motor ocular dominance or eyedness is the preference for one of the two eyes when sighting for instance when shooting or playing golf. This preference is typically stable when tested under standard conditions. Eye-care professionals use it to correct age-related far-sightedness: they will likely determine the non-dominant eye and correct it to see close-up objects, while the dominant eye will be corrected to see in the distance. An interest in this topic was sparked by the findings of a less stable eyedness in up to two-thirds of individuals with dyslexia and the suggestion that the assessment of eyedness could form the basis of early diagnosis for this condition.


Hardly anything is known about the neural mechanisms that underlie eyedness. The dominant eye is not necessarily the eye with the best vision. A sensory modality that has so far been overlooked by this research is oculoproprioception, the sense of eye position and movement. Pei Wu, a PhD student, and I are now researching whether the dominant eye is the eye with the more reliable proprioception and whether optimal integration of proprioceptive information from the two eyes is important for accurate vision.




Pei Wu

Contact

Daniela Balslev


University of St Andrews

School of Psychology and Neuroscience

St Mary's College

South Street

St Andrews

KY16 9JP

UK


daniela.balslev@st-andrews.ac.uk












Last updated by Daniela on 07.10.2024, 19:46

With thanks to FinnÅrup Nielsen who designed this site.