VESTIBULAR

Dissociating active from passive self motion from sound source motion

Jose Garcia Uceda Calvo (Project 2)

The aim of my research is to perform behavioral/psychophysical experiments on cross-sensory (auditory-vestibular) spatial behavior in healthy humans. In particular, the behavioral experiments concern human sound localization and its relation with the vestibular system. Humans can localise sounds accurately and consistently, and can make fast head movements towards static sound sources. However, real-world sound sources, such as animals, humans and objects are in constant motion. Moreover, the observer is moving too, by rotating and translating the head through space. My first research question is: Can the human auditory system localise and accurately trace moving sound sources? Second: How does the brain dissociate object motion from self-motion? Third: What is the weight of the vestibular signal in audio-vestibular integration during a whole-body rotation and/or during static tilt? Fourth: Can proprioceptive perturbation of the neck muscles alter the spatial representation of sounds?

Spatial orientation in health and disease

Antonella Pomante (Project 12)

Spatial orientation is the sense of body orientation and self-motion relative to the stationary environment, fundamental to normal waking behavior and control of everyday motor actions including eye movements, postural control, and locomotion. Spatial orientation and self-motion perception depends heavily on how the brain interprets signals mediated by visual, somatosensory and vestibular systems in combination with motor signals and internal beliefs. This project aims to understand this process by developing statistical models, and testing these models in healthy subjects and patients that lack particular sensory or motor functions.

Integrating ego motion in eye-hand coordination

Johannes Keyser (Project 13)

Voluntary eye-hand coordination primarily relies on signals mediated by our eyes, proprioception, and vestibular system. Acting based on these sensory sources means combining them with motor signals and internal beliefs, creating a closed sensorimotor control loop. Often, manual tasks are performed during whole-body movements, initiated either actively or passively. This project aims to understand this process by developing statistical models, and testing these models in healthy subjects and patients that lack particular functions of sensorimotor integration.


 

AUDITORY

Bimodal hearing in the hearing-impaired

Snandan Sharma (Project 9)

I am Snandan and with a gradual experience to hearing aid science through an acoustics based education I began PhD studies in Bimodal Hearing in HealthPAC in early 2015. Bimodal Hearing simply means two ways of hearing and i.e. electrical hearing through a cochlear implant (CI) and acoustic hearing via a hearing aid (HA). Presently I am studying sound spatial perception in people who listen through a CI in one ear and a HA in the same or contra-lateral ear. In order to ensure an optimal benefit to an individual using a hearing prosthesis it is important to understand which sound spatial cues they rely upon to localise sounds. This is achieved by performing sound localisation tasks in a dark sound proof room having a speakers arranged in 3D space. A participant’s response to a stimuli is measured through a head movement tracker. I am hoping to pilot first set of sound localisation experiments with Bimodal Listeners very soon. 

Bilateral cochlear implantation

Sebastian Ausili (Project 10)

So far, research concerning the effectiveness of bilateral cochlear implantation has mainly focused on primary auditive outcome measures, like speech perception, speech perception-in- noise and localization abilities. It has been shown that it is probable that bilateral implantation has added value compared to unilateral cochlear implantation in children and adults in these domains. However, the effects seem to be limited by the fact that CI’s are capable of transferring information regarding intensity and amplitude, but fall short regarding the transfer of temporal information. New coding strategies like fine structure processing may have some promise, but also new technology linking the two speech processors facilitating synchronisation, and microphone technology need to be explored. In the present project, we will conduct research evaluating primary auditive capacities in patients with current state of the art devices and strategies, combined with fundamental research on new hardware and strategies. Pilots, and clinical trials will be designed and conducted to explore and test new possibilities to extend the capacity of the implanted patients to improve their auditive capacity, making use of the squelch effect, ITD and ILD’s and spectral cues.


 

VISUAL

Early visual attention: a diagnostic measure for deficits after brain injuries

Ahmed Hisham Gardoh (Project 4)

Hemispatial neglect is a frequent disorder following stroke. Patients with hemispatial neglect fail to attend towards the sensory stimuli on the contralesional side of space. Importantly, the visual cortex in stroke patients with pure visual neglect is intact and the early visual processing is usually preserved. Strokes frequently damage the connection between the intact visual cortex and the higher level processing areas. Consequently, the sensory stimuli on one side of space fail to receive enough attention to exceed the threshold needed to reach awareness. We hypothesize that training voluntary control of attention using synchronous multisensory cues improves the functional outcome during rehabilitation of neglect patients. Information from multiple senses can circumvent the damaged routes so that the intact visual cortex can be functionally reengaged. Multisensory stimulation might induce enough attentional modulation of the activity in primary sensory cortex to reach awareness. The aim of my project is to test this hypothesis and perform some fundamental laboratory studies to investigate multisensory interactions and its interplay with the mechanisms of attentional control over perceptual selection. This provides a framework within which we are planning to devise multisensory stimulation-based rehabilitation tools to facilitate high-level sensory processing (e.g., attentional control) that can be used for hemispatial neglect patients in clinical settings.

Cognitive assessment and rehabilitation of visual-impaired patients

Leslie Guadron (Project 5)

Our study will employ computational modeling and behavioral studies to predict and examine the visuo-motor characteristics of visually impaired patients. It is believed that visual input plays an important role in determining an optimal saccade trajectory and generating the neural command that produces this movement. We aim to investigate the changes in saccade generation when visual information is lacking or incomplete, as is the case in patients with tunnel vision, foveal scotomas, or peripheral blind spots. We hope to determine precisely how visual input affects saccade planning and execution. What we learn can be applied for the early diagnosis of those at risk for vision loss and possibly for the improvement and/or assessment of retinal implants.

Visual rehabilitation after stroke

Anna Geuzebroek (Project 6)

Within the HealthPAC project, I am working on the rehabilitation of visual field defects after stroke. Our patients suffer from these visual field defects on a daily bases and until recently there is no effective rehabilitation offered. Visual restorative treatment is a new development within this field, where patients are training their visual field defect by repetitive visual stimulation. However, this treatment unfortunately is not able to help all patients. My project focuses on the development of new ways to improve this treatment and find new ways to help this patients.  In this light I am working on several projects. In the first project we use a saccadic target-selection task to assess the capabilities of the recovered visual field after the restorative treatment to compete with the healthy visual field.  We found that the patients show an abnormal behavior pattern in their 'unaffected' visual field. Hence, we further studied the saccadic parameters, specifically the reaction times, to find a possible explanation. Our results show that this abnormal behavior might indicated the adoption of a new decision-making strategy. In this new choice strategy, patients seem to trade off accuracy with the urgency to make a choice. In the second project I work on a new treatment in cooperation with Philips. Therefore, we are using a multimodal approach by combining visual stimulation with haptic stimulation to try to induce plasticity.

Top-down attention in health and disease

Andrea Bertana (Project 11)

Visual perceptual learning refers to a reliable improvement in performance on a visual perceptual task, due to experience with that task. Many studies have show how this type of learning is task and feature (orientation, motion direction, spatial frequency) specific. A recent study in our lab has also shown that perceptual learning can improve our ability to sample relevant portions of the stimuli, by increase our attentional window. In my current project I am interested to investigate the underlying neuronal properties linked to this type of learning. I am investigating that by using functional magnetic resonance imaging (fMRI) and focusing my attention on primary visual cortex. Data analysis involved standard and canonical fMRI analysis together with the use of ad­-hoc models in order to estimate attentional modulations.

 


 

MOTOR

Signatures of impaired inhibition in motor behavior in Parkinson's Disease 

Sonal Sengupta (Project 7)

The predominant model of basal ganglia-thalamo cortical dysfunction in PD explains the motor impairments as a result of over inhibition of motor cortex and brain stem circuits. However, there is evidence for increased excitability of motor cortex and abnormal motor behavior resulting from disinhibition. The main aim of the project is to address the effect of impaired inhibition in PD on the motor behavior of the patients. The manifestation of disinhibition in motor behavior will be investigated using the basic covenants of motor control and movement planning. The project will outline experiments in upper limb motor control/ reaching gait and eye movements.

Muscle mechanics and neural control in fine hand motor tasks in health and disease

Sigrid Dupan (Project 8)

Fine motor control of the hand influences daily activities as grasping, writing, or typing. Involuntary movement of the fingers that are not necessary for a certain task might therefore lead to inefficient movement or even interfere with the intended movement. While most human hand movement involves grasping, some require single finger movement. Motor control is characterized by simultaneous control of a large number of mechanical degrees of freedom. Peripheral and neural constraints may simplify the control for certain movements, but also enforce limitations on single finger mobility. The analysis of fine hand motor tasks in movement disorders might provide insight into the underlying mechanisms. Kinematic and EMG data allow the description of fine hand motor control in both health and disease. Hypotheses about central constraints will be tested by analyzing biophysical changes to cortical perturbations induced by transcranial magnetic stimulation (TMS).