Vision, Hearing, & Touch
Audiovisual Synchrony Perception.
This project investigates temporal processing of auditory and visual
information and the synchronization of those signals. Specifically,
we are looking at the ways in which the brain ensures a continuous
perception of multisensory synchrony in the presence of asynchronous
sensory inputs. We are currently focusing on the perception of synchrony
for ecologically-valid stimuli such as audiovisual speech and non-speech
signals like music as well as of simpler stimuli.
Timing & Temporal Order Perception.
How do we judge the relative timing of events arising in different
sensory modalities? Because sound, light, and odours take different
amounts of time to reach us through the environment, and because the
peripheral and central nervous systems process different forms of
energy (mechanical, electromagnetic, chemical) in different ways and
to different extents for individual sensory modalities, the question
of relative timing between the senses is critical. This project assesses
the influence of spatial location and divided attention upon the perception
of timing across vision, touch, and audition.
Auditory & Tactile Warning Signals in Real-World Simulations.
Can multisensory spatial cueing, as studied in psychophysical laboratories,
provide behavioural advantages in real-world situations, like driving
a car? Where should auditory or tactile cues be presented in order
to maximise behavioural advantages
when dealing with potentially life-threatening situations like head-on
or rear-impact collisions?
Synaesthesia is a rare condition in which people report, for example,
'seeing' a colour when they hear certain words, like the days of the
week, or numbers. Everyday language also uses cross-modal correspondences
to describe a variety of sensory experiences - tastes can be 'sharp'
and colours can be 'loud', for example. But can such synaesthetic
correspondences be demonstrated in normal individuals for simple stimuli,
such as brightness, size, colour, motion etc. This project investigates
a number of questions in this line.
The Body, Peripersonal Space, & Tool-use.
This project is investigating the crossmodal representation of the
body and the space around the body using a visual-tactile interference
paradigm. Evidence from psychophysics, neuropsychology and neurophysiology
suggest that the representation(s) of the body and peri-personal space
can be modulated by active tool-use. We are studying this modulation
in more detail.
Visual Capture of Proprioception.
By using prisms, mirrors, and television monitors we can dissociate
the seen and felt locations of individual body parts. To what extent
can the felt location of a body part be 'captured' by visual information
presented at different spatial locations? What kind and quality of
visual information is necessary for this capture to occur? To what
extent can we 'fool' the brain about the locations of individual body
Tactile Numerosity Judgements.
In vision, we can 'count' up to a certain number of objects almost
instantaneously, without having to enumerate or deliberate, just by
looking at the group of objects. Typically, people can do this (called
'subitising') for a maximum of around 5-7 objects. Can we also do
this for objects in other sensory modalities? We are trying to find
Vision & the
Do We Just Smell/Taste What We See?
The main goal of this project is to understand the interactions between
vision and odour and taste perception. Particularly, we are investigating
the influence of colour on odour and taste perception, and the level
of processing where any crossmodal integration may occur. Beside this,
we are interested in the study of the attentional influence of odour
perception on visual performance.