Current Funded Research

Language-induced event-representation: competition and multiple object instantiation (Dr Yuki Kamide & Dr Anue Kukona)

In order to understand the events around us, our cognitive systems must encode and track the changes that individual objects undergo as these events unfold (or, if reading or hearing a description of those events, as the language unfolds) – our cognitive systems must encode the ‘before’ and ‘after’ states of any object affected by the event. This project explores the conditions under which these alternative encodings of the same object might interfere with one another in memory, and the neural structures that support these encodings. A range of converging methods are used, including fMRI to establish the neural underpinnings of these representations, eye tracking to establish the behavioural consequences of the interference postulated to occur with such representations, and the Speed Accuracy Tradeoff (SAT) task to explore the consequences of such interference on the time-course of activation of such representations. The work lays the foundation for a novel understanding of both normal and disordered language and cognitive development (in which the ability to keep apart different representations of the same object is essential), and will further our understanding of how brain maturation controls such development.

(Click here for a longer description of the project)

Prof Gerry Altmann (York), Dr Gitte Joergensen (York)

Funding Source:
Economic and Social Research Council (ESRC), 2011-14
(730,113 (FEC) jointly with York)

Dynamic representations of motion events in sentence processing (Dr Yuki Kamide & Dr Shane Lindsay)

When we hear or read a sentence that describes an object moving from one place to another, we have to keep in memory where the object was initially, then where it is after the movement, in order to understand the event described in the sentence. This project closely investigates how we construct such representations as the sentence unfolds. The project proposes a series of experiments that attempt to uncover potential ‘mental simulation’ processes in understanding linguistically described motion events. Some experiments, for example, investigate whether listeners' attention shifts can be modulated by the shape of the trajectory of the object in motion (e.g., the ball will thrown/rolled into the pond). To do so, various methods are used to track attention shifts: standard ‘visual-world’ eye-movement tracking techniques to explore listeners' overt attention shifts, as well as other methods to capture more 'covert' attention shifts, focusing on the time-course of the processing to establish how automatic these processes are. Other experiments also aim at comparing the processes by which people represent motion events in spoken language versus written language. Altogether the research will have implications for wider issues, such as cognitive representation theories, cognitive development, and language-vision interactions.

(Click here for a longer description of the project)

Dr Christoph Scheepers (Glasgow), Scott Gilmour (Glasgow)

Funding Source:
Economic and Social Research Council (ESRC), 2011-14
(452,449 (FEC) + PhD studentship jointly with Glasgow)

Other Research

Talker adaptation in sentence processing (Dr Yuki Kamide)

First language (L1) syntax and second language (L2) processing (Dr Yuki Kamide)

Prediction in language comprehension in context (Dr Yuki Kamide & Dr Anue Kukona)

Integration of what we see and what we hear (Dr Yuki Kamide & Dr Anue Kunoka)

Parsing (syntactic processing) (Dr Yuki Kamide)

Modelling language processing (Dr Anue Kukona)

Novel word learning (Dr Shane Lindsay)

Postgraduate Students Projects:

Processing words as objects (PhD project: Gavin Revie & Dr Yuki Kamide)

Spatial representations in language processing (PhD project, Glasgow: Scott Gilmour, Dr Christoph Scheepers & Dr Yuki Kamide)

Metaphor processing (MSc project: Anna Dobai & Dr Yuki Kamide)

Audience design in dialogue (MSc project: Eva Vousta & Dr Yuki Kamide)