Become a Friend of Aeon to save articles and enjoy other exclusive benefitsSupport Aeon
In every conversation, there is an unspoken code – a set of social rules that guides you. When to talk, when to stop talking, when to listen, and where to look. But what happens to people with aphasia, a communication disorder that can occur after a stroke, brain injury or tumour, causing patients to make mistakes and misunderstand language?
As a clinical researcher working in the field of neuro-rehabilitation at the University Hospital of Bern, I have always been interested in the consequences of aphasia in patients with acquired brain lesions. The resulting impairment of language has a severe impact on quality of life as it impedes social interaction.
My latest research points toward a behavioural workaround for those now locked out: the more complex the sentence structure, the harder it is for aphasia patients to respond to cues and take turns in a conversation. To get past the disorder, conversants must use simple sentences, repeat words, and slow down.
Conversation is a complex interplay involving the production and comprehension of speech, along with non-verbal components such as gesture, facial expression and gazing. With a grant from the Swiss National Science Foundation, our team in Bern had the chance to study the interaction of verbal and non-verbal behaviour during conversation, in healthy subjects and in patients with aphasia, by tracking the movement of their eyes. The analysis of eye movements and fixations is a well-established technique for studying the real-time processing of ongoing conversations because these elements shift every time one speaker stops and another takes a turn. The gaze signals to other speakers when one is concluding a statement, and the partner may chime in. At the transitions between conversational turns, participants shift their gaze.
The conversation turn-taking system has been known for many years. I think of it as a speech-exchange system that organises the opportunities to speak during social interaction. A speaker can either actively pass the turn to the next speaker – or the turn can be actively taken by the listener at the next possible completion. Following these rules ensures that there is only one speaker at a time during a dialogue. Obviously, for the listener to take a turn requires that he stays attuned to the end of a conversational turn. This ability relies on the knowledge of the structure of the linguistic units, which enables us to project the ending in advance – and is limited in those with aphasia.
What happens to aphasia patients, who have so much trouble jumping in? To find out, we started by looking at sentence complexity, measured through what we call the ‘lexico-syntactic complexity index’ – a formula that calculates the number of words in each sentence, and the load of verbal complexity in a given statement overall. More common words are perceived at much lower speech-to-noise ratios than less common words.
In our study, patients with aphasia and healthy controls observed videos of dialogues. In the videos, two actors stood at a table discussing a theme from daily life, such as cooking, sports or shopping. While the participants watched the videos, we recorded their eye movements, analysing how they changed with speech complexity. Based on the movements, we determined that those with aphasia had trouble predicting the end of a statement; the more complex the sentences became, the greater the challenge that the patient faced. This was in stark contrast to healthy subjects, where higher syntactic complexity lead to more accurate gaze shift, indicating an ability to jump in as called for, and keep the conversation alive.
The lesson here is that, when speaking to aphasic people, one will communicate better with simpler sentences and even repetition of words. Without this adaptation, the aphasic dialogue partner has no chance to take over the conversation – and communication dries up.
We are taking our studies further. The next step involves examining conversational turn-taking and gaze between aphasic patients and examiners directly. In our new studies, patients and controls no longer observe videos, but engage in dialogue directly. Participants wear a helmet-mounted eye-tracking system that measures gaze as the interaction occurs. Using this technique, we are starting to evaluate how participants explore the faces of their conversation partners for emotions and other subtle cues – and hope to give those with aphasia further communication tools.