February 2020

How do people with dementia and their carers use Alexa-type devices in the home?

We have a a fully funded PhD position available (deadline 6th March 2020) to work with myself, Prof. Charles Antaki and Prof. Liz Peel in collaboration with The Alzheimer’s Society to explore the opportunities, risks and wider issues surrounding the use of AI-based voice technologies such as the Amazon Echo and home automation systems in the lives of people with dementia.

Voice technologies are often marketed as enabling people’s independence. For example, Amazon’s “Sharing is Caring” advert for its AI-based voice assistant Alexa shows an elderly man being taught to use the ‘remind me’ function of an Amazon Echo smart speaker by his young carer. But how accessible are these technologies in practice? How are people with dementia and carers using them in creative ways to solve everyday access issues? And what are the implications for policy given the consent and privacy issues?

The project will combine micro and macro-levels of analysis and research. On the micro-level, the successful applicant will be trained and/or supported to use video analysis to study how people with dementia collaborate with their assistants to adapt and use voice technologies to solve everyday access issues. On the macro-level, the project will involve working on larger scale operations and policy issues with Ian Mcreath and Hannah Brayford at The Alzheimer’s Society and within the wider Dementia Choices Action Network (#DCAN).

Through this collaboration, the research will influence how new technologies are used, interpreted and integrated into personalised care planning across health, social care and voluntary, community and social enterprise sectors.

The deadline is the 6th March 2020 (see the job ad for application details). All you need to submit for a first round application is a CV and a short form, with a brief personal statement. We welcome applications from people from all backgrounds and levels of research experience (training in specific research methods will be provided where necessary). We especially welcome applications from people with first hand experience of disability and dementia, or with experience of working as a formal or informal carer/personal assistant.

This research will form part of the Adept at Adaptation project, looking at how disabled people adapt consumer AI-based voice technologies to support their independence across a wide range of impairment groups and applied settings.

The successful applicant will be supported through the ESRC Midlands Doctoral Training Partnership, and will have access to a range of highly relevant supervision and training through the Centre for Research in Communication and Culture at Loughborough University.

Feel free to contact me on s.b.albert@lboro.ac.uk with any informal inquiries about the post.

How do people with dementia and their carers use Alexa-type devices in the home? Read More »

Adept at Adaptation

AI and voice technologies in disability and social care

There is a crisis in social care for disabled people, and care providers are turning to AI for high-tech solutions. However, research often focuses on medical interventions rather than on how disabled people adapt technologies and work with their carers to enhance their independence.

This project explores how disabled people adapt consumer voice technologies such as the Amazon Alexa to enhance their personal independence, and the wider opportunities and risks that AI-based voice technologies may present for future social care services.

We are using a Social Action research method to involve disabled people and carers in shaping the research from the outset, and conversation analysis to examine how participants work together using technology (in the broadest sense – including language and social interaction), to solve everyday access issues.

The project team includes myself, Elizabeth Stokoe, Thorsten Gruber, Crispin Coombs , Donald Hislop, and Mark Harrison.

Background

Voice technologies are often marketed as enabling people’s independence.

For example, a 2019 Amazon ad entitled “Morning Ritual” features a young woman with a visual impairment waking up, making coffee, then standing in front of a rain-spattered window while asking Alexa what the weather is like.

Many such adverts, policy reports and human-computer interaction studies suggest that new technologies and the ‘Internet of Things’ will help disabled people gain independence. However, technology-centred approaches often take a medicalized approach to ‘fixing’ individual disabled people, which can stigmatize disabled people by presenting them as ‘broken’, offering high-tech, lab-based solutions over more realistic adaptations.

This project explores how voice technologies are used and understood by elderly and disabled people and their carers in practice. We will use applied conversation analysis – a method designed to show, in procedural detail, how people achieve routine tasks together via language and social interaction.

A simple example: turning off a heater

Here’s a simple example of the kind of process we are interested in.

In the illustration below, Ted, who is about to be hosted out of his bed, gives a command to Alexa to turn off his heater (named ‘blue’) while his carer, Ann moves around his bed, unclipping the wheel locks so she can move it underneath the hoist’s ceiling track. Before Ann can move the bed, she has to put away the heater. Before she can put it away, it must be switched off.


Ann leaves time and space for Ted to use Alexa to participate in their shared activity.

While Ann could easily have switched off the heater herself before moving it out of the way and starting to push the bed towards the hoist, she pauses her activity while Ted re-does his command to Alexa – this time successfully. You can see this sequence of events as it unfolds in the video below.

Here are a few initial observations we can make about this interaction.

Firstly, Ann is clearly working with Ted, waiting for him to finish his part of the collaborative task before continuing with hers. By pausing her action, she supports the independence of his part of their interdependent activity.

Secondly, using conversation analytic transcriptions and the automated activity log of the Amazon device, we can study the sequences of events that lead up to such coordination problems. For example, we can see that when Ted says the ‘wake word’ Alexa, it successfully pauses the music and waits for a command. We can see how Alexa mishears the reference to the heater name ‘blue’, then in lines 7 and 8, it apologizes and gives an account for simply giving up on fulfilling the request and unpausing the music.

Alexa mishears Ted’s reference to the heater ‘blue’, then terminates the request sequence

These moments give us insight both into the conversation design of voice and home automation systems, and also into how interactional practices and jointly coordinated care activities can support people’s independence.

Thanks to

The British Academy/Leverhulme Small Research Grants scheme for funding the pilot project.

Adept at Adaptation Read More »