Subscribe nowjsy

Automated Diagnostic Toolkit for Dementia in 
Ageing Deaf Users of British Sign Language (BSL)


Welcome to the ATDA-BSL project funded by the Dunhill Medical Trust. This is a joint research activity between the Cognitive Computing Research Lab, University of Westminster and the Deafness Cognition and Language Research Centre, University College London.

The ATDA-BSL project aims to develop and validate an automated screening toolkit to improve early screening for Dementia among ageing individuals of BSL, thus assisting clinicians who have limited knowledge about BSL in diagnosing dementia in deaf people.

Around 50,000 Deaf people use British Sign Language (BSL) - a full natural language - unrelated to English -with its own grammar and vocabulary. Apart from family and friends, older signers may have limited opportunities for communication: carers and those in the wider community usually have little or no experience of signing, and interpreters are not available for daily communication. This can not only make communication with deaf elderly people who suffer from memory loss particularly challenging, but it may be difficult to identify dementia in a context of already limited communication with the wider hearing community (e.g., where difficulties in communication may be attributed to deafness rather than to cognitive problems).

Although there is no cure for dementia, a timely diagnosis helps in obtaining necessary support, appropriate medication, and maintenance, as far as possible, of engagement in intellectual, social and physical activities within the community. Early identification also enables support to be provided within the home setting, delaying the need to move to residential care.

WHAT IS BSL?

Sign Language are natural human languages, created by Deaf communities, and unrelated to spoken languages. They make use of
  • Hand actions
  • Face, mouth, and head movements
  • Body movements
British Sign Language (BSL) is the sign language used by Deaf people in Britain. Sign languages are articulated by the face and body as well as the hands in an envelope of space in front of the signer’s body. The Figure on the right shows a signing dictionary lookup from Sign Bank. It is clearly shown that normal sign space is within these grey areas.

Technology

Clinical observation suggests that there may be differences in the envelope of sign space and in movements of the face in signers with dementia compared to healthy signers. Signers with dementia may use a restricted sign space and limited facial expression compared to healthy deaf controls. 
  •  The first phase of our research work focusses on analysing the sign space envelope in terms of sign trajectories and sign speed, together with the facial expressions of deaf individuals using standard 2D videos freely available from the BSL Signbank dataset.

Software Development

  • Data
    • BSL Corpus video recordings of 60 signers aged over 50 from 8 regions of the UK in conversation and responding to interview questions
    • BSL Cognitive Screen norming data: video recordings as well as cognitive screen of 250 signers aged between 50-80 
    • Video recordings of case studies of signers with mild cognitive impairment and early stage dementia.
    • Standard 2D videos of single signs freely available on the BSL Signbank dataset
  • Platform

    Open source software  and Libraries are used in the current platform.  

    OpenCV https://opencv.org/

    NumPy http://www.numpy.org/

    Matplotlib  https://matplotlib.org/ 

    Tensorflow GPU  https://www.tensorflow.org/install/gpu

  • Methodology and Development
    • Based on CRoss-Industry Standard Process for Data Mining Standard 
    • Software Engineering approach is being Agile

    Implementation

    The Figure below shows the pipeline of our model. Two methods have been applied to evaluate the datasets in regards to sign trajectories before feeding the extracted features to the machine learning model. The highlighted section and the two dashed boxes indicate the two methods for the gesture tracking given RGB video stream as input. The videos below show results for two different baselines: 

    1. An image processing method (second videos) 
    2. A deep learning model using the OpenPose Algorithm (first video).
    3. Ground Truth data are also collected with the use of a Polhemus Magnetic tracker to measure the accuracy of the two methods (last set of videos).
  • Further Experimentation
    • Deep Neural Network Models will be used for the incremental improvement of dementia recognition rates based on the differences in patterns from facial and trajectories motion data.
    • Convolutional Neural Network/Recurrent Neural Network/Hybrid
    • Train/Validate the results with cognitive screening results 
Screening Dementia Pipeline
University of Westminster (UoW), 309 Regent Street, London, W1B 2HW, UK
University College London, Gower Street, London, WC1E 6BT 
The Dunhill Medical Trust, 5th Floor, 6 New Bridge Street, London, EC4V 6A

News & Events

BSL Class for Computer Science Researchers


On 12th November 2018 and for 5 consecutive weeks, 11am -1pm at UCL


In order to have a better understanding of BSL, our research team members attended an intensive course of six classes in BSL for computer science researchers at UCL. At the end of the course we were able to :

  • Understand narrations of simple information presented in BSL
  • Engage in simple conversations with Deaf People

RSLondonSouthEast 2019 Workshop

On Thursday 7th February 2019, 10am – 4.30pm, The Royal Society

We presented our initial findings to the one-day workshop of Research Software London and South East community.

The talk was about enhancing dementia screening in aging deaf signers of British Sign Language via Analysis of Hand Movement Trajectories. 

Contact Us

Contact us

Share by: