TV

Can natural language processing help detect dementia? This TV grad student is trying to find out

“Your speech can offer a lot of information and clues into how your brain is functioning,” says Katie Fraser, a PhD candidate in the department of computer science at the University of Toronto. 

“Dementia is often linked to language, and using today’s computational tools we can quickly evaluate a person’s speech.” 

Dementia is a disease affecting 47.5 million people worldwide (World Health Organization). Research has consistently shown that particular changes in speech and language can signal early onset of the disease. 

For Fraser, finding a computational solution for the detection of dementia has been the focus of her research and the idea behind the startup — software that uses natural language processing and machine learning technology to detect signs of dementia using speech samples.

“It’s important to get this research out of the academic sphere and into hands of people who can actually benefit from it,” says Fraser. “I think the best way to do this is to develop a product that people can use.”

Fraser, who specializes in computational linguistics, is supervised by computer science Professor and , an assistant professor in the department of psychology and a scientist with the Rotman Research Institute at the Baycrest Centre for Geriatric Care. 

Several years ago, Hirst, his graduate student Xuan Le, and department of English Professor Ian Lancashire, in consultation with Regina Jokel, assistant professor department of speech-language pathology, used natural language processing techniques to examine the writing of authors Iris Murdoch (who is known to have died of Alzheimer’s), P.D. James (who did not have Alzheimer’s) and Agatha Christie. Hirst says Christie’s writing showed an incredible decline in her vocabulary in her last books, but surprisingly, not in her syntax. They concluded Christie likely died of the disease. 

“Our thinking in those days was that what matters is change in a person over time,” says Hirst. “That axiom is not as clear-cut as we thought. Our more recent work is a lot less longitudinal.” 

Hirst says that aging inevitably leads to changes in language. Older people do talk more abstractly, and have minor vocabulary-seeking problems. People with dementia rapidly slide down in their vocabulary and noun to verb ratio, and their syntax becomes simpler.  

As part of her thesis research, Fraser conducts research on text and speech processing to look for symptoms of dementia. She collects speech samples and analyzes them through speech recognition software and machine learning classifiers. Machine learning classifiers are algorithms that have been trained to classify data into a category or class. In this case the classifiers have been trained to distinguish between a healthy individual and a person with dementia. 
 
Fraser founded Winterlight Labs alongside her colleagues who share similar research interests in computer science and healthcare. 

“We had discussed trying to build a company by bringing together all of our expertise.” 

The startup was created in the summer of 2015 and includes computer science assistant professor, status-only, , a scientist with the Toronto Rehabilitation Institute – an expert in acoustic processing; , whose experience includes Alzheimer’s research, and holds both a degree in computer science and a master’s degree from TV’s Institute of Medical Science; and computer science master’s student , a computational linguistics student with a background in software engineering and assistive technologies.

Winterlight’s current prototype works by recording a one-to-five-minute speech sample of a person describing an image. The speech sample is then transcribed using automatic speech recognition. Various features of the speech and text sample, including acoustic, lexical, syntactic and semantic aspects are extracted and examined. These features are then put into the machine learning classifier. The prototype has an 81 per cent accuracy rate in classifying speech samples.

In an interview with Faculty of Medicine writer Carolyn Morris, Rudzicz explained how the technology is able to detect cognitive impairment through observations of diction and syntax.

“When it comes to word choice and grammar, for example, one of the signs of cognitive impairment would be the use of simple verbs versus gerunds — so “the kid runs” instead of “there’s a kid running.” People with early signs of dementia will often use pronouns instead of more specific nouns. So “she is washing dishes” instead of “The mother is washing dishes.” Then there’s the interpretation of the images. So, for example, in what we call the “cookie theft” image there’s a woman in a kitchen, an overflowing sink and kids reaching up to steal cookies. Someone with cognitive impairment might notice “a kid on a stool,” but not take that next step to point out that “the son is trying to steal cookies.” Or they’ll comment on the overflowing sink, but not on the woman failing to notice it.”

“Our technology has a high level of accuracy and the software is applicable to not only screening for dementia but also early detection and monitoring the disease over time,” says Rudzicz. “It’s truly inspiring to be part of such an innovative project and with such potential to improving the future of health care.”

Neuropsychological testing for dementia can only be administered once every six months because people are able to learn the test and get better at it over time. Winterlight uses different images as part of their testing diminishing the chance of a learning effect. 

“Being able to get such rich information with such a short, easy and cost-effective test is very exciting for us,” says Fraser. “I find it motivating to work on an issue that affects so many people. Every time I go to a conference or event where I present my work, I always meet people who have family that has been affected by dementia. It’s a terrible condition.”  

Winterlight Labs receives support from three University of Toronto’s accelerator hubs: Rotman’s , the Faculty of Medicine’s (H2i) and the (DCSIL).

The team is starting to conduct pilot studies involving seniors living in Toronto retirement homes this summer. The plan is to obtain more data to further extend their research findings and in the near future, deploy their software in medical care settings. 

“We are not necessarily limiting ourselves to just dementia,” says Fraser. “But our long-term goal is to be able to use this technology on other mental and cognitive health issues such aphasia, depression, and development disorders like autism. Our goal is to monitor cognition through speech”.  

The Bulletin Brief logo

Subscribe to The Bulletin Brief