Does my cough sound like COVID? There could be an app for that
It could be possible to detect whether someone has COVID-19 or not, just from the sound of their coughing.
That’s the conclusion of testing of an artificial intelligence (AI) algorithm developed by the Massachusetts Institute of Technology (MIT), which was able to detect around 98% of cases of COVID-19 from a forced cough delivered down a cell phone – confirmed by coronavirus testing.
Almost unbelievably, the neural network was also 100% effective in correctly diagnosing COVID-19 in people with no symptoms but who had tested positive for the virus, according to the MIT researchers, although the trade-off was a false positive rate of around 17% in this group.
The MIT Open Voice algorithm was put through its paces in more than 5,300 patients, finding a 97.1% accuracy rate overall, with 98.5% sensitivity and 94.2% specificity.
The finding ties in with anecdotal reports that COVID-19 causes a very distinctive sounding cough, although it will have to be thoroughly tested in additional studies to see if it could be useful as a screening tool.
If its value is confirmed however, it could provide a way to reduce the logistical burden and expense on healthcare systems around the world of providing coronavirus testing, according to the scientists, who have published the work in the IEEE Open Journal of Engineering in Medicine and Biology.
“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” says co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory.
The tool was built up from databases of sounds generated by human vocal cords, starting with simple words and sounds, then adding in variations for different emotional states and neurological conditions like Alzheimer’s.
The final stage was to develop a database of cough sounds that could pick up changes in lung and respiratory performance. All the components were then layered together alongside an algorithm to detect muscular degradation by distinguishing strong coughs from weaker ones.
The tool was originally designed to diagnose early-stage Alzheimer’s, but Subirana and colleagues decided to see if it could be repurposed for COVID-19 as the pandemic started to gather pace earlier this year.
“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa,” according to Subirana.
“It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state.”
In future, the tool could be refined to different age groups and regions of the world to improve accuracy even further, according to the research team.
So far, the researchers have collected more than 70,000 cough recordings, including around 2,500 submitted by people confirmed to have COVID-19.
They are working with an undisclosed company to develop a free pre-screening app based on their AI model, and have agreements with hospitals around the world to collect further cough recordings, to train and strengthen the AI model’s accuracy.
Other groups at Cambridge University, Carnegie Mellon University and UK health start-up Novoic have been working on similar projects, according to a BBC report, although some of these are reported to be having teething troubles.
Image by mohamed Hassan from Pixabay