AI on par with dermatologists at diagnosing skin cancer
Artificial intelligence is once again proving its value in assisting human doctors with diagnosing diseases - this time in skin cancer.
The study, carried out by researchers at Stanford University, compared the use of a type of artificial intelligence called convolutional neural networks (CNNs) to professional dermatologists in its ability to distinguish between benign and malignant skin cancers.
CNNs are inspired by vision in animals, which combines inputs from a number of ‘neuron bundles’. Each bundle records a specific part of an image which slightly overlaps with the image portions around it.
These segments are then combined into one distinct image for the programme to analyse and decide what it represents. Google has been known to use CNNs for its search engine image search capability.
The algorithm was first trained using 129,450 images of various skin lesions, including 2,032 different skin diseases.
Once trained, the algorithm was then shown a set of new, unseen digital images of skin lesions which had previously been confirmed using biopsies. A board of 21 dermatologists were given the same image set.
Researchers included images of two distinct types of cancer and their benign equivalents: keratinocyte carcinomas (the most common form of skin cancer) and benign seborrheic keratoses, and malignant melanomas (the deadliest form of skin cancer) and benign nevi (birthmarks).
The team found the algorithm to correctly distinguish between both sets of image data, differentiating between the most common and most deadly forms of skin cancer and their benign equivalents just as well as the 21 dermatologists.
Susan Swetter, professor of dermatology at the Stanford Cancer Institute and co-author of the paper said the technology could greatly assist dermatologists in improving diagnoses.
The authors suggest that if the algorithms could be adapted for use via a smartphone app, this would be a cost-effective method of earlier diagnosis of skin cancer.
[caption id="attachment_24012" align="alignnone" width="180"] Co-lead author Andre Esteva[/caption]
“My main Eureka moment was when I realised just how ubiquitous smartphones will be,” said Andre Esteva, co-lead author of the paper. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
These latest findings add to the long list of circumstances in which artificial intelligence could apply. In terms of image analysis, companies such as Zebra Medical continue to develop algorithms for diagnosing various diseases, whilst health bodies like the NHS are deploying AI chatbots to replace non-emergency services.