nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

AI eggheads: Our cancer-spotting code rivals dermatologists

Next step: Get it working on mobile phones

By Katyanna Quach, 25 Jan 2017

An algorithm that promises to diagnose skin cancer as well as dermatologists can may work with mobile phone cameras in the future, according to a paper published in Nature.

The recent obsession with machine learning and AI in the tech world has boosted the ability of computers to analyze streams of data and classify images. Skin cancer is the latest disease to be identified through software.

Last year, Google Brain claimed its algorithm was as good as ophthalmologists at detecting diabetic retinopathy from retinal scans. Its sister AI company, DeepMind, is also working to tell apart cancerous from healthy tissue for oral cancers.

Skin cancer is usually diagnosed visually before a biopsy. Doctors look for signs of patchy, abnormal skin growth through a dermascope – a job that can be performed by machines.

The new code is based on Google's Inception v3, which is a deep-learning algorithm built on TensorFlow and pretrained using 1.28 million images from the ImageNet dataset to differentiate between pictures of cats and dogs. Now, researchers from Stanford University, California and the Veterans Affairs Palo Alto Health Care System in California have tweaked it to turn its attention to skin cancer.

A database of 129,450 images containing 2,032 different skin diseases was used for training. Each image was processed as raw pixels associated with a disease label.

"There's no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own," said Brett Kuprel, co-lead author of the paper and a graduate student at Stanford. "We gathered images from the internet and worked with the medical school to create a nice taxonomy out of data that was very messy – the labels alone were in several languages, including German, Arabic and Latin."

After translating the labels into English, the team tested the algorithm against 21 dermatologists. Each person was asked whether they would refer a patient for a biopsy, for a treatment, or reassure them that their skin lesion wasn't cancerous, based on images alone.

Success was determined by how well the dermatologists could correctly tell apart cancerous and non-cancerous skin, compared to the algorithm's results.

Tested on three types of skin diseases, the software matched the dermatologists answers and had up to 91 per cent accuracy.

Although the results are promising, "rigorous prospective validation of the algorithm is necessary before it can be implemented in clinical practice, by practitioners and patients alike," said Susan Swetter, coauthor of the paper and professor of dermatology at Stanford University.

Once the code is good enough for clinical use, the researchers hope to bring the technology to mobile phones. "Outfitted with deep neural networks, mobile devices can potentially extend the reach of dermatologists outside of the clinic. It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021, and can therefore potentially provide low-cost universal access to vital diagnostic care," the paper [paywalled] concluded.

There are 5.4 million new cases of skin cancer (all types) in the United States every year, and early-stage detection has a massive impact on survival rates. The five-year survival rate for malignant melanoma is around 97 per cent during initial stages, but drops to approximately 14 per cent in its latest stages.

If the algorithm can be easily used on mobile phones, it would increase the likelihood that skin cancer is detected earlier. ®

The Register - Independent news and views for the tech community. Part of Situation Publishing