A new artificial intelligence system can spot the tell-tale signs of skin cancer just as accurately as human doctors, say researchers, and the next step is to get the tech on a smartphone, so anyone can run a self-diagnosis.
Once the system is refined further and becomes portable, it could give many more people the chance to get screened with minimal cost, and without having to wait for an appointment with a doctor to confirm the symptoms.
The Stanford University researchers behind the deep learning system say the key to its success is an algorithm that enables it to apply what it knows from its existing database of skin cancer samples to pictures it hasn't seen before.
"We made a very powerful machine learning algorithm that learns from data," says one of the team, Andre Esteva. "Instead of writing into computer code exactly what to look for, you let the algorithm figure it out."
To give the system its smarts, the researchers trained it using 129,450 close-up images of skin lesions covering more than 2,000 different diseases, providing a vast database of examples to learn from.
Next, the team borrowed an algorithm developed by Google to spot the difference between cats and dogs in images, and adapted it to tell the difference between skin marks.
They put their new device up against 21 qualified dermatologists, who were shown 376 images of skin lesions and asked to judge if they would refer the patient for further analysis, or give them the all-clear.
Across the board, the AI was able to match the success rate of the professionals.
Read the paper: https://goo.gl/hZpS11