The last few years have seen a number of fascinating case studies that have used machine learning to produce medical diagnoses. There are a couple of new examples from both a corporate giant and a nimbler startup.
On the giant side of the fence is a traditional player in this space. Researchers at Google have developed an algorithm for scanning our eyes in order to better spot a particularly common form of blindness.
The project follows a familiar path, feeding the algorithm a bunch of medical images of the retina to train it to look for diabetic retinopathy, which is a condition believed to affect around 1/3 of diabetes patients.
The condition is caused by blood vessels in the eye becoming damaged, which sees the vision of the sufferer deteriorate. As with so many conditions, early detection can see it successfully treated.
The team hope that by deploying machine learning, they can improve both the accuracy and objectivity of diagnoses, and therefore the quality of eye care.
Google have form in this area, as they are already working with the Moorfields Eye Hospital in London to utilize AI in the detection of various eye diseases.
The approach is interesting because it deviates from traditional machine learning whereby algorithms are trained on specifically labelled images that tell it what is good and what is bad. Instead, the Google based algorithm figured this out for itself.
When the algorithm was put through its paces on over 12,000 images, it performed admirably against leading experts, both in identifying the condition, and correctly grading the severity of it.
The researchers are now using the algorithm in a clinical setting in partnership with the famous Aravind Eye Hospital in India, although the results from this trial are not yet known.
Understanding bone density
The second project, from a startup in Israel, looks at the use of machine learning to support patients with Osteoporosis. They utilize CT data to better identify patients for bone density screening, thus providing an earlier identification of possible issues.
Estimates suggest that around 1/3 of women, and 1/5 of men over the age of 50 will suffer some form of osteoporotic fracture in their lifetime, with hip fractures the most common instance of this. The impact can be severe for both quality and length of life, with less than a 1/3 of patients successfully able to rehabilitate.
As such, it's a significant burden on the healthcare system, with estimates suggesting it costs $18 billion in the United States alone. In a bid to improve the diagnosis of the condition, a team from Zebra Medical Vision and the Clalit Research Institute set out to develop an algorithm capable of calculating bone density simply from looking at CT scans that are often produced for other purposes. In other words, patients can be tested for osteoporosis risk without having to undergo a specific procedure for it.
"Since it is common to find existing CT scans for the relevant age groups inside the provider's system, it is a more efficient screening process, enabling the identification of patients at risk without additional expenditures and time investment," the researchers say. "Combining the data obtained from this tool together with demographic and medical data at the institute, will enable the efficient prediction of high fracture risk. This is an excellent example of harnessing analytic capabilities and big-data techniques for the advancement of public health."
They believe that their solution will be increasingly crucial as the population ages and healthcare systems struggle to cope with the rising demands of this ageing population.
The service currently allows patients to upload a couple of CT scans to the site for free, with plans afoot to cater for a broader range of scans in due course.
It's further evidence of the power of machine learning to improve healthcare, providing they have access to the right levels of data to both train their algorithms on and also to derive their insights from. That, I suspect, will be the next big challenge for the industry, and it is perhaps no surprise that Barack Obama likened the unification of patient data as the next big moonshot for the United States.