Can Images of the Eye Predict Indicators of Heart Disease?
Submitted by Elman Retina Group on October 18, 2019
If you’re thinking about skipping your next eye exam, you might want to reconsider. According to recent research from Google’s Artificial Intelligence unit, an inspection of the back of the eye can predict indicators for heart attack and other cardiovascular disease.
The team at Elman Retina Group breaks down what you need to know about the research here.
Understanding Google’s Research
Cardiovascular disease is a tremendous public health concern, and assessing someone’s risk of the disease is the first step toward reducing the chances that he or she suffers a cardiovascular event in the future.
Most cardiovascular risk calculators use a combination of variables like age, gender, smoking status, blood pressure, Body Mass Index (BMI) and glucose and cholesterol levels to identify patients at risk of a cardiovascular event or cardiac-related mortality.
Doctors can ask a patient about some of these factors (e.g., age or smoking status) but some of these variables, like cholesterol levels, require a blood draw.
A team from Google and its health-tech subsidiary Verily Life Sciences discovered that a computer algorithm can predict a person’s risk for cardiovascular disease based on retinal images.
In their results, which were published in Nature Biomedical Engineering, Google said they used deep learning algorithms trained on data from over 284,000 patients to predict a person’s risk for cardiovascular disease with very high accuracy.
The algorithm can’t find cardiovascular disease, but it can identify risk factors that are associated with cardiovascular disease, such as high blood pressure and cholesterol levels. The Google team said their algorithm is designed to discern which patients are likely to suffer a cardiac event in the next five years.
Retinal images can be obtained quickly, cheaply and non-invasively in an outpatient setting (and without drawing blood).
The Implications of Google’s Findings
“This discovery is particularly exciting because it suggests we might discover even more ways to diagnose health issues from retinal images,” said Lily Peng, Google Brain product manager.
Another reason why the Google research is so exciting is that the team used attention techniques to look at how the algorithm works. In essence, the team opened up the “black box” of the algorithm for an explanation of how it makes its predictions. It is their hope that doctors will have more confidence in the algorithm having an understanding of how it works.
It is unlikely this type of data-based algorithm will ever replace a traditional eye exam. A computer is no substitute for an eye doctor’s clinical judgement and understanding of an individual patient’s history. However, these algorithms are another tool that could help eye doctors improve their diagnostic skill and patient care.
For more information about the connection between ocular and overall health, please contact the Baltimore team at Elman Retina Group today.