A new computer algorithm marks a breakthrough in predicting diabetes and stroke. By analyzing the color of a person’s tongue, the algorithm can provide 98% accurate results.
The algorithm was jointly developed by Middle Technical University and the University of South Australia (UniSA). The conditions this algorithm can detect include diabetes, stroke, anemia, asthma, liver and gallbladder issues, COVID-19, and other vascular and gastrointestinal diseases.
“The color, shape, and thickness of the tongue can reveal a litany of health conditions,” said Ali Al-Naji, adjunct Associate Professor at MTU and UniSA.
“Typically, people with diabetes have a yellow tongue; cancer patients a purple tongue with a thick greasy coating; and acute stroke patients present with an unusually shaped red tongue,” he added.
The breakthrough was achieved through a series of experiments using 5,260 images to train machine-learning algorithms to detect tongue color.
Researchers received 60 tongue images from two teaching hospitals in the Middle East, representing patients with diverse health conditions. The AI model matched tongue color with the correct disease in nearly all cases.
The paper published in Technologies describes how the system analyses tongue color to provide real-time diagnoses, demonstrating that AI can advance medical practices significantly.
Read more – Bangladesh Crisis: Former Bangledesh PM Sheikh Hasina Accused Of Murder Along With Six Others
Al-Naji explained that AI is replicating a 2,000-year-old technique from traditional Chinese medicine, where the tongue’s color, shape, and thickness are used to diagnose health issues.
For example, people with diabetes typically have a yellow tongue, while cancer patients show a purple tongue with a thick greasy coating. Stroke patients often present with an unusually shaped red tongue. A white tongue can indicate anemia, severe Covid-19 cases are associated with a deep red tongue, and an indigo or violet tongue suggests vascular or gastrointestinal problems or asthma.
The study used cameras placed 20 centimeters from a patient to capture tongue color, and the imaging system predicted health conditions in real-time.
Co-author UniSA Professor Javaan Chahl noted that this technology could eventually be adapted for use with smartphones, making disease screening more accessible.