Pages

Wednesday, 17 May 2017

The role of AI in imaging for healthcare

Source: NVIDIA blog post. Technology has advanced so much that a medical image like this can be diagnosed with deep learning techniques with more accurate conclusions than can be made by computer algorithms alone.
Source: NVIDIA blog post. Technology has advanced so much that a medical image like this can be diagnosed with deep learning techniques with more accurate conclusions than can be made by computer algorithms alone.

Artificial intelligence (AI) is not going to replace doctors, says Daniel Rubin, an associate professor of biomedical data science, radiology, medicine and ophthalmology from Stanford University, at least not yet.

Rubin explained at the GPU Technology Conference that deep learning trained applications are currently interpreting medical images, assessing how disease presents in patients, and monitoring patient responses to treatments, reducing diagnosis error rates and improving clinical decision making. His research focuses on developing applications to improve diagnostic accuracy and clinical effectiveness, by using information in images, combined with clinical and molecular data, to characterise disease more accurately.

“Physicians are caregivers, not computers. They don’t need to be replaced, what they need is help in taking care of patients,” said Rubin, who is Director of biomedical informatics at the Stanford Cancer Institute, and director of the Scholarly Concentration in Informatics at the Stanford School of Medicine, as quoted in an NVIDIA blog post

Rubin said that the decisions made in medical imaging can rely on subtle cues, such as whether the edges of a tumour are straight or faceted, or if the shape is contained, as opposed to having anemone-like fronds growing from it. Multiple images must be taken using a variety methods, from different angles, and over a period of time during the review process. The different perspectives helps doctors to decide if a disease is responding to treatment, he explained.

Deep learning, trained on a convolutional network, is being used to detect for breast masses in mammograms, helping identify early signs of diabetic retinopathy in ophthalmology, and looking for signs of cancer during bladder exams.

The improvements in AI-based image diagnosis are doing better than state-of-the-art computerised diagnosis algorithms, Rubin said, making doctors more confident that they can rely on such information to guide their decision-making and arrive at positive patient outcomes.

Hashtag: #GTC17

No comments:

Post a Comment