Doctors can look at X-ray images and analyse what they see. Thanks to their expertise, they can diagnose a patient’s health simply by looking at these images.
We are already at the stage where this type of expertise can be stored and automated. With a large number of already classified X-ray images, an artificial intelligence can be trained to diagnose diseases. Thanks to a significant scientific breakthrough called deep learning, there is no need to tell the AI what parts of an image led to the diagnosis; it can discover the diagnostic rules itself.
AI to solve healthcare labour shortage?
“Healthcare is one sector that will see a significant change thanks to artificial intelligence. Work in this field has traditionally required expensive knowledge, but now part of that knowledge can be automated”, says Professor of Practice Leo Kärkkäinen.
He believes that medical diagnoses by machines will be commonplace in the future. Automating repetitive and time-consuming work can help free up expert resources for more demanding tasks in a sector that is plagued by labour shortage.
Kärkkäinen has participated in a research project for detecting subarachnoid haemorrhages. The arachnoid membrane separates the brain tissue from cavities in the brain. When a blood vessel in one of these cavities starts to leak, no typical neurological symptoms of a brain haemorrhage may appear, except for a severe headache. In most cases, patients are X-rayed, but there is not always a radiologist present to detect possible leaks in the images. This is where a diagnosis made by an artificial intelligence could save a patient’s life.
“This is AI application at its best. An artificial intelligence does not necessarily do things better than a human, but it can work faster and regardless of the time of day, which is perhaps its greatest advantage”, says Kärkkäinen.
Self-learning neural networks
Deep learning – or neural networks – is a machine learning method inspired by how we believe that the human brain works. A neural network consists of a very large number of artificial nerves, or neurons, which specialise in performing simple tasks given to them or sent from other neurons. Data moves up through the network’s layers of neurons, as the system performs combinations of these simple tasks. Thus, each new layer of neurons is tasked with an increasingly complicated task.
Image recognition is one application that uses deep learning. While traditional machine learning methods require very complex programmatic rules for identifying objects in images, a deep learning system can – with a sufficiently large number of already classified images as input – automatically adjust its neutral network operation to improve detection accuracy. The system is therefore self-learning. The system can perform tasks that are increasingly complex as the amount and accuracy of the input data increases.
The university is collaborating with hospitals to get access to large amounts of classified data, such as X-ray images, in order to train deep learning systems properly.
“Aalto University is participating in several research projects where we collaborate with doctors to identify tools that could help healthcare professionals work faster and more effectively.”
This release was first published 28 February 2019 by Aalto University.