Kerala Cancer Crusade, Cancer Literacy Mission and IRIA Preventive Radiology National Program conducted a Webinar Series on “Risk Stratification and Cancer Screening in the Digital Era” in association with IRIA Kerala, Swasthi Foundation and Community Oncology, Regional Cancer Centre, Journo Med. The eighth-day webinar discussed “AI in the prediction of breast cancer using mammogram”. The event was held on 8th October 2022.
October month is “National Breast Cancer Awareness Month” which was introduced in the year 1985 as a campaign by the American Academy of Family Physicians, AstraZeneca Healthcare Foundation, Cancer care and other organizations. Also, “Male Breast Cancer Awareness Week” was celebrated in the third week of October. An annual campaign was conducted to educate people regarding Breast Cancer. Cancer is the most prevalent disease and October month being a month to raise awareness, promote education, screen and to early detect the risk of cancer among cancer patients. Positive promotions were done to promote the campaign. Conversations regarding cancer helped to spread the word and explain the importance of cancer screening and diagnosis.
Dr Keshav Das, Editor-in-chief of the Indian Journal of Radiology Imaging, Deputy Director of Shri Chitira Tirunal Institute of Medical Sciences and Technology, spoke about the IRIA Preventive Radiology National Program- it’s a program in association with IRIA Kerala and Swasti Foundation and Community Oncology Regional Cancer Centre as part of the Kerala Cancer Crusade and Cancer Literacy Mission. Dr Rijo and his team have taken up a noble cause on preventive radiology. In 1995, the technique mammography was just introduced and there is a lot of research being performed in Computer Aided Detection (CAD) it was perhaps Artificial Intelligence (AI) which is not in much usage. It was Computer Aided Detection (CAD) and a few other technologies which gained their prominence later. In earlier days, it was more about picking up early carcinoma breast cancer lesions and especially screening was popular in those days, it was not yet started but there could be techniques where the lesions can be picked up. Screening over the years has come up with new Artificial Intelligence (AI) techniques. The techniques for lesion pickup have been improved drastically and the main goal is not to miss a lesson. The topic is very apt, and the screening is more prevalent with large number of infographics which are difficult for a radiologist alone to visualize. Using AI lesions can be studied further by radiologists and even the lesions’ presence can be detected. The lesion presence is discussed by Dr Wong. Before deeply dived into the scientific session, a couple of points must be noted. The first is the evolution from computer-aided detection and its transformation to computer aided diagnosis and that is where AI is trying to inculcate its intuitive thinking that can be observed by viewing the image/ infographic. The second focus is on screening modalities like chest x-ray, mammograms first because these are the volumes that require volumes and where the actual access to tertiary care and advanced technology is not available and AI is hoped to do that, the FDA at last count has 523 FDA approved solutions in medicine out of which 390 are in radiology. So, there is a lot of research which is happening on the ground regarding FDA-approved algorithms.
Dr Wong Lai Kaun, Associate Professor, Faculty of Computing and Informatics at Multimedia University, Malaysia spoke on the topic ‘AI in the prediction of breast cancer using mammogram’. Firstly, the knowledge regarding the cured matrix must be known. Cure Metrix is actually the first FDA clear optimization workflow that is utilized for the cases that help in reducing the production of radiologists. Advanced algorithms are useful for radiologists in evaluating mammograms with greater accuracy. When CureMetrix AI scans a woman’s mammogram, it’s doing that with the intelligence gathered from millions of mammogram images from women around the world. It adds an extra layer of precision, helping to flag suspicious cases and reduce the reading time for streamlined workflow, helping radiologists focus on abnormalities that could be missed with outdated technologies and then it scores them providing the doctor additional intelligence and potentially eliminating unnecessary biopsies which can increase costs and patient anxiety. It can also access breast tissue density and automatically provide a density classification AI helps in detecting a woman’s risk of cardiac disease from the same mammogram and score her risk level so she can evaluate the health progress over time. At the end of the day, doctors prefer AI technology and patients deserve positive outcomes. AI is in lead in the healthcare sector. Everyone wants an extra layer of confidence that Artificial Intelligence (AI) serves as a support to doctors. CureMetrix AI-based technology gives more control in faster and more accurate readings.
The Technology behind AI Mammography
AI stands for artificial intelligence and specifically AI technology is a popular artificial neural network. So, the relationship between artificial intelligence (AI) and artificial neural networks must be known. Artificial Intelligence (AI) is an umbrella term and inside AI, machine learning (ML) is just one type of Artificial Intelligence (AI) and artificial neural network is a subset or a kind of machine learning technique and the more recently adapted technology that enables the tremendous improvement in medical imaging is Deep Learning (DL). Deep Learning (DL) is a type of deep neural network and together it possesses a convolutional neural network, a type of artificial neural network. The specific technology that made AI very popular and brought great improvement is “Artificial Neural Network Technology.”
Artificial Neural Network Technology
Artificial Neural Network Technology is an information processing paradigm that is inspired by the way of the biological nervous system or how the brain process information, so it is a bio-inspired network. For eg: How do humans learn or how does a child learn the alphabet, pictures names etc. So, human brain processes it all so humans are trained in such a way over time they learn, and an artificial neural network is trying to do it. Is the artificial neural network trying to mimic how the human brain works so that it can be trained like how a child learns to recognize objects.
We all know that a human brain consists of 100 billion neurons. Neurons are processing elements in the brain and these neurons have stomata, cell bodies and dendrites that pass the information. So, the activity of the artificial neural network is it processes the information system that trains a human brain to learn a new skill, knowledge or technology. For example, Sinai system or algorithm, a set of cat and dog images can be given, and the human brain is trained to learn or recognize it automatically, and these images will be passed through the artificial network, in this case, more specifically they are sent to the convolutional neural network (CNN). When these images are passed through, the important features that make a cat and dog specifically can be learnt. Once this system is trained next time, a new unseen cat or dog image is passed, and it will recognize and tell us whether it is a cat or a dog. Also, the AI or convolutional neural network (CNN) can perform object detection and segmentation. For example, it can detect a human, this is an elephant, this is a cat and so on. It can even differentiate, and it can segment out quite accurately. So, in addition, to be able to tell this is a human it can cut out the human more accurately. So, this is the more advanced application of artificial intelligence (AI) where it can detect objects and segment out objects accordingly. Even the brain is trained to recognize a few features that are analogous to AI neural networks’ low-level features like ages, colours, orientation and so on. The low-level features are first learned these low-level features will form mid-level features. It will learn the object parts and finally add the multiple layers in the neural network with high features at the resemblance. Considering the face recognition neural network, the low-level features and the object parts help in the detection of the human face.
Impact of Deep Learning: Image Classification
Based on the research study of the computer vision world cup in 2010, a group of researchers from Stanford created a large data set containing millions of images and 1000 classes of objects, they tried to train an AI network to recognize 1000 classes of objects, it can recognize 1000 classes of objects from millions of images. Earlier, when it was introduced in 2010 and 2011, there was a discussion regarding traditional vision techniques and deep learning. The error rate mentioned is about 20%. Compared to the traditional method, deep learning increases the accuracy or reduces the error rate significantly to less than 20% and in 2013 the trend changed significantly. In 2013, the participants utilized deep learning networks with high accuracy and the error levels were reduced to less than 10% in 2014, accuracy further increased, and error levels were reduced to 4-5%. Now the accuracy of image classification is about 97-98%. It’s accurate in recognizing 1000 images.
Impact of Deep Learning: Object Detection
Considering object detection, deep learning was introduced in the year 2012, before that the mean or average position is checked based on accuracy which is very low about 40% but with the introduction of a convolutional neural network (CNN) the accuracy got increased. In 2015, the increase in object detection is about 80%.
Effect of AI on Mammography
The accuracy of object detection and recognition of the object has played a significant role in Medical Imaging. , To understand the functionality of AI Mammography, one needs to understand the research performed by medical experts in cancer diagnosis. An example of a diagnosis of cancer from a mammogram image. A large set of mammogram images can be taken and trained as a neural network that eventually recognizes the image of a cancerous lesion or normal condition. Even the mammogram model can be differentiated based on cancerous cell recognition.
AI in Mammography: Cancer Detection
AI is trained to recognize the high-resolution image of cancerous cells. It will identify the candidate’s mass localization at different scales and after that, it will undergo multi-scale processes to detect the actual cancerous region it also goes one step further where the image patch can be seen and it can now further segment it and give an accurate identification of the region and thus, the lesion location can be detected.t. AI can capacitance the cancer detection and localization in AI mammography as well.
AI in Mammography: Risk Prediction
Early detection is very important, and it is one area of research where the researchers are currently involved. If one compares the different types of images of breast fatty, scattered, heterogeneously dense and extremely dense images. One of the dimensions of the breast is classified by radiologists and another dimension of the breast is classified by AI. It was concluded that AI provides better consistency. In conclusion, there is wide variation in human interpretation, it was found that 40% of US-certified breast imaging radiologists perform outside recommended ranges for acceptable false positive rates.
It is found that AI can predict in 97% of agreements with an expert radiologist and they have tested this on more than 40,000 mammograms which ran into a conclusion for risk prediction. So, in risk prediction, AI is doing quite well in recognizing the different density mammograms that may give some indication or help with an early cancer diagnosis.
AI Mammography Functionality
According to the research, it is found that 34 out of 36 AI systems evaluated were less accurate than a single radiologist, and all were less accurate than the consensus of two or more radiologists. Another group of researchers found that combining the strengths of radiologists and AI for breast cancer screening is actually a better approach. So, this is called an existing pathway where two readings can be obtained from two different radiologists and then discussed at the consensus conference. So, at this point, there is an agreement that knowing the radiologist’s reading differs between the two readers. The next pathway is the standalone AI pathway, so in this pathway, they have replaced one of the radiologists with AI and they set a certain threshold for whether the result is cancerous or not and then combine with reading 1 of a radiologist reading and combine it to get the consensus conference. Now interestingly, they achieve a sensitivity of 84.2% and a specificity of 89.5% on internal-test data, and a sensitivity of 84.6% and a specificity of 91.3% on external–test data but was less accurate than the average unaided radiologist.
Another pathway is introduced known as a decision-referral approach for integrating AI into the breast cancer screening pathway, whereby the algorithm predicts on the basis of its qualification of uncertainty. By using this approach, it can improve radiologist sensitivity by 2-6% points and specificity by 1-0% points. So, in this case, the accuracy improves, corresponding to a triaging performance at 63-0% on the external dataset; the AUROC was 0-982 on the subset of studies assessed by AI, surpassing radiologist performance.
According to the research, it was concluded that doctors and AI researchers work together. They were 2.6% better at detecting breast cancer than a doctor alone and few raised a false alarm. It was said that radiologists will not be replaced by AI, but in the proposed AI-driven process nearly three-quarters of the screening studies didn’t need a reviewed radiologist while improving overall accuracy.
AI + Radiologists
- Radiologists and AI system yield significant differences in breast-cancer screenings, revealing the potential value of using both human and AI methods which marks a significance in medical diagnosis.
- Every patient performing the study continues to have their mammograms interpreted by a radiologist, but the AI will flag and prioritize patients with additional imaging, facilitating the flow of care.