Explainable Artificial Intelligence for Clinical Decision Support in Precision Healthcare Systems | IJCT Volume 10 – Issue 2 | IJCT-V10I2P27

International Journal of Computer Techniques
ISSN 2394-2231
Volume 10, Issue 2  |  Published: April 2023

Author

Md Mehedi Hassan Melon, Abdur Rahim, Md Sazzad Hossain Shehan, Md Fazlay Rabby, Md Mahidur Rahman, MD Monirul Islam, Nayem Miah, MD Mizanur Rahman, Md Azharul Islam, Md Sumon Rana

Abstract

The use of Artificial Intelligence (AI) in healthcare has become a growing trend to help clinicians in disease diagnosis, forecasting patient outcomes, and enhancing treatment planning. A significant portion of the highly developed machine learning models can be viewed as black-boxes, and the decision-making process in healthcare is hard to comprehend how the prediction is made. This non-transparency has the potential to reduce the AI implementation in the clinical setting where interpretability and trust are crucial. To eliminate this difficulty, Explainable Artificial Intelligence (XAI)., has turned out to be an enticing solution that boosts the openness and readability of AI models. This study will put forward an explicable AI-based framework in clinical decision support in precision healthcare systems that will predict heart disease by using clinical patient data. The dataset used in the study is the Heart Disease Cleveland of the UCI Machine Learning Repository that presents clinical features including age, sex, type of chest pain, blood pressure, cholesterol, electrocardiographic findings, and maximum heart rate. Healthcare professionals usually use these characteristics to estimate cardiovascular health. Within the proposed method, data cleaning, normalization, and feature selection are performed on the dataset to guarantee quality and consistency of data. Machine learning algorithms are subsequently created to determine whether a patient is at risk of having heart disease based on the given clinical characteristics.

Keywords

Explainable Artificial Intelligence (XAI), Clinical Decision Support Systems, Precision Healthcare, Heart Disease Prediction, Machine Learning in Medical Care and Medical Data Analytics

Conclusion

In this study, an explainable artificial intelligence-based system, on how to assist clinical decision-making in precision healthcare systems via predicting heart disease via machine learning methods, was presented. The study made use of the Heart Disease Cleveland dataset that includes a variety of clinical characteristics of cardiovascular health such as age, cholesterol level, chest pain type, blood pressure, electrocardiographic findings, and maximum heart rate during physical activity. In a well-organized data preprocessing, exploratory data analysis, evaluation of relationship between features, significant data patterns and clinical indicators of heart disease were recognized. A predictive classification model based on Support Vector Machine (SVM) algorithm was used to create a model that would classify patients with and without heart disease by their clinical characteristics. The outcomes of the model analysis revealed the potentially good predictive accuracy of the model as shown by the analysis of the confusion matrix and Receiver Operating Characteristic (ROC) curve. The ROC analysis resulted in an area under the curve (AUC) value that was also high which implies that there is a high discrimination ability by the model implying that the model is capable of identifying cardiovascular risk in patients within the dataset. Besides predictive accuracy, the study also highlighted explainable artificial intelligence as significant to healthcare.

References

[1]. Amann, J., Blasimme, A., Vayena, E., Frey, D., Madai, V. I., & Precise4Q Consortium. (2020). Explainability for artificial intelligence in healthcare: a multidisciplinary perspective. BMC medical informatics and decision making, 20(1), 310. [2]. London, A. J. (2019). Artificial intelligence and black‐box medical decisions: accuracy versus explainability. Hastings Center Report, 49(1), 15-21. [3]. Payrovnaziri, S. N., Chen, Z., Rengifo-Moreno, P., Miller, T., Bian, J., Chen, J. H., … & He, Z. (2020). Explainable artificial intelligence models using real-world electronic health record data: a systematic scoping review. Journal of the American Medical Informatics Association, 27(7), 1173-1185. [4]. Magrabi, F., Ammenwerth, E., McNair, J. B., De Keizer, N. F., Hyppönen, H., Nykänen, P., … & Georgiou, A. (2019). Artificial intelligence in clinical decision support: challenges for evaluating AI and practical implications. Yearbook of medical informatics, 28(01), 128-134. [5]. Gordon, L., Grantcharov, T., & Rudzicz, F. (2019). Explainable artificial intelligence for safe intraoperative decision support. JAMA surgery, 154(11), 1064-1065. [6]. Feng, J., Shaib, C., & Rudzicz, F. (2020, November). Explainable clinical decision support from text. In Proceedings of the 2020 conference on empirical methods in natural language processing (EMNLP) (pp. 1478-1489). [7]. Lauritsen, S. M., Kristensen, M., Olsen, M. V., Larsen, M. S., Lauritsen, K. M., Jørgensen, M. J., … & Thiesson, B. (2020). Explainable artificial intelligence model to predict acute critical illness from electronic health records. Nature communications, 11(1), 3852. [8]. Yang, M., Liu, C., Wang, X., Li, Y., Gao, H., Liu, X., & Li, J. (2020). An explainable artificial intelligence predictor for early detection of sepsis. Critical care medicine, 48(11), e1091-e1096. [9]. Aghamohammadi, M., Madan, M., Hong, J. K., & Watson, I. (2019, June). Predicting heart attack through explainable artificial intelligence. In International conference on computational science (pp. 633-645). Cham: Springer International Publishing. [10]. Das, A., & Rad, P. (2020). Opportunities and challenges in explainable artificial intelligence (xai): A survey. arXiv preprint arXiv:2006.11371. [11]. Song, X., Yu, A. S., Kellum, J. A., Waitman, L. R., Matheny, M. E., Simpson, S. Q., … & Liu, M. (2020). Cross-site transportability of an explainable artificial intelligence model for acute kidney injury prediction. Nature communications, 11(1), 5668. [12]. Pawar, U., O’Shea, D., Rea, S., & O’Reilly, R. (2020, December). Incorporating Explainable Artificial Intelligence (XAI) to aid the Understanding of Machine Learning in the Healthcare Domain. In Aics (pp. 169-180). [13]. Tonekaboni, S., Joshi, S., McCradden, M. D., & Goldenberg, A. (2019, October). What clinicians want: contextualizing explainable machine learning for clinical end use. In Machine learning for healthcare conference (pp. 359-380). PMLR. [14]. Longo, L., Goebel, R., Lecue, F., Kieseberg, P., & Holzinger, A. (2020, August). Explainable artificial intelligence: Concepts, applications, research challenges and visions. In International cross-domain conference for machine learning and knowledge extraction (pp. 1-16). Cham: Springer International Publishing. [15]. Fellous, J. M., Sapiro, G., Rossi, A., Mayberg, H., & Ferrante, M. (2019). Explainable artificial intelligence for neuroscience: behavioral neurostimulation. Frontiers in neuroscience, 13, 490966. [16]. Deeks, A. (2019). The judicial demand for explainable artificial intelligence. Columbia Law Review, 119(7), 1829-1850. [17]. Selvam, A. (2019). Automating Claims Adjudication with Explainable Machine Learning Models. Artificial Intelligence, Machine Learning, and Autonomous Systems, 3, 209-255. [18]. Gerlings, J., Shollo, A., & Constantiou, I. (2020). Reviewing the need for explainable artificial intelligence (xAI). arXiv preprint arXiv:2012.01007. [19]. Mirchi, N., Bissonnette, V., Yilmaz, R., Ledwos, N., Winkler-Schwartz, A., & Del Maestro, R. F. (2020). The virtual operative assistant: an explainable artificial intelligence tool for simulation-based training in surgery and medicine. PloS one, 15(2), e0229596. [20]. Tjoa, E., & Guan, C. (2020). A survey on explainable artificial intelligence (xai): Toward medical xai. IEEE transactions on neural networks and learning systems, 32(11), 4793-4813. [21]. Mathews, S. M. (2019, July). Explainable artificial intelligence applications in NLP, biomedical, and malware classification: a literature review. In Intelligent computing-proceedings of the computing conference (pp. 1269-1292). Cham: Springer International Publishing. [22]. Thorsen-Meyer, H. C., Nielsen, A. B., Nielsen, A. P., Kaas-Hansen, B. S., Toft, P., Schierbeck, J., … & Perner, A. (2020). Dynamic and explainable machine learning prediction of mortality in patients in the intensive care unit: a retrospective study of high-frequency data in electronic patient records. The Lancet Digital Health, 2(4), e179-e191. [23]. Caruana, R., Lundberg, S., Ribeiro, M. T., Nori, H., & Jenkins, S. (2020, August). Intelligible and explainable machine learning: Best practices and practical challenges. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining (pp. 3511-3512). [24]. Thakker, D., Mishra, B. K., Abdullatif, A., Mazumdar, S., & Simpson, S. (2020). Explainable artificial intelligence for developing smart cities solutions. Smart Cities, 3(4), 1353-1382. [25]. Akata, Z., Balliet, D., De Rijke, M., Dignum, F., Dignum, V., Eiben, G., … & Welling, M. (2020). A research agenda for hybrid intelligence: augmenting human intellect with collaborative, adaptive, responsible, and explainable artificial intelligence. Computer, 53(8), 18-28. [26]. Pedersen, M., Verspoor, K., Jenkinson, M., Law, M., Abbott, D. F., & Jackson, G. D. (2020). Artificial intelligence for clinical decision support in neurology. Brain communications, 2(2), fcaa096. [27]. Sendak, M., Elish, M. C., Gao, M., Futoma, J., Ratliff, W., Nichols, M., … & O’Brien, C. (2020, January). ” The human body is a black box” supporting clinical decision-making with deep learning. In Proceedings of the 2020 conference on fairness, accountability, and transparency (pp. 99-109). [28]. Garcia-Vidal, C., Sanjuan, G., Puerta-Alcalde, P., Moreno-García, E., & Soriano, A. (2019). Artificial intelligence to support clinical decision-making processes. EBioMedicine, 46, 27-29. [29]. Vilone, G., & Longo, L. (2020). Explainable artificial intelligence: a systematic review. arXiv preprint arXiv:2006.00093. [30]. Jiménez-Luna, J., Grisoni, F., & Schneider, G. (2020). Drug discovery with explainable artificial intelligence. Nature Machine Intelligence, 2(10), 573-584. [31]. Ammar, N., & Shaban-Nejad, A. (2020). Explainable artificial intelligence recommendation system by leveraging the semantics of adverse childhood experiences: proof-of-concept prototype development. JMIR medical informatics, 8(11), e18752. [32]. Sloane, E. B., & Silva, R. J. (2020). Artificial intelligence in medical devices and clinical decision support systems. In Clinical engineering handbook (pp. 556-568). Academic Press. [33]. Shukla, B., Fan, I. S., & Jennions, I. (2020, July). Opportunities for explainable artificial intelligence in aerospace predictive maintenance. In PHM Society European Conference (Vol. 5, No. 1, pp. 11-11). [34]. Gu, D., Su, K., & Zhao, H. (2020). A case-based ensemble learning system for explainable breast cancer recurrence prediction. Artificial Intelligence in Medicine, 107, 101858. [35]. Bruckert, S., Finzel, B., & Schmid, U. (2020). The next generation of medical decision support: A roadmap toward transparent expert companions. Frontiers in artificial intelligence, 3, 507973. [36]. Pourhomayoun, M., & Shakibi, M. (2020). Predicting mortality risk in patients with COVID-19 using artificial intelligence to help medical decision-making. MedRxiv, 2020-03. [37]. Kuzlu, M., Cali, U., Sharma, V., & Güler, Ö. (2020). Gaining insight into solar photovoltaic power generation forecasting utilizing explainable artificial intelligence tools. Ieee Access, 8, 187814-187823. [38]. Magesh, P. R., Myloth, R. D., & Tom, R. J. (2020). An explainable machine learning model for early detection of Parkinson’s disease using LIME on DaTSCAN imagery. Computers in biology and medicine, 126, 104041. [39]. Rudin, C., & Radin, J. (2019). Why are we using black box models in AI when we don’t need to? A lesson from an explainable AI competition. Harvard Data Science Review, 1(2), 1-9. [40]. Romero-Brufau, S., Wyatt, K. D., Boyum, P., Mickelson, M., Moore, M., & Cognetta-Rieke, C. (2020). A lesson in implementation: a pre-post study of providers’ experience with artificial intelligence-based clinical decision support. International journal of medical informatics, 137, 104072. [41]. Suryadevara, S., & Yanamala, A. K. Y. (2020). Patient apprehensions about the use of artificial intelligence in healthcare. International Journal of Machine Learning Research in Cybersecurity and Artificial Intelligence, 11(1), 30-48. [42]. Casiraghi, E., Malchiodi, D., Trucco, G., Frasca, M., Cappelletti, L., Fontana, T., … & Valentini, G. (2020). Explainable machine learning for early assessment of COVID-19 risk prediction in emergency departments. Ieee Access, 8, 196299. [43]. Asan, O., Bayrak, A. E., & Choudhury, A. (2020). Artificial intelligence and human trust in healthcare: focus on clinicians. Journal of medical Internet research, 22(6), e15154. [44]. Xu, F., Sepúlveda, M. J., Jiang, Z., Wang, H., Li, J., Liu, Z., … & Rhee, K. (2020). Effect of an artificial intelligence clinical decision support system on treatment decisions for complex breast cancer. JCO clinical cancer informatics, 4, 824-838. [45]. Duval, A. (2019). Explainable artificial intelligence (XAI). MA4K9 Scholarly Report, Mathematics Institute, The University of Warwick, 4. [46]. Shaw, J., Rudzicz, F., Jamieson, T., & Goldfarb, A. (2019). Artificial intelligence and the implementation challenge. Journal of medical Internet research, 21(7), e13659. [47]. Shabaniyan, T., Parsaei, H., Aminsharifi, A., Movahedi, M. M., Jahromi, A. T., Pouyesh, S., & Parvin, H. (2019). An artificial intelligence-based clinical decision support system for large kidney stone treatment. Australasian physical & engineering sciences in medicine, 42(3), 771-779. [48]. Buchlak, Q. D., Esmaili, N., Leveque, J. C., Farrokhi, F., Bennett, C., Piccardi, M., & Sethi, R. K. (2020). Machine learning applications to clinical decision support in neurosurgery: an artificial intelligence augmented systematic review. Neurosurgical review, 43(5), 1235-1253. [49]. Spänig, S., Emberger-Klein, A., Sowa, J. P., Canbay, A., Menrad, K., & Heider, D. (2019). The virtual doctor: an interactive clinical-decision-support system based on deep learning for non-invasive prediction of diabetes. Artificial intelligence in medicine, 100, 101706. [50]. Bivard, A., Churilov, L., & Parsons, M. (2020). Artificial intelligence for decision support in acute stroke—current roles and potential. Nature Reviews Neurology, 16(10), 575-585. [51]Loftus, T. J., Tighe, P. J., Filiberto, A. C., Efron, P. A., Brakenridge, S. C., Mohr, A. M., … & Bihorac, A. (2020). Artificial intelligence and surgical decision-making. JAMA surgery, 155(2), 148-158. [52]. Sanjana, K., Sowmya, V., Gopalakrishnan, E. A., & Soman, K. P. (2020). Explainable artificial intelligence for heart rate variability in ECG signal. Healthcare Technology Letters, 7(6), 146. [53]. Roscher, R., Bohn, B., Duarte, M. F., & Garcke, J. (2020). Explainable machine learning for scientific insights and discoveries. Ieee Access, 8, 42200-42216. [54]. Lysaght, T., Lim, H. Y., Xafis, V., & Ngiam, K. Y. (2019). AI-assisted decision-making in healthcare: the application of an ethics framework for big data in health and research. Asian bioethics review, 11(3), 299-314. [55]. Abidi, S. S. R., & Abidi, S. R. (2019, July). Intelligent health data analytics: A convergence of artificial intelligence and big data. In Healthcare management forum (Vol. 32, No. 4, pp. 178-182). Sage CA: Los Angeles, CA: SAGE Publications. [56]. Watson, H. A., Tribe, R. M., & Shennan, A. H. (2019). The role of medical smartphone apps in clinical decision-support: A literature review. Artificial Intelligence in Medicine, 100, 101707. [57]. Boppiniti, S. T. (2020). A survey on explainable AI: Techniques and challenges. Novateur Publications, International Journal of Innovations in Engineering Research and Technology. [58]. Abdel-Basset, M., Manogaran, G., Gamal, A., & Chang, V. (2019). A novel intelligent medical decision support model based on soft computing and IoT. IEEE internet of things journal, 7(5), 4160-4170. [59]. Sutton, R. T., Pincock, D., Baumgart, D. C., Sadowski, D. C., Fedorak, R. N., & Kroeker, K. I. (2020). An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ digital medicine, 3(1), 17. [60]. Jin, Z., Cui, S., Guo, S., Gotz, D., Sun, J., & Cao, N. (2020). Carepre: An intelligent clinical decision assistance system. ACM Transactions on Computing for Healthcare, 1(1), 1-20. [61]. Dataset Link: https://www.kaggle.com/datasets/cherngs/heart-disease-cleveland-ucI

How to Cite This Paper

Md Mehedi Hassan Melon, Abdur Rahim, Md Sazzad Hossain Shehan, Md Fazlay Rabby, Md Mahidur Rahman, MD Monirul Islam, Nayem Miah, MD Mizanur Rahman, Md Azharul Islam, Md Sumon Rana (2023). Explainable Artificial Intelligence for Clinical Decision Support in Precision Healthcare Systems. International Journal of Computer Techniques, 10(2). ISSN: 2394-2231.

© 2026 International Journal of Computer Techniques (IJCT). All rights reserved.

Submit Your Paper