z-logo
open-access-imgOpen Access
Extraction of human understandable insight from machine learning model for diabetes prediction
Author(s) -
Tsehay Admassu Assegie,
Thulasi Karpagam,
Radha Mothukuri,
R. Lakshmi Tulasi,
Minychil Fentahun Engidaye
Publication year - 2022
Publication title -
bulletin of electrical engineering and informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.251
H-Index - 12
ISSN - 2302-9285
DOI - 10.11591/eei.v11i2.3391
Subject(s) - interpretability , machine learning , artificial intelligence , outcome (game theory) , computer science , boosting (machine learning) , predictive modelling , artificial neural network , mathematics , mathematical economics
Explaining the reason for model’s output as diabetes positive or negative is crucial for diabetes diagnosis. Because, reasoning the predictive outcome of model helps to understand why the model predicted an instance into diabetes positive or negative class. In recent years, highest predictive accuracy and promising result is achieved with simple linear model to complex deep neural network. However, the use of complex model such as ensemble and deep learning have trade-off between accuracy and interpretability. In response to the problem of interpretability, different approaches have been proposed to explain the predictive outcome of complex model. However, the relationship between the proposed approaches and the preferred approach for diabetes prediction is not clear. To address this problem, the authors aimed to implement and compare existing model interpretation approaches, local interpretable model agnostic explanation (LIME), shapely additive explanation (SHAP) and permutation feature importance by employing extreme boosting (XGBoost). Experiment is conducted on diabetes dataset with the aim of investigating the most influencing feature on model output. Overall, experimental result evidently appears to reveal that blood glucose has the highest impact on model prediction outcome.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here