Starts at: 2025-03-01 10:45AM
Ends at: 2025-03-01 12:00PM
Abstract:
We investigate classifying the stages of wounds from a dietetic wound imaging dataset with alternative machine learning algorithms to the standard approaches like Convolutional Neural Networks (CNN) and Support Vector Machines (SVM). Although these methods usually classify with high accuracy, we search for more explainable methods for clinician and patient interpretability. One approach previously studied in the literature is using a Bayesian Network model, specifically Naive Bayes Nearest Neighbor (NBNN). The goal of the authors was to make a computationally simple method while being competitive in image classification. We make some alterations to their method by trying alternative estimates of the probabilities in NBNN to improve the model’s performance and equally important, its explainability. In particular, we consider K-Nearest Neighbor (KNN) for probability density estimation without using a kernel and then compare their classic NBNN to our approach using k-fold cross-validation. Ultimately, we address whether Bayesian Networks can be competitive in performance but also provide explainable wound imaging predictions.