SMOTE Technique Utilization in Cirrhosis Classification: A Comparison of Gradient Boosting and XGBoost
Keywords:
Gradient Boosting, Integration, Model Accuracy, SMOTE, XGBoost.Abstract
Cirrhosis is a chronic liver disease with significant health implications, responsible for 56,585 deaths annually, and ranking as the 9th leading cause of mortality worldwide. Early detection is crucial for effective treatment and better patient outcomes, as cirrhosis can progress to irreversible damage if not addressed in its initial stages. This research focuses on developing an advanced, integrated method for detecting cirrhosis by employing a combination of Synthetic Minority Over-sampling Technique (SMOTE) and machine learning models, specifically Gradient Boosting and XGBoost. The use of SMOTE is critical in this study as it addresses class imbalance in the dataset, which is a common challenge in medical diagnosis problems, especially when dealing with rare or minority conditions like cirrhosis. Class imbalance can lead to biased models that perform poorly on the minority class, which, in this case, could mean missing crucial cirrhosis diagnoses. SMOTE oversamples the minority class to ensure a more balanced dataset, which improves the model's ability to detect cirrhosis accurately. The research further includes a performance comparison between two powerful machine learning algorithms: Gradient Boosting and XGBoost. Gradient Boosting is known for its ability to optimize the model by focusing on misclassified instances in a sequential manner, while XGBoost, an advanced version of Gradient Boosting, is renowned for its speed and efficiency due to parallel processing and advanced regularization techniques.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2024 International Journal of Informatics and Computing
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.