• Title/Summary/Keyword: (ML) Machine learning

Search Result 277, Processing Time 0.026 seconds

Development of ML and IoT Enabled Disease Diagnosis Model for a Smart Healthcare System

  • Mehra, Navita;Mittal, Pooja
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.7
    • /
    • pp.1-12
    • /
    • 2022
  • The current progression in the Internet of Things (IoT) and Machine Learning (ML) based technologies converted the traditional healthcare system into a smart healthcare system. The incorporation of IoT and ML has changed the way of treating patients and offers lots of opportunities in the healthcare domain. In this view, this research article presents a new IoT and ML-based disease diagnosis model for the diagnosis of different diseases. In the proposed model, vital signs are collected via IoT-based smart medical devices, and the analysis is done by using different data mining techniques for detecting the possibility of risk in people's health status. Recommendations are made based on the results generated by different data mining techniques, for high-risk patients, an emergency alert will be generated to healthcare service providers and family members. Implementation of this model is done on Anaconda Jupyter notebook by using different Python libraries in it. The result states that among all data mining techniques, SVM achieved the highest accuracy of 0.897 on the same dataset for classification of Parkinson's disease.

A Deep Learning Application for Automated Feature Extraction in Transaction-based Machine Learning (트랜잭션 기반 머신러닝에서 특성 추출 자동화를 위한 딥러닝 응용)

  • Woo, Deock-Chae;Moon, Hyun Sil;Kwon, Suhnbeom;Cho, Yoonho
    • Journal of Information Technology Services
    • /
    • v.18 no.2
    • /
    • pp.143-159
    • /
    • 2019
  • Machine learning (ML) is a method of fitting given data to a mathematical model to derive insights or to predict. In the age of big data, where the amount of available data increases exponentially due to the development of information technology and smart devices, ML shows high prediction performance due to pattern detection without bias. The feature engineering that generates the features that can explain the problem to be solved in the ML process has a great influence on the performance and its importance is continuously emphasized. Despite this importance, however, it is still considered a difficult task as it requires a thorough understanding of the domain characteristics as well as an understanding of source data and the iterative procedure. Therefore, we propose methods to apply deep learning for solving the complexity and difficulty of feature extraction and improving the performance of ML model. Unlike other techniques, the most common reason for the superior performance of deep learning techniques in complex unstructured data processing is that it is possible to extract features from the source data itself. In order to apply these advantages to the business problems, we propose deep learning based methods that can automatically extract features from transaction data or directly predict and classify target variables. In particular, we applied techniques that show high performance in existing text processing based on the structural similarity between transaction data and text data. And we also verified the suitability of each method according to the characteristics of transaction data. Through our study, it is possible not only to search for the possibility of automated feature extraction but also to obtain a benchmark model that shows a certain level of performance before performing the feature extraction task by a human. In addition, it is expected that it will be able to provide guidelines for choosing a suitable deep learning model based on the business problem and the data characteristics.

Analysis of Open-Source Hyperparameter Optimization Software Trends

  • Lee, Yo-Seob;Moon, Phil-Joo
    • International Journal of Advanced Culture Technology
    • /
    • v.7 no.4
    • /
    • pp.56-62
    • /
    • 2019
  • Recently, research using artificial neural networks has further expanded the field of neural network optimization and automatic structuring from improving inference accuracy. The performance of the machine learning algorithm depends on how the hyperparameters are configured. Open-source hyperparameter optimization software can be an important step forward in improving the performance of machine learning algorithms. In this paper, we review open-source hyperparameter optimization softwares.

New Approaches to Xerostomia with Salivary Flow Rate Based on Machine Learning Algorithm

  • Yeon-Hee Lee;Q-Schick Auh;Hee-Kyung Park
    • Journal of Korean Dental Science
    • /
    • v.16 no.1
    • /
    • pp.47-62
    • /
    • 2023
  • Purpose: We aimed to investigate the objective cutoff values of unstimulated flow rates (UFR) and stimulated salivary flow rates (SFR) in patients with xerostomia and to present an optimal machine learning model with a classification and regression tree (CART) for all ages. Materials and Methods: A total of 829 patients with oral diseases were enrolled (591 females; mean age, 59.29±16.40 years; 8~95 years old), 199 patients with xerostomia and 630 patients without xerostomia. Salivary and clinical characteristics were collected and analyzed. Result: Patients with xerostomia had significantly lower levels of UFR (0.29±0.22 vs. 0.41±0.24 ml/min) and SFR (1.12±0.55 vs. 1.39±0.94 ml/min) (P<0.001), respectively, compared to those with non-xerostomia. The presence of xerostomia had a significantly negative correlation with UFR (r=-0.603, P=0.002) and SFR (r=-0.301, P=0.017). In the diagnosis of xerostomia based on the CART algorithm, the presence of stomatitis, candidiasis, halitosis, psychiatric disorder, and hyperlipidemia were significant predictors for xerostomia, and the cutoff ranges for xerostomia for UFR and SFR were 0.03~0.18 ml/min and 0.85~1.6 ml/min, respectively. Conclusion: Xerostomia was correlated with decreases in UFR and SFR, and their cutoff values varied depending on the patient's underlying oral and systemic conditions.

Prediction of medication-related osteonecrosis of the jaw (MRONJ) using automated machine learning in patients with osteoporosis associated with dental extraction and implantation: a retrospective study

  • Da Woon Kwack;Sung Min Park
    • Journal of the Korean Association of Oral and Maxillofacial Surgeons
    • /
    • v.49 no.3
    • /
    • pp.135-141
    • /
    • 2023
  • Objectives: This study aimed to develop and validate machine learning (ML) models using H2O-AutoML, an automated ML program, for predicting medication-related osteonecrosis of the jaw (MRONJ) in patients with osteoporosis undergoing tooth extraction or implantation. Patients and Methods: We conducted a retrospective chart review of 340 patients who visited Dankook University Dental Hospital between January 2019 and June 2022 who met the following inclusion criteria: female, age ≥55 years, osteoporosis treated with antiresorptive therapy, and recent dental extraction or implantation. We considered medication administration and duration, demographics, and systemic factors (age and medical history). Local factors, such as surgical method, number of operated teeth, and operation area, were also included. Six algorithms were used to generate the MRONJ prediction model. Results: Gradient boosting demonstrated the best diagnostic accuracy, with an area under the receiver operating characteristic curve (AUC) of 0.8283. Validation with the test dataset yielded a stable AUC of 0.7526. Variable importance analysis identified duration of medication as the most important variable, followed by age, number of teeth operated, and operation site. Conclusion: ML models can help predict MRONJ occurrence in patients with osteoporosis undergoing tooth extraction or implantation based on questionnaire data acquired at the first visit.

Machine learning-based techniques to facilitate the production of stone nano powder-reinforced manufactured-sand concrete

  • Zanyu Huang;Qiuyue Han;Adil Hussein Mohammed;Arsalan Mahmoodzadeh;Nejib Ghazouani;Shtwai Alsubai;Abed Alanazi;Abdullah Alqahtani
    • Advances in nano research
    • /
    • v.15 no.6
    • /
    • pp.533-539
    • /
    • 2023
  • This study aims to examine four machine learning (ML)-based models for their potential to estimate the splitting tensile strength (STS) of manufactured sand concrete (MSC). The ML models were trained and tested based on 310 experimental data points. Stone nanopowder content (SNPC), curing age (CA), and water-to-cement (W/C) ratio were also studied for their impacts on the STS of MSC. According to the results, the support vector regression (SVR) model had the highest correlation with experimental data. Still, all of the optimized ML models showed promise in estimating the STS of MSC. Both ML and laboratory results showed that MSC with 10% SNPC improved the STS of MSC.

Surface-Engineered Graphene surface-enhanced Raman scattering Platform with Machine-learning Enabled Classification of Mixed Analytes

  • Jae Hee Cho;Garam Bae;Ki-Seok An
    • Journal of Sensor Science and Technology
    • /
    • v.33 no.3
    • /
    • pp.139-146
    • /
    • 2024
  • Surface-enhanced Raman scattering (SERS) enables the detection of various types of π-conjugated biological and chemical molecules owing to its exceptional sensitivity in obtaining unique spectra, offering nondestructive classification capabilities for target analytes. Herein, we demonstrate an innovative strategy that provides significant machine learning (ML)-enabled predictive SERS platforms through surface-engineered graphene via complementary hybridization with Au nanoparticles (NPs). The hybridized Au NPs/graphene SERS platforms showed exceptional sensitivity (10-7 M) due to the collaborative strong correlation between the localized electromagnetic effect and the enhanced chemical bonding reactivity. The chemical and physical properties of the demonstrated SERS platform were systematically investigated using microscopy and spectroscopic analysis. Furthermore, an innovative strategy employing ML is proposed to predict various analytes based on a featured Raman spectral database. Using a customized data-preprocessing algorithm, the feature data for ML were extracted from the Raman peak characteristic information, such as intensity, position, and width, from the SERS spectrum data. Additionally, sophisticated evaluations of various types of ML classification models were conducted using k-fold cross-validation (k = 5), showing 99% prediction accuracy.

Path Loss Prediction Using an Ensemble Learning Approach

  • Beom Kwon;Eonsu Noh
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.2
    • /
    • pp.1-12
    • /
    • 2024
  • Predicting path loss is one of the important factors for wireless network design, such as selecting the installation location of base stations in cellular networks. In the past, path loss values were measured through numerous field tests to determine the optimal installation location of the base station, which has the disadvantage of taking a lot of time to measure. To solve this problem, in this study, we propose a path loss prediction method based on machine learning (ML). In particular, an ensemble learning approach is applied to improve the path loss prediction performance. Bootstrap dataset was utilized to obtain models with different hyperparameter configurations, and the final model was built by ensembling these models. We evaluated and compared the performance of the proposed ensemble-based path loss prediction method with various ML-based methods using publicly available path loss datasets. The experimental results show that the proposed method outperforms the existing methods and can predict the path loss values accurately.

Development of ensemble machine learning models for evaluating seismic demands of steel moment frames

  • Nguyen, Hoang D.;Kim, JunHee;Shin, Myoungsu
    • Steel and Composite Structures
    • /
    • v.44 no.1
    • /
    • pp.49-63
    • /
    • 2022
  • This study aims to develop ensemble machine learning (ML) models for estimating the peak floor acceleration and maximum top drift of steel moment frames. For this purpose, random forest, adaptive boosting, gradient boosting regression tree (GBRT), and extreme gradient boosting (XGBoost) models were considered. A total of 621 steel moment frames were analyzed under 240 ground motions using OpenSees software to generate the dataset for ML models. From the results, the GBRT and XGBoost models exhibited the highest performance for predicting peak floor acceleration and maximum top drift, respectively. The significance of each input variable on the prediction was examined using the best-performing models and Shapley additive explanations approach (SHAP). It turned out that the peak ground acceleration had the most significant impact on the peak floor acceleration prediction. Meanwhile, the spectral accelerations at 1 and 2 s had the most considerable influence on the maximum top drift prediction. Finally, a graphical user interface module was created that places a pioneering step for the application of ML to estimate the seismic demands of building structures in practical design.

Optimizing shallow foundation design: A machine learning approach for bearing capacity estimation over cavities

  • Kumar Shubham;Subhadeep Metya;Abdhesh Kumar Sinha
    • Geomechanics and Engineering
    • /
    • v.37 no.6
    • /
    • pp.629-641
    • /
    • 2024
  • The presence of excavations or cavities beneath the foundations of a building can have a significant impact on their stability and cause extensive damage. Traditional methods for calculating the bearing capacity and subsidence of foundations over cavities can be complex and time-consuming, particularly when dealing with conditions that vary. In such situations, machine learning (ML) and deep learning (DL) techniques provide effective alternatives. This study concentrates on constructing a prediction model based on the performance of ML and DL algorithms that can be applied in real-world settings. The efficacy of eight algorithms, including Regression Analysis, k-Nearest Neighbor, Decision Tree, Random Forest, Multivariate Regression Spline, Artificial Neural Network, and Deep Neural Network, was evaluated. Using a Python-assisted automation technique integrated with the PLAXIS 2D platform, a dataset containing 272 cases with eight input parameters and one target variable was generated. In general, the DL model performed better than the ML models, and all models, except the regression models, attained outstanding results with an R2 greater than 0.90. These models can also be used as surrogate models in reliability analysis to evaluate failure risks and probabilities.