• Title/Summary/Keyword: Machine Learning, ML

Search Result 302, Processing Time 0.021 seconds

Stroke Disease Identification System by using Machine Learning Algorithm

  • K.Veena Kumari ;K. Siva Kumar ;M.Sreelatha
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.11
    • /
    • pp.183-189
    • /
    • 2023
  • A stroke is a medical disease where a blood vessel in the brain ruptures, causes damage to the brain. If the flow of blood and different nutrients to the brain is intermittent, symptoms may occur. Stroke is other reason for loss of life and widespread disorder. The prevalence of stroke is high in growing countries, with ischemic stroke being the high usual category. Many of the forewarning signs of stroke can be recognized the seriousness of a stroke can be reduced. Most of the earlier stroke detections and prediction models uses image examination tools like CT (Computed Tomography) scan or MRI (Magnetic Resonance Imaging) which are costly and difficult to use for actual-time recognition. Machine learning (ML) is a part of artificial intelligence (AI) that makes software applications to gain the exact accuracy to predict the end results not having to be directly involved to get the work done. In recent times ML algorithms have gained lot of attention due to their accurate results in medical fields. Hence in this work, Stroke disease identification system by using Machine Learning algorithm is presented. The ML algorithm used in this work is Artificial Neural Network (ANN). The result analysis of presented ML algorithm is compared with different ML algorithms. The performance of the presented approach is compared to find the better algorithm for stroke identification.

Several models for tunnel boring machine performance prediction based on machine learning

  • Mahmoodzadeh, Arsalan;Nejati, Hamid Reza;Ibrahim, Hawkar Hashim;Ali, Hunar Farid Hama;Mohammed, Adil Hussein;Rashidi, Shima;Majeed, Mohammed Kamal
    • Geomechanics and Engineering
    • /
    • v.30 no.1
    • /
    • pp.75-91
    • /
    • 2022
  • This paper aims to show how to use several Machine Learning (ML) methods to estimate the TBM penetration rate systematically (TBM-PR). To this end, 1125 datasets including uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), punch slope index (PSI), distance between the planes of weakness (DPW), orientation of discontinuities (alpha angle-α), rock fracture class (RFC), and actual/measured TBM-PRs were established. To evaluate the ML methods' ability to perform, the 5-fold cross-validation was taken into consideration. Eventually, comparing the ML outcomes and the TBM monitoring data indicated that the ML methods have a very good potential ability in the prediction of TBM-PR. However, the long short-term memory model with a correlation coefficient of 0.9932 and a route mean square error of 2.68E-6 outperformed the remaining six ML algorithms. The backward selection method showed that PSI and RFC were more and less significant parameters on the TBM-PR compared to the others.

Machine learning modeling of irradiation embrittlement in low alloy steel of nuclear power plants

  • Lee, Gyeong-Geun;Kim, Min-Chul;Lee, Bong-Sang
    • Nuclear Engineering and Technology
    • /
    • v.53 no.12
    • /
    • pp.4022-4032
    • /
    • 2021
  • In this study, machine learning (ML) techniques were used to model surveillance test data of nuclear power plants from an international database of the ASTM E10.02 committee. Regression modeling was conducted using various techniques, including Cubist, XGBoost, and a support vector machine. The root mean square deviation of each ML model for the baseline dataset was less than that of the ASTM E900-15 nonlinear regression model. With respect to the interpolation, the ML methods provided excellent predictions with relatively few computations when applied to the given data range. The effect of the explanatory variables on the transition temperature shift (TTS) for the ML methods was analyzed, and the trends were slightly different from those for the ASTM E900-15 model. ML methods showed some weakness in the extrapolation of the fluence in comparison to the ASTM E900-15, while the Cubist method achieved an extrapolation to a certain extent. To achieve a more reliable prediction of the TTS, it was confirmed that advanced techniques should be considered for extrapolation when applying ML modeling.

A Prediction Triage System for Emergency Department During Hajj Period using Machine Learning Models

  • Huda N. Alhazmi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.7
    • /
    • pp.11-23
    • /
    • 2024
  • Triage is a practice of accurately prioritizing patients in emergency department (ED) based on their medical condition to provide them with proper treatment service. The variation in triage assessment among medical staff can cause mis-triage which affect the patients negatively. Developing ED triage system based on machine learning (ML) techniques can lead to accurate and efficient triage outcomes. This study aspires to develop a triage system using machine learning techniques to predict ED triage levels using patients' information. We conducted a retrospective study using Security Forces Hospital ED data, from 2021 through 2023 during Hajj period in Saudia Arabi. Using demographics, vital signs, and chief complaints as predictors, two machine learning models were investigated, naming gradient boosted decision tree (XGB) and deep neural network (DNN). The models were trained to predict ED triage levels and their predictive performance was evaluated using area under the receiver operating characteristic curve (AUC) and confusion matrix. A total of 11,584 ED visits were collected and used in this study. XGB and DNN models exhibit high abilities in the predicting performance with AUC-ROC scores 0.85 and 0.82, respectively. Compared to the traditional approach, our proposed system demonstrated better performance and can be implemented in real-world clinical settings. Utilizing ML applications can power the triage decision-making, clinical care, and resource utilization.

Research Status on Machine Learning for Self-Healing of Mobile Communication Network (이동통신망 자가 치유를 위한 기계학습 연구동향)

  • Kwon, D.S.;Na, J.H.
    • Electronics and Telecommunications Trends
    • /
    • v.35 no.5
    • /
    • pp.30-42
    • /
    • 2020
  • Unlike in previous generations of mobile technology, machine learning (ML)-based self-healing research trend are currently attracting attention to provide high-quality, effective, and low-cost 5G services that need to operate in the HetNets scenario where various wireless transmission technologies are added. Self-healing plays a vital role in detecting and mitigating the faults, and confirming that there is still room for improvement. We analyzed the research trend in self-healing framework and ML-based fault detection, fault diagnosis, and fault compensation. We propose that to ensure that self-healing is a proactive instead of being reactive, we have to design an ML-based self-healing framework and select a suitable ML algorithm for fault detection, diagnosis, and outage compensation.

Research Status of Machine Learning for Self-Organizing Network - I (Self-Organizing Network에서 기계학습 연구동향-I)

  • Kwon, D.S.;Na, J.H.
    • Electronics and Telecommunications Trends
    • /
    • v.35 no.4
    • /
    • pp.103-114
    • /
    • 2020
  • In this study, a machine learning (ML) algorithm is analyzed and summarized as a self-organizing network (SON) realization technology that can minimize expert intervention in the planning, configuration, and optimization of mobile communication networks. First, the basic concept of the ML algorithm in which areas of the SON of this algorithm are applied, is briefly summarized. In addition, the requirements and performance metrics for ML are summarized from the SON perspective, and the ML algorithm that has hitherto been applied to an SON achieves a performance in terms of the SON performance metrics.

Machine Learning Application to the Korean Freshwater Ecosystems

  • Jeong, Kwang-Seuk;Kim, Dong-Kyun;Chon, Tae-Soo;Joo, Gea-Jae
    • The Korean Journal of Ecology
    • /
    • v.28 no.6
    • /
    • pp.405-415
    • /
    • 2005
  • This paper considers the advantage of Machine Learning (ML) implemented to freshwater ecosystem research. Currently, many studies have been carried out to find the patterns of environmental impact on dynamics of communities in aquatic ecosystems. Ecological models popularly adapted by many researchers have been a means of information processing in dealing with dynamics in various ecosystems. The up-to-date trend in ecological modelling partially turns to the application of ML to explain specific ecological events in complex ecosystems and to overcome the necessity of complicated data manipulation. This paper briefly introduces ML techniques applied to freshwater ecosystems in Korea. The manuscript provides promising information for the ecologists who utilize ML for elucidating complex ecological patterns and undertaking modelling of spatial and temporal dynamics of communities.

A Pragmatic Framework for Predicting Change Prone Files Using Machine Learning Techniques with Java-based Software

  • Loveleen Kaur;Ashutosh Mishra
    • Asia pacific journal of information systems
    • /
    • v.30 no.3
    • /
    • pp.457-496
    • /
    • 2020
  • This study aims to extensively analyze the performance of various Machine Learning (ML) techniques for predicting version to version change-proneness of source code Java files. 17 object-oriented metrics have been utilized in this work for predicting change-prone files using 31 ML techniques and the framework proposed has been implemented on various consecutive releases of two Java-based software projects available as plug-ins. 10-fold and inter-release validation methods have been employed to validate the models and statistical tests provide supplementary information regarding the reliability and significance of the results. The results of experiments conducted in this article indicate that the ML techniques perform differently under the different validation settings. The results also confirm the proficiency of the selected ML techniques in lieu of developing change-proneness prediction models which could aid the software engineers in the initial stages of software development for classifying change-prone Java files of a software, in turn aiding in the trend estimation of change-proneness over future versions.

From Machine Learning Algorithms to Superior Customer Experience: Business Implications of Machine Learning-Driven Data Analytics in the Hospitality Industry

  • Egor Cherenkov;Vlad Benga;Minwoo Lee;Neil Nandwani;Kenan Raguin;Marie Clementine Sueur;Guohao Sun
    • Journal of Smart Tourism
    • /
    • v.4 no.2
    • /
    • pp.5-14
    • /
    • 2024
  • This study explores the transformative potential of machine learning (ML) and ML-driven data analytics in the hospitality industry. It provides a comprehensive overview of this emerging method, from explaining ML's origins to introducing the evolution of ML-driven data analytics in the hospitality industry. The present study emphasizes the shift embodied in ML, moving from explicit programming towards a self-learning, adaptive approach refined over time through big data. Meanwhile, social media analytics has progressed from simplistic metrics deriving nuanced qualitative insights into consumer behavior as an industry-specific example. Additionally, this study explores innovative applications of these innovative technologies in the hospitality sector, whether in demand forecasting, personalized marketing, predictive maintenance, etc. The study also emphasizes the integration of ML and social media analytics, discussing the implications like enhanced customer personalization, real-time decision-making capabilities, optimized marketing campaigns, and improved fraud detection. In conclusion, ML-driven hospitality data analytics have become indispensable in the strategic and operation machinery of contemporary hospitality businesses. It projects these technologies' continued significance in propelling data-centric advancements across the industry.

Trend in eXplainable Machine Learning for Intelligent Self-organizing Networks (지능형 Self-Organizing Network를 위한 설명 가능한 기계학습 연구 동향)

  • D.S. Kwon;J.H. Na
    • Electronics and Telecommunications Trends
    • /
    • v.38 no.6
    • /
    • pp.95-106
    • /
    • 2023
  • As artificial intelligence has become commonplace in various fields, the transparency of AI in its development and implementation has become an important issue. In safety-critical areas, the eXplainable and/or understandable of artificial intelligence is being actively studied. On the other hand, machine learning have been applied to the intelligence of self-organizing network (SON), but transparency in this application has been neglected, despite the critical decision-makings in the operation of mobile communication systems. We describes concepts of eXplainable machine learning (ML), along with research trends, major issues, and research directions. After summarizing the ML research on SON, research directions are analyzed for explainable ML required in intelligent SON of beyond 5G and 6G communication.