• Title/Summary/Keyword: Machine Learning Algorithm

Search Result 1,493, Processing Time 0.032 seconds

Development of Semi-Active Control Algorithm Using Deep Q-Network (Deep Q-Network를 이용한 준능동 제어알고리즘 개발)

  • Kim, Hyun-Su;Kang, Joo-Won
    • Journal of Korean Association for Spatial Structures
    • /
    • v.21 no.1
    • /
    • pp.79-86
    • /
    • 2021
  • Control performance of a smart tuned mass damper (TMD) mainly depends on control algorithms. A lot of control strategies have been proposed for semi-active control devices. Recently, machine learning begins to be applied to development of vibration control algorithm. In this study, a reinforcement learning among machine learning techniques was employed to develop a semi-active control algorithm for a smart TMD. The smart TMD was composed of magnetorheological damper in this study. For this purpose, an 11-story building structure with a smart TMD was selected to construct a reinforcement learning environment. A time history analysis of the example structure subject to earthquake excitation was conducted in the reinforcement learning procedure. Deep Q-network (DQN) among various reinforcement learning algorithms was used to make a learning agent. The command voltage sent to the MR damper is determined by the action produced by the DQN. Parametric studies on hyper-parameters of DQN were performed by numerical simulations. After appropriate training iteration of the DQN model with proper hyper-parameters, the DQN model for control of seismic responses of the example structure with smart TMD was developed. The developed DQN model can effectively control smart TMD to reduce seismic responses of the example structure.

Machine Learning Model to Predict Osteoporotic Spine with Hounsfield Units on Lumbar Computed Tomography

  • Nam, Kyoung Hyup;Seo, Il;Kim, Dong Hwan;Lee, Jae Il;Choi, Byung Kwan;Han, In Ho
    • Journal of Korean Neurosurgical Society
    • /
    • v.62 no.4
    • /
    • pp.442-449
    • /
    • 2019
  • Objective : Bone mineral density (BMD) is an important consideration during fusion surgery. Although dual X-ray absorptiometry is considered as the gold standard for assessing BMD, quantitative computed tomography (QCT) provides more accurate data in spine osteoporosis. However, QCT has the disadvantage of additional radiation hazard and cost. The present study was to demonstrate the utility of artificial intelligence and machine learning algorithm for assessing osteoporosis using Hounsfield units (HU) of preoperative lumbar CT coupling with data of QCT. Methods : We reviewed 70 patients undergoing both QCT and conventional lumbar CT for spine surgery. The T-scores of 198 lumbar vertebra was assessed in QCT and the HU of vertebral body at the same level were measured in conventional CT by the picture archiving and communication system (PACS) system. A multiple regression algorithm was applied to predict the T-score using three independent variables (age, sex, and HU of vertebral body on conventional CT) coupling with T-score of QCT. Next, a logistic regression algorithm was applied to predict osteoporotic or non-osteoporotic vertebra. The Tensor flow and Python were used as the machine learning tools. The Tensor flow user interface developed in our institute was used for easy code generation. Results : The predictive model with multiple regression algorithm estimated similar T-scores with data of QCT. HU demonstrates the similar results as QCT without the discordance in only one non-osteoporotic vertebra that indicated osteoporosis. From the training set, the predictive model classified the lumbar vertebra into two groups (osteoporotic vs. non-osteoporotic spine) with 88.0% accuracy. In a test set of 40 vertebrae, classification accuracy was 92.5% when the learning rate was 0.0001 (precision, 0.939; recall, 0.969; F1 score, 0.954; area under the curve, 0.900). Conclusion : This study is a simple machine learning model applicable in the spine research field. The machine learning model can predict the T-score and osteoporotic vertebrae solely by measuring the HU of conventional CT, and this would help spine surgeons not to under-estimate the osteoporotic spine preoperatively. If applied to a bigger data set, we believe the predictive accuracy of our model will further increase. We propose that machine learning is an important modality of the medical research field.

Design of a ParamHub for Machine Learning in a Distributed Cloud Environment

  • Su-Yeon Kim;Seok-Jae Moon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.2
    • /
    • pp.161-168
    • /
    • 2024
  • As the size of big data models grows, distributed training is emerging as an essential element for large-scale machine learning tasks. In this paper, we propose ParamHub for distributed data training. During the training process, this agent utilizes the provided data to adjust various conditions of the model's parameters, such as the model structure, learning algorithm, hyperparameters, and bias, aiming to minimize the error between the model's predictions and the actual values. Furthermore, it operates autonomously, collecting and updating data in a distributed environment, thereby reducing the burden of load balancing that occurs in a centralized system. And Through communication between agents, resource management and learning processes can be coordinated, enabling efficient management of distributed data and resources. This approach enhances the scalability and stability of distributed machine learning systems while providing flexibility to be applied in various learning environments.

IoT-based systemic lupus erythematosus prediction model using hybrid genetic algorithm integrated with ANN

  • Edison Prabhu K;Surendran D
    • ETRI Journal
    • /
    • v.45 no.4
    • /
    • pp.594-602
    • /
    • 2023
  • Internet of things (IoT) is commonly employed to detect different kinds of diseases in the health sector. Systemic lupus erythematosus (SLE) is an autoimmune illness that occurs when the body's immune system attacks its own connective tissues and organs. Because of the complicated interconnections between illness trigger exposure levels across time, humans have trouble predicting SLE symptom severity levels. An effective automated machine learning model that intakes IoT data was created to forecast SLE symptoms to solve this issue. IoT has several advantages in the healthcare industry, including interoperability, information exchange, machine-to-machine networking, and data transmission. An SLE symptom-predicting machine learning model was designed by integrating the hybrid marine predator algorithm and atom search optimization with an artificial neural network. The network is trained by the Gene Expression Omnibus dataset as input, and the patients' data are used as input to predict symptoms. The experimental results demonstrate that the proposed model's accuracy is higher than state-of-the-art prediction models at approximately 99.70%.

Comparative Analysis of Machine Learning Techniques for IoT Anomaly Detection Using the NSL-KDD Dataset

  • Zaryn, Good;Waleed, Farag;Xin-Wen, Wu;Soundararajan, Ezekiel;Maria, Balega;Franklin, May;Alicia, Deak
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.1
    • /
    • pp.46-52
    • /
    • 2023
  • With billions of IoT (Internet of Things) devices populating various emerging applications across the world, detecting anomalies on these devices has become incredibly important. Advanced Intrusion Detection Systems (IDS) are trained to detect abnormal network traffic, and Machine Learning (ML) algorithms are used to create detection models. In this paper, the NSL-KDD dataset was adopted to comparatively study the performance and efficiency of IoT anomaly detection models. The dataset was developed for various research purposes and is especially useful for anomaly detection. This data was used with typical machine learning algorithms including eXtreme Gradient Boosting (XGBoost), Support Vector Machines (SVM), and Deep Convolutional Neural Networks (DCNN) to identify and classify any anomalies present within the IoT applications. Our research results show that the XGBoost algorithm outperformed both the SVM and DCNN algorithms achieving the highest accuracy. In our research, each algorithm was assessed based on accuracy, precision, recall, and F1 score. Furthermore, we obtained interesting results on the execution time taken for each algorithm when running the anomaly detection. Precisely, the XGBoost algorithm was 425.53% faster when compared to the SVM algorithm and 2,075.49% faster than the DCNN algorithm. According to our experimental testing, XGBoost is the most accurate and efficient method.

A Study on the Insider Behavior Analysis Using Machine Learning for Detecting Information Leakage (정보 유출 탐지를 위한 머신 러닝 기반 내부자 행위 분석 연구)

  • Kauh, Janghyuk;Lee, Dongho
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.13 no.2
    • /
    • pp.1-11
    • /
    • 2017
  • In this paper, we design and implement PADIL(Prediction And Detection of Information Leakage) system that predicts and detect information leakage behavior of insider by analyzing network traffic and applying a variety of machine learning methods. we defined the five-level information leakage model(Reconnaissance, Scanning, Access and Escalation, Exfiltration, Obfuscation) by referring to the cyber kill-chain model. In order to perform the machine learning for detecting information leakage, PADIL system extracts various features by analyzing the network traffic and extracts the behavioral features by comparing it with the personal profile information and extracts information leakage level features. We tested various machine learning methods and as a result, the DecisionTree algorithm showed excellent performance in information leakage detection and we showed that performance can be further improved by fine feature selection.

Pipeline wall thinning rate prediction model based on machine learning

  • Moon, Seongin;Kim, Kyungmo;Lee, Gyeong-Geun;Yu, Yongkyun;Kim, Dong-Jin
    • Nuclear Engineering and Technology
    • /
    • v.53 no.12
    • /
    • pp.4060-4066
    • /
    • 2021
  • Flow-accelerated corrosion (FAC) of carbon steel piping is a significant problem in nuclear power plants. The basic process of FAC is currently understood relatively well; however, the accuracy of prediction models of the wall-thinning rate under an FAC environment is not reliable. Herein, we propose a methodology to construct pipe wall-thinning rate prediction models using artificial neural networks and a convolutional neural network, which is confined to a straight pipe without geometric changes. Furthermore, a methodology to generate training data is proposed to efficiently train the neural network for the development of a machine learning-based FAC prediction model. Consequently, it is concluded that machine learning can be used to construct pipe wall thinning rate prediction models and optimize the number of training datasets for training the machine learning algorithm. The proposed methodology can be applied to efficiently generate a large dataset from an FAC test to develop a wall thinning rate prediction model for a real situation.

Selecting Machine Learning Model Based on Natural Language Processing for Shanghanlun Diagnostic System Classification (자연어 처리 기반 『상한론(傷寒論)』 변병진단체계(辨病診斷體系) 분류를 위한 기계학습 모델 선정)

  • Young-Nam Kim
    • 대한상한금궤의학회지
    • /
    • v.14 no.1
    • /
    • pp.41-50
    • /
    • 2022
  • Objective : The purpose of this study is to explore the most suitable machine learning model algorithm for Shanghanlun diagnostic system classification using natural language processing (NLP). Methods : A total of 201 data items were collected from 『Shanghanlun』 and 『Clinical Shanghanlun』, 'Taeyangbyeong-gyeolhyung' and 'Eumyangyeokchahunobokbyeong' were excluded to prevent oversampling or undersampling. Data were pretreated using a twitter Korean tokenizer and trained by logistic regression, ridge regression, lasso regression, naive bayes classifier, decision tree, and random forest algorithms. The accuracy of the models were compared. Results : As a result of machine learning, ridge regression and naive Bayes classifier showed an accuracy of 0.843, logistic regression and random forest showed an accuracy of 0.804, and decision tree showed an accuracy of 0.745, while lasso regression showed an accuracy of 0.608. Conclusions : Ridge regression and naive Bayes classifier are suitable NLP machine learning models for the Shanghanlun diagnostic system classification.

  • PDF

A Feasibility Study on the Improvement of Diagnostic Accuracy for Energy-selective Digital Mammography using Machine Learning (머신러닝을 이용한 에너지 선택적 유방촬영의 진단 정확도 향상에 관한 연구)

  • Eom, Jisoo;Lee, Seungwan;Kim, Burnyoung
    • Journal of radiological science and technology
    • /
    • v.42 no.1
    • /
    • pp.9-17
    • /
    • 2019
  • Although digital mammography is a representative method for breast cancer detection. It has a limitation in detecting and classifying breast tumor due to superimposed structures. Machine learning, which is a part of artificial intelligence fields, is a method for analysing a large amount of data using complex algorithms, recognizing patterns and making prediction. In this study, we proposed a technique to improve the diagnostic accuracy of energy-selective mammography by training data using the machine learning algorithm and using dual-energy measurements. A dual-energy images obtained from a photon-counting detector were used for the input data of machine learning algorithms, and we analyzed the accuracy of predicted tumor thickness for verifying the machine learning algorithms. The results showed that the classification accuracy of tumor thickness was above 95% and was improved with an increase of imput data. Therefore, we expect that the diagnostic accuracy of energy-selective mammography can be improved by using machine learning.

Android Malware Detection using Machine Learning Techniques KNN-SVM, DBN and GRU

  • Sk Heena Kauser;V.Maria Anu
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.7
    • /
    • pp.202-209
    • /
    • 2023
  • Android malware is now on the rise, because of the rising interest in the Android operating system. Machine learning models may be used to classify unknown Android malware utilizing characteristics gathered from the dynamic and static analysis of an Android applications. Anti-virus software simply searches for the signs of the virus instance in a specific programme to detect it while scanning. Anti-virus software that competes with it keeps these in large databases and examines each file for all existing virus and malware signatures. The proposed model aims to provide a machine learning method that depend on the malware detection method for Android inability to detect malware apps and improve phone users' security and privacy. This system tracks numerous permission-based characteristics and events collected from Android apps and analyses them using a classifier model to determine whether the program is good ware or malware. This method used the machine learning techniques KNN-SVM, DBN, and GRU in which help to find the accuracy which gives the different values like KNN gives 87.20 percents accuracy, SVM gives 91.40 accuracy, Naive Bayes gives 85.10 and DBN-GRU Gives 97.90. Furthermore, in this paper, we simply employ standard machine learning techniques; but, in future work, we will attempt to improve those machine learning algorithms in order to develop a better detection algorithm.