• Title/Summary/Keyword: Dynamic Analysis Machine Learning

Search Result 71, Processing Time 0.023 seconds

Prediction of dynamic soil properties coupled with machine learning algorithms

  • Dae-Hong Min;Hyung-Koo Yoon
    • Geomechanics and Engineering
    • /
    • v.37 no.3
    • /
    • pp.253-262
    • /
    • 2024
  • Dynamic properties are pivotal in soil analysis, yet their experimental determination is hampered by complex methodologies and the need for costly equipment. This study aims to predict dynamic soil properties using static properties that are relatively easier to obtain, employing machine learning techniques. The static properties considered include soil cohesion, friction angle, water content, specific gravity, and compressional strength. In contrast, the dynamic properties of interest are the velocities of compressional and shear waves. Data for this study are sourced from 26 boreholes, as detailed in a geotechnical investigation report database, comprising a total of 130 data points. An importance analysis, grounded in the random forest algorithm, is conducted to evaluate the significance of each dynamic property. This analysis informs the prediction of dynamic properties, prioritizing those static properties identified as most influential. The efficacy of these predictions is quantified using the coefficient of determination, which indicated exceptionally high reliability, with values reaching 0.99 in both training and testing phases when all input properties are considered. The conventional method is used for predicting dynamic properties through Standard Penetration Test (SPT) and compared the outcomes with this technique. The error ratio has decreased by approximately 0.95, thereby validating its reliability. This research marks a significant advancement in the indirect estimation of the relationship between static and dynamic soil properties through the application of machine learning techniques.

Dynamic Asset Allocation by Applying Regime Detection Analysis (Regime 탐지 분석을 이용한 동적 자산 배분 기법)

  • Kim, Woo Chang
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.38 no.4
    • /
    • pp.258-261
    • /
    • 2012
  • In this paper, I propose a new asset allocation framework to cope with the dynamic nature of the financial market. The investment performance can be much improved by protecting the capital from the market crashes, and such crashes can be pre-identified with high probabilities by regime detection analysis via a specialized unsupervised machine learning technique.

Machine Learning based Seismic Response Prediction Methods for Steel Frame Structures (기계학습 기반 강 구조물 지진응답 예측기법)

  • Lee, Seunghye;Lee, Jaehong
    • Journal of Korean Association for Spatial Structures
    • /
    • v.24 no.2
    • /
    • pp.91-99
    • /
    • 2024
  • In this paper, machine learning models were applied to predict the seismic response of steel frame structures. Both geometric and material nonlinearities were considered in the structural analysis, and nonlinear inelastic dynamic analysis was performed. The ground acceleration response of the El Centro earthquake was applied to obtain the displacement of the top floor, which was used as the dataset for the machine learning methods. Learning was performed using two methods: Decision Tree and Random Forest, and their efficiency was demonstrated through application to 2-story and 6-story 3-D steel frame structure examples.

A Study on Variant Malware Detection Techniques Using Static and Dynamic Features

  • Kang, Jinsu;Won, Yoojae
    • Journal of Information Processing Systems
    • /
    • v.16 no.4
    • /
    • pp.882-895
    • /
    • 2020
  • The amount of malware increases exponentially every day and poses a threat to networks and operating systems. Most new malware is a variant of existing malware. It is difficult to deal with numerous malware variants since they bypass the existing signature-based malware detection method. Thus, research on automated methods of detecting and processing variant malware has been continuously conducted. This report proposes a method of extracting feature data from files and detecting malware using machine learning. Feature data were extracted from 7,000 malware and 3,000 benign files using static and dynamic malware analysis tools. A malware classification model was constructed using multiple DNN, XGBoost, and RandomForest layers and the performance was analyzed. The proposed method achieved up to 96.3% accuracy.

Android Malware Detection using Machine Learning Techniques KNN-SVM, DBN and GRU

  • Sk Heena Kauser;V.Maria Anu
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.7
    • /
    • pp.202-209
    • /
    • 2023
  • Android malware is now on the rise, because of the rising interest in the Android operating system. Machine learning models may be used to classify unknown Android malware utilizing characteristics gathered from the dynamic and static analysis of an Android applications. Anti-virus software simply searches for the signs of the virus instance in a specific programme to detect it while scanning. Anti-virus software that competes with it keeps these in large databases and examines each file for all existing virus and malware signatures. The proposed model aims to provide a machine learning method that depend on the malware detection method for Android inability to detect malware apps and improve phone users' security and privacy. This system tracks numerous permission-based characteristics and events collected from Android apps and analyses them using a classifier model to determine whether the program is good ware or malware. This method used the machine learning techniques KNN-SVM, DBN, and GRU in which help to find the accuracy which gives the different values like KNN gives 87.20 percents accuracy, SVM gives 91.40 accuracy, Naive Bayes gives 85.10 and DBN-GRU Gives 97.90. Furthermore, in this paper, we simply employ standard machine learning techniques; but, in future work, we will attempt to improve those machine learning algorithms in order to develop a better detection algorithm.

Machine Learning Based Malware Detection Using API Call Time Interval (API Call Time Interval을 활용한 머신러닝 기반의 악성코드 탐지)

  • Cho, Young Min;Kwon, Hun Yeong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.1
    • /
    • pp.51-58
    • /
    • 2020
  • The use of malware in cyber threats continues to be used in all ages, and will continue to be a major attack method even if IT technology advances. Therefore, researches for detecting such malicious codes are constantly tried in various ways. Recently, with the development of AI-related technology, many researches related to machine learning have been conducted to detect malware. In this paper, we propose a method to detect malware using machine learning. For machine learning detection, we create a feature around each call interval, ie Time Interval, in which API calls occur among dynamic analysis data, and then apply the result to machine learning techniques.

Machine Learning Based Architecture and Urban Data Analysis - Construction of Floating Population Model Using Deep Learning - (머신러닝을 통한 건축 도시 데이터 분석의 기초적 연구 - 딥러닝을 이용한 유동인구 모델 구축 -)

  • Shin, Dong-Youn
    • Journal of KIBIM
    • /
    • v.9 no.1
    • /
    • pp.22-31
    • /
    • 2019
  • In this paper, we construct a prototype model for city data prediction by using time series data of floating population, and use machine learning to analyze urban data of complex structure. A correlation prediction model was constructed using three of the 10 data (total flow population, male flow population, and Monday flow population), and the result was compared with the actual data. The results of the accuracy were evaluated. The results of this study show that the predicted model of the floating population predicts the correlation between the predicted floating population and the current state of commerce. It is expected that it will help efficient and objective design in the planning stages of architecture, landscape, and urban areas such as tree environment design and layout of trails. Also, it is expected that the dynamic population prediction using multivariate time series data and collected location data will be able to perform integrated simulation with time series data of various fields.

Android Botnet Detection Using Hybrid Analysis

  • Mamoona Arhsad;Ahmad Karim
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.3
    • /
    • pp.704-719
    • /
    • 2024
  • Botnet pandemics are becoming more prevalent with the growing use of mobile phone technologies. Mobile phone technologies provide a wide range of applications, including entertainment, commerce, education, and finance. In addition, botnet refers to the collection of compromised devices managed by a botmaster and engaging with each other via a command server to initiate an attack including phishing email, ad-click fraud, blockchain, and much more. As the number of botnet attacks rises, detecting harmful activities is becoming more challenging in handheld devices. Therefore, it is crucial to evaluate mobile botnet assaults to find the security vulnerabilities that occur through coordinated command servers causing major financial and ethical harm. For this purpose, we propose a hybrid analysis approach that integrates permissions and API and experiments on the machine-learning classifiers to detect mobile botnet applications. In this paper, the experiment employed benign, botnet, and malware applications for validation of the performance and accuracy of classifiers. The results conclude that a classifier model based on a simple decision tree obtained 99% accuracy with a low 0.003 false-positive rate than other machine learning classifiers for botnet applications detection. As an outcome of this paper, a hybrid approach enhances the accuracy of mobile botnet detection as compared to static and dynamic features when both are taken separately.

Selection of Machine Learning Techniques for Network Lifetime Parameters and Synchronization Issues in Wireless Networks

  • Srilakshmi, Nimmagadda;Sangaiah, Arun Kumar
    • Journal of Information Processing Systems
    • /
    • v.15 no.4
    • /
    • pp.833-852
    • /
    • 2019
  • In real time applications, due to their effective cost and small size, wireless networks play an important role in receiving particular data and transmitting it to a base station for analysis, a process that can be easily deployed. Due to various internal and external factors, networks can change dynamically, which impacts the localisation of nodes, delays, routing mechanisms, geographical coverage, cross-layer design, the quality of links, fault detection, and quality of service, among others. Conventional methods were programmed, for static networks which made it difficult for networks to respond dynamically. Here, machine learning strategies can be applied for dynamic networks effecting self-learning and developing tools to react quickly and efficiently, with less human intervention and reprogramming. In this paper, we present a wireless networks survey based on different machine learning algorithms and network lifetime parameters, and include the advantages and drawbacks of such a system. Furthermore, we present learning algorithms and techniques for congestion, synchronisation, energy harvesting, and for scheduling mobile sinks. Finally, we present a statistical evaluation of the survey, the motive for choosing specific techniques to deal with wireless network problems, and a brief discussion on the challenges inherent in this area of research.

A supervised-learning-based spatial performance prediction framework for heterogeneous communication networks

  • Mukherjee, Shubhabrata;Choi, Taesang;Islam, Md Tajul;Choi, Baek-Young;Beard, Cory;Won, Seuck Ho;Song, Sejun
    • ETRI Journal
    • /
    • v.42 no.5
    • /
    • pp.686-699
    • /
    • 2020
  • In this paper, we propose a supervised-learning-based spatial performance prediction (SLPP) framework for next-generation heterogeneous communication networks (HCNs). Adaptive asset placement, dynamic resource allocation, and load balancing are critical network functions in an HCN to ensure seamless network management and enhance service quality. Although many existing systems use measurement data to react to network performance changes, it is highly beneficial to perform accurate performance prediction for different systems to support various network functions. Recent advancements in complex statistical algorithms and computational efficiency have made machine-learning ubiquitous for accurate data-based prediction. A robust network performance prediction framework for optimizing performance and resource utilization through a linear discriminant analysis-based prediction approach has been proposed in this paper. Comparison results with different machine-learning techniques on real-world data demonstrate that SLPP provides superior accuracy and computational efficiency for both stationary and mobile user conditions.