• Title/Summary/Keyword: Machine Learning and Artificial Intelligence

Search Result 747, Processing Time 0.025 seconds

Development of a Prediction Model and Correlation Analysis of Weather-induced Flight Delay at Jeju International Airport Using Machine Learning Techniques (머신러닝(Machine Learning) 기법을 활용한 제주국제공항의 운항 지연과의 상관관계 분석 및 지연 여부 예측모형 개발 - 기상을 중심으로 -)

  • Lee, Choongsub;Paing, Zin Min;Yeo, Hyemin;Kim, Dongsin;Baik, Hojong
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.29 no.4
    • /
    • pp.1-20
    • /
    • 2021
  • Due to the recent rapid increase in passenger and cargo air transport demand, the capacity of Jeju International Airport has been approaching its limit. Even though in COVID-19 crisis which has started from Nov 2019, Jeju International Airport still suffers from strong demand in terms of air passenger and cargo transportation. However, it is an undeniable fact that the delay has also increased in Jeju International Airport. In this study, we analyze the correlation between weather and delayed departure operation based on both datum collected from the historical airline operation information and aviation weather statistics of Jeju International Airport. Adopting machine learning techniques, we then analyze weather condition Jeju International Airport and construct a delay prediction model. The model presented in this study is expected to play a useful role to predict aircraft departure delay and contribute to enhance aircraft operation efficiency and punctuality in the Jeju International Airport.

A Study on Total Production Time Prediction Using Machine Learning Techniques (머신러닝 기법을 이용한 총생산시간 예측 연구)

  • Eun-Jae Nam;Kwang-Soo Kim
    • Journal of the Korea Safety Management & Science
    • /
    • v.25 no.2
    • /
    • pp.159-165
    • /
    • 2023
  • The entire industry is increasing the use of big data analysis using artificial intelligence technology due to the Fourth Industrial Revolution. The value of big data is increasing, and the same is true of the production technology. However, small and medium -sized manufacturers with small size are difficult to use for work due to lack of data management ability, and it is difficult to enter smart factories. Therefore, to help small and medium -sized manufacturing companies use big data, we will predict the gross production time through machine learning. In previous studies, machine learning was conducted as a time and quantity factor for production, and the excellence of the ExtraTree Algorithm was confirmed by predicting gross product time. In this study, the worker's proficiency factors were added to the time and quantity factors necessary for production, and the prediction rate of LightGBM Algorithm knowing was the highest. The results of the study will help to enhance the company's competitiveness and enhance the competitiveness of the company by identifying the possibility of data utilization of the MES system and supporting systematic production schedule management.

A Novel Approach to COVID-19 Diagnosis Based on Mel Spectrogram Features and Artificial Intelligence Techniques

  • Alfaidi, Aseel;Alshahrani, Abdullah;Aljohani, Maha
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.9
    • /
    • pp.195-207
    • /
    • 2022
  • COVID-19 has remained one of the most serious health crises in recent history, resulting in the tragic loss of lives and significant economic impacts on the entire world. The difficulty of controlling COVID-19 poses a threat to the global health sector. Considering that Artificial Intelligence (AI) has contributed to improving research methods and solving problems facing diverse fields of study, AI algorithms have also proven effective in disease detection and early diagnosis. Specifically, acoustic features offer a promising prospect for the early detection of respiratory diseases. Motivated by these observations, this study conceptualized a speech-based diagnostic model to aid in COVID-19 diagnosis. The proposed methodology uses speech signals from confirmed positive and negative cases of COVID-19 to extract features through the pre-trained Visual Geometry Group (VGG-16) model based on Mel spectrogram images. This is used in addition to the K-means algorithm that determines effective features, followed by a Genetic Algorithm-Support Vector Machine (GA-SVM) classifier to classify cases. The experimental findings indicate the proposed methodology's capability to classify COVID-19 and NOT COVID-19 of varying ages and speaking different languages, as demonstrated in the simulations. The proposed methodology depends on deep features, followed by the dimension reduction technique for features to detect COVID-19. As a result, it produces better and more consistent performance than handcrafted features used in previous studies.

A Container Orchestration System for Process Workloads

  • Jong-Sub Lee;Seok-Jae Moon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.15 no.4
    • /
    • pp.270-278
    • /
    • 2023
  • We propose a container orchestration system for process workloads that combines the potential of big data and machine learning technologies to integrate enterprise process-centric workloads. This proposed system analyzes big data generated from industrial automation to identify hidden patterns and build a machine learning prediction model. For each machine learning case, training data is loaded into a data store and preprocessed for model training. In the next step, you can use the training data to select and apply an appropriate model. Then evaluate the model using the following test data: This step is called model construction and can be performed in a deployment framework. Additionally, a visual hierarchy is constructed to display prediction results and facilitate big data analysis. In order to implement parallel computing of PCA in the proposed system, several virtual systems were implemented to build the cluster required for the big data cluster. The implementation for evaluation and analysis built the necessary clusters by creating multiple virtual machines in a big data cluster to implement parallel computation of PCA. The proposed system is modeled as layers of individual components that can be connected together. The advantage of a system is that components can be added, replaced, or reused without affecting the rest of the system.

Investigations on Dynamic Trading Strategy Utilizing Stochastic Optimal Control and Machine Learning (확률론적 최적제어와 기계학습을 이용한 동적 트레이딩 전략에 관한 고찰)

  • Park, Jooyoung;Yang, Dongsu;Park, Kyungwook
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.4
    • /
    • pp.348-353
    • /
    • 2013
  • Recently, control theory including stochastic optimal control and various machine-learning-based artificial intelligence methods have become major tools in the field of financial engineering. In this paper, we briefly review some recent papers utilizing stochastic optimal control theory in the fields of the pair trading for mean-reverting markets and the trend-following strategy, and consider a couple of strategies utilizing both stochastic optimal control theory and machine learning methods to acquire more flexible and accessible tools. Illustrative simulations show that the considered strategies can yield encouraging results when applied to a set of real financial market data.

Path Loss Prediction Using an Ensemble Learning Approach

  • Beom Kwon;Eonsu Noh
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.2
    • /
    • pp.1-12
    • /
    • 2024
  • Predicting path loss is one of the important factors for wireless network design, such as selecting the installation location of base stations in cellular networks. In the past, path loss values were measured through numerous field tests to determine the optimal installation location of the base station, which has the disadvantage of taking a lot of time to measure. To solve this problem, in this study, we propose a path loss prediction method based on machine learning (ML). In particular, an ensemble learning approach is applied to improve the path loss prediction performance. Bootstrap dataset was utilized to obtain models with different hyperparameter configurations, and the final model was built by ensembling these models. We evaluated and compared the performance of the proposed ensemble-based path loss prediction method with various ML-based methods using publicly available path loss datasets. The experimental results show that the proposed method outperforms the existing methods and can predict the path loss values accurately.

Image Enhanced Machine Vision System for Smart Factory

  • Kim, ByungJoo
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.7-13
    • /
    • 2021
  • Machine vision is a technology that helps the computer as if a person recognizes and determines things. In recent years, as advanced technologies such as optical systems, artificial intelligence and big data advanced in conventional machine vision system became more accurate quality inspection and it increases the manufacturing efficiency. In machine vision systems using deep learning, the image quality of the input image is very important. However, most images obtained in the industrial field for quality inspection typically contain noise. This noise is a major factor in the performance of the machine vision system. Therefore, in order to improve the performance of the machine vision system, it is necessary to eliminate the noise of the image. There are lots of research being done to remove noise from the image. In this paper, we propose an autoencoder based machine vision system to eliminate noise in the image. Through experiment proposed model showed better performance compared to the basic autoencoder model in denoising and image reconstruction capability for MNIST and fashion MNIST data sets.

Fake News Detection for Korean News Using Text Mining and Machine Learning Techniques (텍스트 마이닝과 기계 학습을 이용한 국내 가짜뉴스 예측)

  • Yun, Tae-Uk;Ahn, Hyunchul
    • Journal of Information Technology Applications and Management
    • /
    • v.25 no.1
    • /
    • pp.19-32
    • /
    • 2018
  • Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection method using Artificial Intelligence techniques over the past years. But, unfortunately, there have been no prior studies proposed an automated fake news detection method for Korean news. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (Topic Modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as multiple discriminant analysis, case based reasoning, artificial neural networks, and support vector machine can be applied. To validate the effectiveness of the proposed method, we collected 200 Korean news from Seoul National University's FactCheck (http://factcheck.snu.ac.kr). which provides with detailed analysis reports from about 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Human Factor & Artificial Intelligence: For future software security to be invincible, a confronting comprehensive survey

  • Al-Amri, Bayan O;Alsuwat, Hatim;Alsuwat, Emad
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.6
    • /
    • pp.245-251
    • /
    • 2021
  • This work aims to focus on the current features and characteristics of Human Element and Artificial intelligence (AI), ask some questions about future information security, and whether we can avoid human errors by improving machine learning and AI or invest in human knowledge more and work them both together in the best way possible? This work represents several related research results on human behavior towards information security, specified with elements and factors like knowledge and attitude, and how much are they invested for ISA (information security awareness), then presenting some of the latest studies on AI and their contributions to further improvements, making the field more securely advanced, we aim to open a new type of thinking in the cybersecurity field and we wish our suggestions of utilizing each point of strengths in both human attributions in software security and the existence of a well-built AI are going to make better future software security.

Some Observations for Portfolio Management Applications of Modern Machine Learning Methods

  • Park, Jooyoung;Heo, Seongman;Kim, Taehwan;Park, Jeongho;Kim, Jaein;Park, Kyungwook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.16 no.1
    • /
    • pp.44-51
    • /
    • 2016
  • Recently, artificial intelligence has reached the level of top information technologies that will have significant influence over many aspects of our future lifestyles. In particular, in the fields of machine learning technologies for classification and decision-making, there have been a lot of research efforts for solving estimation and control problems that appear in the various kinds of portfolio management problems via data-driven approaches. Note that these modern data-driven approaches, which try to find solutions to the problems based on relevant empirical data rather than mathematical analyses, are useful particularly in practical application domains. In this paper, we consider some applications of modern data-driven machine learning methods for portfolio management problems. More precisely, we apply a simplified version of the sparse Gaussian process (GP) classification method for classifying users' sensitivity with respect to financial risk, and then present two portfolio management issues in which the GP application results can be useful. Experimental results show that the GP applications work well in handling simulated data sets.