• Title/Summary/Keyword: Artificial intelligence techniques

Search Result 629, Processing Time 0.023 seconds

Comparison of Prediction Accuracy Between Regression Analysis and Deep Learning, and Empirical Analysis of The Importance of Techniques for Optimizing Deep Learning Models (회귀분석과 딥러닝의 예측 정확성에 대한 비교 그리고 딥러닝 모델 최적화를 위한 기법들의 중요성에 대한 실증적 분석)

  • Min-Ho Cho
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.2
    • /
    • pp.299-304
    • /
    • 2023
  • Among artificial intelligence techniques, deep learning is a model that has been used in many places and has proven its effectiveness. However, deep learning models are not used effectively in everywhere. In this paper, we will show the limitations of deep learning models through comparison of regression analysis and deep learning models, and present a guide for effective use of deep learning models. In addition, among various techniques used for optimization of deep learning models, data normalization and data shuffling techniques, which are widely used, are compared and evaluated based on actual data to provide guidelines for increasing the accuracy and value of deep learning models.

Artificial neural network for predicting nuclear power plant dynamic behaviors

  • El-Sefy, M.;Yosri, A.;El-Dakhakhni, W.;Nagasaki, S.;Wiebe, L.
    • Nuclear Engineering and Technology
    • /
    • v.53 no.10
    • /
    • pp.3275-3285
    • /
    • 2021
  • A Nuclear Power Plant (NPP) is a complex dynamic system-of-systems with highly nonlinear behaviors. In order to control the plant operation under both normal and abnormal conditions, the different systems in NPPs (e.g., the reactor core components, primary and secondary coolant systems) are usually monitored continuously, resulting in very large amounts of data. This situation makes it possible to integrate relevant qualitative and quantitative knowledge with artificial intelligence techniques to provide faster and more accurate behavior predictions, leading to more rapid decisions, based on actual NPP operation data. Data-driven models (DDM) rely on artificial intelligence to learn autonomously based on patterns in data, and they represent alternatives to physics-based models that typically require significant computational resources and might not fully represent the actual operation conditions of an NPP. In this study, a feed-forward backpropagation artificial neural network (ANN) model was trained to simulate the interaction between the reactor core and the primary and secondary coolant systems in a pressurized water reactor. The transients used for model training included perturbations in reactivity, steam valve coefficient, reactor core inlet temperature, and steam generator inlet temperature. Uncertainties of the plant physical parameters and operating conditions were also incorporated in these transients. Eight training functions were adopted during the training stage to develop the most efficient network. The developed ANN model predictions were subsequently tested successfully considering different new transients. Overall, through prompt prediction of NPP behavior under different transients, the study aims at demonstrating the potential of artificial intelligence to empower rapid emergency response planning and risk mitigation strategies.

Film Production Using Artificial Intelligence with a Focus on Visual Effects (인공지능을 이용한 영화제작 : 시각효과를 중심으로)

  • Yoo, Tae-Kyung
    • Journal of Korea Entertainment Industry Association
    • /
    • v.15 no.1
    • /
    • pp.53-62
    • /
    • 2021
  • After the first to present projected moving pictures to audiences, the film industry has been reshaping along with technological advancements. Through the full-scale introduction of visual effects-oriented post-production and digital technologies in the film-making process, the film industry has not only undergone significant changes in the production, but is also embracing the cutting edge technologies broadly and expanding the scope of industry. Not long after the change to digital cinema, the concept of artificial intelligence, first known at the Dartmouth summer research project in 1956, before the digitalization of film, is expected to bring about a big transformation in the film industry once again. Large volume of clear digital data from digital film-making makes easy to apply recent artificial intelligence technologies represented by machine learning and deep learning. The use of artificial intelligence techniques is prominent around major visual effects studios due to automate many laborious, time-consuming tasks currently performed by artists. This study aims to predict how artificial intelligence technology will change the film industry in the future through analysis of visual effects production cases using artificial intelligence technology as a production tool and to discuss the industrial potential of artificial intelligence as visual effects technology.

A Novel Approach to COVID-19 Diagnosis Based on Mel Spectrogram Features and Artificial Intelligence Techniques

  • Alfaidi, Aseel;Alshahrani, Abdullah;Aljohani, Maha
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.9
    • /
    • pp.195-207
    • /
    • 2022
  • COVID-19 has remained one of the most serious health crises in recent history, resulting in the tragic loss of lives and significant economic impacts on the entire world. The difficulty of controlling COVID-19 poses a threat to the global health sector. Considering that Artificial Intelligence (AI) has contributed to improving research methods and solving problems facing diverse fields of study, AI algorithms have also proven effective in disease detection and early diagnosis. Specifically, acoustic features offer a promising prospect for the early detection of respiratory diseases. Motivated by these observations, this study conceptualized a speech-based diagnostic model to aid in COVID-19 diagnosis. The proposed methodology uses speech signals from confirmed positive and negative cases of COVID-19 to extract features through the pre-trained Visual Geometry Group (VGG-16) model based on Mel spectrogram images. This is used in addition to the K-means algorithm that determines effective features, followed by a Genetic Algorithm-Support Vector Machine (GA-SVM) classifier to classify cases. The experimental findings indicate the proposed methodology's capability to classify COVID-19 and NOT COVID-19 of varying ages and speaking different languages, as demonstrated in the simulations. The proposed methodology depends on deep features, followed by the dimension reduction technique for features to detect COVID-19. As a result, it produces better and more consistent performance than handcrafted features used in previous studies.

A deep learning framework for wind pressure super-resolution reconstruction

  • Xiao Chen;Xinhui Dong;Pengfei Lin;Fei Ding;Bubryur Kim;Jie Song;Yiqing Xiao;Gang Hu
    • Wind and Structures
    • /
    • v.36 no.6
    • /
    • pp.405-421
    • /
    • 2023
  • Strong wind is the main factors of wind-damage of high-rise buildings, which often creates largely economical losses and casualties. Wind pressure plays a critical role in wind effects on buildings. To obtain the high-resolution wind pressure field, it often requires massive pressure taps. In this study, two traditional methods, including bilinear and bicubic interpolation, and two deep learning techniques including Residual Networks (ResNet) and Generative Adversarial Networks (GANs), are employed to reconstruct wind pressure filed from limited pressure taps on the surface of an ideal building from TPU database. It was found that the GANs model exhibits the best performance in reconstructing the wind pressure field. Meanwhile, it was confirmed that k-means clustering based retained pressure taps as model input can significantly improve the reconstruction ability of GANs model. Finally, the generalization ability of k-means clustering based GANs model in reconstructing wind pressure field is verified by an actual engineering structure. Importantly, the k-means clustering based GANs model can achieve satisfactory reconstruction in wind pressure field under the inputs processing by k-means clustering, even the 20% of pressure taps. Therefore, it is expected to save a huge number of pressure taps under the field reconstruction and achieve timely and accurately reconstruction of wind pressure field under k-means clustering based GANs model.

On the prediction of unconfined compressive strength of silty soil stabilized with bottom ash, jute and steel fibers via artificial intelligence

  • Gullu, Hamza;Fedakar, Halil ibrahim
    • Geomechanics and Engineering
    • /
    • v.12 no.3
    • /
    • pp.441-464
    • /
    • 2017
  • The determination of the mixture parameters of stabilization has become a great concern in geotechnical applications. This paper presents an effort about the application of artificial intelligence (AI) techniques including radial basis neural network (RBNN), multi-layer perceptrons (MLP), generalized regression neural network (GRNN) and adaptive neuro-fuzzy inference system (ANFIS) in order to predict the unconfined compressive strength (UCS) of silty soil stabilized with bottom ash (BA), jute fiber (JF) and steel fiber (SF) under different freeze-thaw cycles (FTC). The dosages of the stabilizers and number of freeze-thaw cycles were employed as input (predictor) variables and the UCS values as output variable. For understanding the dominant parameter of the predictor variables on the UCS of stabilized soil, a sensitivity analysis has also been performed. The performance measures of root mean square error (RMSE), mean absolute error (MAE) and determination coefficient ($R^2$) were used for the evaluations of the prediction accuracy and applicability of the employed models. The results indicate that the predictions due to all AI techniques employed are significantly correlated with the measured UCS ($p{\leq}0.05$). They also perform better predictions than nonlinear regression (NLR) in terms of the performance measures. It is found from the model performances that RBNN approach within AI techniques yields the highest satisfactory results (RMSE = 55.4 kPa, MAE = 45.1 kPa, and $R^2=0.988$). The sensitivity analysis demonstrates that the JF inclusion within the input predictors is the most effective parameter on the UCS responses, followed by FTC.

Deep Learning-Based Dynamic Scheduling with Multi-Agents Supporting Scalability in Edge Computing Environments (멀티 에이전트 에지 컴퓨팅 환경에서 확장성을 지원하는 딥러닝 기반 동적 스케줄링)

  • JongBeom Lim
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.9
    • /
    • pp.399-406
    • /
    • 2023
  • Cloud computing has been evolved to support edge computing architecture that combines fog management layer with edge servers. The main reason why it is received much attention is low communication latency for real-time IoT applications. At the same time, various cloud task scheduling techniques based on artificial intelligence have been proposed. Artificial intelligence-based cloud task scheduling techniques show better performance in comparison to existing methods, but it has relatively high scheduling time. In this paper, we propose a deep learning-based dynamic scheduling with multi-agents supporting scalability in edge computing environments. The proposed method shows low scheduling time than previous artificial intelligence-based scheduling techniques. To show the effectiveness of the proposed method, we compare the performance between previous and proposed methods in a scalable experimental environment. The results show that our method supports real-time IoT applications with low scheduling time, and shows better performance in terms of the number of completed cloud tasks in a scalable experimental environment.

Artificial Intelligence in Gastric Cancer Imaging With Emphasis on Diagnostic Imaging and Body Morphometry

  • Kyung Won Kim;Jimi Huh ;Bushra Urooj ;Jeongjin Lee ;Jinseok Lee ;In-Seob Lee ;Hyesun Park ;Seongwon Na ;Yousun Ko
    • Journal of Gastric Cancer
    • /
    • v.23 no.3
    • /
    • pp.388-399
    • /
    • 2023
  • Gastric cancer remains a significant global health concern, coercing the need for advancements in imaging techniques for ensuring accurate diagnosis and effective treatment planning. Artificial intelligence (AI) has emerged as a potent tool for gastric-cancer imaging, particularly for diagnostic imaging and body morphometry. This review article offers a comprehensive overview of the recent developments and applications of AI in gastric cancer imaging. We investigated the role of AI imaging in gastric cancer diagnosis and staging, showcasing its potential to enhance the accuracy and efficiency of these crucial aspects of patient management. Additionally, we explored the application of AI body morphometry specifically for assessing the clinical impact of gastrectomy. This aspect of AI utilization holds significant promise for understanding postoperative changes and optimizing patient outcomes. Furthermore, we examine the current state of AI techniques for the prognosis of patients with gastric cancer. These prognostic models leverage AI algorithms to predict long-term survival outcomes and assist clinicians in making informed treatment decisions. However, the implementation of AI techniques for gastric cancer imaging has several limitations. As AI continues to evolve, we hope to witness the translation of cutting-edge technologies into routine clinical practice, ultimately improving patient care and outcomes in the fight against gastric cancer.

Fault Location Technique of 154 kV Substation using Neural Network (신경회로망을 이용한 154kV 변전소의 고장 위치 판별 기법)

  • Ahn, Jong-Bok;Kang, Tae-Won;Park, Chul-Won
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.9
    • /
    • pp.1146-1151
    • /
    • 2018
  • Recently, researches on the intelligence of electric power facilities have been trying to apply artificial intelligence techniques as computer platforms have improved. In particular, faults occurring in substation should be able to quickly identify possible faults and minimize power fault recovery time. This paper presents fault location technique for 154kV substation using neural network. We constructed a training matrix based on the operating conditions of the circuit breaker and IED to identify the fault location of each component of the target 154kV substation, such as line, bus, and transformer. After performing the training to identify the fault location by the neural network using Weka software, the performance of fault location discrimination of the designed neural network was confirmed.

A TabNet - Based System for Water Quality Prediction in Aquaculture

  • Nguyen, Trong–Nghia;Kim, Soo Hyung;Do, Nhu-Tai;Hong, Thai-Thi Ngoc;Yang, Hyung Jeong;Lee, Guee Sang
    • Smart Media Journal
    • /
    • v.11 no.2
    • /
    • pp.39-52
    • /
    • 2022
  • In the context of the evolution of automation and intelligence, deep learning and machine learning algorithms have been widely applied in aquaculture in recent years, providing new opportunities for the digital realization of aquaculture. Especially, water quality management deserves attention thanks to its importance to food organisms. In this study, we proposed an end-to-end deep learning-based TabNet model for water quality prediction. From major indexes of water quality assessment, we applied novel deep learning techniques and machine learning algorithms in innovative fish aquaculture to predict the number of water cells counting. Furthermore, the application of deep learning in aquaculture is outlined, and the obtained results are analyzed. The experiment on in-house data showed an optimistic impact on the application of artificial intelligence in aquaculture, helping to reduce costs and time and increase efficiency in the farming process.