• Title/Summary/Keyword: Intelligence Optimization

Search Result 384, Processing Time 0.028 seconds

Deformation of the PDMS Membrane for a Liquid Lens Under Hydraulic Pressure

  • Gu, Haipeng;Gan, Zihao;Hong, Huajie;He, Keyan
    • Current Optics and Photonics
    • /
    • v.5 no.4
    • /
    • pp.391-401
    • /
    • 2021
  • In the present study, a hyperelastic constitutive model is built by complying with a simplified hyperelastic strain energy function, which yields the numerical solution for a deformed polydimethylsiloxane (PDMS) membrane in the case of axisymmetric hydraulic pressure. Moreover, a nonlinear equilibrium model is deduced to accurately express the deformation of the membrane, laying a basis for precise analysis of the optical transfer function. Comparison to experimental and simulated data suggests that the model is capable of accurately characterizing the deformation behavior of the membrane. Furthermore, the stretch ratio derived from the model applies to the geometrical optimization of the deformed membrane.

Application of artificial intelligence for solving the engineering problems

  • Xiaofei Liu;Xiaoli Wang
    • Structural Engineering and Mechanics
    • /
    • v.85 no.1
    • /
    • pp.15-27
    • /
    • 2023
  • Using artificial intelligence and internet of things methods in engineering and industrial problems has become a widespread method in recent years. The low computational costs and high accuracy without the need to engage human resources in comparison to engineering demands are the main advantages of artificial intelligence. In the present paper, a deep neural network (DNN) with a specific method of optimization is utilize to predict fundamental natural frequency of a cylindrical structure. To provide data for training the DNN, a detailed numerical analysis is presented with the aid of functionally modified couple stress theory (FMCS) and first-order shear deformation theory (FSDT). The governing equations obtained using Hamilton's principle, are further solved engaging generalized differential quadrature method. The results of the numerical solution are utilized to train and test the DNN model. The results are validated at the first step and a comprehensive parametric results are presented thereafter. The results show the high accuracy of the DNN results and effects of different geometrical, modeling and material parameters in the natural frequencies of the structure.

An Ant-based Routing Method using Enhanced Path Maintenance for MANETs (MANET에서 향상된 경로 관리를 사용한 개미 기반 라우팅 방안)

  • Woo, Mi-Ae
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.9B
    • /
    • pp.1281-1286
    • /
    • 2010
  • Ant-based routing methods belong to a class of ant colony optimization algorithms which apply the behavior of ants in nature to routing mechanism. Since the topology of mobile ad-hoc network(MANET) changes dynamically, it is needed to establish paths based on the local information. Subsequently, it is known that routing in MANET is one of applications of ant colony optimization. In this paper, we propose a routing method, namely EPMAR, which enhances SIR in terms of route selection method and the process upon link failure. The performance of the proposed method is compared with those of AntHocNet and SIR. Based on he analysis, it is proved that the proposed method provided higher packet delivery ratio and less critical link failure than AntHocNet and SIR.

A Survey of Computational Offloading in Cloud/Edge-based Architectures: Strategies, Optimization Models and Challenges

  • Alqarni, Manal M.;Cherif, Asma;Alkayal, Entisar
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.3
    • /
    • pp.952-973
    • /
    • 2021
  • In recent years, mobile devices have become an essential part of daily life. More and more applications are being supported by mobile devices thanks to edge computing, which represents an emergent architecture that provides computing, storage, and networking capabilities for mobile devices. In edge computing, heavy tasks are offloaded to edge nodes to alleviate the computations on the mobile side. However, offloading computational tasks may incur extra energy consumption and delays due to network congestion and server queues. Therefore, it is necessary to optimize offloading decisions to minimize time, energy, and payment costs. In this article, different offloading models are examined to identify the offloading parameters that need to be optimized. The paper investigates and compares several optimization techniques used to optimize offloading decisions, specifically Swarm Intelligence (SI) models, since they are best suited to the distributed aspect of edge computing. Furthermore, based on the literature review, this study concludes that a Cuckoo Search Algorithm (CSA) in an edge-based architecture is a good solution for balancing energy consumption, time, and cost.

Artificial Intelligence Application using Nutcracker Optimization Algorithm to Enhance Efficiency & Reliability of Power Systems via Optimal Setting and Sizing of Renewable Energy Sources as Distributed Generations in Radial Distribution Systems

  • Nawaf A. AlZahrani;Mohammad Hamza Awedh;Ali M. Rushdi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.1
    • /
    • pp.31-44
    • /
    • 2024
  • People have been using more energy in the last years. Several research studies were conducted to develop sustainable energy sources that can produce clean energy to fulfill our energy requirements. Using renewable energy sources helps to decrease the harm to the environment caused by conventional power plants. Choosing the right location and capacity for DG-RESs can greatly impact the performance of Radial Distribution Systems. It is beneficial to have a good and stable electrical power supply with low energy waste and high effectiveness because it improves the performance and reliability of the system. This research investigates the ideal location and size for solar and wind power systems, which are popular methods for producing clean electricity. A new artificial intelligent algorithm called Nutcracker Optimization Algorithm (NOA) is used to find the best solution in two common electrical systems named IEEE 33 and 69 bus systems to examine the improvement in the efficiency & reliability of power system network by reducing power losses, making voltage deviation smaller, and improving voltage stability. Finally, the NOA method is compared with another method called PSO and developed Hybrid Algorithm (NOA+PSO) to validate the proposed algorithm effectiveness and enhancement of both efficiency and reliability aspects.

Analysis of Korea's Artificial Intelligence Competitiveness Based on Patent Data: Focusing on Patent Index and Topic Modeling (특허데이터 기반 한국의 인공지능 경쟁력 분석 : 특허지표 및 토픽모델링을 중심으로)

  • Lee, Hyun-Sang;Qiao, Xin;Shin, Sun-Young;Kim, Gyu-Ri;Oh, Se-Hwan
    • Informatization Policy
    • /
    • v.29 no.4
    • /
    • pp.43-66
    • /
    • 2022
  • With the development of artificial intelligence technology, competition for artificial intelligence technology patents around the world is intensifying. During the period 2000 ~ 2021, artificial intelligence technology patent applications at the US Patent and Trademark Office have been steadily increasing, and the growth rate has been steeper since the 2010s. As a result of analyzing Korea's artificial intelligence technology competitiveness through patent indices, it is evaluated that patent activity, impact, and marketability are superior in areas such as auditory intelligence and visual intelligence. However, compared to other countries, overall Korea's artificial intelligence technology patents are good in terms of activity and marketability, but somewhat inferior in technological impact. While noise canceling and voice recognition have recently decreased as topics for artificial intelligence, growth is expected in areas such as model learning optimization, smart sensors, and autonomous driving. In the case of Korea, efforts are required as there is a slight lack of patent applications in areas such as fraud detection/security and medical vision learning.

A Study on CFD Result Analysis of Mist-CVD using Artificial Intelligence Method (인공지능기법을 이용한 초음파분무화학기상증착의 유동해석 결과분석에 관한 연구)

  • Joohwan Ha;Seokyoon Shin;Junyoung Kim;Changwoo Byun
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.1
    • /
    • pp.134-138
    • /
    • 2023
  • This study focuses on the analysis of the results of computational fluid dynamics simulations of mist-chemical vapor deposition for the growth of an epitaxial wafer in power semiconductor technology using artificial intelligence techniques. The conventional approach of predicting the uniformity of the deposited layer using computational fluid dynamics and design of experimental takes considerable time. To overcome this, artificial intelligence method, which is widely used for optimization, automation, and prediction in various fields, was utilized to analyze the computational fluid dynamics simulation results. The computational fluid dynamics simulation results were analyzed using a supervised deep neural network model for regression analysis. The predicted results were evaluated quantitatively using Euclidean distance calculations. And the Bayesian optimization was used to derive the optimal condition, which results obtained through deep neural network training showed a discrepancy of approximately 4% when compared to the results obtained through computational fluid dynamics analysis. resulted in an increase of 146.2% compared to the previous computational fluid dynamics simulation results. These results are expected to have practical applications in various fields.

  • PDF

Research on the Performance Optimization of HR-Net for Spinal Region Segmentation in Whole Spine X-ray Images (Whole Spine X-ray 영상에서 척추 영역 분할을 위한 HR-Net 성능 최적화에 관한 연구)

  • Han Beom Yu;Ho Seong Hwang;Dong Hyun Kim;Hee Jue Oh;Ho Chul Kim
    • Journal of Biomedical Engineering Research
    • /
    • v.45 no.4
    • /
    • pp.139-147
    • /
    • 2024
  • This study enhances AI algorithms for extracting spinal regions from Whole Spine X-rays, aiming for higher accuracy while minimizing learning and detection times. Whole Spine X-rays, critical for diagnosing conditions such as scoliosis and kyphosis, necessitate precise differentiation of spinal contours. The conventional manual methodology encounters challenge due to the overlap of anatomical structures, prompting the integration of AI to overcome these limitations and enhance diagnostic precision. In this study, 1204 AP and 500 LAT Whole Spine X-ray images were meticulously labeled, spanning the third cervical to the fifth lumbar vertebrae. We based our efforts on the HR-Net algorithm, which exhibited the highest accuracy, and proceeded to simplify its network architecture and enhance the block structure for optimization. The optimized HR-Net algorithm demonstrates an improvement, increasing accuracy by 2.98% for the AP dataset and 1.59% for the LAT dataset compared to its original formulation. Additionally, the modification resulted in a substantial reduction in learning time by 70.06% for AP images and 68.43% for LAT images, along with a decrease in detection time by 47.18% for AP and 43.07% for LAT images. The time taken per image for detection was also reduced by 47.09% for AP and 43.07% for LAT images. We suggest that the application of the proposed HR-Net in this study can lead to more accurate and efficient extraction of spinal regions in Whole Spine X-ray images. This can become a crucial tool for medical professionals in the diagnosis and treatment of spinal-related conditions, and it will serve as a foundation for future research aimed at further improving the accuracy and speed of spinal region segmentation.

Study on Prediction of Similar Typhoons through Neural Network Optimization (뉴럴 네트워크의 최적화에 따른 유사태풍 예측에 관한 연구)

  • Kim, Yeon-Joong;Kim, Tae-Woo;Yoon, Jong-Sung;Kim, In-Ho
    • Journal of Ocean Engineering and Technology
    • /
    • v.33 no.5
    • /
    • pp.427-434
    • /
    • 2019
  • Artificial intelligence (AI)-aided research currently enjoys active use in a wide array of fields thanks to the rapid development of computing capability and the use of Big Data. Until now, forecasting methods were primarily based on physics models and statistical studies. Today, AI is utilized in disaster prevention forecasts by studying the relationships between physical factors and their characteristics. Current studies also involve combining AI and physics models to supplement the strengths and weaknesses of each aspect. However, prior to these studies, an optimization algorithm for the AI model should be developed and its applicability should be studied. This study aimed to improve the forecast performance by constructing a model for neural network optimization. An artificial neural network (ANN) followed the ever-changing path of a typhoon to produce similar typhoon predictions, while the optimization achieved by the neural network algorithm was examined by evaluating the activation function, hidden layer composition, and dropouts. A learning and test dataset was constructed from the available digital data of one typhoon that affected Korea throughout the record period (1951-2018). As a result of neural network optimization, assessments showed a higher degree of forecast accuracy.

Trends in Artificial Intelligence Applications in Clinical Trials: An analysis of ClinicalTrials.gov (임상시험에서 인공지능의 활용에 대한 분석 및 고찰: ClinicalTrials.gov 분석)

  • Jeong Min Go;Ji Yeon Lee;Yun-Kyoung Song;Jae Hyun Kim
    • Korean Journal of Clinical Pharmacy
    • /
    • v.34 no.2
    • /
    • pp.134-139
    • /
    • 2024
  • Background: Increasing numbers of studies and research about artificial intelligence (AI) and machine learning (ML) have led to their application in clinical trials. The purpose of this study is to analyze computer-based new technologies (AI/ML) applied on clinical trials registered on ClinicalTrials.gov to elucidate current usage of these technologies. Methods: As of March 1st, 2023, protocols listed on ClinicalTrials.gov that claimed to use AI/ML and included at least one of the following interventions-Drug, Biological, Dietary Supplement, or Combination Product-were selected. The selected protocols were classified according to their context of use: 1) drug discovery; 2) toxicity prediction; 3) enrichment; 4) risk stratification/management; 5) dose selection/optimization; 6) adherence; 7) synthetic control; 8) endpoint assessment; 9) postmarketing surveillance; and 10) drug selection. Results: The applications of AI/ML were explored in 131 clinical trial protocols. The areas where AI/ML was most frequently utilized in clinical trials included endpoint assessment (n=80), followed by dose selection/optimization (n=15), risk stratification/management (n=13), drug discovery (n=4), adherence (n=4), drug selection (n=1) and enrichment (n=1). Conclusion: The most frequent application of AI/ML in clinical trials is in the fields of endpoint assessment, where the utilization is primarily focuses on the diagnosis of disease by imaging or video analyses. The number of clinical trials using artificial intelligence will increase as the technology continues to develop rapidly, making it necessary for regulatory associates to establish proper regulations for these clinical trials.