• Title/Summary/Keyword: reinforcement algorithms

Search Result 149, Processing Time 0.028 seconds

A novel harmony search based optimization of reinforced concrete biaxially loaded columns

  • Nigdeli, Sinan Melih;Bekdas, Gebrail;Kim, Sanghun;Geem, Zong Woo
    • Structural Engineering and Mechanics
    • /
    • v.54 no.6
    • /
    • pp.1097-1109
    • /
    • 2015
  • A novel optimization approach for reinforced concrete (RC) biaxially loaded columns is proposed. Since there are several design constraints and influences, a new computation methodology using iterative analyses for several stages is proposed. In the proposed methodology random iterations are combined with music inspired metaheuristic algorithm called harmony search by modifying the classical rules of the employed algorithm for the problem. Differently from previous approaches, a detailed and practical optimum reinforcement design is done in addition to optimization of dimensions. The main objective of the optimization is the total material cost and the optimization is important for RC members since steel and concrete are very different materials in cost and properties. The methodology was applied for 12 cases of flexural moment combinations. Also, the optimum results are found by using 3 different axial forces for all cases. According to the results, the proposed method is effective to find a detailed optimum result with different number of bars and various sizes which can be only found by 2000 trial of an engineer. Thus, the cost economy is provided by using optimum bars with different sizes.

The Advanced Voltage Regulation Method for ULTC in Distribution Systems with DG

  • Kim, Mi-Young;Song, Yong-Un;Kim, Kyung-Hwa
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.4
    • /
    • pp.737-743
    • /
    • 2013
  • The small-scaled onsite generators such as photovoltaic power, wind power, biomass and fuel cell belong to decarbonization techniques. In general, these generators tend to be connected to utility systems, and they are called distributed generations (DGs) compared with conventional centralized power plants. However, DGs may impact on stabilization of utility systems, which gets utility into trouble. In order to reduce utility's burdens (e.g., investment for facilities reinforcement) and accelerate DG introduction, the advanced operation algorithms under the existing utility systems are urgently needed. This paper presents the advanced voltage regulation method in power systems since the sending voltage of voltage regulators has been played a decisive role restricting maximum installable DG capacity (MaxC_DG). For the proposed voltage regulation method, the difference from existing voltage regulation method is explained and the detailed concept is introduced in this paper. MaxC_DG estimation through case studies based on Korean model network verifies the superiority of the proposed method.

Optimum cost design of frames using genetic algorithms

  • Chen, Chulin;Yousif, Salim Taib;Najem, Rabi' Muyad;Abavisani, Ali;Pham, Binh Thai;Wakil, Karzan;Mohamad, Edy Tonnizam;Khorami, Majid
    • Steel and Composite Structures
    • /
    • v.30 no.3
    • /
    • pp.293-304
    • /
    • 2019
  • The optimum cost of a reinforced concrete plane and space frames have been found by using the Genetic Algorithm (GA) method. The design procedure is subjected to many constraints controlling the designed sections (beams and columns) based on the standard specifications of the American Concrete Institute ACI Code 2011. The design variables have contained the dimensions of designed sections, reinforced steel and topology through the section. It is obtained from a predetermined database containing all the single reinforced design sections for beam and columns subjected to axial load, uniaxial or biaxial moments. The designed optimum beam sections by using GAs have been unified through MATLAB to satisfy axial, flexural, shear and torsion requirements based on the designed code. The frames' functional cost has contained the cost of concrete and reinforcement of steel in addition to the cost of the frames' formwork. The results have found that limiting the dimensions of the frame's beams with the frame's columns have increased the optimum cost of the structure by 2%, declining the re-analysis of the optimum designed structures through GA.

Strategy to coordinate actions through a plant parameter prediction model during startup operation of a nuclear power plant

  • Jae Min Kim;Junyong Bae;Seung Jun Lee
    • Nuclear Engineering and Technology
    • /
    • v.55 no.3
    • /
    • pp.839-849
    • /
    • 2023
  • The development of automation technology to reduce human error by minimizing human intervention is accelerating with artificial intelligence and big data processing technology, even in the nuclear field. Among nuclear power plant operation modes, the startup and shutdown operations are still performed manually and thus have the potential for human error. As part of the development of an autonomous operation system for startup operation, this paper proposes an action coordinating strategy to obtain the optimal actions. The lower level of the system consists of operating blocks that are created by analyzing the operation tasks to achieve local goals through soft actor-critic algorithms. However, when multiple agents try to perform conflicting actions, a method is needed to coordinate them, and for this, an action coordination strategy was developed in this work as the upper level of the system. Three quantification methods were compared and evaluated based on the future plant state predicted by plant parameter prediction models using long short-term memory networks. Results confirmed that the optimal action to satisfy the limiting conditions for operation can be selected by coordinating the action sets. It is expected that this methodology can be generalized through future research.

Machine learning-based probabilistic predictions of shear resistance of welded studs in deck slab ribs transverse to beams

  • Vitaliy V. Degtyarev;Stephen J. Hicks
    • Steel and Composite Structures
    • /
    • v.49 no.1
    • /
    • pp.109-123
    • /
    • 2023
  • Headed studs welded to steel beams and embedded within the concrete of deck slabs are vital components of modern composite floor systems, where safety and economy depend on the accurate predictions of the stud shear resistance. The multitude of existing deck profiles and the complex behavior of studs in deck slab ribs makes developing accurate and reliable mechanical or empirical design models challenging. The paper addresses this issue by presenting a machine learning (ML) model developed from the natural gradient boosting (NGBoost) algorithm capable of producing probabilistic predictions and a database of 464 push-out tests, which is considerably larger than the databases used for developing existing design models. The proposed model outperforms models based on other ML algorithms and existing descriptive equations, including those in EC4 and AISC 360, while offering probabilistic predictions unavailable from other models and producing higher shear resistances for many cases. The present study also showed that the stud shear resistance is insensitive to the concrete elastic modulus, stud welding type, location of slab reinforcement, and other parameters considered important by existing models. The NGBoost model was interpreted by evaluating the feature importance and dependence determined with the SHapley Additive exPlanations (SHAP) method. The model was calibrated via reliability analyses in accordance with the Eurocodes to ensure that its predictions meet the required reliability level and facilitate its use in design. An interactive open-source web application was created and deployed to the cloud to allow for convenient and rapid stud shear resistance predictions with the developed model.

A DQN-based Two-Stage Scheduling Method for Real-Time Large-Scale EVs Charging Service

  • Tianyang Li;Yingnan Han;Xiaolong Li
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.3
    • /
    • pp.551-569
    • /
    • 2024
  • With the rapid development of electric vehicles (EVs) industry, EV charging service becomes more and more important. Especially, in the case of suddenly drop of air temperature or open holidays that large-scale EVs seeking for charging devices (CDs) in a short time. In such scenario, inefficient EV charging scheduling algorithm might lead to a bad service quality, for example, long queueing times for EVs and unreasonable idling time for charging devices. To deal with this issue, this paper propose a Deep-Q-Network (DQN) based two-stage scheduling method for the large-scale EVs charging service. Fine-grained states with two delicate neural networks are proposed to optimize the sequencing of EVs and charging station (CS) arrangement. Two efficient algorithms are presented to obtain the optimal EVs charging scheduling scheme for large-scale EVs charging demand. Three case studies show the superiority of our proposal, in terms of a high service quality (minimized average queuing time of EVs and maximized charging performance at both EV and CS sides) and achieve greater scheduling efficiency. The code and data are available at THE CODE AND DATA.

A Study on the Development Trend of Artificial Intelligence Using Text Mining Technique: Focused on Open Source Software Projects on Github (텍스트 마이닝 기법을 활용한 인공지능 기술개발 동향 분석 연구: 깃허브 상의 오픈 소스 소프트웨어 프로젝트를 대상으로)

  • Chong, JiSeon;Kim, Dongsung;Lee, Hong Joo;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.1-19
    • /
    • 2019
  • Artificial intelligence (AI) is one of the main driving forces leading the Fourth Industrial Revolution. The technologies associated with AI have already shown superior abilities that are equal to or better than people in many fields including image and speech recognition. Particularly, many efforts have been actively given to identify the current technology trends and analyze development directions of it, because AI technologies can be utilized in a wide range of fields including medical, financial, manufacturing, service, and education fields. Major platforms that can develop complex AI algorithms for learning, reasoning, and recognition have been open to the public as open source projects. As a result, technologies and services that utilize them have increased rapidly. It has been confirmed as one of the major reasons for the fast development of AI technologies. Additionally, the spread of the technology is greatly in debt to open source software, developed by major global companies, supporting natural language recognition, speech recognition, and image recognition. Therefore, this study aimed to identify the practical trend of AI technology development by analyzing OSS projects associated with AI, which have been developed by the online collaboration of many parties. This study searched and collected a list of major projects related to AI, which were generated from 2000 to July 2018 on Github. This study confirmed the development trends of major technologies in detail by applying text mining technique targeting topic information, which indicates the characteristics of the collected projects and technical fields. The results of the analysis showed that the number of software development projects by year was less than 100 projects per year until 2013. However, it increased to 229 projects in 2014 and 597 projects in 2015. Particularly, the number of open source projects related to AI increased rapidly in 2016 (2,559 OSS projects). It was confirmed that the number of projects initiated in 2017 was 14,213, which is almost four-folds of the number of total projects generated from 2009 to 2016 (3,555 projects). The number of projects initiated from Jan to Jul 2018 was 8,737. The development trend of AI-related technologies was evaluated by dividing the study period into three phases. The appearance frequency of topics indicate the technology trends of AI-related OSS projects. The results showed that the natural language processing technology has continued to be at the top in all years. It implied that OSS had been developed continuously. Until 2015, Python, C ++, and Java, programming languages, were listed as the top ten frequently appeared topics. However, after 2016, programming languages other than Python disappeared from the top ten topics. Instead of them, platforms supporting the development of AI algorithms, such as TensorFlow and Keras, are showing high appearance frequency. Additionally, reinforcement learning algorithms and convolutional neural networks, which have been used in various fields, were frequently appeared topics. The results of topic network analysis showed that the most important topics of degree centrality were similar to those of appearance frequency. The main difference was that visualization and medical imaging topics were found at the top of the list, although they were not in the top of the list from 2009 to 2012. The results indicated that OSS was developed in the medical field in order to utilize the AI technology. Moreover, although the computer vision was in the top 10 of the appearance frequency list from 2013 to 2015, they were not in the top 10 of the degree centrality. The topics at the top of the degree centrality list were similar to those at the top of the appearance frequency list. It was found that the ranks of the composite neural network and reinforcement learning were changed slightly. The trend of technology development was examined using the appearance frequency of topics and degree centrality. The results showed that machine learning revealed the highest frequency and the highest degree centrality in all years. Moreover, it is noteworthy that, although the deep learning topic showed a low frequency and a low degree centrality between 2009 and 2012, their ranks abruptly increased between 2013 and 2015. It was confirmed that in recent years both technologies had high appearance frequency and degree centrality. TensorFlow first appeared during the phase of 2013-2015, and the appearance frequency and degree centrality of it soared between 2016 and 2018 to be at the top of the lists after deep learning, python. Computer vision and reinforcement learning did not show an abrupt increase or decrease, and they had relatively low appearance frequency and degree centrality compared with the above-mentioned topics. Based on these analysis results, it is possible to identify the fields in which AI technologies are actively developed. The results of this study can be used as a baseline dataset for more empirical analysis on future technology trends that can be converged.

Active VM Consolidation for Cloud Data Centers under Energy Saving Approach

  • Saxena, Shailesh;Khan, Mohammad Zubair;Singh, Ravendra;Noorwali, Abdulfattah
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.11
    • /
    • pp.345-353
    • /
    • 2021
  • Cloud computing represent a new era of computing that's forms through the combination of service-oriented architecture (SOA), Internet and grid computing with virtualization technology. Virtualization is a concept through which every cloud is enable to provide on-demand services to the users. Most IT service provider adopt cloud based services for their users to meet the high demand of computation, as it is most flexible, reliable and scalable technology. Energy based performance tradeoff become the main challenge in cloud computing, as its acceptance and popularity increases day by day. Cloud data centers required a huge amount of power supply to the virtualization of servers for maintain on- demand high computing. High power demand increase the energy cost of service providers as well as it also harm the environment through the emission of CO2. An optimization of cloud computing based on energy-performance tradeoff is required to obtain the balance between energy saving and QoS (quality of services) policies of cloud. A study about power usage of resources in cloud data centers based on workload assign to them, says that an idle server consume near about 50% of its peak utilization power [1]. Therefore, more number of underutilized servers in any cloud data center is responsible to reduce the energy performance tradeoff. To handle this issue, a lots of research proposed as energy efficient algorithms for minimize the consumption of energy and also maintain the SLA (service level agreement) at a satisfactory level. VM (virtual machine) consolidation is one such technique that ensured about the balance of energy based SLA. In the scope of this paper, we explore reinforcement with fuzzy logic (RFL) for VM consolidation to achieve energy based SLA. In this proposed RFL based active VM consolidation, the primary objective is to manage physical server (PS) nodes in order to avoid over-utilized and under-utilized, and to optimize the placement of VMs. A dynamic threshold (based on RFL) is proposed for over-utilized PS detection. For over-utilized PS, a VM selection policy based on fuzzy logic is proposed, which selects VM for migration to maintain the balance of SLA. Additionally, it incorporate VM placement policy through categorization of non-overutilized servers as- balanced, under-utilized and critical. CloudSim toolkit is used to simulate the proposed work on real-world work load traces of CoMon Project define by PlanetLab. Simulation results shows that the proposed policies is most energy efficient compared to others in terms of reduction in both electricity usage and SLA violation.

Analysis on Filter Bubble reinforcement of SNS recommendation algorithm identified in the Russia-Ukraine war (러시아-우크라이나 전쟁에서 파악된 SNS 추천알고리즘의 필터버블 강화현상 분석)

  • CHUN, Sang-Hun;CHOI, Seo-Yeon;SHIN, Seong-Joong
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.22 no.3
    • /
    • pp.25-30
    • /
    • 2022
  • This study is a study on the filter bubble reinforcement phenomenon of SNS recommendation algorithm such as YouTube, which is a characteristic of the Russian-Ukraine war (2022), and the victory or defeat factors of the hybrid war. This war is identified as a hybrid war, and the use of New Media based on the SNS recommendation algorithm is emerging as a factor that determines the outcome of the war beyond political leverage. For this reason, the filter bubble phenomenon goes beyond the dictionary meaning of confirmation bias that limits information exposed to viewers. A YouTube video of Ukrainian President Zelensky encouraging protests in Kyiv garnered 7.02 million views, but Putin's speech only 800,000, which is a evidence that his speech was not exposed to the recommendation algorithm. The war of these SNS recommendation algorithms tends to develop into an algorithm war between the US (YouTube, Twitter, Facebook) and China (TikTok) big tech companies. Influenced by US companies, Ukraine is now able to receive international support, and in Russia, under the influence of Chinese companies, Putin's approval rating is over 80%, resulting in conflicting results. Since this algorithmic empowerment is based on the confirmation bias of public opinion by 'filter bubble', the justification that a new guideline setting for this distortion phenomenon should be presented shortly is drawing attention through this Russia-Ukraine war.

Unlicensed Band Traffic and Fairness Maximization Approach Based on Rate-Splitting Multiple Access (전송률 분할 다중 접속 기술을 활용한 비면허 대역의 트래픽과 공정성 최대화 기법)

  • Jeon Zang Woo;Kim Sung Wook
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.12 no.10
    • /
    • pp.299-308
    • /
    • 2023
  • As the spectrum shortage problem has accelerated by the emergence of various services, New Radio-Unlicensed (NR-U) has appeared, allowing users who communicated in licensed bands to communicate in unlicensed bands. However, NR-U network users reduce the performance of Wi-Fi network users who communicate in the same unlicensed band. In this paper, we aim to simultaneously maximize the fairness and throughput of the unlicensed band, where the NR-U network users and the WiFi network users coexist. First, we propose an optimal power allocation scheme based on Monte Carlo Policy Gradient of reinforcement learning to maximize the sum of rates of NR-U networks utilizing rate-splitting multiple access in unlicensed bands. Then, we propose a channel occupancy time division algorithm based on sequential Raiffa bargaining solution of game theory that can simultaneously maximize system throughput and fairness for the coexistence of NR-U and WiFi networks in the same unlicensed band. Simulation results show that the rate splitting multiple access shows better performance than the conventional multiple access technology by comparing the sum-rate when the result value is finally converged under the same transmission power. In addition, we compare the data transfer amount and fairness of NR-U network users, WiFi network users, and total system, and prove that the channel occupancy time division algorithm based on sequential Raiffa bargaining solution of this paper satisfies throughput and fairness at the same time than other algorithms.