• Title/Summary/Keyword: 목적 함수

Search Result 2,137, Processing Time 0.031 seconds

An Analysis of the Dynamics between Media Coverage and Stock Market on Digital New Deal Policy: Focusing on Companies Related to the Fourth Industrial Revolution (디지털 뉴딜 정책에 대한 언론 보도량과 주식 시장의 동태적 관계 분석: 4차산업혁명 관련 기업을 중심으로)

  • Sohn, Kwonsang;Kwon, Ohbyung
    • The Journal of Society for e-Business Studies
    • /
    • v.26 no.3
    • /
    • pp.33-53
    • /
    • 2021
  • In the crossroads of social change caused by the spread of the Fourth Industrial Revolution and the prolonged COVID-19, the Korean government announced the Digital New Deal policy on July 14, 2020. The Digital New Deal policy's primary goal is to create new businesses by accelerating digital transformation in the public sector and industries around data, networks, and artificial intelligence technologies. However, in a rapidly changing social environment, information asymmetry of the future benefits of technology can cause differences in the public's ability to analyze the direction and effectiveness of policies, resulting in uncertainty about the practical effects of policies. On the other hand, the media leads the formation of discourse through communicators' role to disseminate government policies to the public and provides knowledge about specific issues through the news. In other words, as the media coverage of a particular policy increases, the issue concentration increases, which also affects public decision-making. Therefore, the purpose of this study is to verify the dynamic relationship between the media coverage and the stock market on the Korean government's digital New Deal policy using Granger causality, impulse response functions, and variance decomposition analysis. To this end, the daily stock turnover ratio, daily price-earnings ratio, and EWMA volatility of digital technology-based companies related to the digital new deal policy among KOSDAQ listed companies were set as variables. As a result, keyword search volume, daily stock turnover ratio, EWMA volatility have a bi-directional Granger causal relationship with media coverage. And an increase in media coverage has a high impact on keyword search volume on digital new deal policies. Also, the impulse response analysis on media coverage showed a sharp drop in EWMA volatility. The influence gradually increased over time and played a role in mitigating stock market volatility. Based on this study's findings, the amount of media coverage of digital new deals policy has a significant dynamic relationship with the stock market.

A Study on the Development of High Sensitivity Collision Simulation with Digital Twin (디지털 트윈을 적용한 고감도 충돌 시뮬레이션 개발을 위한 연구)

  • Ki, Jae-Sug;Hwang, Kyo-Chan;Choi, Ju-Ho
    • Journal of the Society of Disaster Information
    • /
    • v.16 no.4
    • /
    • pp.813-823
    • /
    • 2020
  • Purpose: In order to maximize the stability and productivity of the work through simulation prior to high-risk facilities and high-cost work such as dismantling the facilities inside the reactor, we intend to use digital twin technology that can be closely controlled by simulating the specifications of the actual control equipment. Motion control errors, which can be caused by the time gap between precision control equipment and simulation in applying digital twin technology, can cause hazards such as collisions between hazardous facilities and control equipment. In order to eliminate and control these situations, prior research is needed. Method: Unity 3D is currently the most popular engine used to develop simulations. However, there are control errors that can be caused by time correction within Unity 3D engines. The error is expected in many environments and may vary depending on the development environment, such as system specifications. To demonstrate this, we develop crash simulations using Unity 3D engines, which conduct collision experiments under various conditions, organize and analyze the resulting results, and derive tolerances for precision control equipment based on them. Result: In experiments with collision experiment simulation, the time correction in 1/1000 seconds of an engine internal function call results in a unit-hour distance error in the movement control of the collision objects and the distance error is proportional to the velocity of the collision. Conclusion: Remote decomposition simulators using digital twin technology are considered to require limitations of the speed of movement according to the required precision of the precision control devices in the hardware and software environment and manual control. In addition, the size of modeling data such as system development environment, hardware specifications and simulations imitated control equipment and facilities must also be taken into account, available and acceptable errors of operational control equipment and the speed required of work.

Assessing the Sensitivity of Runoff Projections Under Precipitation and Temperature Variability Using IHACRES and GR4J Lumped Runoff-Rainfall Models (집중형 모형 IHACRES와 GR4J를 이용한 강수 및 기온 변동성에 대한 유출 해석 민감도 평가)

  • Woo, Dong Kook;Jo, Jihyeon;Kang, Boosik;Lee, Songhee;Lee, Garim;Noh, Seong Jin
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.43 no.1
    • /
    • pp.43-54
    • /
    • 2023
  • Due to climate change, drought and flood occurrences have been increasing. Accurate projections of watershed discharges are imperative to effectively manage natural disasters caused by climate change. However, climate change and hydrological model uncertainty can lead to imprecise analysis. To address this issues, we used two lumped models, IHACRES and GR4J, to compare and analyze the changes in discharges under climate stress scenarios. The Hapcheon and Seomjingang dam basins were the study site, and the Nash-Sutcliffe efficiency (NSE) and the Kling-Gupta efficiency (KGE) were used for parameter optimizations. Twenty years of discharge, precipitation, and temperature (1995-2014) data were used and divided into training and testing data sets with a 70/30 split. The accuracies of the modeled results were relatively high during the training and testing periods (NSE>0.74, KGE>0.75), indicating that both models could reproduce the previously observed discharges. To explore the impacts of climate change on modeled discharges, we developed climate stress scenarios by changing precipitation from -50 % to +50 % by 1 % and temperature from 0 ℃ to 8 ℃ by 0.1 ℃ based on two decades of weather data, which resulted in 8,181 climate stress scenarios. We analyzed the yearly maximum, abundant, and ordinary discharges projected by the two lumped models. We found that the trends of the maximum and abundant discharges modeled by IHACRES and GR4J became pronounced as changes in precipitation and temperature increased. The opposite was true for the case of ordinary water levels. Our study demonstrated that the quantitative evaluations of the model uncertainty were important to reduce the impacts of climate change on water resources.

Effects of Benzo〔a〕pyrene on Growth and Photosynthesis of Phytoplankton (식물플랑크톤의 성장과 광합성에 대한 benzo〔a〕pyrene의 영향)

  • Kim, Sun-Ju;Shin, Kyung-Soon;Moon, Chang-Ho;Park, Dong-Won;Chang, Man
    • Korean Journal of Environmental Biology
    • /
    • v.22
    • /
    • pp.54-62
    • /
    • 2004
  • We examined the impacts of anthyopogenic pollutant (benzo〔a〕pyrene) on the growth and photosynthesis of five marine phytoplankton species (Skeletonema costatum, Heterosigma akashiwo, Prorocentrum dentatum, P. minimum, Aknshiwo sanguinea), which are dominant in Korean coastal water. After the 72 h exposure to benzo〔a〕pyrene, the dramatic decrease in cell numbers was observed in the range of 1 to 10 $\mu\textrm{g}$ L$^{-1}$ for S. costatum, P. minimum, P. dentatum, whereas for A. sanguinea and H. akashiwo at the low concentrations 0.1 to 1 $\mu\textrm{g}$ L$^{-1}$ . Among the 5 phytoplankton species, the highest growth inhibition concentration ($IC_{50}$/) was 6.20 $\mu\textrm{g}$ L$^{-1}$ for P. minimum, followed by 2.14 $\mu\textrm{g}$ L$^{-1}$ for P. dentatum, 1.68 $\mu\textrm{g}$ L$^{-1}$ for S. costatum, 0.74 $\mu\textrm{g}$ L$^{-1}$ for H. akashiwo, 0.10 $\mu\textrm{g}$ L$^{-1}$ for A. sanguinea. The five species exposed to the low concentration of 1 $\mu\textrm{g}$ L$^{-1}$ were recovered after transferring to new media, but the species exposed to the high concentrations of 10 and 100 $\mu\textrm{g}$ L$^{-1}$ were not recovered, with the exception of P. minimum. Those results indicate that the thecate dinoflagellate P. minimum is most tolerant to the chemical and the athecate dinoflagellate A. sanguinea is not. Geneyally, the cell-specific photosynthetic capacity of H. akashiwo exposed to the low concentrations of 0.1 and 1 $\mu\textrm{g}$ L$^{-1}$ was higher than that of the cells in the control, whereas the cells exposed to the high concentrations of 5 and 10 $\mu\textrm{g}$ L$^{-1}$ showed the negligible photosynthetic level by the first few days of the experiment. In the case of the cells exposed to the concentration of 5 $\mu\textrm{g}$ L$^{-1}$ , after 12 days of the experiment the photosynthetic capacity was increased toward the end of the experiment. This indicates that H. akashiwo may utilize the benzo〔a〕pyrene as a carton source for its growth when exposed to low concentrations. Results suggest that anthropogenic pollutants such as benzo〔a〕pyrene may have significant influence on the succession of phytoplankton species composition and the primary production in coastal marine environments.

The Measurement of Sensitivity and Comparative Analysis of Simplified Quantitation Methods to Measure Dopamine Transporters Using [I-123]IPT Pharmacokinetic Computer Simulations ([I-123]IPT 약역학 컴퓨터시뮬레이션을 이용한 민감도 측정 및 간편화된 운반체 정량분석 방법들의 비교분석 연구)

  • Son, Hye-Kyung;Nha, Sang-Kyun;Lee, Hee-Kyung;Kim, Hee-Joung
    • The Korean Journal of Nuclear Medicine
    • /
    • v.31 no.1
    • /
    • pp.19-29
    • /
    • 1997
  • Recently, [I-123]IPT SPECT has been used for early diagnosis of Parkinson's patients(PP) by imaging dopamine transporters. The dynamic time activity curves in basal ganglia(BG) and occipital cortex(OCC) without blood samples were obtained for 2 hours. These data were then used to measure dopamine transporters by operationally defined ratio methods of (BG-OCC)/OCC at 2 hrs, binding potential $R_v=k_3/k_4$ using graphic method or $R_A$= (ABBG-ABOCC)/ABOCC for 2 hrs, where ABBG represents accumulated binding activity in basal ganglia(${\int}^{120min}_0$ BG(t)dt) and ABOCC represents accumulated binding activity in occipital cortex(${\int}^{120min}_0$ OCC(t)dt). The purpose of this study was to examine the IPT pharmacokinetics and investigate the usefulness of simplified methods of (BG-OCC)/OCC, $R_A$, and $R_v$ which are often assumed that these values reflect the true values of $k_3/k_4$. The rate constants $K_1,\;k_2\;k_3$ and $k_4$ to be used for simulations were derived using [I-123]IPT SPECT and aterialized blood data with a standard three compartmental model. The sensitivities and time activity curves in BG and OCC were computed by changing $K_l$ and $k_3$(only BG) for every 5min over 2 hours. The values (BG-OCC)/OCC, $R_A$, and $R_v$ were then computed from the time activity curves and the linear regression analysis was used to measure the accuracies of these methods. The late constants $K_l,\;k_2\;k_3\;k_4$ at BG and OCC were $1.26{\pm}5.41%,\;0.044{\pm}19.58%,\;0.031{\pm}24.36%,\;0.008{\pm}22.78%$ and $1.36{\pm}4.76%,\;0.170{\pm}6.89%,\;0.007{\pm}23.89%,\;0.007{\pm}45.09%$, respectively. The Sensitivities for ((${\Delta}S/S$)/(${\Delta}k_3/k_3$)) and ((${\Delta}S/S$)/(${\Delta}K_l/K_l$)) at 30min and 120min were measured as (0.19, 0.50) and (0.61, 0,23), respectively. The correlation coefficients and slopes of ((BG-OCC)/OCC, $R_A$, and $R_v$) with $k_3/k_4$ were (0.98, 1.00, 0.99) and (1.76, 0.47, 1.25), respectively. These simulation results indicate that a late [I-123]IPT SPECT image may represent the distribution of the dopamine transporters. Good correlations were shown between (3G-OCC)/OCC, $R_A$ or $R_v$ and true $k_3/k_4$, although the slopes between them were not unity. Pharmacokinetic computer simulations may be a very useful technique in studying dopamine transporter systems.

  • PDF

Evaluating Reverse Logistics Networks with Centralized Centers : Hybrid Genetic Algorithm Approach (집중형센터를 가진 역물류네트워크 평가 : 혼합형 유전알고리즘 접근법)

  • Yun, YoungSu
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.55-79
    • /
    • 2013
  • In this paper, we propose a hybrid genetic algorithm (HGA) approach to effectively solve the reverse logistics network with centralized centers (RLNCC). For the proposed HGA approach, genetic algorithm (GA) is used as a main algorithm. For implementing GA, a new bit-string representation scheme using 0 and 1 values is suggested, which can easily make initial population of GA. As genetic operators, the elitist strategy in enlarged sampling space developed by Gen and Chang (1997), a new two-point crossover operator, and a new random mutation operator are used for selection, crossover and mutation, respectively. For hybrid concept of GA, an iterative hill climbing method (IHCM) developed by Michalewicz (1994) is inserted into HGA search loop. The IHCM is one of local search techniques and precisely explores the space converged by GA search. The RLNCC is composed of collection centers, remanufacturing centers, redistribution centers, and secondary markets in reverse logistics networks. Of the centers and secondary markets, only one collection center, remanufacturing center, redistribution center, and secondary market should be opened in reverse logistics networks. Some assumptions are considered for effectively implementing the RLNCC The RLNCC is represented by a mixed integer programming (MIP) model using indexes, parameters and decision variables. The objective function of the MIP model is to minimize the total cost which is consisted of transportation cost, fixed cost, and handling cost. The transportation cost is obtained by transporting the returned products between each centers and secondary markets. The fixed cost is calculated by opening or closing decision at each center and secondary markets. That is, if there are three collection centers (the opening costs of collection center 1 2, and 3 are 10.5, 12.1, 8.9, respectively), and the collection center 1 is opened and the remainders are all closed, then the fixed cost is 10.5. The handling cost means the cost of treating the products returned from customers at each center and secondary markets which are opened at each RLNCC stage. The RLNCC is solved by the proposed HGA approach. In numerical experiment, the proposed HGA and a conventional competing approach is compared with each other using various measures of performance. For the conventional competing approach, the GA approach by Yun (2013) is used. The GA approach has not any local search technique such as the IHCM proposed the HGA approach. As measures of performance, CPU time, optimal solution, and optimal setting are used. Two types of the RLNCC with different numbers of customers, collection centers, remanufacturing centers, redistribution centers and secondary markets are presented for comparing the performances of the HGA and GA approaches. The MIP models using the two types of the RLNCC are programmed by Visual Basic Version 6.0, and the computer implementing environment is the IBM compatible PC with 3.06Ghz CPU speed and 1GB RAM on Windows XP. The parameters used in the HGA and GA approaches are that the total number of generations is 10,000, population size 20, crossover rate 0.5, mutation rate 0.1, and the search range for the IHCM is 2.0. Total 20 iterations are made for eliminating the randomness of the searches of the HGA and GA approaches. With performance comparisons, network representations by opening/closing decision, and convergence processes using two types of the RLNCCs, the experimental result shows that the HGA has significantly better performance in terms of the optimal solution than the GA, though the GA is slightly quicker than the HGA in terms of the CPU time. Finally, it has been proved that the proposed HGA approach is more efficient than conventional GA approach in two types of the RLNCC since the former has a GA search process as well as a local search process for additional search scheme, while the latter has a GA search process alone. For a future study, much more large-sized RLNCCs will be tested for robustness of our approach.

Construction and Application of Intelligent Decision Support System through Defense Ontology - Application example of Air Force Logistics Situation Management System (국방 온톨로지를 통한 지능형 의사결정지원시스템 구축 및 활용 - 공군 군수상황관리체계 적용 사례)

  • Jo, Wongi;Kim, Hak-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.77-97
    • /
    • 2019
  • The large amount of data that emerges from the initial connection environment of the Fourth Industrial Revolution is a major factor that distinguishes the Fourth Industrial Revolution from the existing production environment. This environment has two-sided features that allow it to produce data while using it. And the data produced so produces another value. Due to the massive scale of data, future information systems need to process more data in terms of quantities than existing information systems. In addition, in terms of quality, only a large amount of data, Ability is required. In a small-scale information system, it is possible for a person to accurately understand the system and obtain the necessary information, but in a variety of complex systems where it is difficult to understand the system accurately, it becomes increasingly difficult to acquire the desired information. In other words, more accurate processing of large amounts of data has become a basic condition for future information systems. This problem related to the efficient performance of the information system can be solved by building a semantic web which enables various information processing by expressing the collected data as an ontology that can be understood by not only people but also computers. For example, as in most other organizations, IT has been introduced in the military, and most of the work has been done through information systems. Currently, most of the work is done through information systems. As existing systems contain increasingly large amounts of data, efforts are needed to make the system easier to use through its data utilization. An ontology-based system has a large data semantic network through connection with other systems, and has a wide range of databases that can be utilized, and has the advantage of searching more precisely and quickly through relationships between predefined concepts. In this paper, we propose a defense ontology as a method for effective data management and decision support. In order to judge the applicability and effectiveness of the actual system, we reconstructed the existing air force munitions situation management system as an ontology based system. It is a system constructed to strengthen management and control of logistics situation of commanders and practitioners by providing real - time information on maintenance and distribution situation as it becomes difficult to use complicated logistics information system with large amount of data. Although it is a method to take pre-specified necessary information from the existing logistics system and display it as a web page, it is also difficult to confirm this system except for a few specified items in advance, and it is also time-consuming to extend the additional function if necessary And it is a system composed of category type without search function. Therefore, it has a disadvantage that it can be easily utilized only when the system is well known as in the existing system. The ontology-based logistics situation management system is designed to provide the intuitive visualization of the complex information of the existing logistics information system through the ontology. In order to construct the logistics situation management system through the ontology, And the useful functions such as performance - based logistics support contract management and component dictionary are further identified and included in the ontology. In order to confirm whether the constructed ontology can be used for decision support, it is necessary to implement a meaningful analysis function such as calculation of the utilization rate of the aircraft, inquiry about performance-based military contract. Especially, in contrast to building ontology database in ontology study in the past, in this study, time series data which change value according to time such as the state of aircraft by date are constructed by ontology, and through the constructed ontology, It is confirmed that it is possible to calculate the utilization rate based on various criteria as well as the computable utilization rate. In addition, the data related to performance-based logistics contracts introduced as a new maintenance method of aircraft and other munitions can be inquired into various contents, and it is easy to calculate performance indexes used in performance-based logistics contract through reasoning and functions. Of course, we propose a new performance index that complements the limitations of the currently applied performance indicators, and calculate it through the ontology, confirming the possibility of using the constructed ontology. Finally, it is possible to calculate the failure rate or reliability of each component, including MTBF data of the selected fault-tolerant item based on the actual part consumption performance. The reliability of the mission and the reliability of the system are calculated. In order to confirm the usability of the constructed ontology-based logistics situation management system, the proposed system through the Technology Acceptance Model (TAM), which is a representative model for measuring the acceptability of the technology, is more useful and convenient than the existing system.