• Title/Summary/Keyword: 시계열 예측분석

Search Result 732, Processing Time 0.028 seconds

Multidimensional data generation of water distribution systems using adversarially trained autoencoder (적대적 학습 기반 오토인코더(ATAE)를 이용한 다차원 상수도관망 데이터 생성)

  • Kim, Sehyeong;Jun, Sanghoon;Jung, Donghwi
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.7
    • /
    • pp.439-449
    • /
    • 2023
  • Recent advancements in data measuring technology have facilitated the installation of various sensors, such as pressure meters and flow meters, to effectively assess the real-time conditions of water distribution systems (WDSs). However, as cities expand extensively, the factors that impact the reliability of measurements have become increasingly diverse. In particular, demand data, one of the most significant hydraulic variable in WDS, is challenging to be measured directly and is prone to missing values, making the development of accurate data generation models more important. Therefore, this paper proposes an adversarially trained autoencoder (ATAE) model based on generative deep learning techniques to accurately estimate demand data in WDSs. The proposed model utilizes two neural networks: a generative network and a discriminative network. The generative network generates demand data using the information provided from the measured pressure data, while the discriminative network evaluates the generated demand outputs and provides feedback to the generator to learn the distinctive features of the data. To validate its performance, the ATAE model is applied to a real distribution system in Austin, Texas, USA. The study analyzes the impact of data uncertainty by calculating the accuracy of ATAE's prediction results for varying levels of uncertainty in the demand and the pressure time series data. Additionally, the model's performance is evaluated by comparing the results for different data collection periods (low, average, and high demand hours) to assess its ability to generate demand data based on water consumption levels.

Analysis of National Stream Drying Phenomena using DrySAT-WFT Model: Focusing on Inflow of Dam and Weir Watersheds in 5 River Basins (DrySAT-WFT 모형을 활용한 전국 하천건천화 분석: 전국 5대강 댐·보 유역의 유입량을 중심으로)

  • LEE, Yong-Gwan;JUNG, Chung-Gil;KIM, Won-Jin;KIM, Seong-Joon
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.23 no.2
    • /
    • pp.53-69
    • /
    • 2020
  • The increase of the impermeable area due to industrialization and urban development distorts the hydrological circulation system and cause serious stream drying phenomena. In order to manage this, it is necessary to develop a technology for impact assessment of stream drying phenomena, which enables quantitative evaluation and prediction. In this study, the cause of streamflow reduction was assessed for dam and weir watersheds in the five major river basins of South Korea by using distributed hydrological model DrySAT-WFT (Drying Stream Assessment Tool and Water Flow Tracking) and GIS time series data. For the modeling, the 5 influencing factors of stream drying phenomena (soil erosion, forest growth, road-river disconnection, groundwater use, urban development) were selected and prepared as GIS-based time series spatial data from 1976 to 2015. The DrySAT-WFT was calibrated and validated from 2005 to 2015 at 8 multipurpose dam watershed (Chungju, Soyang, Andong, Imha, Hapcheon, Seomjin river, Juam, and Yongdam) and 4 gauging stations (Osucheon, Mihocheon, Maruek, and Chogang) respectively. The calibration results showed that the coefficient of determination (R2) was 0.76 in average (0.66 to 0.84) and the Nash-Sutcliffe model efficiency was 0.62 in average (0.52 to 0.72). Based on the 2010s (2006~2015) weather condition for the whole period, the streamflow impact was estimated by applying GIS data for each decade (1980s: 1976~1985, 1990s: 1986~1995, 2000s: 1996~2005, 2010s: 2006~2015). The results showed that the 2010s averaged-wet streamflow (Q95) showed decrease of 4.1~6.3%, the 2010s averaged-normal streamflow (Q185) showed decreased of 6.7~9.1% and the 2010s averaged-drought streamflow (Q355) showed decrease of 8.4~10.4% compared to 1980s streamflows respectively on the whole. During 1975~2015, the increase of groundwater use covered 40.5% contribution and the next was forest growth with 29.0% contribution among the 5 influencing factors.

A study on the derivation and evaluation of flow duration curve (FDC) using deep learning with a long short-term memory (LSTM) networks and soil water assessment tool (SWAT) (LSTM Networks 딥러닝 기법과 SWAT을 이용한 유량지속곡선 도출 및 평가)

  • Choi, Jung-Ryel;An, Sung-Wook;Choi, Jin-Young;Kim, Byung-Sik
    • Journal of Korea Water Resources Association
    • /
    • v.54 no.spc1
    • /
    • pp.1107-1118
    • /
    • 2021
  • Climate change brought on by global warming increased the frequency of flood and drought on the Korean Peninsula, along with the casualties and physical damage resulting therefrom. Preparation and response to these water disasters requires national-level planning for water resource management. In addition, watershed-level management of water resources requires flow duration curves (FDC) derived from continuous data based on long-term observations. Traditionally, in water resource studies, physical rainfall-runoff models are widely used to generate duration curves. However, a number of recent studies explored the use of data-based deep learning techniques for runoff prediction. Physical models produce hydraulically and hydrologically reliable results. However, these models require a high level of understanding and may also take longer to operate. On the other hand, data-based deep-learning techniques offer the benefit if less input data requirement and shorter operation time. However, the relationship between input and output data is processed in a black box, making it impossible to consider hydraulic and hydrological characteristics. This study chose one from each category. For the physical model, this study calculated long-term data without missing data using parameter calibration of the Soil Water Assessment Tool (SWAT), a physical model tested for its applicability in Korea and other countries. The data was used as training data for the Long Short-Term Memory (LSTM) data-based deep learning technique. An anlysis of the time-series data fond that, during the calibration period (2017-18), the Nash-Sutcliffe Efficiency (NSE) and the determinanation coefficient for fit comparison were high at 0.04 and 0.03, respectively, indicating that the SWAT results are superior to the LSTM results. In addition, the annual time-series data from the models were sorted in the descending order, and the resulting flow duration curves were compared with the duration curves based on the observed flow, and the NSE for the SWAT and the LSTM models were 0.95 and 0.91, respectively, and the determination coefficients were 0.96 and 0.92, respectively. The findings indicate that both models yield good performance. Even though the LSTM requires improved simulation accuracy in the low flow sections, the LSTM appears to be widely applicable to calculating flow duration curves for large basins that require longer time for model development and operation due to vast data input, and non-measured basins with insufficient input data.

Comparison of physics-based and data-driven models for streamflow simulation of the Mekong river (메콩강 유출모의를 위한 물리적 및 데이터 기반 모형의 비교·분석)

  • Lee, Giha;Jung, Sungho;Lee, Daeeop
    • Journal of Korea Water Resources Association
    • /
    • v.51 no.6
    • /
    • pp.503-514
    • /
    • 2018
  • In recent, the hydrological regime of the Mekong river is changing drastically due to climate change and haphazard watershed development including dam construction. Information of hydrologic feature like streamflow of the Mekong river are required for water disaster prevention and sustainable water resources development in the river sharing countries. In this study, runoff simulations at the Kratie station of the lower Mekong river are performed using SWAT (Soil and Water Assessment Tool), a physics-based hydrologic model, and LSTM (Long Short-Term Memory), a data-driven deep learning algorithm. The SWAT model was set up based on globally-available database (topography: HydroSHED, landuse: GLCF-MODIS, soil: FAO-Soil map, rainfall: APHRODITE, etc) and then simulated daily discharge from 2003 to 2007. The LSTM was built using deep learning open-source library TensorFlow and the deep-layer neural networks of the LSTM were trained based merely on daily water level data of 10 upper stations of the Kratie during two periods: 2000~2002 and 2008~2014. Then, LSTM simulated daily discharge for 2003~2007 as in SWAT model. The simulation results show that Nash-Sutcliffe Efficiency (NSE) of each model were calculated at 0.9(SWAT) and 0.99(LSTM), respectively. In order to simply simulate hydrological time series of ungauged large watersheds, data-driven model like the LSTM method is more applicable than the physics-based hydrological model having complexity due to various database pressure because it is able to memorize the preceding time series sequences and reflect them to prediction.

The Prediction of Currency Crises through Artificial Neural Network (인공신경망을 이용한 경제 위기 예측)

  • Lee, Hyoung Yong;Park, Jung Min
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.19-43
    • /
    • 2016
  • This study examines the causes of the Asian exchange rate crisis and compares it to the European Monetary System crisis. In 1997, emerging countries in Asia experienced financial crises. Previously in 1992, currencies in the European Monetary System had undergone the same experience. This was followed by Mexico in 1994. The objective of this paper lies in the generation of useful insights from these crises. This research presents a comparison of South Korea, United Kingdom and Mexico, and then compares three different models for prediction. Previous studies of economic crisis focused largely on the manual construction of causal models using linear techniques. However, the weakness of such models stems from the prevalence of nonlinear factors in reality. This paper uses a structural equation model to analyze the causes, followed by a neural network model to circumvent the linear model's weaknesses. The models are examined in the context of predicting exchange rates In this paper, data were quarterly ones, and Consumer Price Index, Gross Domestic Product, Interest Rate, Stock Index, Current Account, Foreign Reserves were independent variables for the prediction. However, time periods of each country's data are different. Lisrel is an emerging method and as such requires a fresh approach to financial crisis prediction model design, along with the flexibility to accommodate unexpected change. This paper indicates the neural network model has the greater prediction performance in Korea, Mexico, and United Kingdom. However, in Korea, the multiple regression shows the better performance. In Mexico, the multiple regression is almost indifferent to the Lisrel. Although Lisrel doesn't show the significant performance, the refined model is expected to show the better result. The structural model in this paper should contain the psychological factor and other invisible areas in the future work. The reason of the low hit ratio is that the alternative model in this paper uses only the financial market data. Thus, we cannot consider the other important part. Korea's hit ratio is lower than that of United Kingdom. So, there must be the other construct that affects the financial market. So does Mexico. However, the United Kingdom's financial market is more influenced and explained by the financial factors than Korea and Mexico.

An Anomalous Event Detection System based on Information Theory (엔트로피 기반의 이상징후 탐지 시스템)

  • Han, Chan-Kyu;Choi, Hyoung-Kee
    • Journal of KIISE:Information Networking
    • /
    • v.36 no.3
    • /
    • pp.173-183
    • /
    • 2009
  • We present a real-time monitoring system for detecting anomalous network events using the entropy. The entropy accounts for the effects of disorder in the system. When an abnormal factor arises to agitate the current system the entropy must show an abrupt change. In this paper we deliberately model the Internet to measure the entropy. Packets flowing between these two networks may incur to sustain the current value. In the proposed system we keep track of the value of entropy in time to pinpoint the sudden changes in the value. The time-series data of entropy are transformed into the two-dimensional domains to help visually inspect the activities on the network. We examine the system using network traffic traces containing notorious worms and DoS attacks on the testbed. Furthermore, we compare our proposed system of time series forecasting method, such as EWMA, holt-winters, and PCA in terms of sensitive. The result suggests that our approach be able to detect anomalies with the fairly high accuracy. Our contributions are two folds: (1) highly sensitive detection of anomalies and (2) visualization of network activities to alert anomalies.

Development of selection method for Hydrological Reference Station (수문학적 참조관측소 선정방법 개발)

  • Chi Young Kim;Young Hun Jung;Hee Joo Lim;Hyeok Jin Im
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.271-271
    • /
    • 2023
  • 수문학적 기준지점(HRS, Reference Hydrological Station)은 유량의 변동성의 장기적인 추세를 파악하기 위해 고품질의 자료를 생산하는 관측소를 의미한다. 선진외국의 경우 운영목적과 수문학적 기준지점의 정의는 조금씩 다르지만 유사한 개념의 관측소를 운영하고 있다. 호주의 경우 기후변화에 따른 장기간의 수자원 부존량의 변화를 예측하기 위한 모니터링 지점으로 정의하며, 미국의 경우 시간에 따른 수문학적 특성의 자연적인 변화 및 인간의 활동에 따른 수문환경의 변화에 대한 연구를 위한 기준값을 제공하기 위해 참조지점(HBN, Hydrological Benchmark Network)의 자료를 제공하기 위해 운영한다. 영국은 기후변화에 따른 유역의 수문학적 응답을 조사할 목적으로 참조지점(RHN, Reference Hydrological Network)을 운영하고 있으며, 주로 자연유역에 설치하여 운영하고 있다. WMO는 2006년 '기후연구를 위한 적절한 유량관측소'를 선정해 줄 것을 회원국에 요청하고, 관련 자료의 데이터베이스를 독일의 GRDC(Global Runoff Data Centre)에 수집하고 있다. 국외의 경우 '자연에 가까운 유역특성을 갖는 하천 유량관측망 중 양질의 자료를 보유하고 있는 관측소'를 고려하여 수문학적 기준지점을 선정한다. 하지만 우리나라의 경우 장기간의 유량자료를 보유하고 있는 관측소가 상대적으로 부족하고, 장기간의 유량자료를 보유한 지점 또한 홍수예보, 댐 운영 등 물관리 업무에 직접 활용하기 위해 대하천의 본류 중심으로 자료를 생산하고 있다. 따라서 현재를 기준으로 국제적으로 통용되는 기준에 부합하는 기준관측소를 선정하는 것은 곤란한 상황으로 미래에 수문학적 기준지점이 될 수 있는 관측소를 선정하여 장기간 모니터링을 통해 기준관측소를 확대해 나갈 필요가 있다. 본 연구에서는 국외의 수문학적 기준관측소 선정기준을 비교 검토하여 우리나라 실정에 맞는 기준관측소 선정기준을 개발하였다. 선정 기준은 ① 유역의 개발정도, ② 댐·저수지 등 인위적인 조절 정도, ③ 취수량 또는 방류량 등 유역간의 물 이동, ④ 유량자료의 보유기간 및 정확도 등을 고려하여 기준을 설정하였다. 또한 기준지점의 선정을 위한 절차를 ① 수위관측소 사전목록의 작성, ② 관측소 정보 분석(유역특성, 시계열자료 등), ③ 수문학적 기준관측소 후보 선정, ④ 유관기관 및 전문가 검토를 통한 우선순위 선정 등 4단계로 구분하여 제시하였다.

  • PDF

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.

A Study on the quantitative measurement methods of MRTD and prediction of detection distance for Infrared surveillance equipments in military (군용 열영상장비 최소분해가능온도차의 정량적 측정 방법 및 탐지거리 예측에 관한 연구)

  • Jung, Yeong-Tak;Lim, Jae-Seong;Lee, Ji-Hyeok
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.5
    • /
    • pp.557-564
    • /
    • 2017
  • The purpose of the thermal imaging observation device mounted on the K's tank in the Republic of Korea military is to convert infrared rays into visual information to provide information about the environment under conditions of restricted visibility. Among the various performance indicators of thermal observation devices, such as the view, magnification, resolution, MTF, NETD, and Minimum Resolvable Temperature Difference (MRTD), the MRTD is the most important, because it can indicate both the spatial frequency and temperature resolvable. However, the standard method of measuring the MRTD in NATO contains many subjective factors. As the measurement result can vary depending on subjective factors such as the human eye, metal condition and measurement conditions, the MRTD obtained is not stable. In this study, these qualitative MRTD measurement systems are converted into quantitative indicators based on a gray scale using imaging processing. By converting the average of the gray scale differences of the black and white images into the MRTD, the mean values can be used to determine whether the performance requirements required by the defense specification are met. The (mean) value can also be used to discriminate between detection, recognition and identification and the detectable distance of the thermal equipment can be analyzed under various environmental conditions, such as altostratus, heavy rain and fog.

Overseas Construction Order Forecasting Using Time Series Model (시계열 모형을 이용한 해외건설 수주 전망)

  • Kim, Woon Joong
    • Korean Journal of Construction Engineering and Management
    • /
    • v.19 no.2
    • /
    • pp.107-116
    • /
    • 2018
  • Since 2010, Korea's overseas construction orders have seen dramatic fluctuations. I propose causes and remedies for the industry as a whole. Orders have recorded an annual average of $63.8 billion dollars from 2011 to 2014, reaching its highest at $71.6 billion dollars(2010) which marked the peak of Korea's overseas construction. However, due to a decline in international oil prices, starting in the last half of 2014, Korea's overseas construction orders have followed suit recording $46.1 billion dollar in 2014, $28.2 billion dollars in 2016, and $29.0 billion dollars in 2017. Facing uncertainty in Korea's overseas construction market, caused by continued slow growth of the global economy, Korean EPC contractors are at a critical point in regards to their award-winning capabilities. Together with declining oil prices, the challenges have never been bigger. To mitigate the challenges, I would suggest policy direction as a way to grow and develop the overseas construction industry. Proper counterplans are needed to foster Korea's overseas construction industry. Forecasting total order amount for overseas construction projects is essencial. Analyzing contract award & tender structure and its changing trends in both overseas and world construction markets should also be included. Korea has great potential and global competitiveness. These measures will serve to enhance Korea's overall export strategy in uncertain overseas markets and global economy.