• Title/Summary/Keyword: Anomaly detection

Search Result 631, Processing Time 0.028 seconds

A Study on the Air Pollution Monitoring Network Algorithm Using Deep Learning (심층신경망 모델을 이용한 대기오염망 자료확정 알고리즘 연구)

  • Lee, Seon-Woo;Yang, Ho-Jun;Lee, Mun-Hyung;Choi, Jung-Moo;Yun, Se-Hwan;Kwon, Jang-Woo;Park, Ji-Hoon;Jung, Dong-Hee;Shin, Hye-Jung
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.11
    • /
    • pp.57-65
    • /
    • 2021
  • We propose a novel method to detect abnormal data of specific symptoms using deep learning in air pollution measurement system. Existing methods generally detect abnomal data by classifying data showing unusual patterns different from the existing time series data. However, these approaches have limitations in detecting specific symptoms. In this paper, we use DeepLab V3+ model mainly used for foreground segmentation of images, whose structure has been changed to handle one-dimensional data. Instead of images, the model receives time-series data from multiple sensors and can detect data showing specific symptoms. In addition, we improve model's performance by reducing the complexity of noisy form time series data by using 'piecewise aggregation approximation'. Through the experimental results, it can be confirmed that anomaly data detection can be performed successfully.

Study on Anomaly Detection Method of Improper Foods using Import Food Big data (수입식품 빅데이터를 이용한 부적합식품 탐지 시스템에 관한 연구)

  • Cho, Sanggoo;Choi, Gyunghyun
    • The Journal of Bigdata
    • /
    • v.3 no.2
    • /
    • pp.19-33
    • /
    • 2018
  • Owing to the increase of FTA, food trade, and versatile preferences of consumers, food import has increased at tremendous rate every year. While the inspection check of imported food accounts for about 20% of the total food import, the budget and manpower necessary for the government's import inspection control is reaching its limit. The sudden import food accidents can cause enormous social and economic losses. Therefore, predictive system to forecast the compliance of food import with its preemptive measures will greatly improve the efficiency and effectiveness of import safety control management. There has already been a huge data accumulated from the past. The processed foods account for 75% of the total food import in the import food sector. The analysis of big data and the application of analytical techniques are also used to extract meaningful information from a large amount of data. Unfortunately, not many studies have been done regarding analyzing the import food and its implication with understanding the big data of food import. In this context, this study applied a variety of classification algorithms in the field of machine learning and suggested a data preprocessing method through the generation of new derivative variables to improve the accuracy of the model. In addition, the present study compared the performance of the predictive classification algorithms with the general base classifier. The Gaussian Naïve Bayes prediction model among various base classifiers showed the best performance to detect and predict the nonconformity of imported food. In the future, it is expected that the application of the abnormality detection model using the Gaussian Naïve Bayes. The predictive model will reduce the burdens of the inspection of import food and increase the non-conformity rate, which will have a great effect on the efficiency of the food import safety control and the speed of import customs clearance.

Detection of Signs of Hostile Cyber Activity against External Networks based on Autoencoder (오토인코더 기반의 외부망 적대적 사이버 활동 징후 감지)

  • Park, Hansol;Kim, Kookjin;Jeong, Jaeyeong;Jang, jisu;Youn, Jaepil;Shin, Dongkyoo
    • Journal of Internet Computing and Services
    • /
    • v.23 no.6
    • /
    • pp.39-48
    • /
    • 2022
  • Cyberattacks around the world continue to increase, and their damage extends beyond government facilities and affects civilians. These issues emphasized the importance of developing a system that can identify and detect cyber anomalies early. As above, in order to effectively identify cyber anomalies, several studies have been conducted to learn BGP (Border Gateway Protocol) data through a machine learning model and identify them as anomalies. However, BGP data is unbalanced data in which abnormal data is less than normal data. This causes the model to have a learning biased result, reducing the reliability of the result. In addition, there is a limit in that security personnel cannot recognize the cyber situation as a typical result of machine learning in an actual cyber situation. Therefore, in this paper, we investigate BGP (Border Gateway Protocol) that keeps network records around the world and solve the problem of unbalanced data by using SMOTE. After that, assuming a cyber range situation, an autoencoder classifies cyber anomalies and visualizes the classified data. By learning the pattern of normal data, the performance of classifying abnormal data with 92.4% accuracy was derived, and the auxiliary index also showed 90% performance, ensuring reliability of the results. In addition, it is expected to be able to effectively defend against cyber attacks because it is possible to effectively recognize the situation by visualizing the congested cyber space.

The Fault Diagnosis Model of Ship Fuel System Equipment Reflecting Time Dependency in Conv1D Algorithm Based on the Convolution Network (합성곱 네트워크 기반의 Conv1D 알고리즘에서 시간 종속성을 반영한 선박 연료계통 장비의 고장 진단 모델)

  • Kim, Hyung-Jin;Kim, Kwang-Sik;Hwang, Se-Yun;Lee, Jang Hyun
    • Journal of Navigation and Port Research
    • /
    • v.46 no.4
    • /
    • pp.367-374
    • /
    • 2022
  • The purpose of this study was to propose a deep learning algorithm that applies to the fault diagnosis of fuel pumps and purifiers of autonomous ships. A deep learning algorithm reflecting the time dependence of the measured signal was configured, and the failure pattern was trained using the vibration signal, measured in the equipment's regular operation and failure state. Considering the sequential time-dependence of deterioration implied in the vibration signal, this study adopts Conv1D with sliding window computation for fault detection. The time dependence was also reflected, by transferring the measured signal from two-dimensional to three-dimensional. Additionally, the optimal values of the hyper-parameters of the Conv1D model were determined, using the grid search technique. Finally, the results show that the proposed data preprocessing method as well as the Conv1D model, can reflect the sequential dependency between the fault and its effect on the measured signal, and appropriately perform anomaly as well as failure detection, of the equipment chosen for application.

Implementation of Security Information and Event Management for Realtime Anomaly Detection and Visualization (실시간 이상 행위 탐지 및 시각화 작업을 위한 보안 정보 관리 시스템 구현)

  • Kim, Nam Gyun;Park, Sang Seon
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.8 no.5
    • /
    • pp.303-314
    • /
    • 2018
  • In the past few years, government agencies and corporations have succumbed to stealthy, tailored cyberattacks designed to exploit vulnerabilities, disrupt operations and steal valuable information. Security Information and Event Management (SIEM) is useful tool for cyberattacks. SIEM solutions are available in the market but they are too expensive and difficult to use. Then we implemented basic SIEM functions to research and development for future security solutions. We focus on collection, aggregation and analysis of real-time logs from host. This tool allows parsing and search of log data for forensics. Beyond just log management it uses intrusion detection and prioritize of security events inform and support alerting to user. We select Elastic Stack to process and visualization of these security informations. Elastic Stack is a very useful tool for finding information from large data, identifying correlations and creating rich visualizations for monitoring. We suggested using vulnerability check results on our SIEM. We have attacked to the host and got real time user activity for monitoring, alerting and security auditing based this security information management.

Sensitivity Experiment of Surface Reflectance to Error-inducing Variables Based on the GEMS Satellite Observations (GEMS 위성관측에 기반한 지면반사도 산출 시에 오차 유발 변수에 대한 민감도 실험)

  • Shin, Hee-Woo;Yoo, Jung-Moon
    • Journal of the Korean earth science society
    • /
    • v.39 no.1
    • /
    • pp.53-66
    • /
    • 2018
  • The information of surface reflectance ($R_{sfc}$) is important for the heat balance and the environmental/climate monitoring. The $R_{sfc}$ sensitivity to error-induced variables for the Geostationary Environment Monitoring Spectrometer (GEMS) retrieval from geostationary-orbit satellite observations at 300-500 nm was investigated, utilizing polar-orbit satellite data of the MODerate resolution Imaging Spectroradiometer (MODIS) and Ozone Mapping Instrument (OMI), and the radiative transfer model (RTM) experiment. The variables in this study can be cloud, Rayleigh-scattering, aerosol, ozone and surface type. The cloud detection in high-resolution MODIS pixels ($1km{\times}1km$) was compared with that in GEMS-scale pixels ($8km{\times}7km$). The GEMS detection was consistent (~79%) with the MODIS result. However, the detection probability in partially-cloudy (${\leq}40%$) GEMS pixels decreased due to other effects (i.e., aerosol and surface type). The Rayleigh-scattering effect in RGB images was noticeable over ocean, based on the RTM calculation. The reflectance at top of atmosphere ($R_{toa}$) increased with aerosol amounts in case of $R_{sfc}$<0.2, but decreased in $R_{sfc}{\geq}0.2$. The $R_{sfc}$ errors due to the aerosol increased with wavelength in the UV, but were constant or slightly decreased in the visible. The ozone absorption was most sensitive at 328 nm in the UV region (328-354 nm). The $R_{sfc}$ error was +0.1 because of negative total ozone anomaly (-100 DU) under the condition of $R_{sfc}=0.15$. This study can be useful to estimate $R_{sfc}$ uncertainties in the GEMS retrieval.

Seismic AVO Analysis, AVO Modeling, AVO Inversion for understanding the gas-hydrate structure (가스 하이드레이트 부존층의 구조파악을 위한 탄성파 AVO 분석 AVO모델링, AVO역산)

  • Kim Gun-Duk;Chung Bu-Heung
    • 한국신재생에너지학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.643-646
    • /
    • 2005
  • The gas hydrate exploration using seismic reflection data, the detection of BSR(Bottom Simulating Reflector) on the seismic section is the most important work flow because the BSR have been interpreted as being formed at the base of a gas hydrate zone. Usually, BSR has some dominant qualitative characteristics on seismic section i.e. Wavelet phase reversal compare to sea bottom signal, Parallel layer with sea bottom, Strong amplitude, Masking phenomenon above the BSR, Cross bedding with other geological layer. Even though a BSR can be selected on seismic section with these guidance, it is not enough to conform as being true BSR. Some other available methods for verifying the BSR with reliable analysis quantitatively i.e. Interval velocity analysis, AVO(Amplitude Variation with Offset)analysis etc. Usually, AVO analysis can be divided by three main parts. The first part is AVO analysis, the second is AVO modeling and the last is AVO inversion. AVO analysis is unique method for detecting the free gas zone on seismic section directly. Therefore it can be a kind of useful analysis method for discriminating true BSR, which might arise from an Possion ratio contrast between high velocity layer, partially hydrated sediment and low velocity layer, water saturated gas sediment. During the AVO interpretation, as the AVO response can be changed depend upon the water saturation ratio, it is confused to discriminate the AVO response of gas layer from dry layer. In that case, the AVO modeling is necessary to generate synthetic seismogram comparing with real data. It can be available to make conclusions from correspondence or lack of correspondence between the two seismograms. AVO inversion process is the method for driving a geological model by iterative operation that the result ing synthetic seismogram matches to real data seismogram wi thin some tolerance level. AVO inversion is a topic of current research and for now there is no general consensus on how the process should be done or even whether is valid for standard seismic data. Unfortunately, there are no well log data acquired from gas hydrate exploration area in Korea. Instead of that data, well log data and seismic data acquired from gas sand area located nearby the gas hydrate exploration area is used to AVO analysis, As the results of AVO modeling, type III AVO anomaly confirmed on the gas sand layer. The Castagna's equation constant value for estimating the S-wave velocity are evaluated as A=0.86190, B=-3845.14431 respectively and water saturation ratio is $50\%$. To calculate the reflection coefficient of synthetic seismogram, the Zoeppritz equation is used. For AVO inversion process, the dataset provided by Hampson-Rushell CO. is used.

  • PDF

An Analysis of Geophysical and Temperature Monitoring Data for Leakage Detection of Earth Dam (흙댐의 누수구역 판별을 위한 물리탐사와 온도 모니터링 자료의 해석)

  • Oh, Seok-Hoon;Suh, Baek-Soo;Kim, Joong-Ryul
    • Journal of the Korean earth science society
    • /
    • v.31 no.6
    • /
    • pp.563-572
    • /
    • 2010
  • Both multi-channel temperature monitoring and geophysical electric survey were performed together for an embankment to assess the leakage zone. Temperature variation according to space and time on the inner parts of engineering constructions (e.g.: dam and slope) can be basic information for diagnosing their safety problem. In general, as constructions become superannuated, structural deformation (e.g.: cracks and defects) could be generated by various factors. Seepage or leakage of water through the cracks or defects in old dams will directly cause temperature anomaly. This study shows that the position of seepage or leakage in dam body can be detected by multi-channel temperature monitoring using thermal line sensor. For that matter, diverse temperature monitoring experiments for a leakage physical model were performed in the laboratory. In field application of an old earth fill dam, temperature variations for water depth and for inner parts of boreholes located at downstream slope were measured. Temperature monitoring results for a long time at the bottom of downstream slope of the dam showed the possibility that temperature monitoring can provide the synthetic information about flowing path and quantity of seepage of leakage in dam body. Geophysical data by electrical method are also added to help interpret data.

Unstructured Data Analysis using Equipment Check Ledger: A Case Study in Telecom Domain (장비점검 일지의 비정형 데이터분석을 통한 고장 대응 효율화 사례 연구)

  • Ju, Yeonjin;Kim, Yoosin;Jeong, Seung Ryul
    • Journal of Internet Computing and Services
    • /
    • v.21 no.1
    • /
    • pp.127-135
    • /
    • 2020
  • As the importance of the use and analysis of big data is emerging, there is a growing interest in natural language processing techniques for unstructured data such as news articles and comments. Particularly, as the collection of big data becomes possible, data mining techniques capable of pre-processing and analyzing data are emerging. In this case study with a telecom company, we propose a methodology how to formalize unstructured data using text mining. The domain is determined as equipment failure and the data is about 2.2 million equipment check ledger data. Data on equipment failures by 800,000 per year is accumulated in the equipment check ledger. The equipment check ledger coexist with both formal and unstructured data. Although formal data can be easily used for analysis, unstructured data is difficult to be used immediately for analysis. However, in unstructured data, there is a high possibility that important information. Because it can be contained that is not written in a formal. Therefore, in this study, we study to develop digital transformation method for unstructured data in equipment check ledger.

Clinical Applications of Neuroimaging with Susceptibility Weighted Imaging: Review Article (SWI의 신경영상분야의 임상적 이용)

  • Roh, Keuntak;Kang, Hyunkoo;Kim, Injoong
    • Investigative Magnetic Resonance Imaging
    • /
    • v.18 no.4
    • /
    • pp.290-302
    • /
    • 2014
  • Purpose : Susceptibility-weighted magnetic resonance (MR) sequence is three-dimensional (3D), spoiled gradient-echo pulse sequences that provide a high sensitivity for the detection of blood degradation products, calcifications, and iron deposits. This pictorial review is aimed at illustrating and discussing its main clinical applications. Materials and Methods: SWI is based on high-resolution, 3D, fully velocity-compensated gradient-echo sequences using both magnitude and phase images. To enhance the visibility of the venous structures, the magnitude images are multiplied with a phase mask generated from the filtered phase data, which are displayed at best after post-processing of the 3D dataset with the minimal intensity projection algorithm. A total of 200 patients underwent MR examinations that included SWI on a 3 tesla MR imager were enrolled. Results: SWI is very useful in detecting multiple brain disorders. Among the 200 patients, 80 showed developmental venous anomaly, 22 showed cavernous malformation, 12 showed calcifications in various conditions, 21 showed cerebrovascular accident with susceptibility vessel sign or microbleeds, 52 showed brain tumors, 2 showed diffuse axonal injury, 3 showed arteriovenous malformation, 5 showed dural arteriovenous fistula, 1 showed moyamoya disease, and 2 showed Parkinson's disease. Conclusion: SWI is useful in detecting occult low flow vascular lesions, calcification and microbleed and characterising diverse brain disorders.