• Title/Summary/Keyword: preprocessing

Search Result 2,062, Processing Time 0.034 seconds

Rapid Detection of Radioactive Strontium in Water Samples Using Laser-Induced Breakdown Spectroscopy (LIBS) (Laser-Induced Breakdown Spectroscopy (LIBS)를 이용한 방사성 스트론튬 오염물질에 대한 신속한 모니터링 기술)

  • Park, Jin-young;Kim, Hyun-a;Park, Kihong;Kim, Kyoung-woong
    • Economic and Environmental Geology
    • /
    • v.50 no.5
    • /
    • pp.341-352
    • /
    • 2017
  • Along with Cs-137 (half-life: 30.17 years), Sr-90 (half-life: 28.8 years) is one of the most important environmental monitoring radioactive elements. Rapid and easy monitoring method for Sr-90 using Laser-Induced Breakdown Spectroscopy (LIBS) has been studied. Strontium belongs to a bivalent alkaline earth metal such as calcium and has similar electron arrangement and size. Due to these similar chemical properties, it can easily enter into the human body through the food chain via water, soil, and crops when leaked into the environment. In addition, it is immersed into the bone at the case of human influx and causes the toxicity for a long time (biological half-life: about 50 years). It is a very reductive and related with the specific reaction that makes wet analysis difficult. In particular, radioactive strontium should be monitored by nuclear power plants but it is very difficult to be analysed from high-cost problems as well as low accuracy of analysis due to complicated analysis procedures, expensive analysis equipment, and a pretreatment process of using massive chemicals. Therefore, we introduce the Laser-Induced Breakdown Spectroscopy (LIBS) analysis method that analyzes the elements in the sample using the inherent spectrum by generating plasma on the sample using pulse energy, and it can be analyzed in a few seconds without preprocessing. A variety of analytical plates for samples were developed to improve the analytical sensitivity by optimizing the laser, wavelength, and time resolution. This can be effectively applied to real-time monitoring of radioactive wastewater discharged from a nuclear power plant, and furthermore, it can be applied as an emergency monitoring means such as possible future accidents at a nuclear power plants.

Robust Eye Localization using Multi-Scale Gabor Feature Vectors (다중 해상도 가버 특징 벡터를 이용한 강인한 눈 검출)

  • Kim, Sang-Hoon;Jung, Sou-Hwan;Cho, Seong-Won;Chung, Sun-Tae
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.1
    • /
    • pp.25-36
    • /
    • 2008
  • Eye localization means localization of the center of the pupils, and is necessary for face recognition and related applications. Most of eye localization methods reported so far still need to be improved about robustness as well as precision for successful applications. In this paper, we propose a robust eye localization method using multi-scale Gabor feature vectors without big computational burden. The eye localization method using Gabor feature vectors is already employed in fuck as EBGM, but the method employed in EBGM is known not to be robust with respect to initial values, illumination, and pose, and may need extensive search range for achieving the required performance, which may cause big computational burden. The proposed method utilizes multi-scale approach. The proposed method first tries to localize eyes in the lower resolution face image by utilizing Gabor Jet similarity between Gabor feature vector at an estimated initial eye coordinates and the Gabor feature vectors in the eye model of the corresponding scale. Then the method localizes eyes in the next scale resolution face image in the same way but with initial eye points estimated from the eye coordinates localized in the lower resolution images. After repeating this process in the same way recursively, the proposed method funally localizes eyes in the original resolution face image. Also, the proposed method provides an effective illumination normalization to make the proposed multi-scale approach more robust to illumination, and additionally applies the illumination normalization technique in the preprocessing stage of the multi-scale approach so that the proposed method enhances the eye detection success rate. Experiment results verify that the proposed eye localization method improves the precision rate without causing big computational overhead compared to other eye localization methods reported in the previous researches and is robust to the variation of post: and illumination.

Enhancement of Inter-Image Statistical Correlation for Accurate Multi-Sensor Image Registration (정밀한 다중센서 영상정합을 위한 통계적 상관성의 증대기법)

  • Kim, Kyoung-Soo;Lee, Jin-Hak;Ra, Jong-Beom
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.4 s.304
    • /
    • pp.1-12
    • /
    • 2005
  • Image registration is a process to establish the spatial correspondence between images of the same scene, which are acquired at different view points, at different times, or by different sensors. This paper presents a new algorithm for robust registration of the images acquired by multiple sensors having different modalities; the EO (electro-optic) and IR(infrared) ones in the paper. The two feature-based and intensity-based approaches are usually possible for image registration. In the former selection of accurate common features is crucial for high performance, but features in the EO image are often not the same as those in the R image. Hence, this approach is inadequate to register the E0/IR images. In the latter normalized mutual Information (nHr) has been widely used as a similarity measure due to its high accuracy and robustness, and NMI-based image registration methods assume that statistical correlation between two images should be global. Unfortunately, since we find out that EO and IR images don't often satisfy this assumption, registration accuracy is not high enough to apply to some applications. In this paper, we propose a two-stage NMI-based registration method based on the analysis of statistical correlation between E0/1R images. In the first stage, for robust registration, we propose two preprocessing schemes: extraction of statistically correlated regions (ESCR) and enhancement of statistical correlation by filtering (ESCF). For each image, ESCR automatically extracts the regions that are highly correlated to the corresponding regions in the other image. And ESCF adaptively filters out each image to enhance statistical correlation between them. In the second stage, two output images are registered by using NMI-based algorithm. The proposed method provides prospective results for various E0/1R sensor image pairs in terms of accuracy, robustness, and speed.

A Numerical Study for Effective Operation of MSW Incinerator for Waste of High Heating Value by the Addition of Moisture Air (함습공기를 이용한 고발열량 도시폐기물 소각로의 효율적 운전을 위한 수치 해석적 연구)

  • Shin, Mi-Soo;Shin, Na-Ra;Jang, Dong-Soon
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.35 no.2
    • /
    • pp.115-123
    • /
    • 2013
  • Stoker type incinerator is one of the most popular one used as municipal solid waste (MSW) incineration because, in general, it is quite suitable for large capacity and need no preprocessing facility. Nowadays, however, since the combustible portion of incoming MSW increases together with the decrease of the moisture content due to prohibition of directly burying food waste in landfill, the heating value of waste is remarkably increasing in comparison with the early stage of incinerator installation. Consequently, the increased heating value in incinerator operation causes a number of serious problems such as reduction of waste amount to be burned due to the boiler heat capacity together with the significant NO generation in high temperature environment. Therefore, in this study, a series of numerical simulation have been made as parameters of waste amount and the fraction of moisture in air stream in order to investigate optimal operating condition for the resolution of the problems associated with the high heating value of waste mentioned above. In specific, a detailed turbulent reaction flow field calculation with NO model was made for the full scale incinerator of D city. To this end, the injection method of moisturized air as oxidizer was intensively reviewed by the addition of moisture water amount from 10% and 20%. The calculation result, in general, showed that the reduction of maximum flame temperature appears consistently due to the combined effects of the increased specific heat of combustion air and vaporization heat by the addition of water moisture. As a consequence, the generation of NOx concentration was substantially reduced. Further, for the case of 20% moisture amount stream, the afterburner region is quite appropriate in temperature range for the operation of SNCR. This suggests the SNCR facility can be considered for reoperation. which is not in service at all due to the increased heating value of MSW.

Oceanic Application of Satellite Synthetic Aperture Radar - Focused on Sea Surface Wind Retrieval - (인공위성 합성개구레이더 영상 자료의 해양 활용 - 해상풍 산출을 중심으로 -)

  • Jang, Jae-Cheol;Park, Kyung-Ae
    • Journal of the Korean earth science society
    • /
    • v.40 no.5
    • /
    • pp.447-463
    • /
    • 2019
  • Sea surface wind is a fundamental element for understanding the oceanic phenomena and for analyzing changes of the Earth environment caused by global warming. Global research institutes have developed and operated scatterometers to accurately and continuously observe the sea surface wind, with the accuracy of approximately ${\pm}20^{\circ}$ for wind direction and ${\pm}2m\;s^{-1}$ for wind speed. Given that the spatial resolution of the scatterometer is 12.5-25.0 km, the applicability of the data to the coastal area is limited due to complicated coastal lines and many islands around the Korean Peninsula. In contrast, Synthetic Aperture Radar (SAR), one of microwave sensors, is an all-weather instrument, which enables us to retrieve sea surface wind with high resolution (<1 km) and compensate the sparse resolution of the scatterometer. In this study, we investigated the Geophysical Model Functions (GMF), which are the algorithms for retrieval of sea surface wind speed from the SAR data depending on each band such as C-, L-, or X-band radar. We reviewed in the simulation of the backscattering coefficients for relative wind direction, incidence angle, and wind speed by applying LMOD, CMOD, and XMOD model functions, and analyzed the characteristics of each GMF. We investigated previous studies about the validation of wind speed from the SAR data using these GMFs. The accuracy of sea surface wind from SAR data changed with respect to observation mode, GMF type, reference data for validation, preprocessing method, and the method for calculation of relative wind direction. It is expected that this study contributes to the potential users of SAR images who retrieve wind speeds from SAR data at the coastal region around the Korean Peninsula.

Analysis of Research Trends of 'Word of Mouth (WoM)' through Main Path and Word Co-occurrence Network (주경로 분석과 연관어 네트워크 분석을 통한 '구전(WoM)' 관련 연구동향 분석)

  • Shin, Hyunbo;Kim, Hea-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.179-200
    • /
    • 2019
  • Word-of-mouth (WoM) is defined by consumer activities that share information concerning consumption. WoM activities have long been recognized as important in corporate marketing processes and have received much attention, especially in the marketing field. Recently, according to the development of the Internet, the way in which people exchange information in online news and online communities has been expanded, and WoM is diversified in terms of word of mouth, score, rating, and liking. Social media makes online users easy access to information and online WoM is considered a key source of information. Although various studies on WoM have been preceded by this phenomenon, there is no meta-analysis study that comprehensively analyzes them. This study proposed a method to extract major researches by applying text mining techniques and to grasp the main issues of researches in order to find the trend of WoM research using scholarly big data. To this end, a total of 4389 documents were collected by the keyword 'Word-of-mouth' from 1941 to 2018 in Scopus (www.scopus.com), a citation database, and the data were refined through preprocessing such as English morphological analysis, stopwords removal, and noun extraction. To carry out this study, we adopted main path analysis (MPA) and word co-occurrence network analysis. MPA detects key researches and is used to track the development trajectory of academic field, and presents the research trend from a macro perspective. For this, we constructed a citation network based on the collected data. The node means a document and the link means a citation relation in citation network. We then detected the key-route main path by applying SPC (Search Path Count) weights. As a result, the main path composed of 30 documents extracted from a citation network. The main path was able to confirm the change of the academic area which was developing along with the change of the times reflecting the industrial change such as various industrial groups. The results of MPA revealed that WoM research was distinguished by five periods: (1) establishment of aspects and critical elements of WoM, (2) relationship analysis between WoM variables, (3) beginning of researches of online WoM, (4) relationship analysis between WoM and purchase, and (5) broadening of topics. It was found that changes within the industry was reflected in the results such as online development and social media. Very recent studies showed that the topics and approaches related WoM were being diversified to circumstantial changes. However, the results showed that even though WoM was used in diverse fields, the main stream of the researches of WoM from the start to the end, was related to marketing and figuring out the influential factors that proliferate WoM. By applying word co-occurrence network analysis, the research trend is presented from a microscopic point of view. Word co-occurrence network was constructed to analyze the relationship between keywords and social network analysis (SNA) was utilized. We divided the data into three periods to investigate the periodic changes and trends in discussion of WoM. SNA showed that Period 1 (1941~2008) consisted of clusters regarding relationship, source, and consumers. Period 2 (2009~2013) contained clusters of satisfaction, community, social networks, review, and internet. Clusters of period 3 (2014~2018) involved satisfaction, medium, review, and interview. The periodic changes of clusters showed transition from offline to online WoM. Media of WoM have become an important factor in spreading the words. This study conducted a quantitative meta-analysis based on scholarly big data regarding WoM. The main contribution of this study is that it provides a micro perspective on the research trend of WoM as well as the macro perspective. The limitation of this study is that the citation network constructed in this study is a network based on the direct citation relation of the collected documents for MPA.

A Study on Training Dataset Configuration for Deep Learning Based Image Matching of Multi-sensor VHR Satellite Images (다중센서 고해상도 위성영상의 딥러닝 기반 영상매칭을 위한 학습자료 구성에 관한 연구)

  • Kang, Wonbin;Jung, Minyoung;Kim, Yongil
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1505-1514
    • /
    • 2022
  • Image matching is a crucial preprocessing step for effective utilization of multi-temporal and multi-sensor very high resolution (VHR) satellite images. Deep learning (DL) method which is attracting widespread interest has proven to be an efficient approach to measure the similarity between image pairs in quick and accurate manner by extracting complex and detailed features from satellite images. However, Image matching of VHR satellite images remains challenging due to limitations of DL models in which the results are depending on the quantity and quality of training dataset, as well as the difficulty of creating training dataset with VHR satellite images. Therefore, this study examines the feasibility of DL-based method in matching pair extraction which is the most time-consuming process during image registration. This paper also aims to analyze factors that affect the accuracy based on the configuration of training dataset, when developing training dataset from existing multi-sensor VHR image database with bias for DL-based image matching. For this purpose, the generated training dataset were composed of correct matching pairs and incorrect matching pairs by assigning true and false labels to image pairs extracted using a grid-based Scale Invariant Feature Transform (SIFT) algorithm for a total of 12 multi-temporal and multi-sensor VHR images. The Siamese convolutional neural network (SCNN), proposed for matching pair extraction on constructed training dataset, proceeds with model learning and measures similarities by passing two images in parallel to the two identical convolutional neural network structures. The results from this study confirm that data acquired from VHR satellite image database can be used as DL training dataset and indicate the potential to improve efficiency of the matching process by appropriate configuration of multi-sensor images. DL-based image matching techniques using multi-sensor VHR satellite images are expected to replace existing manual-based feature extraction methods based on its stable performance, thus further develop into an integrated DL-based image registration framework.

Landslide Susceptibility Mapping Using Deep Neural Network and Convolutional Neural Network (Deep Neural Network와 Convolutional Neural Network 모델을 이용한 산사태 취약성 매핑)

  • Gong, Sung-Hyun;Baek, Won-Kyung;Jung, Hyung-Sup
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_2
    • /
    • pp.1723-1735
    • /
    • 2022
  • Landslides are one of the most prevalent natural disasters, threating both humans and property. Also landslides can cause damage at the national level, so effective prediction and prevention are essential. Research to produce a landslide susceptibility map with high accuracy is steadily being conducted, and various models have been applied to landslide susceptibility analysis. Pixel-based machine learning models such as frequency ratio models, logistic regression models, ensembles models, and Artificial Neural Networks have been mainly applied. Recent studies have shown that the kernel-based convolutional neural network (CNN) technique is effective and that the spatial characteristics of input data have a significant effect on the accuracy of landslide susceptibility mapping. For this reason, the purpose of this study is to analyze landslide vulnerability using a pixel-based deep neural network model and a patch-based convolutional neural network model. The research area was set up in Gangwon-do, including Inje, Gangneung, and Pyeongchang, where landslides occurred frequently and damaged. Landslide-related factors include slope, curvature, stream power index (SPI), topographic wetness index (TWI), topographic position index (TPI), timber diameter, timber age, lithology, land use, soil depth, soil parent material, lineament density, fault density, normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) were used. Landslide-related factors were built into a spatial database through data preprocessing, and landslide susceptibility map was predicted using deep neural network (DNN) and CNN models. The model and landslide susceptibility map were verified through average precision (AP) and root mean square errors (RMSE), and as a result of the verification, the patch-based CNN model showed 3.4% improved performance compared to the pixel-based DNN model. The results of this study can be used to predict landslides and are expected to serve as a scientific basis for establishing land use policies and landslide management policies.

Understanding Public Opinion by Analyzing Twitter Posts Related to Real Estate Policy (부동산 정책 관련 트위터 게시물 분석을 통한 대중 여론 이해)

  • Kim, Kyuli;Oh, Chanhee;Zhu, Yongjun
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.56 no.3
    • /
    • pp.47-72
    • /
    • 2022
  • This study aims to understand the trends of subjects related to real estate policies and public's emotional opinion on the policies. Two keywords related to real estate policies such as "real estate policy" and "real estate measure" were used to collect tweets created from February 25, 2008 to August 31, 2021. A total of 91,740 tweets were collected and we applied sentiment analysis and dynamic topic modeling to the final preprocessed and categorized data of 18,925 tweets. Sentiment analysis and dynamic topic model analysis were conducted for a total of 18,925 posts after preprocessing data and categorizing them into supply, real estate tax, interest rate, and population variance. Keywords of each category are as follows: the supply categories (rental housing, greenbelt, newlyweds, homeless, supply, reconstruction, sale), real estate tax categories (comprehensive real estate tax, acquisition tax, holding tax, multiple homeowners, speculation), interest rate categories (interest rate), and population variance categories (Sejong, new city). The results of the sentiment analysis showed that one person posted on average one or two positive tweets whereas in the case of negative and neutral tweets, one person posted two or three. In addition, we found that part of people have both positive as well as negative and neutral opinions towards real estate policies. As the results of dynamic topic modeling analysis, negative reactions to real estate speculative forces and unearned income were identified as major negative topics and as for positive topics, expectation on increasing supply of housing and benefits for homeless people who purchase houses were identified. Unlike previous studies, which focused on changes and evaluations of specific real estate policies, this study has academic significance in that it collected posts from Twitter, one of the social media platforms, used emotional analysis, dynamic topic modeling analysis, and identified potential topics and trends of real estate policy over time. The results of the study can help create new policies that take public opinion on real estate policies into consideration.

Preliminary Inspection Prediction Model to select the on-Site Inspected Foreign Food Facility using Multiple Correspondence Analysis (차원축소를 활용한 해외제조업체 대상 사전점검 예측 모형에 관한 연구)

  • Hae Jin Park;Jae Suk Choi;Sang Goo Cho
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.1
    • /
    • pp.121-142
    • /
    • 2023
  • As the number and weight of imported food are steadily increasing, safety management of imported food to prevent food safety accidents is becoming more important. The Ministry of Food and Drug Safety conducts on-site inspections of foreign food facilities before customs clearance as well as import inspection at the customs clearance stage. However, a data-based safety management plan for imported food is needed due to time, cost, and limited resources. In this study, we tried to increase the efficiency of the on-site inspection by preparing a machine learning prediction model that pre-selects the companies that are expected to fail before the on-site inspection. Basic information of 303,272 foreign food facilities and processing businesses collected in the Integrated Food Safety Information Network and 1,689 cases of on-site inspection information data collected from 2019 to April 2022 were collected. After preprocessing the data of foreign food facilities, only the data subject to on-site inspection were extracted using the foreign food facility_code. As a result, it consisted of a total of 1,689 data and 103 variables. For 103 variables, variables that were '0' were removed based on the Theil-U index, and after reducing by applying Multiple Correspondence Analysis, 49 characteristic variables were finally derived. We build eight different models and perform hyperparameter tuning through 5-fold cross validation. Then, the performance of the generated models are evaluated. The research purpose of selecting companies subject to on-site inspection is to maximize the recall, which is the probability of judging nonconforming companies as nonconforming. As a result of applying various algorithms of machine learning, the Random Forest model with the highest Recall_macro, AUROC, Average PR, F1-score, and Balanced Accuracy was evaluated as the best model. Finally, we apply Kernal SHAP (SHapley Additive exPlanations) to present the selection reason for nonconforming facilities of individual instances, and discuss applicability to the on-site inspection facility selection system. Based on the results of this study, it is expected that it will contribute to the efficient operation of limited resources such as manpower and budget by establishing an imported food management system through a data-based scientific risk management model.