• Title/Summary/Keyword: Automatic model selection

Search Result 101, Processing Time 0.027 seconds

Analysis of the Optimal Window Size of Hampel Filter for Calibration of Real-time Water Level in Agricultural Reservoirs (농업용저수지의 실시간 수위 보정을 위한 Hampel Filter의 최적 Window Size 분석)

  • Joo, Dong-Hyuk;Na, Ra;Kim, Ha-Young;Choi, Gyu-Hoon;Kwon, Jae-Hwan;Yoo, Seung-Hwan
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.64 no.3
    • /
    • pp.9-24
    • /
    • 2022
  • Currently, a vast amount of hydrologic data is accumulated in real-time through automatic water level measuring instruments in agricultural reservoirs. At the same time, false and missing data points are also increasing. The applicability and reliability of quality control of hydrological data must be secured for efficient agricultural water management through calculation of water supply and disaster management. Considering the characteristics of irregularities in hydrological data caused by irrigation water usage and rainfall pattern, the Korea Rural Community Corporation is currently applying the Hampel filter as a water level data quality management method. This method uses window size as a key parameter, and if window size is large, distortion of data may occur and if window size is small, many outliers are not removed which reduces the reliability of the corrected data. Thus, selection of the optimal window size for individual reservoir is required. To ensure reliability, we compared and analyzed the RMSE (Root Mean Square Error) and NSE (Nash-Sutcliffe model efficiency coefficient) of the corrected data and the daily water level of the RIMS (Rural Infrastructure Management System) data, and the automatic outlier detection standards used by the Ministry of Environment. To select the optimal window size, we used the classification performance evaluation index of the error matrix and the rainfall data of the irrigation period, showing the optimal values at 3 h. The efficient reservoir automatic calibration technique can reduce manpower and time required for manual calibration, and is expected to improve the reliability of water level data and the value of water resources.

3D Object Recognition for Localization of Outdoor Robotic Vehicles (실외 주행 로봇의 위치 추정을 위한 3 차원 물체 인식)

  • Baek, Seung-Min;Kim, Jae-Woong;Lee, Jang-Won;Zhaojin, Lu;Lee, Suk-Han
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.200-204
    • /
    • 2008
  • In this paper, to solve localization problem for out-door navigation of robotic vehicles, a particle filter based 3D object recognition framework that can estimate the pose of a building or its entrance is presented. A particle filter framework of multiple evidence fusion and model matching in a sequence of images is presented for robust recognition and pose estimation of 3D objects. The proposed approach features 1) the automatic selection and collection of an optimal set of evidences 2) the derivation of multiple interpretations, as particles representing possible object poses in 3D space, and the assignment of their probabilities based on matching the object model with evidences, and 3) the particle filtering of interpretations in time with the additional evidences obtained from a sequence of images. The proposed approach has been validated by the stereo-camera based experimentation of 3D object recognition and pose estimation, where a combination of photometric and geometric features are used for evidences.

  • PDF

Face Detection for Automatic Avatar Creation by using Deformable Template and GA (Deformable Template과 GA를 이용한 얼굴 인식 및 아바타 자동 생성)

  • Park Tae-Young;Kwon Min-Su;Kang Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.1
    • /
    • pp.110-115
    • /
    • 2005
  • This paper proposes the method to detect contours of a face, eyes and a mouth in a color image for making an avatar automatically. First, we use the HSI color model to exclude the effect of various light condition, and we find skin regions in an input image by using the skin color is defined on HS-plane. And then, we use deformable templates and Genetic Algorithm(GA) to detect contours of a face, eyes and a mouth. Deformable templates consist of B-spline curves and control point vectors. Those can represent various shape of a face, eyes and a mouth. And GA is very useful search procedure based on the mechanics of natural selection and natural genetics. Second, an avatar is created automatically by using contours and Fuzzy C-means clustering(FCM). FCM is used to reduce the number of face color As a result, we could create avatars like handmade caricatures which can represent the user's identity, differing from ones generated by the existing methods.

A Study on Optimal Time Distribution of Extreme Rainfall Using Minutely Rainfall Data: A Case Study of Seoul (분단위 강우자료를 이용한 극치강우의 최적 시간분포 연구: 서울지점을 중심으로)

  • Yoon, Sun-Kwon;Kim, Jong-Suk;Moon, Young-Il
    • Journal of Korea Water Resources Association
    • /
    • v.45 no.3
    • /
    • pp.275-290
    • /
    • 2012
  • In this study, we have developed an optimal time distribution model through extraction of peaks over threshold (POT) series. The median values for annual maximum rainfall dataset, which are obtained from the magnetic recording (MMR) and the automatic weather system(AWS) data at Seoul meteorological observatory, were used as the POT criteria. We also suggested the improved methodology for the time distribution of extreme rainfall compared to Huff method, which is widely used for time distributions of design rainfall. The Huff method did not consider changing in the shape of time distribution for each rainfall durations and rainfall criteria as total amount of rainfall for each rainfall events. This study have suggested an extracting methodology for rainfall events in each quartile based on interquartile range (IQR) matrix and selection for the mode quartile storm to determine the ranking cosidering weighting factors on minutely observation data. Finally, the optimal time distribution model in each rainfall duration was derived considering both data size and characteristics of distribution using kernel density function in extracted dimensionless unit rainfall hyetograph.

Exploring indicators of genetic selection using the sniffer method to reduce methane emissions from Holstein cows

  • Yoshinobu Uemoto;Tomohisa Tomaru;Masahiro Masuda;Kota Uchisawa;Kenji Hashiba;Yuki Nishikawa;Kohei Suzuki;Takatoshi Kojima;Tomoyuki Suzuki;Fuminori Terada
    • Animal Bioscience
    • /
    • v.37 no.2
    • /
    • pp.173-183
    • /
    • 2024
  • Objective: This study aimed to evaluate whether the methane (CH4) to carbon dioxide (CO2) ratio (CH4/CO2) and methane-related traits obtained by the sniffer method can be used as indicators for genetic selection of Holstein cows with lower CH4 emissions. Methods: The sniffer method was used to simultaneously measure the concentrations of CH4 and CO2 during milking in each milking box of the automatic milking system to obtain CH4/CO2. Methane-related traits, which included CH4 emissions, CH4 per energy-corrected milk, methane conversion factor (MCF), and residual CH4, were calculated. First, we investigated the impact of the model with and without body weight (BW) on the lactation stage and parity for predicting methane-related traits using a first on-farm dataset (Farm 1; 400 records for 74 Holstein cows). Second, we estimated the genetic parameters for CH4/CO2 and methane-related traits using a second on-farm dataset (Farm 2; 520 records for 182 Holstein cows). Third, we compared the repeatability and environmental effects on these traits in both farm datasets. Results: The data from Farm 1 revealed that MCF can be reliably evaluated during the lactation stage and parity, even when BW is excluded from the model. Farm 2 data revealed low heritability and moderate repeatability for CH4/CO2 (0.12 and 0.46, respectively) and MCF (0.13 and 0.38, respectively). In addition, the estimated genetic correlation of milk yield with CH4/CO2 was low (0.07) and that with MCF was moderate (-0.53). The on-farm data indicated that CH4/CO2 and MCF could be evaluated consistently during the lactation stage and parity with moderate repeatability on both farms. Conclusion: This study demonstrated the on-farm applicability of the sniffer method for selecting cows with low CH4 emissions.

Selection Model of System Trading Strategies using SVM (SVM을 이용한 시스템트레이딩전략의 선택모형)

  • Park, Sungcheol;Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.59-71
    • /
    • 2014
  • System trading is becoming more popular among Korean traders recently. System traders use automatic order systems based on the system generated buy and sell signals. These signals are generated from the predetermined entry and exit rules that were coded by system traders. Most researches on system trading have focused on designing profitable entry and exit rules using technical indicators. However, market conditions, strategy characteristics, and money management also have influences on the profitability of the system trading. Unexpected price deviations from the predetermined trading rules can incur large losses to system traders. Therefore, most professional traders use strategy portfolios rather than only one strategy. Building a good strategy portfolio is important because trading performance depends on strategy portfolios. Despite of the importance of designing strategy portfolio, rule of thumb methods have been used to select trading strategies. In this study, we propose a SVM-based strategy portfolio management system. SVM were introduced by Vapnik and is known to be effective for data mining area. It can build good portfolios within a very short period of time. Since SVM minimizes structural risks, it is best suitable for the futures trading market in which prices do not move exactly the same as the past. Our system trading strategies include moving-average cross system, MACD cross system, trend-following system, buy dips and sell rallies system, DMI system, Keltner channel system, Bollinger Bands system, and Fibonacci system. These strategies are well known and frequently being used by many professional traders. We program these strategies for generating automated system signals for entry and exit. We propose SVM-based strategies selection system and portfolio construction and order routing system. Strategies selection system is a portfolio training system. It generates training data and makes SVM model using optimal portfolio. We make $m{\times}n$ data matrix by dividing KOSPI 200 index futures data with a same period. Optimal strategy portfolio is derived from analyzing each strategy performance. SVM model is generated based on this data and optimal strategy portfolio. We use 80% of the data for training and the remaining 20% is used for testing the strategy. For training, we select two strategies which show the highest profit in the next day. Selection method 1 selects two strategies and method 2 selects maximum two strategies which show profit more than 0.1 point. We use one-against-all method which has fast processing time. We analyse the daily data of KOSPI 200 index futures contracts from January 1990 to November 2011. Price change rates for 50 days are used as SVM input data. The training period is from January 1990 to March 2007 and the test period is from March 2007 to November 2011. We suggest three benchmark strategies portfolio. BM1 holds two contracts of KOSPI 200 index futures for testing period. BM2 is constructed as two strategies which show the largest cumulative profit during 30 days before testing starts. BM3 has two strategies which show best profits during testing period. Trading cost include brokerage commission cost and slippage cost. The proposed strategy portfolio management system shows profit more than double of the benchmark portfolios. BM1 shows 103.44 point profit, BM2 shows 488.61 point profit, and BM3 shows 502.41 point profit after deducting trading cost. The best benchmark is the portfolio of the two best profit strategies during the test period. The proposed system 1 shows 706.22 point profit and proposed system 2 shows 768.95 point profit after deducting trading cost. The equity curves for the entire period show stable pattern. With higher profit, this suggests a good trading direction for system traders. We can make more stable and more profitable portfolios if we add money management module to the system.

A study on Algorithm Automatically Generating Ray Codes for Ray-tracing (파선코드 자동생성 알고리즘에 관한 연구)

  • Lee, Hee-Il;Cho, Chang-Soo
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.4
    • /
    • pp.361-367
    • /
    • 2008
  • When constructing a synthetic seismogram in the earthquake study or in seismic data interpretation by using a ray-tracing technique, the most troublesome and error-prone task is to define a suite of ray codes for the corresponding rays to trace in advance. An infinite number of rays exist for any arbitrarily located source and receiver in a medium. Missing certain important rays or an inappropriate selection of ray codes in tracing rays may result in wrong interpretation of the earthquake record or seismogram. Automatic ray code generation could be able to eliminate those problems. In this study we have developed an efficient algorithm with which one can generate systematically all the ray codes for the source(s) and receiver(s) arbitrarily located in a model. The result of this work could be used not only in analysing multiples in seismic data processing and interpretation, but also in coda wave study, study on the amplification effects in a basin and phase identification of the waves multiply reflected/refracted in earthquake study.

A SOC Design Methodology using SystemC (SystemC를 이용한 SOC 설계 방법)

  • 홍진석;김주선;배점한
    • Proceedings of the IEEK Conference
    • /
    • 2000.06b
    • /
    • pp.153-156
    • /
    • 2000
  • This paper presents a SOC design methodology using the newly-emerging SystemC. The suggested methodology firstly uses SystemC to define blocks from the previously-developed system level algorithm with internal behavior and interface being separated and validate such a described blocks' functionality when integrated. Next, the partitioning between software and hardware is considered. With software, the interface to hardware is described cycle-accurate and the other internal behavior in conventional ways. With hardware, I/O transactions are refined gradually in several abstraction levels and internal behavior described on a function basis. Once hardware and software have been completed functionally, system performance analysis is performed on the built model with assumed performance factors and influences such decisions regressively as on optimum algorithm selection, partitioning and etc. The analysis then gives constraint information when hardware description undergoes scheduling and fixed-point trans- formation with the help of automatic translation tools or manually. The methodology enables C/C++ program developers and VHDL/Verilog users to migrate quickly to a co-design & co-verification environment and is suitable for SoC development at a low cost.

  • PDF

Developing a Framework of Semantic Web Services for Integrated Management Center of U-City (U-City 도시통합운영센터를 위한 시맨틱 웹 서비스 프레임워크의 개발)

  • Lee, Myung-Jin;Kim, Kyung-Min;Jeon, Dong-Kyu;Eom, Tea-Young;Kim, Woo-Ju;Hong, June-S.
    • The Journal of Society for e-Business Studies
    • /
    • v.15 no.2
    • /
    • pp.167-189
    • /
    • 2010
  • As adopting ubiquitous technology into civil engineering, new city model is suggested called U-City. This paper proposes the framework of U-City management center to support effective services operation. The aims of the framework are to provide the development and operation environment for U-City services. Basically, these objectives are achieved by adopting the semantic web service technology to the framework. In this paper, OWL-S is mainly conducted to represent the description of U-City services. In addition, this paper insists that fine grained unit services are required to guarantee reusability, compatibility, and scalability of the services on U-City management center. The documentations conducted by OWL-S are provided as an example of service descriptions. At the last section, this paper also presents the architecture of U-City management center which enables automatic service discovery, selection, composition and interoperation.

STK Feature Tracking Using BMA for Fast Feature Displacement Convergence (빠른 피쳐변위수렴을 위한 BMA을 이용한 STK 피쳐 추적)

  • Jin, Kyung-Chan;Cho, Jin-Ho
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.36S no.8
    • /
    • pp.81-87
    • /
    • 1999
  • In general, feature detection and tracking algorithms is classified by EBGM using Garbor-jet, NNC-R and STK algorithm using pixel eigenvalue. In those algorithms, EBGM and NCC-R detect features with feature model, but STK algorithm has a characteristics of an automatic feature selection. In this paper, to solve the initial problem of NR tracking in STK algorithm, we detected features using STK algorithm in modelled feature region and tracked features with NR method. In tracking, to improve the tracking accuracy for features by NR method, we proposed BMA-NR method. We evaluated that BMA-NR method was superior to NBMA-NR in that feature tracking accuracy, since BMA-NR method was able to solve the local minimum problem due to search window size of NR.

  • PDF