• Title/Summary/Keyword: prior 모델

Search Result 572, Processing Time 0.03 seconds

Daily Stock Price Prediction Using Fuzzy Model (퍼지 모델을 이용한 일별 주가 예측)

  • Hwang, Hee-Soo
    • The KIPS Transactions:PartB
    • /
    • v.15B no.6
    • /
    • pp.603-608
    • /
    • 2008
  • In this paper an approach to building fuzzy model to predict daily open, close, high, and low stock prices is presented. One of prior problems in building a stock prediction model is to select most effective indicators for the stock prediction. The problem is overcome by the selection of information used in the analysis of stick-chart as the input variables of our fuzzy model. The fuzzy rules have the premise and the consequent, in which they are composed of trapezoidal membership functions, and nonlinear equations, respectively. DE(Differential Evolution) searches optimal fuzzy rules through an evolutionary process. To evaluate the effectiveness of the proposed approach numerical example is considered. The fuzzy models to predict open, high, low, and close prices of KOSPI(KOrea composite Stock Price Index) on a daily basis are built, and their performances are demonstrated and compared with those of neural network.

Evaluation and Promotion Policy for Promising Business Models Based on TV White Space (TV 유휴 대역을 활용한 유망 비즈니스 모델의 평가 및 활성화 정책 연구)

  • Kim, Tae-Han;Song, Hee-Seok
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.23 no.8
    • /
    • pp.909-922
    • /
    • 2012
  • To fully utilize scarce spectrum resource, it's necessary to develop and evaluate promising business models prior to making technology R&D plan and industrial promotion policy. The purpose of this paper is to design potential business models, evaluate the propriety of commercializing the models, and discuss promotion policies after exploring promising sectors consuming spectrum resources. The research is based on TV white space, which is vacant TV channels in region or time domain and considered as core spectrum resource along with digital terrestrial television switchover. As the result, four kinds of business models were derived, including broadcasting and telecommunication types. Each model was discussed from four standpoints: customer value proposition, profit formula, key resources, and key processes, and the propriety for commercialization was evaluated by three dimensions: technological evaluations, business-oriented evaluations, and user-oriented evaluations. The promotion policies of government and market participants for the activation of TV White space-based business models were discussed as well.

Strategies for Activating BIM-data Sharing in Construction - Based on cases of defining practical data and a survey of practitioners - (건설분야 BIM 데이터 공유 활성화 전략 - 건설 실무분야의 데이터 연계방법과 실무자 설문을 기반으로-)

  • Kim, Do-Young;Lee, Sung-Woo;Nam, Ju-Hyun;Kim, Bum-Soo;Kim, Sung-Jin
    • Journal of KIBIM
    • /
    • v.12 no.1
    • /
    • pp.72-80
    • /
    • 2022
  • It has become mandatory to designs by BIM in construction. It is urgent to make accurate decisions through the linkage between complex and various types of data in projects. In particular, block-chain based data sharing process (using BIM files, general construction submitted files) is essential to support reliable decision making in complex data flood systems. Prior to developing data sharing system based on block-chain, in this paper, a data linkage method is proposed so that practitioners can simultaneously utilize existing construction information and BIM data. Examples are shown based on the construction classification system and file expression, and incentive strategies are explored through a survey so that heterogeneous information can be used at the same time in overall projects.

Agent's Activities based Intention Recognition Computing (에이전트 행동에 기반한 의도 인식 컴퓨팅)

  • Kim, Jin-Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.2
    • /
    • pp.87-98
    • /
    • 2012
  • Understanding agent's intent is an essential component of the human-computer interaction of ubiquitous computing. Because correct inference of subject's intention in ubiquitous computing system helps particularly to understand situations that involve collaboration among multiple agents or detection of situations that can pose a particular activity. This paper, inspired by people have a mechanism for interpreting one another's actions and for inferring the intentions and goals that underlie action, proposes an approach that allows a computing system to quickly recognize the intent of agents based on experience data acquired through prior capabilities of activities recognition. To proceed intention recognition, proposed method uses formulations of Hidden Markov Models (HMM) to model a system's prior experience and agents' action change, then makes for system infer intents in advance before the agent's actions are finalized while taking the perspective of the agent whose intent should be recognized. Quantitative validation of experimental results, while presenting an accurate rate, an early detection rate and a correct duration rate with detecting the intent of several people performing various activities, shows that proposed research contributes to implement effective intent recognition system.

Development and Validation of Exposure Models for Construction Industry: Tier 2 Model (건설업 유해화학물질 노출 모델의 개발 및 검증: Tier-2 노출 모델)

  • Kim, Seung Won;Jang, Jiyoung;Kim, Gab Bae
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.24 no.2
    • /
    • pp.219-228
    • /
    • 2014
  • Objectives: The major objective of this study was to develop a tier 2 exposure model combining tier 1 exposure model estimates and worker monitoring data and suggesting narrower exposure ranges than tier 1 results. Methods: Bayesian statistics were used to develop a tier 2 exposure model as was done for the European Union (EU) tier 2 exposure models, for example Advanced REACH Tools (ART) and Stoffenmanager. Bayesian statistics required a prior and data to calculate the posterior results. In this model, tier 1 estimated serving as a prior and worker exposure monitoring data at the worksite of interest were entered as data. The calculation of Bayesian statistics requires integration over a range, which were performed using a Riemann sum algorithm. From the calculated exposure estimates, 95% range was extracted. These algorithm have been realized on Excel spreadsheet for convenience and easy access. Some fail-proof features such as locking the spreadsheet were added in order to prevent errors or miscalculations derived from careless usage of the file. Results: The tier 2 exposure model was successfully built on a separate Excel spreadsheet in the same file containing tier 1 exposure model. To utilize the model, exposure range needs to be estimated from tier 1 model and worker monitoring data, at least one input are required. Conclusions: The developed tier 2 exposure model can help industrial hygienists obtain a narrow range of worker exposure level to a chemical by reflecting a certain set of job characteristics.

A Framework for Object Detection by Haze Removal (안개 제거에 의한 객체 검출 성능 향상 방법)

  • Kim, Sang-Kyoon;Choi, Kyoung-Ho;Park, Soon-Young
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.5
    • /
    • pp.168-176
    • /
    • 2014
  • Detecting moving objects from a video sequence is a fundamental and critical task in video surveillance, traffic monitoring and analysis, and human detection and tracking. It is very difficult to detect moving objects in a video sequence degraded by the environmental factor such as fog. In particular, the color of an object become similar to the neighbor and it reduces the saturation, thus making it very difficult to distinguish the object from the background. For such a reason, it is shown that the performance and reliability of object detection and tracking are poor in the foggy weather. In this paper, we propose a novel method to improve the performance of object detection, combining a haze removal algorithm and a local histogram-based object tracking method. For the quantitative evaluation of the proposed system, information retrieval measurements, recall and precision, are used to quantify how well the performance is improved before and after the haze removal. As a result, the visibility of the image is enhanced and the performance of objects detection is improved.

An Efficient Background Modeling and Correction Method for EDXRF Spectra (EDXRF 스펙트럼을 위한 효율적인 배경 모델링과 보정 방법)

  • Park, Dong Sun;Jagadeesan, Sukanya;Jin, Moonyong;Yoon, Sook
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.8
    • /
    • pp.238-244
    • /
    • 2013
  • In energy dispersive X-ray fluorescence analysis, the removal of the continuum on which the X-ray spectrum is superimposed is one of the most important processes, since it has a strong influence on the analysis result. The existing methods which have been used for it usually require tight constraints or prior information on the continuum. In this paper, an efficient background correction method is proposed for Energy Dispersive X-ray fluorescence (EDXRF) spectra. The proposed method has two steps of background modeling and background correction. It is based on the basic concept which differentiates background areas from the peak areas in a spectrum and the SNIP algorithm, one of the popular methods for background removal, is used to enhance the performance. After detecting some points which belong to the background from a spectrum, its background is modeled by a curve fitting method based on them. And then the obtained background model is subtracted from the raw spectrum. The method has been shown to give better results than some of traditional methods, while working under relatively weak constraints or prior information.

$H_{\infty}$ Filter Based Robust Simultaneous Localization and Mapping for Mobile Robots (이동로봇을 위한 $H_{\infty}$ 필터 기반의 강인한 동시 위치인식 및 지도작성 구현 기술)

  • Jeon, Seo-Hyun;Lee, Keon-Yong;Doh, Nakju Lett
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.48 no.1
    • /
    • pp.55-60
    • /
    • 2011
  • The most basic algorithm in SLAM(Simultaneous Localization And Mapping) technique of mobile robots is EKF(Extended Kalman Filter) SLAM. However, it requires prior information of characteristics of the system and the noise model which cannot be estimated in accurate. By this limit, Kalman Filter shows the following behaviors in a highly uncertain environment: becomes too sensitive to internal parameters, mathematical consistency is not kept, or yields a wrong estimation result. In contrast, $H_{\infty}$ filter does not requires a prior information in detail. Thus, based on a idea that $H_{\infty}$ filter based SLAM will be more robust than the EKF-SLAM, we propose a framework of $H_{\infty}$ filter based SLAM and show that suggested algorithm shows slightly better result man me EKF-SLAM in a highly uncertain environment.

A Long-term Durability Prediction for RC Structures Exposed to Carbonation Using Probabilistic Approach (확률론적 기법을 이용한 탄산화 RC 구조물의 내구성 예측)

  • Jung, Hyun-Jun;Kim, Gyu-Seon
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.14 no.5
    • /
    • pp.119-127
    • /
    • 2010
  • This paper provides a new approach for durability prediction of reinforced concrete structures exposed to carbonation. In this method, the prediction can be updated successively by a Bayes' theorem when additional data are available. The stochastic properties of model parameters are explicitly taken into account in the model. To simplify the procedure of the model, the probability of the durability limit is determined based on the samples obtained from the Latin Hypercube Sampling(LHS) technique. The new method may be very useful in design of important concrete structures and help to predict the remaining service life of existing concrete structures which have been monitored. For using the new method, in which the prior distribution is developed to represent the uncertainties of the carbonation velocity using data of concrete structures(3700 specimens) in Korea and the likelihood function is used to monitor in-situ data. The posterior distribution is obtained by combining a prior distribution and a likelihood function. Efficiency of the LHS technique for simulation was confirmed through a comparison between the LHS and the Monte Calro Simulation(MCS) technique.

Online anomaly detection algorithm based on deep support vector data description using incremental centroid update (점진적 중심 갱신을 이용한 deep support vector data description 기반의 온라인 비정상 탐지 알고리즘)

  • Lee, Kibae;Ko, Guhn Hyeok;Lee, Chong Hyun
    • The Journal of the Acoustical Society of Korea
    • /
    • v.41 no.2
    • /
    • pp.199-209
    • /
    • 2022
  • Typical anomaly detection algorithms are trained by using prior data. Thus the batch learning based algorithms cause inevitable performance degradation when characteristics of newly incoming normal data change over time. We propose an online anomaly detection algorithm which can consider the gradual characteristic changes of incoming normal data. The proposed algorithm based on one-class classification model includes both offline and online learning procedures. In offline learning procedure, the algorithm learns the prior data to be close to centroid of the latent space and then updates the centroid of the latent space incrementally by new incoming data. In the online learning, the algorithm continues learning by using the updated centroid. Through experiments using public underwater acoustic data, the proposed online anomaly detection algorithm takes only approximately 2 % additional learning time for the incremental centroid update and learning. Nevertheless, the proposed algorithm shows 19.10 % improvement in Area Under the receiver operating characteristic Curve (AUC) performance compared to the offline learning model when new incoming normal data comes.