• Title/Summary/Keyword: VE 방법론

Search Result 32, Processing Time 0.023 seconds

Rough Set Analysis for Stock Market Timing (러프집합분석을 이용한 매매시점 결정)

  • Huh, Jin-Nyung;Kim, Kyoung-Jae;Han, In-Goo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.77-97
    • /
    • 2010
  • Market timing is an investment strategy which is used for obtaining excessive return from financial market. In general, detection of market timing means determining when to buy and sell to get excess return from trading. In many market timing systems, trading rules have been used as an engine to generate signals for trade. On the other hand, some researchers proposed the rough set analysis as a proper tool for market timing because it does not generate a signal for trade when the pattern of the market is uncertain by using the control function. The data for the rough set analysis should be discretized of numeric value because the rough set only accepts categorical data for analysis. Discretization searches for proper "cuts" for numeric data that determine intervals. All values that lie within each interval are transformed into same value. In general, there are four methods for data discretization in rough set analysis including equal frequency scaling, expert's knowledge-based discretization, minimum entropy scaling, and na$\ddot{i}$ve and Boolean reasoning-based discretization. Equal frequency scaling fixes a number of intervals and examines the histogram of each variable, then determines cuts so that approximately the same number of samples fall into each of the intervals. Expert's knowledge-based discretization determines cuts according to knowledge of domain experts through literature review or interview with experts. Minimum entropy scaling implements the algorithm based on recursively partitioning the value set of each variable so that a local measure of entropy is optimized. Na$\ddot{i}$ve and Booleanreasoning-based discretization searches categorical values by using Na$\ddot{i}$ve scaling the data, then finds the optimized dicretization thresholds through Boolean reasoning. Although the rough set analysis is promising for market timing, there is little research on the impact of the various data discretization methods on performance from trading using the rough set analysis. In this study, we compare stock market timing models using rough set analysis with various data discretization methods. The research data used in this study are the KOSPI 200 from May 1996 to October 1998. KOSPI 200 is the underlying index of the KOSPI 200 futures which is the first derivative instrument in the Korean stock market. The KOSPI 200 is a market value weighted index which consists of 200 stocks selected by criteria on liquidity and their status in corresponding industry including manufacturing, construction, communication, electricity and gas, distribution and services, and financing. The total number of samples is 660 trading days. In addition, this study uses popular technical indicators as independent variables. The experimental results show that the most profitable method for the training sample is the na$\ddot{i}$ve and Boolean reasoning but the expert's knowledge-based discretization is the most profitable method for the validation sample. In addition, the expert's knowledge-based discretization produced robust performance for both of training and validation sample. We also compared rough set analysis and decision tree. This study experimented C4.5 for the comparison purpose. The results show that rough set analysis with expert's knowledge-based discretization produced more profitable rules than C4.5.

Assessing the Utility of Rainfall Forecasts for Weekly Groundwater Level Forecast in Tampa Bay Region, Florida (주단위 지하수위 예측 모의를 위한 강우 예측 자료의 적용성 평가: 플로리다 템파 지역 사례를 중심으로)

  • Hwang, Syewoon;Asefa, Tirusew;Chang, Seungwoo
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.55 no.6
    • /
    • pp.1-9
    • /
    • 2013
  • 미래 기후 정보를 이용한 수문 환경의 단기 미래 예측은 안정적 수자원 공급을 위한 필수적 과제이다. 미국 플로리다 주 중서부 템파지역에서는 주요 수자원 중 하나인 지하수의 효과적 활용을 위해 지하수위 인공신경망 모델 (GWANN)을 개발하여 피압 대수층과 비피압 대수층에 대한 주 단위 평균 지하수위를 월별로 예측하고 그 결과를 수자원 공급 의사 결정에 반영하고 있다. 본 논문은 템파지역에 대한 GWANN 모델을 이용한 지하수위 예측 시스템을 소개하고 모델의 기후 입력 자료의 민감도를 분석함으로써 양질의 기후 정보에 대한 현 시스템의 활용성을 검토하였다. 2006년과 2007년에 대한 연구 결과, 관측 자료를 최적 예측 시나리오 (the best forecast)로 가정하여 적용한 결과는 지하수위 관측 지점에 따라 큰 차이를 보였지만 일반적으로 현 시스템 (현 시점의 실시간 주 단위 평균 강우량을 향후 4주간 동일하게 적용함) 에 비해 예측 성능이 개선되는 것으로 나타났다. 더불어 강우 관측 자료의 백분위 (percentile forecast; 20분위, 50분위, 80분위)를 강우 예측 자료로 활용한 경우에도 현 시스템과 비교하여 일부 나은 결과를 보여주었다. 그러나 지하수위 예측 모델을 활용하지 않고 현 시점의 지하 수위가 지속된다고 가정하는 경우 (na$\ddot{i}$ve model) 향후 2주간의 예측 결과가 best forecast 경우에 비해 높은 정확도를 보이는 등, GWANN 모델의 단기 예측에 대한 양질의 강우 예측 정보의 활용성은 낮으며, 향후 3주 이상에 대한 예측 성능에 있어 best forecast결과가 na$\ddot{i}$ve model 결과에 비해 높은 정확도를 보이기 시작하는 것으로 나타났다. 또한 GWANN 모델의 예측 성능은 적용 기간과 지역 및 지하대수층의 특성에 따라 큰 다양성을 가지는 단점을 보여 강우 예측 자료 활용에 앞서 모델 개선의 필요성이 있다고 판단된다. 본 연구는 단기수자원 공급 계획 수립을 위하여 사용되는 지역 모델링 시스템에 대한 기후 예측정보의 활용성 평가를 위한 방법론으로 고려될 수 있을 것으로 기대된다.

Virtual Factory 시스템 구현을 위한 사용자 중심의 Sub-system 및 Interface 설계 연구

  • 이동길;양선모;여동한;박종성;이순요
    • Proceedings of the ESK Conference
    • /
    • 1996.04a
    • /
    • pp.151-151
    • /
    • 1996
  • 컴퓨터 기술의 급격한 발전은 현실세계의 각종 현장을 컴퓨터 가상공간 내에서 시험할 수 있도록 하고 있다. 최근들어급격한 발전이 이루어지고 있는 가상현실(Virtual Reality) 기술과 인간의 감성을 중시한 감성공학적 접근방법 론의 활용은 그 적용범위가 점차 확대되어 가고 있는 상황으로, 제조기업에서 사용되고 있는 통합생산 정보시스템 (Computer Integrated Manufacturing)과의 접목으로 가상공장(Virtual Factory) 시스템을 구현할 수가 있다. 가상공장시스 템은 제조기업의 Businiss를 Design & Engineering, Production Planning & Control, Manufacturing의 3가지 프로세스로 분류한다. 그리고 각 프로세스별로 현재의 CIM시스템에 대한 하부시스템을 검토하고, 인간 중심의 가상기업시스템을 구현 하는데 보완이 되어야 할 사항과 이에 대한 해결 대안으로 Design & Engineering Process에서는 Virtual Engineering(VE) Sub-system, Manufacturing Process에서는 Virtual Manufacturing(VM) Sub-system으로 분류하여 각 Sub-system별 구성모듈을 설계하고 상호간의 Interface 사항을 설계하므로서 가상공장시스템 구현을 위한 기본 요구사항과 조건을 제시하고자 한다.

  • PDF

Typological System of Nature-based Solutions and Its Similar Concepts on Water Management (물관리를 위한 자연기반해법과 유사개념들의 유형분류 및 체계)

  • Woo, Hyoseop;Han, Seung-wan
    • Ecology and Resilient Infrastructure
    • /
    • v.7 no.1
    • /
    • pp.15-25
    • /
    • 2020
  • We've compared and conceptually evaluated the newly emerging concept of nature-based solutions (NbS's), in the aspect of water management, and existing similar solutions of different naming, all of which are based on ecosystem functions. In this study, it is found that NbS's seem significant and meaningful both educationally and understandably in the aspect that it can comprehensively cover and include the existing methodologies and solutions using the functions of natural ecosystem to socio-environmental challenges. It, however, seems not quite different from the broad-meaning of green infra, including Eco-DRR, in terms of the approaching methodologies in water management. The conceptual and spatial hierarchy of each practice in water management considered in this study can be expressed in the narrowing order of NbS-(EE)-BGI-(CRT)-GI-LID. Last, the term LID, which is the best management practice for storm water management in the development project, can be replaced with the term GI for clarification and less confusion both in academia and practice.

Dosimetric evaluation of using in-house BoS Frame Fixation Tool for the Head and Neck Cancer Patient (두경부암 환자의 양성자 치료 시 사용하는 자체 제작한 BoS Frame 고정장치의 선량학적 유용성 평가)

  • Kim, kwang suk;Jo, kwang hyun;Choi, byeon ki
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.28 no.1
    • /
    • pp.35-46
    • /
    • 2016
  • Purpose : BoS(Base of Skull) Frame, the fixation tool which is used for the proton of brain cancer increases the lateral penumbra by increasing the airgap (the distance between patient and beam jet), due to the collision of the beam of the posterior oblique direction. Thus, we manufactured the fixation tool per se for improving the limits of BoS frame, and we'd like to evaluate the utility of the manufactured fixation tool throughout this study. Materials and Methods : We've selected the 3 patients of brain cancer who have received the proton therapy from our hospital, and also selected the 6 beam angles; for this, we've selected the beam angle of the posterior oblique direction. We' ve measured the planned BoS frame and the distance of Snout for each beam which are planned for the treatment of the patient using the BoS frame. After this, we've proceeded with the set-up that is above the location which was recommended by the manufacturer of the BoS frame, at the same beam angle of the same patient, by using our in-house Bos frame fixation tool. The set-up was above 21 cm toward the superior direction, compared to the situation when the BoS frame was only used with the basic couch. After that, we've stacked the snout to the BoS frame as much as possible, and measured the distance of snout. We've also measured the airgap, based on the gap of that snout distance; and we've proceeded the normalization based on each dose (100% of each dose), after that, we've conducted the comparative analysis of lateral penumbra. Moreover, we've established the treatment plan according to the changed airgap which has been transformed to the Raystation 5.0 proton therapy planning system, and we've conducted the comparative analysis of DVH(Dose Volume Histogram). Results : When comparing the result before using the in-house Bos frame fixation tool which was manufactured for each beam angle with the result after using the fixation tool, we could figure out that airgap than when not used in accordance with the use of the in-house Bos frame fixation tool was reduced by 5.4 cm ~ 15.4 cm, respectively angle. The reduced snout distance means the airgap. Lateral Penumbra could reduce left, right, 0.1 cm ~ 0.4 cm by an angle in accordance with decreasing the airgap while using each beam angle in-house Bos frame fixation tool. Due to the reduced lateral penumbra, Lt.eyeball, Lt.lens, Lt. hippocampus, Lt. cochlea, Rt. eyeball, Rt. lens, Rt. cochlea, Rt. hippocampus, stem that can be seen that the dose is decreased by 0 CGE ~ 4.4 CGE. Conclusion : It was possible to reduced the airgap by using our in-house Bos frame fixation tool for the proton therapy; as a result, it was possible to figure out that the lateral penumbra reduced. Moreover, it was also possible to check through the comparative analysis of the treatment plan that when we reduce the lateral penumbra, the reduction of the unnecessary irradiation for the normal tissues. Therefore, Using the posterior oblique the Brain cancer proton therapy should be preceded by decreasing the airgap, by using our in-house Bos frame fixation tool; also, the continuous efforts for reducing the airgap as much as possible for the proton therapy of other area will be necessary as well.

  • PDF

Research of Patent Technology Trends in Textile Materials: Text Mining Methodology Using DETM & STM (섬유소재 분야 특허 기술 동향 분석: DETM & STM 텍스트마이닝 방법론 활용)

  • Lee, Hyun Sang;Jo, Bo Geun;Oh, Se Hwan;Ha, Sung Ho
    • The Journal of Information Systems
    • /
    • v.30 no.3
    • /
    • pp.201-216
    • /
    • 2021
  • Purpose The purpose of this study is to analyze the trend of patent technology in textile materials using text mining methodology based on Dynamic Embedded Topic Model and Structural Topic Model. It is expected that this study will have positive impact on revitalizing and developing textile materials industry as finding out technology trends. Design/methodology/approach The data used in this study is 866 domestic patent text data in textile material from 1974 to 2020. In order to analyze technology trends from various aspect, Dynamic Embedded Topic Model and Structural Topic Model mechanism were used. The word embedding technique used in DETM is the GloVe technique. For Stable learning of topic modeling, amortized variational inference was performed based on the Recurrent Neural Network. Findings As a result of this analysis, it was found that 'manufacture' topics had the largest share among the six topics. Keyword trend analysis found the fact that natural and nanotechnology have recently been attracting attention. The metadata analysis results showed that manufacture technologies could have a high probability of patent registration in entire time series, but the analysis results in recent years showed that the trend of elasticity and safety technology is increasing.

Selective Word Embedding for Sentence Classification by Considering Information Gain and Word Similarity (문장 분류를 위한 정보 이득 및 유사도에 따른 단어 제거와 선택적 단어 임베딩 방안)

  • Lee, Min Seok;Yang, Seok Woo;Lee, Hong Joo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.105-122
    • /
    • 2019
  • Dimensionality reduction is one of the methods to handle big data in text mining. For dimensionality reduction, we should consider the density of data, which has a significant influence on the performance of sentence classification. It requires lots of computations for data of higher dimensions. Eventually, it can cause lots of computational cost and overfitting in the model. Thus, the dimension reduction process is necessary to improve the performance of the model. Diverse methods have been proposed from only lessening the noise of data like misspelling or informal text to including semantic and syntactic information. On top of it, the expression and selection of the text features have impacts on the performance of the classifier for sentence classification, which is one of the fields of Natural Language Processing. The common goal of dimension reduction is to find latent space that is representative of raw data from observation space. Existing methods utilize various algorithms for dimensionality reduction, such as feature extraction and feature selection. In addition to these algorithms, word embeddings, learning low-dimensional vector space representations of words, that can capture semantic and syntactic information from data are also utilized. For improving performance, recent studies have suggested methods that the word dictionary is modified according to the positive and negative score of pre-defined words. The basic idea of this study is that similar words have similar vector representations. Once the feature selection algorithm selects the words that are not important, we thought the words that are similar to the selected words also have no impacts on sentence classification. This study proposes two ways to achieve more accurate classification that conduct selective word elimination under specific regulations and construct word embedding based on Word2Vec embedding. To select words having low importance from the text, we use information gain algorithm to measure the importance and cosine similarity to search for similar words. First, we eliminate words that have comparatively low information gain values from the raw text and form word embedding. Second, we select words additionally that are similar to the words that have a low level of information gain values and make word embedding. In the end, these filtered text and word embedding apply to the deep learning models; Convolutional Neural Network and Attention-Based Bidirectional LSTM. This study uses customer reviews on Kindle in Amazon.com, IMDB, and Yelp as datasets, and classify each data using the deep learning models. The reviews got more than five helpful votes, and the ratio of helpful votes was over 70% classified as helpful reviews. Also, Yelp only shows the number of helpful votes. We extracted 100,000 reviews which got more than five helpful votes using a random sampling method among 750,000 reviews. The minimal preprocessing was executed to each dataset, such as removing numbers and special characters from text data. To evaluate the proposed methods, we compared the performances of Word2Vec and GloVe word embeddings, which used all the words. We showed that one of the proposed methods is better than the embeddings with all the words. By removing unimportant words, we can get better performance. However, if we removed too many words, it showed that the performance was lowered. For future research, it is required to consider diverse ways of preprocessing and the in-depth analysis for the co-occurrence of words to measure similarity values among words. Also, we only applied the proposed method with Word2Vec. Other embedding methods such as GloVe, fastText, ELMo can be applied with the proposed methods, and it is possible to identify the possible combinations between word embedding methods and elimination methods.

Electro-optical Properties of ${Mg_{1-x}}{Zn_x}$O Thin Films Grown by a RF Magnetron Sputtering Method as a Protective Layer for AC PDPs (고주파 마그네트론 스퍼터링 방법으로 증착한 PDP용 ${Mg_{1-x}}{Zn_x}$O 보호막의 전기광학적 특성연구)

  • Jeong, Eun-Yeong;Lee, Sang-Geol;Lee, Do-Gyeong;Lee, Gyo-Jung;Son, Sang-Ho
    • Korean Journal of Materials Research
    • /
    • v.11 no.3
    • /
    • pp.197-202
    • /
    • 2001
  • M $g_{1-x}$ Z $n_{x}$O thin films with various composition x of ZnO were fabricated by a RF magnetron sputtering method, which is expected to improve the electro-optical properties of the conventional MgO protective layer for AC-PDP. Test panels with the $Mg_{1-x}$Z $n_{x}$O protective layer have been fabricated in order to investigate the effects of ZnO doping on the electrical characteristics of devices such as the discharge voltages and the memory gain. Experimental results revealed that test panels with the $Mg_{1-x}$Z $n_{x}$O(x=0.5at%) protective layer show lower firing and sustain voltages than those seen in panels with MgO protective layer by 20V. resulting in an increasement of the memory coefficient. In addition, it was found that test panels with the $Mg_{1-x}$Z $n_{x}$O protective layer show higher discharge intensity, i. e., higher plasma density, compared with panels with MgO protective layer.ve layer.layer.

  • PDF

Calculation of the Peak-hour Ratio for Road Traffic Volumes using a Hybrid Clustering Technique (혼합군집분석 기법을 이용한 도로 교통량의 첨두율 산정)

  • Kim, Hyung-Joo;Chang, Justin S.
    • Journal of Korean Society of Transportation
    • /
    • v.30 no.1
    • /
    • pp.19-30
    • /
    • 2012
  • The majority of daily travel demands concentrate at particular time-periods, which causes the difficulties in the travel demand analysis and the corresponding benefit estimation. Thus, it is necessary to consider time-specific traffic characteristics to yield more reliable results. Traditionally, na$\ddot{i}$ve, heuristic, and statistical approaches have been applied to address the peak-hour ratio. In this study, a hybrid clustering model which is one of the statistical methods is applied to calculate the peak-hour ratio and its duration. The 2009 national 24-hour traffic data provided by the Korea institute of Construction Technology are used. The analysis is conducted dividing vehicle types into passenger cars and trucks. For the verification for the usefulness of the methodology, the toll collection system data by the Korea Express Corporation are collected. The result of the research shows lower errors during the off-peak hours and night times and increasing error ratios as the travel distance increases. Since the method proposed can reduce the arbitrariness of analysts and can accommodate the statistical significance test, the model could be considered as a more robust and stable methodology. It is hoped that the result of this paper could contribute to the enhancement of the reliability for the travel demand analysis.

Comparison of Gas Exchange Parameters between Same Volume of $N_2-O_2$ and Heliox Inhalation (동일한 상시 호흡량의 $N_2-O_2$ 및 Heliox 투여 시 가스교환지표의 비교)

  • Sohn, Jang-Won;Lim, Chae-Man;Koh, Youn-Suck;Lee, Jong-Deog;Lee, Sang-Do;Kim, Woo-Sung;Kim, Dong-Soon;Kim, Won-Dong
    • Tuberculosis and Respiratory Diseases
    • /
    • v.45 no.1
    • /
    • pp.169-175
    • /
    • 1998
  • Background: Heliox is known to decrease $PaCO_2$ in patients with increased airway resistance by increasing minute ventilation and reducing work of breathing(WOB). Besides these effect, heliox is expected to decrease functional anatomic dead space owing to improvement of peak expiratory flow rate(PEFR) and enhancement of gas distribution. We investigated whether heliox can decrease $PaCO_2$ even at the same minute ventilation (VE) and WOB with $N_2-O_2$ to speculate the effect of the heliox on the anatomic dead space. Material and Method: The subjects were 8 mechanically ventilated patients with asthma or upper airway obstruction(M : F=5 : 3, $68{\pm}10$years) who were under neuromuscular paralysis. The study was consisted of three 15-minutes phases: basal $N_2-O_2$ heliox and washout Heliox was administered via the low pressure inlet of servo 900C, and respiratory parameters were measured by pulmonary monitor(CP-100 pulmonary monitor, Bicore, Irvine, CA, USA). To obtain the same tidal volume(Vt) in heliox phase, the Vt on monitor was adjusted by the factor of relative flow rate of heliox to $N_2-O_2$. Dead space was calculated by Bohr equation. Results: 1) Vt, VE, peak inspiratory pressure(PIP) and peak inspiratory flow rate(PIFR) were not different between $N_2-O_2$ and heliox. 2) PEFR was higher on heliox($0.52{\pm}0.19$L/sec) than $N_2-O_2$($0.44{\pm}0.13$L/sec)(p=0.024). 3) $PaCO_2$(mmHg) were decreased with heliox($56.1{\pm}14.1$) compared to $N_2-O_2$($60.5{\pm}15.9$)(p=0.027). 4) Dead space ventilation(%) were decreased with heliox($73{\pm}9$ with $N_2-O_2$ and $71{\pm}10$ with heliox)(p=0.026). Conclusion: Heliox decreased $PaCO_2$ even at the same VE and WOB with $N_2-O_2$, and the effect was considered to be related with the reduction of anatomic dead space.

  • PDF