• Title/Summary/Keyword: Kernel Level

Search Result 294, Processing Time 0.023 seconds

The Price of Risk in the Korean Stock Distribution Market after the Global Financial Crisis (글로벌 금융위기 이후 한국 주식유통시장의 위험가격에 관한 연구)

  • Sohn, Kyoung-Woo;Liu, Won-Suk
    • Journal of Distribution Science
    • /
    • v.13 no.5
    • /
    • pp.71-82
    • /
    • 2015
  • Purpose - The purpose of this study is to investigate risk price implied from the pricing kernel of Korean stock distribution market. Recently, it is considered that the quantitative easing programs of major developed countries are contributing to a reduction in global uncertainty caused by the 2007~2009 financial crisis. If true, the risk premium as compensation for global systemic risk or economic uncertainty should show a decrease. We examine whether the risk price in the Korean stock distribution market has declined in recent years, and attempt to provide practical implications for investors to manage their portfolios more efficiently, as well as academic implications. Research design, data and methodology - To estimate the risk price, we adopt a non-parametric method; the minimum norm pricing kernel method under the LOP (Law of One Price) constraint. For the estimation, we use 17 industry sorted portfolios provided by the KRX (Korea Exchange). Additionally, the monthly returns of the 17 industry sorted portfolios, from July 2000 to June 2014, are utilized as data samples. We set 120 months (10 years) as the estimation window, and estimate the risk prices from July 2010 to June 2014 by month. Moreover, we analyze correlation between any of the two industry portfolios within the 17 industry portfolios to suggest further economic implications of the risk price we estimate. Results - According to our results, the risk price in the Korean stock distribution market shows a decline over the period of July 2010 to June 2014 with statistical significance. During the period of the declining risk price, the average correlation level between any of the two industry portfolios also shows a decrease, whereas the standard deviation of the average correlation shows an increase. The results imply that the amount of systematic risk in the Korea stock distribution market has decreased, whereas the amount of industry-specific risk has increased. It is one of the well known empirical results that correlation and uncertainty are positively correlated, therefore, the declining correlation may be the result of decreased global economic uncertainty. Meanwhile, less asset correlation enables investors to build portfolios with less systematic risk, therefore the investors require lower risk premiums for the efficient portfolio, resulting in the declining risk price. Conclusions - Our results may provide evidence of reduction in global systemic risk or economic uncertainty in the Korean stock distribution market. However, to defend the argument, further analysis should be done. For instance, the change of global uncertainty could be measured with funding costs in the global money market; subsequently, the relation between global uncertainty and the price of risk might be directly observable. In addition, as time goes by, observations of the risk price could be extended, enabling us to confirm the relation between the global uncertainty and the effect of quantitative easing. These topics are beyond our scope here, therefore we reserve them for future research.

The Impact of the PCA Dimensionality Reduction for CNN based Hyperspectral Image Classification (CNN 기반 초분광 영상 분류를 위한 PCA 차원축소의 영향 분석)

  • Kwak, Taehong;Song, Ahram;Kim, Yongil
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.6_1
    • /
    • pp.959-971
    • /
    • 2019
  • CNN (Convolutional Neural Network) is one representative deep learning algorithm, which can extract high-level spatial and spectral features, and has been applied for hyperspectral image classification. However, one significant drawback behind the application of CNNs in hyperspectral images is the high dimensionality of the data, which increases the training time and processing complexity. To address this problem, several CNN based hyperspectral image classification studies have exploited PCA (Principal Component Analysis) for dimensionality reduction. One limitation to this is that the spectral information of the original image can be lost through PCA. Although it is clear that the use of PCA affects the accuracy and the CNN training time, the impact of PCA for CNN based hyperspectral image classification has been understudied. The purpose of this study is to analyze the quantitative effect of PCA in CNN for hyperspectral image classification. The hyperspectral images were first transformed through PCA and applied into the CNN model by varying the size of the reduced dimensionality. In addition, 2D-CNN and 3D-CNN frameworks were applied to analyze the sensitivity of the PCA with respect to the convolution kernel in the model. Experimental results were evaluated based on classification accuracy, learning time, variance ratio, and training process. The size of the reduced dimensionality was the most efficient when the explained variance ratio recorded 99.7%~99.8%. Since the 3D kernel had higher classification accuracy in the original-CNN than the PCA-CNN in comparison to the 2D-CNN, the results revealed that the dimensionality reduction was relatively less effective in 3D kernel.

Multiple Cause Model-based Topic Extraction and Semantic Kernel Construction from Text Documents (다중요인모델에 기반한 텍스트 문서에서의 토픽 추출 및 의미 커널 구축)

  • 장정호;장병탁
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.5
    • /
    • pp.595-604
    • /
    • 2004
  • Automatic analysis of concepts or semantic relations from text documents enables not only an efficient acquisition of relevant information, but also a comparison of documents in the concept level. We present a multiple cause model-based approach to text analysis, where latent topics are automatically extracted from document sets and similarity between documents is measured by semantic kernels constructed from the extracted topics. In our approach, a document is assumed to be generated by various combinations of underlying topics. A topic is defined by a set of words that are related to the same topic or cooccur frequently within a document. In a network representing a multiple-cause model, each topic is identified by a group of words having high connection weights from a latent node. In order to facilitate teaming and inferences in multiple-cause models, some approximation methods are required and we utilize an approximation by Helmholtz machines. In an experiment on TDT-2 data set, we extract sets of meaningful words where each set contains some theme-specific terms. Using semantic kernels constructed from latent topics extracted by multiple cause models, we also achieve significant improvements over the basic vector space model in terms of retrieval effectiveness.

Groundwater level behavior analysis using kernel density estimation (비모수 핵밀도 함수를 이용한 지하수위 거동분석)

  • Jeong, Ji Hye;Kim, Jong Wook;Lee, Jeong Ju;Chun, Gun Il
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2017.05a
    • /
    • pp.381-381
    • /
    • 2017
  • 수자원 분야에 대한 기후변화의 영향은 홍수, 가뭄 등 극치 수문사상의 증가와 변동성 확대를 초래하는 것으로 알려져 있으며, 이에 따라 예년에 비해 발생빈도 및 심도가 증가한 가뭄에 대한 모니터링 및 피해경감을 위해 정부에서는 국민안전처를 비롯한 관계기관 합동으로 생활 공업 농업용수 등 분야별 가뭄정보를 제공하고 있다. 국토교통부와 환경부는 생활 및 공업용수 분야의 가뭄정보 제공을 위해 광역 지방 상수도를 이용하는 급수 지역과 마을상수도, 소규모급수시설 등 미급수지역의 용수수급 정보를 분석하여 가뭄 분석정보를 제공 중에 있다. 하지만, 미급수지역에 대한 가뭄 예?경보는 기준이 되는 수원정보의 부재로 기상 가뭄지수인 SPI6를 이용하여 정보를 생산하고 있다. 기상학적 가뭄 상황과 물부족에 의한 체감 가뭄은 차이가 있으며, 미급수 지역의 경우 지하수를 주 수원으로 사용하는 지역이 대부분으로 기상학적 가뭄지수인 SPI6를 이용한 가뭄정보로 실제 물수급 상황을 반영하기는 부족한 실정이다. 따라서 본 연구에서는 미급수지역의 주요 수원인 지하수의 수위 상황을 반영한 가뭄모니터링 기법을 개발하고자 하였으며, 가용량 분석이 현실적으로 어려운 지하수의 특성을 고려하여 수위 거동의 통계적 분석을 통해 가뭄을 모니터링 할 수 있는 방법으로 접근하였다. 국가지하수관측소 중 관측기간이 10년 이상이고 강우와의 상관성이 높은 관측소들을 선정한 후, 일수위 관측자료를 월별로 분리하여 1월~12월 각 월에 대해 핵밀도 함수 추정기법(kernel densitiy estimation)을 적용하여 월별 지하수위 분포 특성을 도출하였다. 각 관측소별 관측수위 분포에 대해 백분위수(percentile)를 이용하여, 25%~100% 사이는 정상, 10%~25% 사이는 주의단계, 5%~10% 사이는 심한가뭄, 5% 이하는 매우심함으로 가뭄의 단계를 구분하였다. 각 백분위수에 해당하는 수위 값은 추정된 Kernel Density와 Quantile Function을 이용하여 산정하였고, 최근 10일 평균수위를 현재의 수위로 설정하여 가뭄의 정도를 분류하였다. 분석된 결과는 관측소를 기점으로 역거리가중법(inverse distance weighting)을 통해 공간 분포를 시켰으며, 수문학적, 지질학적 동질성을 반영하기 위하여 유역도 및 수문지질도를 중첩한 공간연산을 통해 전국 지하수 가뭄상태를 나타내는 지하수위 등급분포도를 작성하였다. 실제 가뭄상황과의 상관성을 분석하기 위해 언론기사를 통해 확인된 가뭄시기와 백문위수 25%이하로 분석된 지하수 가뭄시기를 ROC(receiver operation characteristics) 분석을 통해 비교 검증하였다.

  • PDF

Performance Comparison of Virtualization Domain in User Level Virtualization (사용자 레벨 가상화에서 가상화 영역 성능 비교)

  • Jeong, Chan-Joo;Kang, Tae-Geun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.8 no.11
    • /
    • pp.1741-1748
    • /
    • 2013
  • In this paper, we proposed new virtualization technology that is more convenient and stable in local computing environment, then found technique elements need to desktop virtualization which is based on clients in various virtualization technologies. After running excution of process explorer utility in user level virtualization and VMWare, we found memory capacity that is used 30.1MB in VMWare and 16.6MB in user level virtualization respectively to compare private bytes each of process. We found no significant difference of CPU utilization which is executed application program in local computing environment and user domain with user level virtualization. In this result, proposed virtualization technology is able to minimize performance degradation of local computing environment.

Development of Kernel based High Speed Packet Filtering Imbedded Gateway and Firewall Using Cloud Database (클라우드 데이터베이스를 이용한 커널 기반 고속 패킷필터링 임베디드 게이트웨이 및 방화벽 개발)

  • Park, Daeseung;Kim, Soomin;Yoo, Hanseob;Moon, Songchul
    • Journal of Service Research and Studies
    • /
    • v.5 no.1
    • /
    • pp.57-70
    • /
    • 2015
  • This paper develop curnel based high speed packet filtering imbedded gateway and firewall using cloud database. This study develop equipment include of predict function through bigdata analysis using cloud system. This equipment include intrusion prevention for network attack, and include system security function of L7 switch based contents. This study can improve security level of little company and general family. This study can pioneer a new market. This study can develop high perfomance switch and replacement of existing security equipment. This study proposed new next generation algorithm for constuction of high performance system from low specifications.

Current Issues for ROK Defense Modeling & Simulation Scheme under the Transition of New HLA Simulation Architecture (HLA 모의구조전환에 따른 한국군 DM&S 발전방안)

  • 이상헌
    • Journal of the military operations research society of Korea
    • /
    • v.26 no.2
    • /
    • pp.101-119
    • /
    • 2000
  • US DoD designated the High LEvel Architecture (HLA) as the standard technical architecture for all military simulation since 1996. HLA will supercede the current Distributed Interactive Simulation(DIS) and Aggregated LEvel Simulation Protocol(ALSP) methods by no funds for developing/modifying non-HLA compliant simulations. The new architecture specifies Rules which define relationships among federation components, an Objects Model Template which species the form which simulation elements are described, and an Interface Specification which describes the way simulations interact during operations. HLA is named as standard architecture in NATO, Australia and many other militaries. Also, it will be IEEE standard in the near future. It goes without saying that ROK military whose simulation models are almost from US must be prepared in areas such as ROK-US combined exercise, training, weapon system acquisition, interface models with C4I system, OPLAN analysis, operations, and os on. In this paper, we propose several effective alternatives and issues for ROK Defense Modeling and Simulation under the transition of new HLA architecture. Those include secure the kernel of new simulation technology and develop our own conceptual model, RTI software, prototype federation for each service and aggregated one. In order to challenge the new simulation architecture effectively, we should innovate our current defense modeling and simulation infrastructure such s manpower, organization, budget, research environment, relationships among academia and industry, and many others.

  • PDF

Eager Data Transfer Mechanism for Reducing Communication Latency in User-Level Network Protocols

  • Won, Chul-Ho;Lee, Ben;Park, Kyoung;Kim, Myung-Joon
    • Journal of Information Processing Systems
    • /
    • v.4 no.4
    • /
    • pp.133-144
    • /
    • 2008
  • Clusters have become a popular alternative for building high-performance parallel computing systems. Today's high-performance system area network (SAN) protocols such as VIA and IBA significantly reduce user-to-user communication latency by implementing protocol stacks outside of operating system kernel. However, emerging parallel applications require a significant improvement in communication latency. Since the time required for transferring data between host memory and network interface (NI) make up a large portion of overall communication latency, the reduction of data transfer time is crucial for achieving low-latency communication. In this paper, Eager Data Transfer (EDT) mechanism is proposed to reduce the time for data transfers between the host and network interface. The EDT employs cache coherence interface hardware to directly transfer data between the host and NI. An EDT-based network interface was modeled and simulated on the Linux-based, complete system simulation environment, Linux/SimOS. Our simulation results show that the EDT approach significantly reduces the data transfer time compared to DMA-based approaches. The EDTbased NI attains 17% to 38% reduction in user-to-user message time compared to the cache-coherent DMA-based NIs for a range of message sizes (64 bytes${\sim}$4 Kbytes) in a SAN environment.

Analysing and Neutralizing the Stuxnet's Stealthing Techniques (Stuxnet의 파일 은닉 기법 분석 및 무력화 방법 연구)

  • Lee, Kyung-Roul;Yim, Kang-Bin
    • Journal of Advanced Navigation Technology
    • /
    • v.14 no.6
    • /
    • pp.838-844
    • /
    • 2010
  • This paper introduces Stuxnet, a malicious ware that presently stimulates severity of the cyber warfare worldwide, analyses how it propagates and what it affects if infected and proposes a process to cure infected systems according to its organization. Malicious wares such as Stuxnet secretes themselves within the system during propagation and it is required to analyze file hiding techniques they use to detect and remove them. According to the result of the analysis in this paper, Stuxnet uses the library hooking technique and the file system filter driver technique on both user level and kernel level, respectively, to hide its files. Therefore, this paper shows the results of the Stuxnet's file hiding approach and proposes an idea for countermeasure to neutralize it. A pilot implementation of the idea afterward shows that the stealthing techniques of Stuxnet are removed by the implementation.

Delay optimization algorithm on FPGAs (FPGA 에 대한 지연시간 최적화 알고리듬)

  • Hur Chang-Wu;Kim Nam-Woo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.7
    • /
    • pp.1259-1265
    • /
    • 2006
  • In this paper, we propose a combined synthetic algorithm of the logic level for high speed FPGA design. The algorithm divides critical path to reduce delay time and generates a circuit which the divided circuits execute simultaneously. This kernel selection algorithm is made by C-langage of SUN UNIX. We compare this with the existing FlowMap algorithm. This proposed algorithm shows result on 33.3% reduction of delay time by comparison with the existing algorithm.