• Title/Summary/Keyword: Computer Principal

Search Result 461, Processing Time 0.033 seconds

A Study on Forest Land Classification Using Multivariate Statistical Methods : A Case Study at Mt. Kwanak (다변수통계방법을 이용한 산지분류에 관한 연구)

  • 정순오
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.13 no.1
    • /
    • pp.43-66
    • /
    • 1985
  • Korea needs proper and rational public policies on conservation and use of forest land and other natural resources because of the accelerating expansion of national land developments in recent years. Unfortunately, there is no systematic planning system to support the needs. Generally, forest land use planning needs suitability analysis based on efficient land classification system. The goal of this study was to classify a forest land using multivariate satistical methods. A case study was carried out in winter of 1983 on a mountainous area higher than 100m above sea level located at Mt. Kwanak in Anyang -city, Kyung-gi-do (province). The study area was 19.80 km$^2$wide and was divided into 1, 383 Operational Taxonomic Units (OTU's) by a 120m$\times$120m grid. Fourteen descriptors were identified and quantified for each OTU from existing national land data : elevation, slope, aspect, terrain form, geologic material, surface soil permeability, topsoil type, depth of the solum, soil acidity, forest cover type, stand size class, stand age class, stand density class, and simple forest soil capability class. For this study, a FORTRAN IV program was written for input and output map data, and the computer statistics packages, SPSS and BMD, were used to perform the multivariate statistical analysis. Fourteen variables were analyzed to investigate the characteristics of their fire quench distribution and to estimate the correlation coefficients among them. Principal component analysis was executed to find the dimensions of forest land characteristics, and factor scores were used for proper samples of OTU throughout the study area. In order to develop the classes of forest land classification based on 102 surrogates, cluster and discriminant analyses of principal descriptor variable matrix were undertaken. Results obtained through a series of multivariate statistical analyses were as follows ; 1) Principal component analysis was proved to be a useful tool for data selection and identification of principal descriptor variables which represented the characteristics of forest land and facilitated the selection of samples.

  • PDF

Fault Diagnosis of Induction Motor Using Clustering and Principal Component Analysis (클러스터링과 주성분 분석기법을 이용한 유도전동기 고장진단)

  • Park Chan-Won;Lee Dae-Jong;Park Sung-Moo;Chun Myung-Geun
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2006.05a
    • /
    • pp.208-211
    • /
    • 2006
  • 본 논문에서는 3상 유도전동기의 고장진단을 수행하기 위해 패턴인식에 기반을 둔 진단 알고리즘을 제안한다. 실험 장치는 유도전동기 구동의 고장신호를 얻기 위하여 구축하였으며, 취득된 데이터를 이용하여 진단 알고리즘을 구축하였다. 취득된 데이터 중에서 진단을 위해 사용될 훈련데이터는 퍼지 기반 클러스터링 기법을 이용하여 신뢰성 높은 데이터를 선택하여 고장별 신호를 추출하였다. 진단 알고리즘으로는 데이터를 주성분 분석기법을 적용하였으며, 최종 분류를 위해 Euclidean 기반 거리척도 기법을 이용하였다. 다양한 부하 및 고장신호에 대하여 제안된 방법을 적용하여 타당성을 검증하였다.

  • PDF

A Feature Analysis of the Power Quality Problem by PCA (PCA를 이용한 전력품질 특징분석)

  • Lee, Jin-Mok;Hong, Duc-Pyo;Kim, Soo-Cheol;Choi, Jae-Ho;Hong, Hyun-Mun
    • Proceedings of the KIPE Conference
    • /
    • 2005.07a
    • /
    • pp.192-194
    • /
    • 2005
  • Development of nonlinear loads and compensation instruments make PQ(Power Quality) problem into important issue. Few studies by signal processing and pattern classification as NN(Neural Network), Wavelet Transform, and Fuzzy present feature extraction. A lot of Input features make not always good result and they are difficult to make realtime system. Thus, The dimentionality reduction is indispensable process. PCA(Principal Component Analysis) reduces high-dimensional input features onto a lower-dimensional subspace effectively. It will be useful to apply to realtime system and NN.

  • PDF

Quality Inspection of Dented Capsule using Curve Fitting-based Image Segmentation

  • Kwon, Ki-Hyeon;Lee, Hyung-Bong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.12
    • /
    • pp.125-130
    • /
    • 2016
  • Automatic quality inspection by computer vision can be applied and give a solution to the pharmaceutical industry field. Pharmaceutical capsule can be easily affected by flaws like dents, cracks, holes, etc. In order to solve the quality inspection problem, it is required computationally efficient image processing technique like thresholding, boundary edge detection and segmentation and some automated systems are available but they are very expensive to use. In this paper, we have developed a dented capsule image processing technique using edge-based image segmentation, TLS(Total Least Squares) curve fitting technique and adopted low cost camera module for capsule image capturing. We have tested and evaluated the accuracy, training and testing time of the classification recognition algorithms like PCA(Principal Component Analysis), ICA(Independent Component Analysis) and SVM(Support Vector Machine) to show the performance. With the result, PCA, ICA has low accuracy, but SVM has good accuracy to use for classifying the dented capsule.

A GUI-based Approach to Software Modularization

  • Park, Dongmin;Seo, Yeong-Seok
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.4
    • /
    • pp.97-106
    • /
    • 2018
  • Software maintenance activities have always been important issues in many domains of the software industry. In order to help to resolve this issue, software modularization approaches have been studied to build adequate modules with high cohesion and low coupling; such modular structures can help the comprehension and maintenance of complex systems. In this paper, we propose a GUI-based automated approach for software modularization based on GUI structure analysis. GUI is a principal manner to allow users to access the overall functionalities of a software system; in particular, GUI is closely related to software functionalities, which makes it a promising tool to identify and understand the entire software system. We also implement a software tool to support our approach and evaluate it with a case study using an open source software.

Content-based Image Indexing Using PCA

  • Yu, Young-Dal;Jun, Min-Gun;Kim, Daijij;Kang, Dae-Seong
    • Proceedings of the IEEK Conference
    • /
    • 2000.07b
    • /
    • pp.827-830
    • /
    • 2000
  • In this paper, we propose the method using PCA(principal component analysis) algorithm when proposed algorithm performs multimedia information indexing. After we extract DC coefficients of DCT from MPEG video stream which is an international standard of moving picture compression coding, we apply PCA algorithm to image made of DC coefficients and extract the feature of each DC image. Using extracted features, we generate codebook and perform multimedia information indexing. The proposed algorithm Is very fast when indexing and can generate optimized codebook because of using statistical feature of data

  • PDF

Face Image Compression using Generalized Hebbian Algorithm of Non-Parsed Image

  • Kyung Hwa lee;Seo, Seok-Bae;Kim, Daijin;Kang, Dae-Seong
    • Proceedings of the IEEK Conference
    • /
    • 2000.07b
    • /
    • pp.847-850
    • /
    • 2000
  • This paper proposes an image compressing and template matching algorithm for face image using GHA (Generalized Hebbian Algorithm). GHA is a part of PCA (Principal Component Analysis), that has single-layer perceptrons and operates and self-organizing performance. We used this algorithm for feature extraction of face shape, and our simulations verify the high performance for the proposed method. The shape for face in the fact that the eigenvector of face image can be efficiently represented as a coefficient that can be acquired by a set of basis is to compress data of image. From the simulation results, the mean PSNR performance is 24.08[dB] at 0.047bpp, and reconstruction experiment shows that good reconstruction capacity for an image that not joins at leaning.

  • PDF

A Marker Detection and Recognition System based on Principal Component Analysis (주성분 분석을 이용한 마커 검출 및 인식 시스템)

  • Kang, Sun-Kyoung;So, In-Me;Kim, Young-Un;Jung, Sung-Tae
    • Annual Conference of KIPS
    • /
    • 2006.11a
    • /
    • pp.129-132
    • /
    • 2006
  • 본 논문에서는 카메라 영상으로부터 사각형 형태의 마커를 검출하고 인식하는 방법을 제안한다. 본 논문에서는 사각형 형태의 마커 검출을 위하여 입력 영상을 이진 영상으로 변환하고 객체들의 윤곽선을 추출한 다음에 윤곽선을 선분으로 근사화 한다. 근사화된 선분으로부터 기하학적 특징을 이용하여 사각형을 찾는다. 마커의 사각형 영역을 찾은 다음에는 워핑 기법을 이용하여 사각형 마커 영상으로부터 특징 벡터를 추출하고 표준 마커에 대한 특징 벡터와의 최소 거래법에 의해 마커의 종류를 인식한다. 인식 실험 결과 마커의 종류가 50개일 때에 최대 98%의 인식률을 얻을 수 있었다.

  • PDF

Joint PCA and Adaptive Threshold for Fault Detection in Wireless Sensor Networks (무선 센서 네트워크에서 장애 검출을 위한 결합 주성분분석과 적응형 임계값)

  • Dang, Thien-Binh;Vo, Vi Van;Le, Duc-Tai;Kim, Moonseong;Choo, Hyunseung
    • Annual Conference of KIPS
    • /
    • 2020.05a
    • /
    • pp.69-71
    • /
    • 2020
  • Principal Component Analysis (PCA) is an effective data analysis technique which is commonly used for fault detection on collected data of Wireless Sensor Networks (WSN), However, applying PCA on the whole data make the detection performance low. In this paper, we propose Joint PCA and Adaptive Threshold for Fault Detection (JPATAD). Experimental results on a real dataset show a remarkably higher performance of JPATAD comparing to conventional PCA model in detection of noise which is a popular fault in collected data of sensors.

Generalized Sub-optimum Decoding for Space-Time Trellis Codes in Quasistatic Flat Fading Channel (준정적 플랫 페이딩 채널에서 시공간 트렐리스 부호의 일반화된 부최적 복호법)

  • Kim Young Ju;Shin Sang Sup;Kang Hyun-Soo
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.43 no.1 s.343
    • /
    • pp.89-94
    • /
    • 2006
  • We present a generalized version of principal ratio combining (PRC)[1], which is a near-optimum decoding scheme for space-time trellis codes in quasi-static flat fading environments. In [1], the performance penalty increases as the number of receive antennas increases. In the proposed scheme, receive antennas are divided into K groups, and the PRC decoding method is applied to each group. This shows a flexible tradeoff between performance and decoding complexity by choosing the appropriate K. Moreover, we also propose the performance index(PI) to easily predict the decoding performance among the possible different(receive antenna) configurations.