• Title/Summary/Keyword: extraction techniques

Search Result 906, Processing Time 0.032 seconds

The Object Image Detection Method using statistical properties (통계적 특성에 의한 객체 영상 검출방안)

  • Kim, Ji-hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.7
    • /
    • pp.956-962
    • /
    • 2018
  • As the study of the object feature detection from image, we explain methods to identify the species of the tree in forest using the picture taken from dron. Generally there are three kinds of methods, which are GLCM (Gray Level Co-occurrence Matrix) and Gabor filters, in order to extract the object features. We proposed the object extraction method using the statistical properties of trees in this research because of the similarity of the leaves. After we extract the sample images from the original images, we detect the objects using cross correlation techniques between the original image and sample images. Through this experiment, we realized the mean value and standard deviation of the sample images is very important factor to identify the object. The analysis of the color component of the RGB model and HSV model is also used to identify the object.

Concept Extraction Technique from Documents Using Domain Ontology (지식 문서에서 도메인 온톨로지를 이용한 개념 추출 기법)

  • Mun Hyeon-Jeong;Woo Yong-Tae
    • The KIPS Transactions:PartD
    • /
    • v.13D no.3 s.106
    • /
    • pp.309-316
    • /
    • 2006
  • We propose a novel technique to categorize XML documents and extract a concept efficiently using domain ontology. First, we create domain ontology that use text mining technique and statistical technique. We propose a DScore technique to classify XML documents by using the structural characteristic of XML document. We also present TScore technique to extract a concept by comparing the association term set of domain ontology and the terms in the XML document. To verify the efficiency of the proposed technique, we perform experiment for 295 papers in the computer science area. The results of experiment show that the proposed technique using the structural information in the XML documents is more efficient than the existing technique. Especially, the TScore technique effectively extract the concept of documents although frequency of term is few. Hence, the proposed concept-based retrieval techniques can be expected to contribute to the development of an efficient ontology-based knowledge management system.

A Novel Channel Compensation and Equalization scheme for an OFDM Based Modem (OFDM 전송시스템의 새로운 채널 보상 및 등화 기법)

  • Seo, Jung-Hyun;Lee, Hyun;Cheong, Cha-Keon;Cho, Kyoung-Rok
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.12A
    • /
    • pp.1009-1018
    • /
    • 2003
  • A new fading channel estimation technique is proposed for an OFDM based modem In the ITS system. The algorithm is based on the transfer function extraction of the channel using the pilot signals and compensated the channel preceding the equalization. The newly derived algorithm is division-free arithmetic operations allows the faster circuit operation and the smaller circuit size. Proposed techniques compensate firstly the distortion which is generated at fading channels and secondly eliminate inter-symbol interference. All algorithms are suitability estimated and improved for a system implementation using digital circuits. As the results, the circuit size is reduced by 20% of the conventional design and achieved about 10% performance improvement at low SNR under 10dB in case of ITS system adapted 16-QAM mode.

Technical Trend Analysis of Fingerprint Classification (지문분류 기술 동향 분석)

  • Jung, Hye-Wuk;Lee, Seung
    • The Journal of the Korea Contents Association
    • /
    • v.17 no.9
    • /
    • pp.132-144
    • /
    • 2017
  • The fingerprint classification of categorizing fingerprints by classes should be used in order to improve the processing speed and accuracy in a fingerprint recognition system using a large database. The fingerprint classification methods extract features from the fingerprint ridges of a fingerprint and classify the fingerprint using learning and reasoning techniques based on the classes defined according to the flow and shape of the fingerprint ridges. In earlier days, many researches have been conducted using NIST database acquired by pressing or rolling finger against a paper. However, as automated systems using live-scan scanners for fingerprint recognition have become popular, researches using fingerprint images obtained by live-scan scanners, such as fingerprint data provided by FVC, are increasing. And these days the methods of fingerprint classification using Deep Learning have proposed. In this paper, we investigate the trends of fingerprint classification technology and compare the classification performance of the technology. We desire to assist fingerprint classification research with increasing large fingerprint database in improving the performance by mentioning the necessity of fingerprint classification research with consideration for fingerprint images based on live-scan scanners and analyzing fingerprint classification using deep learning.

Changing Proteins in Granulosa Cells during Follicular Development in Pig (돼지 난포 발달 시 과립막 세포에서 발현되는 단백질의 변화)

  • Chae, In-Soon;Jang, Dong-Min;Cheong, Hee-Tae;Yang, Boo-Keun;Park, Choon-Keun
    • Reproductive and Developmental Biology
    • /
    • v.33 no.3
    • /
    • pp.183-187
    • /
    • 2009
  • This study analyzed change of proteins in granulosa cells during the porcine follicuar development by proteomics techniques. Granulosa cells of the follicles, of which the diameter is $2{\sim}4\;mm$ and $6{\sim}10\;mm$, were collected from ovary of slaughtered pig that each follicle of diameter $1{\sim}4\;mm$ and $6{\sim}10\;mm$. We extracted glanulosa cell proteins by M-PER Mammalian Protein Extraction Reagent. Proteins were refined by clean-up kit and quantified by Bradford method until total protein was $200{\mu}l$. Immobilized pH gradient(IPG) strip used 18 cm, $3{\sim}10\;NL$. SDS-PAGE used 10% acrylamide gel. After silver staining, Melanie 7 and naked eye test were used for spot analyzation. Increasing proteins in glanulosa cell of $6{\sim}10\;mm$ follicle were 7 spots. This spots were analyzed by MALDI-TOF MS and searched on NCBInr. In results, 7 spots were similar to zinc/ling finger protein 3 precursor (RING finger protein 203), angiomotin, heat shock 60 kDa protein 1 (chaperonin) isoform 1 (HSP60), similar to transducin-like enhancer protein 1 (TLE 1), SH3 and PX domains 2A (SH3PXD2A). Those proteins were related with transfer between cells. Increase of proteins has an effect on follicular development.

Extraction Method of Significant Clinical Tests Based on Data Discretization and Rough Set Approximation Techniques: Application to Differential Diagnosis of Cholecystitis and Cholelithiasis Diseases (데이터 이산화와 러프 근사화 기술에 기반한 중요 임상검사항목의 추출방법: 담낭 및 담석증 질환의 감별진단에의 응용)

  • Son, Chang-Sik;Kim, Min-Soo;Seo, Suk-Tae;Cho, Yun-Kyeong;Kim, Yoon-Nyun
    • Journal of Biomedical Engineering Research
    • /
    • v.32 no.2
    • /
    • pp.134-143
    • /
    • 2011
  • The selection of meaningful clinical tests and its reference values from a high-dimensional clinical data with imbalanced class distribution, one class is represented by a large number of examples while the other is represented by only a few, is an important issue for differential diagnosis between similar diseases, but difficult. For this purpose, this study introduces methods based on the concepts of both discernibility matrix and function in rough set theory (RST) with two discretization approaches, equal width and frequency discretization. Here these discretization approaches are used to define the reference values for clinical tests, and the discernibility matrix and function are used to extract a subset of significant clinical tests from the translated nominal attribute values. To show its applicability in the differential diagnosis problem, we have applied it to extract the significant clinical tests and its reference values between normal (N = 351) and abnormal group (N = 101) with either cholecystitis or cholelithiasis disease. In addition, we investigated not only the selected significant clinical tests and the variations of its reference values, but also the average predictive accuracies on four evaluation criteria, i.e., accuracy, sensitivity, specificity, and geometric mean, during l0-fold cross validation. From the experimental results, we confirmed that two discretization approaches based rough set approximation methods with relative frequency give better results than those with absolute frequency, in the evaluation criteria (i.e., average geometric mean). Thus it shows that the prediction model using relative frequency can be used effectively in classification and prediction problems of the clinical data with imbalanced class distribution.

Design of Regression Model and Pattern Classifier by Using Principal Component Analysis (주성분 분석법을 이용한 회귀다항식 기반 모델 및 패턴 분류기 설계)

  • Roh, Seok-Beom;Lee, Dong-Yoon
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.10 no.6
    • /
    • pp.594-600
    • /
    • 2017
  • The new design methodology of prediction model and pattern classification, which is based on the dimension reduction algorithm called principal component analysis, is introduced in this paper. Principal component analysis is one of dimension reduction techniques which are used to reduce the dimension of the input space and extract some good features from the original input variables. The extracted input variables are applied to the prediction model and pattern classifier as the input variables. The introduced prediction model and pattern classifier are based on the very simple regression which is the key point of the paper. The structural simplicity of the prediction model and pattern classifier leads to reducing the over-fitting problem. In order to validate the proposed prediction model and pattern classifier, several machine learning data sets are used.

Development of Statistical/Probabilistic-Based Adaptive Thresholding Algorithm for Monitoring the Safety of the Structure (구조물의 안전성 모니터링을 위한 통계/확률기반 적응형 임계치 설정 알고리즘 개발)

  • Kim, Tae-Heon;Park, Ki-Tae
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.20 no.4
    • /
    • pp.1-8
    • /
    • 2016
  • Recently, buildings tend to be large size, complex shape and functional. As the size of buildings is becoming massive, the need for structural health monitoring(SHM) technique is ever-increasing. Various SHM techniques have been studied for buildings which have different dynamic characteristics and are influenced by various external loads. Generally, the visual inspection and non-destructive test for an accessible point of structures are performed by experts. But nowadays, the system is required which is online measurement and detect risk elements automatically without blind spots on structures. In this study, in order to consider the response of non-linear structures, proposed a signal feature extraction and the adaptive threshold setting algorithm utilized to determine the abnormal behavior by using statistical methods such as control chart, root mean square deviation, generalized extremely distribution. And the performance of that was validated by using the acceleration response of structures during earthquakes measuring system of forced vibration tests and actual operation.

Evidences of in Situ Remediation from Long Term Monitoring Data at a TCE-contaminated Site, Wonju, Korea

  • Lee, Seong-Sun;Kim, Hun-Mi;Lee, Seung Hyun;Yang, Jae-Ha;Koh, Youn Eun;Lee, Kang-Kun
    • Journal of Soil and Groundwater Environment
    • /
    • v.18 no.6
    • /
    • pp.8-17
    • /
    • 2013
  • The contamination of chlorinated ethenes at an industrial complex, Wonju, Korea, was examined based on sixteen rounds of groundwater quality data collected from 2009 to 2013. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pumping-and-treatment have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. At each remediation target zone, temporal monitoring data before and after the application of remediation techniques showed that the aqueous concentrations of TCE plume present at and around the main source areas decreased significantly as a result of remediation technologies. However, the TCE concentration of the plumes at the downstream area remained unchanged in response to the remediation action, but it showed a great fluctuation according to seasonal recharge variation during the monitoring period. Therefore, variations in the contaminant flux across three transects were analyzed. Prior to the remediation action, the concentration and mass discharges of TCE at the transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the transects. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The difference in the temporal profiles of TCE concentrations between the plumes in the source zone and those in the downstream could have resulted from remedial actions taken at the source zones. This study demonstrates that long term monitoring data are useful in assessing the effectiveness of remediation practices.

Enhanced Si based negative electrodes using RF/DC magnetron sputtering for bulk lithium ion batteries

  • Hwang, Chang-Muk;Park, Jong-Wan
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2010.02a
    • /
    • pp.277-277
    • /
    • 2010
  • The capacity of the carbonaceous materials reached ca. $350\;mAhg^{-1}$ which is close to theorestical value of the carbon intercalation composition $LiC_6$, resulting in a relatively low volumetric Li capacity. Notwithstanding the capacities of carbon, it will not adjust well to the need so future devices. Silicon shows the highest gravimetric capacities (up to $4000\;mAhg^{-1}$ for $Li_{21}Si_5$). Although Si is the most promising of the next generation anodes, it undergoes a large volume change during lithium insertion and extraction. It results in pulverization of the Si and loss of electrical contact between the Si and the current collector during the lithiation and delithiation. Thus, its capacity fades rapidly during cycling. We focused on electrode materials in the multiphase form which were composed of two metal compounds to reduce the volume change in material design. A combination of electrochemically amorphous active material in an inert matrix (Si-M) has been investigated for use as negative electrode materials in lithium ion batteries. The matrix composited of Si-M alloys system that; active material (Si)-inactive material (M) with Li; M is a transition metal that does not alloy with Li with Li such as Ti, V or Mo. We fabricated and tested a broad range of Si-M compositions. The electrodes were sputter-deposited on rough Cu foil. Electrochemical, structural, and compositional characterization was performed using various techniques. The structure of Si-M alloys was investigated using X-ray Diffractometer (XRD) and transmission electron microscopy (TEM). Surface morphologies of the electrodes are observed using a field emission scanning electron microscopy (FESEM). The electrochemical properties of the electrodes are studied using the cycling test and electrochemical impedance spectroscopy (EIS). It is found that the capacity is strongly dependent on Si content and cycle retention is also changed according to M contents. It may be beneficial to find materials with high capacity, low irreversible capacity and that do not pulverize, and that combine Si-M to improve capacity retention.

  • PDF