• Title/Summary/Keyword: component retrieval

Search Result 169, Processing Time 0.026 seconds

3D Model Retrieval using Distribution of Interpolated Normal Vectors on Simplified Mesh (간략화된 메쉬에서 보간된 법선 벡터의 분포를 이용한 3차원 모델 검색)

  • Kim, A-Mi;Song, Ju-Whan;Gwun, Ou-Bong
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.11
    • /
    • pp.1692-1700
    • /
    • 2009
  • This paper proposes the direction distribution of surface normal vectors as a feature descriptor of three-dimensional models. Proposed the feature descriptor handles rotation invariance using a principal component analysis(PCA) method, and performs mesh simplification to make it robust and nonsensitive against noise addition. Our method picks samples for the distribution of normal vectors to be proportional to the area of each polygon, applies weight to the normal vectors, and applies interpolation to enhance discrimination so that the information on the surface with less area may be less reflected on composing a feature descriptor. This research measures similarity between models with a L1-norm in the probability density histogram where the distances of feature descriptors are normalized. Experimental results have shown that the proposed method has improved the retrieval performance described in an average normalized modified retrieval rank(ANMRR) by about 17.2% and the retrieval performance described in a quantitative discrimination scale by 9.6%~17.5% as compared to the existing method.

  • PDF

- A Case Study on OOP Component Build-up for Reliability of MRP System - (MRP 시스템의 신뢰성을 위한 객체재향 컴포넌트 개발 사례)

  • Seo Jang Hoon
    • Journal of the Korea Safety Management & Science
    • /
    • v.6 no.3
    • /
    • pp.211-235
    • /
    • 2004
  • Component based design is perceived as a key technology for developing advanced real-time systems in a both cost- and time effective manner. Already today, component based design is seen to increase software productivity, by reducing the amount of effort needed to update and maintain systems, by packaging solutions for re-use, and easing distribution. Nowdays, a thousand and one companies in If(Information Technology) industry such as Sl(System Integration) and software development companies, regardless of scale of their projects, has spent their time and endeavor on developing reusable business logic. The component software is the outcome of software developers effort on overcoming this problem; the component software is the way propositioned for quick and easy implementation of software. In addition, there has been lots of investment on researching and developing the software development methodology and leading If companies has released new standard technologies to help with component development. For instance, COM(Component Object Model) and DCOM(Distribute COM) technology of Microsoft and EJB(Enterprise Java Beans) technology of Sun Microsystems has turned up. Component-Based Development (CBD) has not redeemed its promises of reuse and flexibility. Reuse is inhibited due to problems such as component retrieval, architectural mismatch, and application specificness. Component-based systems are flexible in the sense that components can be replaced and fine-tuned, but only under the assumption that the software architecture remains stable during the system's lifetime. In this paper, It suggest that systems composed of components should be generated from functional and nonfunctional requirements rather than being composed out of existing or newly developed components. about implements and accomplishes the modeling for the Product Control component development by applying CCD(Contract-Collaboration Diagram), one of component development methodology, to MRP(Material Requirement Planning) System

A Automatic Document Summarization Method based on Principal Component Analysis

  • Kim, Min-Soo;Lee, Chang-Beom;Baek, Jang-Sun;Lee, Guee-Sang;Park, Hyuk-Ro
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.2
    • /
    • pp.491-503
    • /
    • 2002
  • In this paper, we propose a automatic document summarization method based on Principal Component Analysis(PCA) which is one of the multivariate statistical methods. After extracting thematic words using PCA, we select the statements containing the respective extracted thematic words, and make the document summary with them. Experimental results using newspaper articles show that the proposed method is superior to the method using either word frequency or information retrieval thesaurus.

Retrieving Land surface Component Temperature Using Multi-Angle Thermal Infrared Data

  • Wenjie, Fan;Xiru, Xu
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.1362-1364
    • /
    • 2003
  • As non-isothermal mixed pixel is widely existed, the pixel-mean temperature cannot adequately represent the actual thermal state of land surface. The row crop was chosen as target to discuss the problem of component temperature retrieval. At first, the matrix model was found to express the thermal radiant directionality of the target. Then correlation of multi-angle infrared radiance was analyzed. In order to increase the retrieving accuracy, we chose the retrievable parameters and established the iterative method combining with inverse matrix to retrieve component temperature. It was proved by field experiment that the method could improve the retrieving accuracy and stability remarkably.

  • PDF

Development of FROG Hardware and Software System for the Measurement of Femto-Seconds Ultrashort Laser Pulses (지속시간 펨토초 수준의 빛펄스틀 재는 이차조화파발생 프로그(SHG FROG) 장치 개발)

  • 양병관;김진승
    • Korean Journal of Optics and Photonics
    • /
    • v.15 no.3
    • /
    • pp.278-284
    • /
    • 2004
  • A Second Harmonic Generation Frequency Resolved Optical Gating(SHG FROG) system was developed. Its performance test shows that it is capable of accurately measuring the temporal evolution of the electric field, both amplitude and phase, of femtosecond light pulses. For the retrieval of the temporal evolution of light pulses from their spectrograms obtained by using the FROG system, Principal Components Generalized Projection(PCGP) algorithm is used and in addition we used additional constraints of second-harmonic spectrum, marginals in frequency and time-delay of the spectrogram. Such modification of the software brings about significant improvement in speed and stability of the pulse retrieval process.

A study of optimal MPEG-7 descriptor composite in database searching using PCA (PCA를 이용한 데이터베이스 검색에 있어서의 최적 MPEG-7 디스크립터 조합에 관한 연구)

  • 김현민;최윤식
    • Proceedings of the IEEK Conference
    • /
    • 2003.11a
    • /
    • pp.437-440
    • /
    • 2003
  • When we search database with a query image, the retrieval efficiency will vary from each kind of descriptor. Even the best representative descriptor, it results a few useless images that don't match with query image. This type of error can be reduced by adopting another descriptor which extracts features in different way. At present, the choice of descriptors is base on intuitive and experimental method. By theoretic accessing to the problem of descriptor choice, we can solve the given problem in the objective and rational way. In this study, we intend to make a composite of descriptors that can reduce retrieval error by adopting principal component analysis.

  • PDF

A study on Metadata Modeling using Structure Information of Video Document (비디오 문서의 구조 정보를 이용한 메타데이터 모델링에 관한 연구)

  • 권재길
    • Journal of the Korea Society of Computer and Information
    • /
    • v.3 no.4
    • /
    • pp.10-18
    • /
    • 1998
  • Video information is an important component of multimedia system such as Digital Library. World-Wide Web(WWW) and Video-On-Demand(VOD) service system. It can support various types of information because of including audio-visual, spatial-temporal and semantics information. In addition, it requires the ability of retrieving the specific scene of video instead of entire retrieval of video document. Therefore, so as to support a variety of retrieval, this paper models metadata using video document structure information that consists of hierarchical structure, and designs database schema that can manipulate video document.

  • PDF

Improving Cover Song Search Accuracy by Extracting Salient Chromagram Components (강인한 크로마그램 성분 추출을 통한 커버곡 검색 성능 개선)

  • Seo, Jin Soo
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.6
    • /
    • pp.639-645
    • /
    • 2019
  • This paper proposes a salient chromagram components extraction method based on the temporal discrete cosine transform of a chromagram block to improve cover song retrieval accuracy. The proposed salient chromagram emphasizes tonal contents of music, which are well-preserved between an original song and its cover version, while reducing the effects of timbre difference. We apply the proposed salient chromagram extraction method as a preprocessing step for the Fourier-transform based cover song matching. Experiments on two cover song datasets confirm that the proposed salient chromagram improves the cover song search accuracy.

Development of Suspended Particulate Matter Algorithms for Ocean Color Remote Sensing

  • Ahn, Yu-Hwan;Moon, Jeong-Eun;Gallegos, Sonia
    • Korean Journal of Remote Sensing
    • /
    • v.17 no.4
    • /
    • pp.285-295
    • /
    • 2001
  • We developed a CASE-II water model that will enable the simulation of remote sensing reflectance($R_{rs}$) at the coastal waters for the retrieval of suspended sediments (SS) concentrations from satellite imagery. The model has six components which are: water, chlorophyll, dissolved organic matter (DOM), non-chlorophyllous particles (NC), heterotrophic microorganisms and an unknown component, possibly represented by bubbles or other particulates unrelated to the five first components. We measured $R_{rs}$, concentration of SS and chlorophyll, and absorption of DOM during our field campaigns in Korea. In addition, we generated $R_{rs}$ from different concentrations of SS and chlorophyll, and various absorptions of DOM by random number functions to create a large database to test the model. We assimilated both the computer generated parameters as well as the in-situ measurements in order to reconstruct the reflectance spectra. We validated the model by comparing model-reconstructed spectra with observed spectra. The estimated $R_{rs}$ spectra were used to (1) evaluate the performance of four wavelengths and wavelengths ratios for accurate retrieval of SS. 2) identify the optimum band for SS retrieval, and 3) assess the influence of the SS on the chlorophyll algorithm. The results indicate that single bands at longer wavelengths in visible better results than commonly used channel ratios. The wavelength of 625nm is suggested as a new and optimal wavelength for SS retrieval. Because this wavelength is not available from SeaWiFS, 555nm is offered as an alternative. The presence of SS in coastal areas can lead to overestimation chlorophyll concentrations greater than 20-500%.

Investigation of the Effect of Calculation Method of Offset Correction Factor on the GEMS Sulfur Dioxide Retrieval Algorithm (GEMS 이산화황 산출 현업 알고리즘에서 오프셋 보정 계수 산정 방법에 대한 영향 조사)

  • Park, Jeonghyeon;Yang, Jiwon;Choi, Wonei;Kim, Serin;Lee, Hanlim
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.2
    • /
    • pp.189-198
    • /
    • 2022
  • In this present study, we investigated the effect of the offset correction factor calculation method on the sulfur dioxide (SO2) column density in the SO2 retrieval algorithm of the Geostationary Environment Monitoring Spectrometer (GEMS) launched in February 2020. The GEMS operational SO2 retrieval algorithm is the Differential Optical Absorption Spectroscopy (DOAS) - Principal Component Analysis (PCA) Hybrid algorithm. In the GEMS Hybrid algorithm, the offset correction process is essential to correct the absorption effect of ozone appearing in the SO2 slant column density (SCD) obtained after spectral fitting using DOAS. Since the SO2 column density may depend on the conditions for calculating the offset correction factor, it is necessary to apply an appropriate offset correction value. In this present study, the offset correction values were calculated for days with many cloud pixels and few cloud pixels, respectively. And a comparison of the SO2 column density retrieved by applying each offset correction factor to the GEMS operational SO2 retrieval algorithm was performed. When the offset correction value was calculated using radiance data of GEMS on a day with many cloud pixels was used, the standard deviation of the SO2 column density around India and the Korean Peninsula, which are the edges of the GEMS observation area, was 1.27 DU, and 0.58 DU, respectively. And around Hong Kong, where there were many cloud pixels, the SO2 standard deviation was 0.77 DU. On the other hand, when the offset correction value calculated using the GEMS data on the day with few cloud pixels was used, the standard deviation of the SO2 column density slightly decreased around India (0.72 DU), Korean Peninsula (0.38 DU), and Hong Kong (0.44 DU). We found that the SO2 retrieval was relatively stable compared to the SO2 retrieval case using the offset correction value on the day with many cloud pixels. Accordingly, to minimize the uncertainty of the GEMS SO2 retrieval algorithm and to obtain a stable retrieval, it is necessary to calculate the offset correction factor under appropriate conditions.