• Title/Summary/Keyword: 유효 데이터

Search Result 1,405, Processing Time 0.029 seconds

Data De-duplication and Recycling Technique in SSD-based Storage System for Increasing De-duplication Rate and I/O Performance (SSD 기반 스토리지 시스템에서 중복률과 입출력 성능 향상을 위한 데이터 중복제거 및 재활용 기법)

  • Kim, Ju-Kyeong;Lee, Seung-Kyu;Kim, Deok-Hwan
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.12
    • /
    • pp.149-155
    • /
    • 2012
  • SSD is a storage device of having high-performance controller and cache buffer and consists of many NAND flash memories. Because NAND flash memory does not support in-place update, valid pages are invalidated when update and erase operations are issued in file system and then invalid pages are completely deleted via garbage collection. However, garbage collection performs many erase operations of long latency and then it reduces I/O performance and increases wear leveling in SSD. In this paper, we propose a new method of de-duplicating valid data and recycling invalid data. The method de-duplicates valid data and then recycles invalid data so that it improves de-duplication ratio. Due to reducing number of writes and garbage collection, the method could increase I/O performance and decrease wear leveling in SSD. Experimental result shows that it can reduce maximum 20% number of garbage collections and 9% I/O latency than those of general case.

Relationship of Pupil's Size and Gaze Frequency for Neuro Sports Marketing: Focusing on Sigma Analysis (뉴로 스포츠 마케팅을 위한 동공 확장과 주시빈도 간의 관계: 시그마 분석법을 적용하여)

  • Ko, Eui-Suk;Song, Ki-Hyeon;Cho, Soo-Hyun;Kim, Jong-Ha
    • Science of Emotion and Sensibility
    • /
    • v.20 no.3
    • /
    • pp.39-48
    • /
    • 2017
  • In order to verify the effectiveness of marketing in the basketball stadium, this study measured and analyzed the gaze frequency and interest when the pupil was expanded by using the eye-tracking technology among various neuro marketing techniques of marketing. To analyze the section where the pupil size get expanded, interval of pupil size was higher than 2.275% (2 sigma data) and higher than 0.135% high (3 sigma data). Overall the valid data was analyzed by inflection points according to gaze frequency. We also analyzed the correlation between overall valid data and the ranges where the pupil size was significantly increased. The result showed that the correlation between overall valid data and pupil size 2 sigma data showed the highest correlation with 0.805. The pupil size 2 sigma data and pupil size 3 sigma data showed a correlation with 0.781, overall the valid data and pupil size 2 sigma data showed a correlation with 0.683. Therefore, it is concluded that, the section where the pupil size was expanded and the section at which gaze frequency is higher in the eye-tracking data were similar. However, the correlation between data of pupil size is determined to be significantly expanded and overall the valid data is decreased.

Efficient Skyline Computation on Time-Interval Data Streams (유효시간 데이터 스트림에서의 스카이라인 질의 알고리즘)

  • Park, Nam-Hun;Chang, Joong-Hyuk
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.13 no.1
    • /
    • pp.370-381
    • /
    • 2012
  • Multi-criteria result extraction is crucial in many scientific applications that support real-time stream processing, such as habitat research and disaster monitoring. Skyline evaluation is computational intensive especially over continuous time-interval data streams where each object has its own customized expiration time. In this work, we propose TI-Sky - a continuous skyline evaluation framework. To ensure correctness, the result space needs to be continuously maintained as new objects arrive and older objects expire. TI-Sky strikes a perfect balance between the costs of continuously maintaining the result space and the costs of computing the final skyline result from this space whenever a pull-based user query is received. Our key principle is to incrementally maintain a partially precomputed skyline result space - however doing so efficiently by working at a higher level of abstraction. TI-Sky's algorithms for insertion, deletion, purging and result retrieval exploit both layers of granularity. Our experimental study demonstrates the superiority of TI-Sky over existing techniques to handle a wide variety of data sets.

A Study on Temporal Data Models and Aggregate Functions (시간지원 데이터 모델 및 집계함수에 관한 연구)

  • Lee, In-Hong;Moon, Hong-Jin;Cho, Dong-Young;Lee, Wan-Kwon;Cho, Hyun-Joon
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.12
    • /
    • pp.2947-2959
    • /
    • 1997
  • Temporal data model is able to handle the time varying information, which is to add temporal attributes to conventional data model. The temporal data model is classified into three models depending upon supporting time dimension, that are the valid time model to support valid time, the transaction time model to support transaction model, and the bitemporal data model to support valid time and transaction time. Most temporal data models are designed to process the temporal data by extending the relational model. There are two types or temporal data model, which are the tuple timestamping and the attribute timestamping depending on time dimension. In this research, a concepts of temporal data model, the time dimension, types of thc data model, and a consideration for the data model design are discussed Also, temporal data models in terms of the time dimension are compared. And the aggregate function model of valid time model is proposed, and then logical analysis for its computing consts has been done.

  • PDF

Evaluation of Effective Temperature for Estimate Design Thermal Loads in Steel Deck of Steel Box Girder Bridges (강상자형교의 강바닥판에서 설계온도하중을 위한 유효온도 산정)

  • Shin, Dong-Wook;Kim, Kyoung-Nam;Choi, Chul-Ho;Lee, Seong-Haeng
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.17 no.6
    • /
    • pp.77-87
    • /
    • 2013
  • A present LSD (limited state design) code for temperature load in the domestic bridge design has applied a uniform standard for various bridge types. In this study, in order to calculate the effective temperature, a specimen of steel box girder bridge section with real size dimension was manufactured. For a year, the temperature data were measured at the 18 point in steel deck of steel box girder bridges specimen. Effective temperature within the cross section according to atmospheric temperature was calculated by this experiment data. The analyzed results were very similar correlation when compared with the effective temperature of the Euro Code. Therefore, the effective temperature which calculated based on the present data could be used as the basic data in order to present to the appropriate design criteria for the thermal loads on the domestic bridge design.

A Data Collection Model of Vehicle Parts for the Evaluation of Electric Vehicle Process Based on DID (DID 기반 전기차 전과정평가를 위한 차량부품 데이터수집 모델)

  • Kwon, Jun-Woo;Kim, Jane;Lee, Soojin;Seo, Seung-Hyun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2022.11a
    • /
    • pp.237-239
    • /
    • 2022
  • 최근, 여러 국가에서 전과정평가를 바탕으로 한 차량 온실가스 배출규제에 대한 검토를 진행중이다. 차량 전과정평가를 수행하기 위해서는 각 부품에 대한 데이터들이 수집되어야 하며, 해당 데이터에 대한 무결성과 유효성 검증이 필요하다. 본 논문에서는 전과정평가를 위한 데이터수집 시 데이터 제공자에 대한 검증과 데이터의 유효성, 무결성을 검증하기 위한 DID 기반 전기차 전과정평가를 위한 데이터수집 모델을 제안한다.

An Optimistic Mechanism for Combining Concurrency Control and Validation in Valid XML (유효한 XML 환경에서 유효성과 병행수행의 결합을 위한 낙관적 기법)

  • Yun, Il-Kook;Ko, Han-Young;Park, Seog
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2008.06c
    • /
    • pp.15-20
    • /
    • 2008
  • 데이터베이스가 DTD를 가지는 valid XML을 관리하기 위해서는 XML 문서를 변경하는 트랜잭션들에 대한변경 유효성을 검사할 수 있는 메커니즘이 필요하게 된다. 그리고 이러한 유효성의 검증 범위는 유효성을 검증하기 위해 필요한 정보를 담고 있는 노드들을 나타낸다고 할 수 있는데 이것은 유효성 검증이 올바르게 수행되기 위해서는 검증 범위에 속하는 데이터 아이템들이 다른 트랜잭션들에 의해서 변경되지 않도록 보장하는 병행수행 제어 기법이 필요하다는 것을 의미하며 이를 위해 유효성과 병행수행에 대한 낙관적 처리 기법이 필요하게 된다. 본 논문에서는 효율적인 충돌 탐지와 같은 검증 범위에서의 유효성 검사를 통해 변경 연산의 트랜잭션들의 병행수행 성능을 향상시키는 기법을 제안하고 기존연구의 유효성 검증과 충돌 탐지 기법을 비교, 분석한다.

  • PDF

A Study on the Validation Test for Open Set Face Recognition Method with a Dummy Class (더미 클래스를 가지는 열린 집합 얼굴 인식 방법의 유효성 검증에 대한 연구)

  • Ahn, Jung-Ho;Choi, KwonTaeg
    • Journal of Digital Contents Society
    • /
    • v.18 no.3
    • /
    • pp.525-534
    • /
    • 2017
  • The open set recognition method should be used for the cases that the classes of test data are not known completely in the training phase. So it is required to include two processes of classification and the validation test. This kind of research is very necessary for commercialization of face recognition modules, but few domestic researches results about it have been published. In this paper, we propose an open set face recognition method that includes two sequential validation phases. In the first phase, with dummy classes we perform classification based on sparse representation. Here, when the test data is classified into a dummy class, we conclude that the data is invalid. If the data is classified into one of the regular training classes, for second validation test we extract four features and apply them for the proposed decision function. In experiments, we proposed a simulation method for open set recognition and showed that the proposed validation test outperform SCI of the well-known validation method

Validation Method of ARINC 661 UA Definition File and CDS Configuration File for DO-330 Tool Qualification (DO-330 도구 자격인증을 고려한 ARINC 661 UA 정의 파일과 CDS 설정 파일의 유효성 확인 방법)

  • Younggon Kim
    • Journal of Platform Technology
    • /
    • v.10 no.4
    • /
    • pp.11-24
    • /
    • 2022
  • The tool for developing airborne software requires the same level of safety as airborne software because the tool whose output is part of the airborne software and thus could insert an error into the airborne software. This paper describes how to ensure the reliability of the tool output that becomes a part of the airborne software by validating of the input and output files of the tool when generating the ARINC 661 standard UA definition file and the CDS configuration file through the A661UAGEN tool of Hanwha Systems. We present the method to validate XML data structure and contents with an XML schema definition, which is an input of the A661UAGEN tool. And the method to validate the output binary data by using mask data for the corresponding data structure and valid value, which is the output of the A661UAGEN tool, was presented. As such, validation of the input and output of the tool improves the reliability of binary DFs and CDs integrated into the airborne software, allowing airborne software developers to utilize the tool to ensure safety in developing the OFP.

Product Value Evaluation Models based on Itemset Association Chain (상품군 연관망 기반의 상품가치 평가모형)

  • Chang, Yong-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.2
    • /
    • pp.1-17
    • /
    • 2010
  • Association rules among product items by association analysis suggest sales effect among products. These are useful for marketing strategies such as cross-selling and product display etc. However, if we evaluate more practical product values reflecting cross-selling effects, they will be also more useful for the decisions of companies such as product item selection for product assortment and profit maximization etc. This study proposes product value evaluation models with the concept of effective value based on single-item association chain and itemset association chain. In addition to that, we performed experiments with transaction data related to clothing of an online shopping mall in Korea to show the performances of our models. In result, we confirmed that some items increased in effective values compared with their pure values while the others decreased in effective values.