• Title/Summary/Keyword: Codebook methods

Search Result 55, Processing Time 0.018 seconds

Data Cleaning and Integration of Multi-year Dietary Survey in the Korea National Health and Nutrition Examination Survey (KNHANES) using Database Normalization Theory (데이터베이스 정규화 이론을 이용한 국민건강영양조사 중 다년도 식이조사 자료 정제 및 통합)

  • Kwon, Namji;Suh, Jihye;Lee, Hunjoo
    • Journal of Environmental Health Sciences
    • /
    • v.43 no.4
    • /
    • pp.298-306
    • /
    • 2017
  • Objectives: Since 1998, the Korea National Health and Nutrition Examination Survey (KNHANES) has been conducted in order to investigate the health and nutritional status of Koreans. The food intake data of individuals in the KNHANES has also been utilized as source dataset for risk assessment of chemicals via food. To improve the reliability of intake estimation and prevent missing data for less-responded foods, the structure of integrated long-standing datasets is significant. However, it is difficult to merge multi-year survey datasets due to ineffective cleaning processes for handling extensive numbers of codes for each food item along with changes in dietary habits over time. Therefore, this study aims at 1) cleaning the process of abnormal data 2) generation of integrated long-standing raw data, and 3) contributing to the production of consistent dietary exposure factors. Methods: Codebooks, the guideline book, and raw intake data from KNHANES V and VI were used for analysis. The violation of the primary key constraint and the $1^{st}-3rd$ normal form in relational database theory were tested for the codebook and the structure of the raw data, respectively. Afterwards, the cleaning process was executed for the raw data by using these integrated codes. Results: Duplication of key records and abnormality in table structures were observed. However, after adjusting according to the suggested method above, the codes were corrected and integrated codes were newly created. Finally, we were able to clean the raw data provided by respondents to the KNHANES survey. Conclusion: The results of this study will contribute to the integration of the multi-year datasets and help improve the data production system by clarifying, testing, and verifying the primary key, integrity of the code, and primitive data structure according to the database normalization theory in the national health data.

Hole-Filling Method Using Extrapolated Spatio-temporal Background Information (추정된 시공간 배경 정보를 이용한 홀채움 방식)

  • Kim, Beomsu;Nguyen, Tien Dat;Hong, Min-Cheol
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.54 no.8
    • /
    • pp.67-80
    • /
    • 2017
  • This paper presents a hole-filling method using extrapolated spatio-temporal background information to obtain a synthesized view. A new temporal background model using non-overlapped patch based background codebook is introduced to extrapolate temporal background information In addition, a depth-map driven spatial local background estimation is addressed to define spatial background constraints that represent the lower and upper bounds of a background candidate. Background holes are filled by comparing the similarities between the temporal background information and the spatial background constraints. Additionally, a depth map-based ghost removal filter is described to solve the problem of the non-fit between a color image and the corresponding depth map of a virtual view after 3-D warping. Finally, an inpainting is applied to fill in the remaining holes with the priority function that includes a new depth term. The experimental results demonstrated that the proposed method led to results that promised subjective and objective improvement over the state-of-the-art methods.

A Beamformer Construction Method Via Partial Feedback of Channel State Information of MIMO Systems (다중 입출력 시스템의 부분적 채널 정보 궤환을 통한 빔포머 형성 방안)

  • Kim, Yoonsoo;Sung, Wonjin
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.6
    • /
    • pp.26-33
    • /
    • 2014
  • For wireless communication systems of (and beyond) LTE-Advanced, multiple-input multiple-output (MIMO) with an increased number of antennas will be utilized for system throughput improvement. When using such an increased number of antenna, an excessive amount of overhead in channel state information (CSI) feedback can be a serious problem. In this paper, we propose methods which reduce the CSI feedback overhead, particularly including application strategies for multi-rank transmission targeted for two or more reception antennas. To reduce the information which is instantaneously transmitted from the reception node to the transmission node, we present a beamforming method utilizing singular value decomposition (SVD) based on channel estimation of partitioned antenna arrays. Since the SVDs for partial matrices of the channel may lose the characteristics of the original unpartitioned matrix, we explain an appropriate scheme to cope with this problem.

Image Data Compression Using Biorthgnal Wavelet Transform and Variable Block Size Edges Extraction (쌍직교 웨이브렛 변환과 가변 블럭 윤곽선 추출에 의한 영상 데이타 압축)

  • 김기옥;김재공
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.19 no.7
    • /
    • pp.1203-1212
    • /
    • 1994
  • This paper proposes a variable block size vector quantization based on a biorthogonal wavelet transform for image compression. An image is first decomposed with the biorthogonal wavelet transform into multiresolution image and the wavelet coefficients of the middle frequency bands are segmented using the quadtree sturcture to extract the perceptually important regions in the middle frequency bands. A sedges of middle frequency bands exist the corresponding position of high frequency bands, the complicated quadtree structure of middle frequency bands is equally applied to the high frequency bands. Therefore the overhaed information of the quadtree codes needed to segment the high frequency bands can be reduced. The segmented subblocks are encoded with the codebook designed at the each scales and directions. The simulation results showed that the proposed methods could reproduce higher quality image with bit rate reduced about 20(%) than of the preceding VQ method and sufficiently reduce the bolck effect and the edge degradation.

  • PDF

A Fast Encoding Algorithm for Image Vector Quantization Based on Prior Test of Multiple Features (복수 특징의 사전 검사에 의한 영상 벡터양자화의 고속 부호화 기법)

  • Ryu Chul-hyung;Ra Sung-woong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1231-1238
    • /
    • 2005
  • This paper presents a new fast encoding algorithm for image vector quantization that incorporates the partial distances of multiple features with a multidimensional look-up table (LUT). Although the methods which were proposed earlier use the multiple features, they handles the multiple features step by step in terms of searching order and calculating process. On the other hand, the proposed algorithm utilizes these features simultaneously with the LUT. This paper completely describes how to build the LUT with considering the boundary effect for feasible memory cost and how to terminate the current search by utilizing partial distances of the LUT Simulation results confirm the effectiveness of the proposed algorithm. When the codebook size is 256, the computational complexity of the proposed algorithm can be reduced by up to the $70\%$ of the operations required by the recently proposed alternatives such as the ordered Hadamard transform partial distance search (OHTPDS), the modified $L_2-norm$ pyramid ($M-L_2NP$), etc. With feasible preprocessing time and memory cost, the proposed algorithm reduces the computational complexity to below the $2.2\%$ of those required for the exhaustive full search (EFS) algorithm while preserving the same encoding quality as that of the EFS algorithm.