• Title/Summary/Keyword: Image Decoding

Search Result 220, Processing Time 0.021 seconds

Fast Extraction of Edge Histogram in DCT Domain based on MPEG-7 (MPEG-7 기반 DCT영역에서의 에지히스토그램 고속 추출 기법)

  • Eom Min-Young;Choe Yoon-Sik;Won Chee-Sun;Nam Jae-Yeal
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.43 no.4 s.310
    • /
    • pp.19-26
    • /
    • 2006
  • In these days, multimedia data is transmitted and processed in compressed format. Due to the decoding procedure and filtering for edge detection, the feature extraction process of MPEG-7 Edge Histogram Descriptor (EHD) is time consuming as well as computationally expensive. To improve efficiency of compressed image retrieval, we propose a new edge histogram generation algorithm in DCT domain in this paper. Using the edge information provided by the only two AC coefficients of DCT coefficients, we can get edge directions and strengths directly in DCT domain. The experimental results demonstrate that our system has good performance in terms of retrieval efficiency and effectiveness.

Improvement to Video Display Time Delay when TV Channel switching in Variable Bit Rate Mode of Terrestrial MMS (지상파 MMS 가변 비트율 모드 방송에서 TV 채널 전환 시 발생하는 영상 표출 시간 지연의 개선)

  • Park, Sung-hwan;Chang, Hae-rang;Jeon, Hyoung-joon;Kwon, Soon-chul;Lee, Seung-hyun
    • Journal of Digital Contents Society
    • /
    • v.16 no.5
    • /
    • pp.775-781
    • /
    • 2015
  • EBS started 2HD MMS experimental broadcasting for the first time in Korea on Feb. 11, 2015. It uses the picture compression technique based on MPEG-2 CODEC, and applies the result of the experiment about variable bit rates and changes according to the scanning types, 1080i and 720p. But when changing channels, the delay in displaying picture occurs because of the operation of the variable GOP on MMS broadcasting, which optimizes image quality by application variable bit rates. In this study, verified the relationship between the decoding time of I frames and the GOP set in the encoding step by experimenting and analyzing ON-AIR TS. By using the verification data and adjusts the Encoder GOP parameters, improved the different video display time delays according to the scanning mode 1080i and 720p.

Digital Watermarking using the Channel Coding Technique (채널 코딩 기법을 이용한 디지털 워터마킹)

  • Bae, Chang-Seok;Choi, Jae-Hoon;Seo, Dong-Wan;Choe, Yoon-Sik
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.10
    • /
    • pp.3290-3299
    • /
    • 2000
  • Digital watermarking has similar concepts with channel coding thechnique for transferring data with minimizing error in noise environment, since it should be robust to various kinds of data manipulation for protecting copyrights of multimedia data. This paper proposes a digital watermarking technique which is robust to various kinds of data manipulation. Intellectual property rights information is encoded using a convolutional code, and block-interleaving technique is applied to prevent successive loss of encoded data. Encoded intelloctual property rithts informationis embedded using spread spectrum technique which is robust to cata manipulation. In order to reconstruct intellectual property rights information, watermark signalis detected by covariance between watermarked image and pseudo rando noise sequence which is used to einbed watermark. Embedded intellectual property rights information is obtaned by de-interleaving and cecoding previously detected wtermark signal. Experimental results show that block interleaving watermarking technique can detect embedded intellectial property right informationmore correctly against to attacks like Gaussian noise additon, filtering, and JPEG compression than general spread spectrum technique in the same PSNR.

  • PDF

A Study on Error Resilience of Header Parameters considering the activity of macroblock (매크로블록의 활동성을 고려한 헤더정보의 오류 복원에 관한 연구)

  • Kim, Jong-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.5
    • /
    • pp.837-842
    • /
    • 2008
  • Any errors that are generated in the channels and networks distort the video quality and are propagated in both the spatial and temporal domains. In particular, header errors ran result in serious visual degradation of the output video encoding/decoding schemes that can make an error resilient compressed bit-stream in error prone environments, such as mobile networks. In this paper, we proposes an header error resilience method that consider the activity of macroblock in video bitstream syntex. The extract of header error repaetedly embeds the header parameters into the least significant bits(LSB) of the quantized DCT coefficients prior to VLC. Experimental results show that the proposed error resilience method restores good image quality despite the detected error on header parameters.

Transform domain Wyner-Ziv Coding based on the frequency-adaptive channel noise modeling (주파수 적응 채널 잡음 모델링에 기반한 변환영역 Wyner-Ziv 부호화 방법)

  • Kim, Byung-Hee;Ko, Bong-Hyuck;Jeon, Byeung-Woo
    • Journal of Broadcast Engineering
    • /
    • v.14 no.2
    • /
    • pp.144-153
    • /
    • 2009
  • Recently, as the necessity of a light-weighted video encoding technique has been rising for applications such as UCC(User Created Contents) or Multiview Video, Distributed Video Coding(DVC) where a decoder, not an encoder, performs the motion estimation/compensation taking most of computational complexity has been vigorously investigated. Wyner-Ziv coding reconstructs an image by eliminating the noise on side information which is decoder-side prediction of original image using channel code. Generally the side information of Wyner-Ziv coding is generated by using frame interpolation between key frames. The channel code such as Turbo code or LDPC code which shows a performance close to the Shannon's limit is employed. The noise model of Wyner-Ziv coding for channel decoding is called Virtual Channel Noise and is generally modeled by Laplacian or Gaussian distribution. In this paper, we propose a Wyner-Ziv coding method based on the frequency-adaptive channel noise modeling in transform domain. The experimental results with various sequences prove that the proposed method makes the channel noise model more accurate compared to the conventional scheme, resulting in improvement of the rate-distortion performance by up to 0.52dB.

Detecting near-duplication Video Using Motion and Image Pattern Descriptor (움직임과 영상 패턴 서술자를 이용한 중복 동영상 검출)

  • Jin, Ju-Kyong;Na, Sang-Il;Jenong, Dong-Seok
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.4
    • /
    • pp.107-115
    • /
    • 2011
  • In this paper, we proposed fast and efficient algorithm for detecting near-duplication based on content based retrieval in large scale video database. For handling large amounts of video easily, we split the video into small segment using scene change detection. In case of video services and copyright related business models, it is need to technology that detect near-duplicates, that longer matched video than to search video containing short part or a frame of original. To detect near-duplicate video, we proposed motion distribution and frame descriptor in a video segment. The motion distribution descriptor is constructed by obtaining motion vector from macro blocks during the video decoding process. When matching between descriptors, we use the motion distribution descriptor as filtering to improving matching speed. However, motion distribution has low discriminability. To improve discrimination, we decide to identification using frame descriptor extracted from selected representative frames within a scene segmentation. The proposed algorithm shows high success rate and low false alarm rate. In addition, the matching speed of this descriptor is very fast, we confirm this algorithm can be useful to practical application.

Efficient Coding of Motion Vector Predictor using Phased-in Code (Phased-in 코드를 이용한 움직임 벡터 예측기의 효율적인 부호화 방법)

  • Moon, Ji-Hee;Choi, Jung-Ah;Ho, Yo-Sung
    • Journal of Broadcast Engineering
    • /
    • v.15 no.3
    • /
    • pp.426-433
    • /
    • 2010
  • The H.264/AVC video coding standard performs inter prediction using variable block sizes to improve coding efficiency. Since we predict not only the motion of homogeneous regions but also the motion of non-homogeneous regions accurately using variable block sizes, we can reduce residual information effectively. However, each motion vector should be transmitted to the decoder. In low bit rate environments, motion vector information takes approximately 40% of the total bitstream. Thus, motion vector competition was proposed to reduce the amount of motion vector information. Since the size of the motion vector difference is reduced by motion vector competition, it requires only a small number of bits for motion vector information. However, we need to send the corresponding index of the best motion vector predictor for decoding. In this paper, we propose a new codeword table based on the phased-in code to encode the index of motion vector predictor efficiently. Experimental results show that the proposed algorithm reduces the average bit rate by 7.24% for similar PSNR values, and it improves the average image quality by 0.36dB at similar bit rates.

Applications of Regularized Dequantizers for Compressed Images (압축된 영상에서 정규화 된 역양자화기의 응용)

  • Lee, Gun-Ho;Sung, Ju-Seung;Song, Moon-Ho
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.39 no.5
    • /
    • pp.11-20
    • /
    • 2002
  • Based on regularization principles, we propose a new dequantization scheme on DCT-based transform coding for reducing of blocking artifacts and minimizing the quantization error. The conventional image dequantization is simply to multiply the received quantized DCT coefficients by the quantization matrix. Therefore, for each DCT coefficients, we premise that the quantization noise is as large as half quantizer step size (in DCT domain). Our approach is based on basic constraint that quantization error is bounded to ${\pm}$(quantizer spacing/2) and at least there are not high frequency components corresponding to discontinuities across block boundaries of the images. Through regularization, our proposed dequantization scheme, sharply reduces blocking artifacts in decoded images. Our proposed algorithm guarantees that the dequantization process will map the quantized DCT coefficients will be evaluated against the standard JPEG, MPEG-1 and H.263 (with Annex J deblocking filter) decoding process. The experimental results will show visual improvements as well as numerical improvements in terms of the peak-signal-to-noise ratio (PSNR) and the blockiness measure (BM) to be defined.

Playing with Rauschenberg: Re-reading Rebus (라우센버그와 게임하기-<리버스> 다시읽기)

  • Rhee, Ji-Eun
    • The Journal of Art Theory & Practice
    • /
    • no.2
    • /
    • pp.27-48
    • /
    • 2004
  • Robert Rauschenberg's artistic career has often been regarded as having reached its culmination when the artist won the first prize at the 1964 Venice Biennale. With this victory, Rauschenberg triumphantly entered the pantheon of all-American artists and firmly secured his position in the history of American art. On the other hand, despite the artist's ongoing new experiments in his art, the seemingly precocious ripeness in his career has led the critical discourses on Rauschenberg's art to the artist's early works, most of which were done in the mid-1950s and the 1960s. The crux of Rauschenberg criticism lies not only in focusing on the artist's 50's and 60's works, but also in its large dismissal of the significance of the imagery that the artist employed in his works. As art historians Roger Cranshaw and Adrian Lewis point out, the critical discourse of Rauschenberg either focuses on the formalist concerns on the picture plane, or relies on the "culturalist" interpretation of Rauschenberg's imagery which emphasizes the artist's "Americanness." Recently, a group of art historians centered around October has applied Charles Sanders Peirce's semiotics as art historical methodology and illuminated the indexical aspects of Rauschenberg's work. The semantic inquiry into Rauschenberg's imagery has also been launched by some art historians who seek the clues in the artist's personal context. The first half of this essay will examine the previous criticism on Rauschenberg's art and the other half will discuss the artist's 1955 work Rebus, which I think intersects various critical concerns of Rauschenberg's work, and yet defies the closure of discourses in one direction. The categories of signs in the semiotics of Charles Sanders Peirce and the discourse of Jean-Francois Lyotard will be used in discussing the meanings of Rebus, not to search for the semantic readings of the work, hut to make an analogy in terms of the paradoxical structures of both the work and the theory. The definitions of rebus is as follows: Rebus 1. a representation or words or syllables by pictures of object or by symbols whose names resemble the intended words or syllables in sound; also: a riddle made up wholly or in part of such pictures or symbols. 2. a badge that suggests the name of the person to whom it belongs. Webster's Third New International Dictionary of the English Language Unabridged. Since its creation in 1955, Robert Rauschenberg's Rebus has been one of the most intriguing works in the artist's oeuvre. This monumental 'combine' painting($6feet{\times}10feet$ 10.5 inches) consists of three panels covered with fabric, paper, newspaper, and printed reproductions. On top of these, oil paints, pencil and crayon drawings connect each section into a whole. The layout of the images is overall horizontal. Starting from a torn election poster, which is partially read as "THAT REPRE," on the far left side of the painting. Rebus leads us to proceed from the left to the right, the typical direction of reading in a Western context. Along with its seemingly proper title. Rebus, the painting has triggered many art historians to seek some semantic readings of it. These art historians painstakingly reconstruct the iconography based on the artist's interviews, (auto)biography, and artistic context of his works. The interpretation of Rebus varies from a 'image-by-image' collation with a word to a more general commentary on Rauschenberg's work overall, such as a work that "bridges between art and life." Despite the title's allusion to the legitimate purpose of the painting as a decoding of the imagery into sound, Rebus, I argue, actually hinders a reading of it. By reading through Peirce to Rauschenberg, I will delve into the subtle anxiety between words and images in their works. And on this basis, I suggest Rauschenberg's strategy in playing Rebus is to hide the meaning of the imagery rather than to disclose it.

  • PDF

An essay on appraisal method over official administration records ill-balanced. -For development of appraisal process and method over chosun government-general office records- (불균형 잔존 행정기록의 평가방법 시론 - 조선총독부 공문서의 평가절차론 수립을 위하여 -)

  • Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.13
    • /
    • pp.179-203
    • /
    • 2006
  • This study develops the process and method of official administration documents which have remained ill-balanced like the official documents of the government-general of Chosun(the pro-Japanese colonial government (1910-1945)). At first, the existing Appraisal-theories are recomposed. The Appraisal-Theories of Schellenberg is focused valuation about value of records itself, but fuction-Appraisal theory is attached importance to operational activities which take the record into action. But given that the record is a re-presentation of operational activities, the both are the same on the philosophy aspect. Therefore, in the case that the process - method is properly designed, it can be possible to use a composite type between operational activities and records. Also, a method of the Curve has its strong points in the macro and balanced aspect while the Absolute has it's strength in the micro aspect, so that chances are that both alternate methodologies are applied to the study. Hereby, the existing Appraisal theories are concluded to be the mutually-complemented things that can be easily put together into various forms according to the characteristics of an object and its situation, in the terms of the specific Appraisal methodology. Especially, in the case of this article dealing with the imbalance remains official-documents, it is necessary to compromise more properly process with a indicated useful method than establishing a method and process by choosing the only one theory. In order to appraise the official-documents of the pro-Japanese colonial government (1910-1945), a macro appraisal of value has to be appraised about them by understanding a system, functions and using the historical-cultural evolution, after analysing Disposal Authority. From this, map the record so that organization function maps are constructed regarding the value rank of functions and detailed-functions. After this, establish the appraisal strategy considering the internal environment of archival agencies and based on micro appraisal to a great quantity of records remained and supplying other meaning to a small quantity of records remained for example, the oral resources production are accomplished. The study has not yet reached the following aspects ; a function analysis, historical decoding techniques, a curve valuation of the record, the official gazette of the government general of Chosun( the pro-Japanese government for 1910-1945), an analysis method of the other historical materials and it's process, presentation of appraisal output image. As the result, that's just simply a proposal and we should fill in the above-mentioned shortages of the study through development of all the up-coming studies.