• Title/Summary/Keyword: Hierarchical Compression

Search Result 77, Processing Time 0.03 seconds

Hierarchical Compression Technique for Reflectivity Data of Weather Radar (기상레이더 반사도 자료의 계층적 압축 기법)

  • Jang, Bong-Joo;Lee, Keon-Haeng;Lim, Sanghun;Kwon, Ki-Ryong
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.7
    • /
    • pp.793-805
    • /
    • 2015
  • Nowadays the amount of data obtained from advanced weather radars is growing to provide higher spatio-temporal resolution. Accordingly radar data compression is important to use limited network bandwidth and storage effectively. In this paper, we proposed a hierarchical compression method for weather radar data having high spatio-temporal resolution. The method is applied to radar reflectivity and evaluated in aspects of accuracy of quantitative rainfall intensity. The technique provides three compression levels from only 1 compressed stream for three radar user groups-signal processor, quality controller, weather analyst. Experimental results show that the method has maximum 13% and minimum 33% of compression rates, and outperforms 25% higher than general compression technique such as gzip.

Multi-resolution Lossless Image Compression for Progressive Transmission and Multiple Decoding Using an Enhanced Edge Adaptive Hierarchical Interpolation

  • Biadgie, Yenewondim;Kim, Min-sung;Sohn, Kyung-Ah
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.12
    • /
    • pp.6017-6037
    • /
    • 2017
  • In a multi-resolution image encoding system, the image is encoded into a single file as a layer of bit streams, and then it is transmitted layer by layer progressively to reduce the transmission time across a low bandwidth connection. This encoding scheme is also suitable for multiple decoders, each with different capabilities ranging from a handheld device to a PC. In our previous work, we proposed an edge adaptive hierarchical interpolation algorithm for multi-resolution image coding system. In this paper, we enhanced its compression efficiency by adding three major components. First, its prediction accuracy is improved using context adaptive error modeling as a feedback. Second, the conditional probability of prediction errors is sharpened by removing the sign redundancy among local prediction errors by applying sign flipping. Third, the conditional probability is sharpened further by reducing the number of distinct error symbols using error remapping function. Experimental results on benchmark data sets reveal that the enhanced algorithm achieves a better compression bit rate than our previous algorithm and other algorithms. It is shown that compression bit rate is much better for images that are rich in directional edges and textures. The enhanced algorithm also shows better rate-distortion performance and visual quality at the intermediate stages of progressive image transmission.

Edge Adaptive Hierarchical Interpolation for Lossless and Progressive Image Transmission

  • Biadgie, Yenewondim;Wee, Young-Chul;Choi, Jung-Ju
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.5 no.11
    • /
    • pp.2068-2086
    • /
    • 2011
  • Based on the quincunx sub-sampling grid, the New Interleaved Hierarchical INTerpolation (NIHINT) method is recognized as a superior pyramid data structure for the lossless and progressive coding of natural images. In this paper, we propose a new image interpolation algorithm, Edge Adaptive Hierarchical INTerpolation (EAHINT), for a further reduction in the entropy of interpolation errors. We compute the local variance of the causal context to model the strength of a local edge around a target pixel and then apply three statistical decision rules to classify the local edge into a strong edge, a weak edge, or a medium edge. According to these local edge types, we apply an interpolation method to the target pixel using a one-directional interpolator for a strong edge, a multi-directional adaptive weighting interpolator for a medium edge, or a non-directional static weighting linear interpolator for a weak edge. Experimental results show that the proposed algorithm achieves a better compression bit rate than the NIHINT method for lossless image coding. It is shown that the compression bit rate is much better for images that are rich in directional edges and textures. Our algorithm also shows better rate-distortion performance and visual quality for progressive image transmission.

Color Image Segmentation using Hierarchical Histogram (계층적 히스토그램을 이용한 컬러영상분할)

  • 김소정;정경훈
    • Proceedings of the IEEK Conference
    • /
    • 2003.07e
    • /
    • pp.1771-1774
    • /
    • 2003
  • Image segmentation is very important technique as preprocessing. It is used for various applications such as object recognition, computer vision, object based image compression. In this paper, a method which segments the multidimensional image using a hierarchical histogram approach, is proposed. The hierarchical histogram approach is a method that decomposes the multi-dimensional situation into multi levels of 1 dimensional situations. It has the advantage of the rapid and easy calculation of the histogram, and at the same time because the histogram is applied at each level and not as a whole, it is possible to have more detailed partitioning of the situation.

  • PDF

Data Sorting-based Adaptive Spatial Compression in Wireless Sensor Networks

  • Chen, Siguang;Liu, Jincheng;Wang, Kun;Sun, Zhixin;Zhao, Xuejian
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.8
    • /
    • pp.3641-3655
    • /
    • 2016
  • Wireless sensor networks (WSNs) provide a promising approach to monitor the physical environments, to prolong the network lifetime by exploiting the mutual correlation among sensor readings has become a research focus. In this paper, we design a hierarchical network framework which guarantees layered-compression. Meanwhile, a data sorting-based adaptive spatial compression scheme (DS-ASCS) is proposed to explore the spatial correlation among signals. The proposed scheme reduces the amount of data transmissions and alleviates the network congestion. It also obtains high compression performance by sorting original sensor readings and selectively discarding the small coefficients in transformed matrix. Moreover, the compression ratio of this scheme varies according to the correlation among signals and the value of adaptive threshold, so the proposed scheme is adaptive to various deploying environments. Finally, the simulation results show that the energy of sorted data is more concentrated than the unsorted data, and the proposed scheme achieves higher reconstruction precision and compression ratio as compared with other spatial compression schemes.

Compression of the Variables Classifying Domestic Marine Accident Data

  • Park, Deuk-Jin;Yang, Hyeong-Sun;Yim, Jeong-Bin
    • Journal of Navigation and Port Research
    • /
    • v.46 no.2
    • /
    • pp.92-98
    • /
    • 2022
  • Maritime accidents result in enormous economic loss and loss of life; thus, such accidents must be prevented, and risks must be managed to prevent these occurrences Risk management must be based on statistical evidence such as variables. Because calculating when variables increase statistically can be difficult, compressing the designated variables is necessary to use the maritime accident data in Korea. Thus, in this study, variables of marine accident data are compressed using statistical methods. The date, ship type, and marine accident type included in all maritime accident data were extracted, the number of optimal variables was confirmed using the hierarchical clustering analysis method, and the data were compressed. For the compressed variables, the validity of the data use was statistically confirmed using analysis of variance, and the data of the variables identified using the variable compression method were designated. Consequently, among the monthly and yearly data, statistical significance was confirmed in yearly data, and compression was possible. The significance of the data was confirmed in six and eight types of ships and accidents, respectively, and these were compressed. These results can be directly used for prevention or prediction based on past maritime accident data. Additionally, the data range extracted from past maritime accidents and the number of applicable data will be studied in the future.

3D Model Compression For Collaborative Design

  • Liu, Jun;Wang, Qifu;Huang, Zhengdong;Chen, Liping;Liu, Yunhua
    • International Journal of CAD/CAM
    • /
    • v.7 no.1
    • /
    • pp.1-10
    • /
    • 2007
  • The compression of CAD models is a key technology for realizing Internet-based collaborative product development because big model sizes often prohibit us to achieve a rapid product information transmission. Although there exist some algorithms for compressing discrete CAD models, original precise CAD models are focused on in this paper. Here, the characteristics of hierarchical structures in CAD models and the distribution of their redundant data are exploited for developing a novel data encoding method. In the method, different encoding rules are applied to different types of data. Geometric data is a major concern for reducing model sizes. For geometric data, the control points of B-spline curves and surfaces are compressed with the second-order predictions in a local coordinate system. Based on analysis to the distortion induced by quantization, an efficient method for computation of the distortion is provided. The results indicate that the data size of CAD models can be decreased efficiently after compressed with the proposed method.

SPIHT-based Subband Division Compression Method for High-resolution Image Compression (고해상도 영상 압축을 위한 SPIHT 기반의 부대역 분할 압축 방법)

  • Kim, Woosuk;Park, Byung-Seo;Oh, Kwan-Jung;Seo, Young-Ho
    • Journal of Broadcast Engineering
    • /
    • v.27 no.2
    • /
    • pp.198-206
    • /
    • 2022
  • This paper proposes a method to solve problems that may occur when SPIHT(set partition in hierarchical trees) is used in a dedicated codec for compressing complex holograms with ultra-high resolution. The development of codecs for complex holograms can be largely divided into a method of creating dedicated compression methods and a method of using anchor codecs such as HEVC and JPEG2000 and adding post-processing techniques. In the case of creating a dedicated compression method, a separate conversion tool is required to analyze the spatial characteristics of complex holograms. Zero-tree-based algorithms in subband units such as EZW and SPIHT have a problem that when coding for high-resolution images, intact subband information is not properly transmitted during bitstream control. This paper proposes a method of dividing wavelet subbands to solve such a problem. By compressing each divided subbands, information throughout the subbands is kept uniform. The proposed method showed better restoration results than PSNR compared to the existing method.

Performance Evaluation of the JPEG DCT-based Progressive and Hierarchical Codings for Medical Image Communication

  • Ahn, C.B.;Lee, J.S.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1991 no.11
    • /
    • pp.48-53
    • /
    • 1991
  • The discrete cosine transform (DCT)-based progressive and hierarchical coding schemes developed by the International Standardization Organization (ISO) Joint Photographic Experts Groups (JPEG) are implemented and evaluated for the application of medical image communication. For a series of head sections of magnetic resonance images, a compression ratio of about 10 is obtained by the algorithm without noticeable image degradation.

  • PDF

Developing JSequitur to Study the Hierarchical Structure of Biological Sequences in a Grammatical Inference Framework of String Compression Algorithms

  • Galbadrakh, Bulgan;Lee, Kyung-Eun;Park, Hyun-Seok
    • Genomics & Informatics
    • /
    • v.10 no.4
    • /
    • pp.266-270
    • /
    • 2012
  • Grammatical inference methods are expected to find grammatical structures hidden in biological sequences. One hopes that studies of grammar serve as an appropriate tool for theory formation. Thus, we have developed JSequitur for automatically generating the grammatical structure of biological sequences in an inference framework of string compression algorithms. Our original motivation was to find any grammatical traits of several cancer genes that can be detected by string compression algorithms. Through this research, we could not find any meaningful unique traits of the cancer genes yet, but we could observe some interesting traits in regards to the relationship among gene length, similarity of sequences, the patterns of the generated grammar, and compression rate.