• Title/Summary/Keyword: Run-Length

Search Result 426, Processing Time 0.024 seconds

An Effective Structure of Hardware Compression for Potentially Visible Set of Indoor 3D Game Scenes (실내 3D 게임 장면의 잠재적 가시 집합을 위한 효과적인 하드웨어 압축 구조)

  • Kim, Youngsik
    • Journal of Korea Game Society
    • /
    • v.14 no.6
    • /
    • pp.29-38
    • /
    • 2014
  • In the large scale indoor 3D game scenes, the data amount of potentially visible set (PVS) which pre-computes the information of occlusion culling can be huge. However, the large part of them can be represented as zero. In this paper, the effective hardware structure is designed, which compresses PVS data as the way of zero run length encoding (ZRLE) during building the scene trees of 3D games in mobile environments. The compression ratio of the proposed structure and the rendering speed (frame per second: FPS) according to both PVS culling and frustum culling are analyzed under 3D game simulations.

AN INTEGRATED PROCESS CONTROL PROCEDURE WITH REPEATED ADJUSTMENTS AND EWMA MONITORING UNDER AN IMA(1,1) DISTURBANCE WITH A STEP SHIFT

  • Park, Chang-Soon
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.4
    • /
    • pp.381-399
    • /
    • 2004
  • Statistical process control (SPC) and engineering process control (EPC) are based on different strategies for process quality improvement. SPC re-duces process variability by detecting and eliminating special causes of process variation, while EPC reduces process variability by adjusting compensatory variables to keep the quality variable close to target. Recently there has been need for an integrated process control (IPC) procedure which combines the two strategies. This paper considers a scheme that simultaneously applies SPC and EPC techniques to reduce the variation of a process. The process model under consideration is an IMA(1,1) model with a step shift. The EPC part of the scheme adjusts the process, while the SPC part of the scheme detects the occurrence of a special cause. For adjusting the process repeated adjustment is applied according to the predicted deviation from target. For detecting special causes the exponentially weighted moving average control chart is applied to the observed deviations. It was assumed that the adjustment under the presence of a special cause may increase the process variability or change the system gain. Reasonable choices of parameters for the IPC procedure are considered in the context of the mean squared deviation as well as the average run length.

Statistical Process Control System for Continuous Flow Processes Using the Kalman Filter and Neural Network′s Modeling (칼만 필터와 뉴럴 네트워크 모델링을 이용한 연속생산공정의 통계적 공정관리 시스템)

  • 권상혁;김광섭;왕지남
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.15 no.3
    • /
    • pp.50-60
    • /
    • 1998
  • This paper is concerned with the design of two residual control charts for real-time monitoring of the continuous flow processes. Two different control charts are designed under the situation that observations are correlated each other. Kalman-Filter based model estimation is employed when the process model is known. A black-box approach, based on Back-Propagation Neural Network, is also applied for the design of control chart when there is no prior information of process model. Performance of the designed control charts and traditional control charts is evaluated. Average run length(ARL) is adopted as a criterion for comparison. Experimental results show that the designed control chart using the Neural Network's modeling has shorter ARL than that of the other control charts when process mean is shifted. This means that the designed control chart detects the out-of-control state of the process faster than the others. The designed control chart using the Kalman-Filter based model estimation also has better performance than traditional control chart when process is out-of-control state.

  • PDF

SISO-RLL Decoding Algorithm of 17PP Modulation Code for High Density Optical Recording Channel (고밀도 광 기록 채널에서 17PP 변조 부호의 연판정 입력 연판정 출력 런-길이 제한 복호 알고리즘)

  • Lee, Bong-Il;Lee, Jae-Jin
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.2C
    • /
    • pp.175-180
    • /
    • 2009
  • When we apply the LDPC code for high density optical storage channel, it is necessary to make an algorithm that the modulation code decoder must feed the LDPC decoder soft-valued information because LDPC decoder exploits soft values using the soft input. Therefore, we propose the soft-input soft-output run-length limited 17PP decoding algorithm and compare performance of LDPC codes. Consequently, we found that the proposed soft-input soft-output decoding algorithm using 17PP is 0.8dB better than the soft-input soft-output decoding algorithm using (1, 7) RLL.

Cell-Based Wavelet Compression Method for Volume Data (볼륨 데이터를 위한 셀 기반 웨이브릿 압축 기법)

  • Kim, Tae-Yeong;Sin, Yeong-Gil
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.26 no.11
    • /
    • pp.1285-1295
    • /
    • 1999
  • 본 논문은 방대한 크기의 볼륨 데이타를 효율적으로 렌더링하기 위한 셀 기반 웨이브릿 압축 방법을 제시한다. 이 방법은 볼륨을 작은 크기의 셀로 나누고, 셀 단위로 웨이브릿 변환을 한 다음 복원 순서에 따른 런-길이(run-length) 인코딩을 수행하여 높은 압축율과 빠른 복원을 제공한다. 또한 최근 복원 정보를 캐쉬 자료 구조에 효율적으로 저장하여 복원 시간을 단축시키고, 에러 임계치의 정규화로 비정규화된 웨이브릿 압축보다 빠른 속도로 정규화된 압축과 같은 고화질의 이미지를 생성하였다. 본 연구의 성능을 평가하기 위하여 {{}} 해상도의 볼륨 데이타를 압축하여 쉬어-? 분해(shear-warp factorization) 알고리즘에 적용한 결과, 손상이 거의 없는 상태로 약 27:1의 압축율이 얻어졌고, 약 3초의 렌더링 시간이 걸렸다.Abstract This paper presents an efficient cell-based wavelet compression method of large volume data. Volume data is divided into individual cell of {{}} voxels, and then wavelet transform is applied to each cell. The transformed cell is run-length encoded according to the reconstruction order resulting in a fairly good compression ratio and fast reconstruction. A cache structure is used to speed up the process of reconstruction and a threshold normalization scheme is presented to produce a higher quality rendered image. We have combined our compression method with shear-warp factorization, which is an accelerated volume rendering algorithm. Experimental results show the space requirement to be about 27:1 and the rendering time to be about 3 seconds for {{}} data sets while preserving the quality of an image as like as using original data.

Binary and Halftone Image Data Hiding Technique using Run-Length (RLE를 이용한 이진 이미지 및 하프톤 영상에 데이터 은폐 기술)

  • Kim, Cheon-Shik;Hong, You-Sik;Han, Chang-Pyoung;Oh, Seon
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.9 no.3
    • /
    • pp.37-43
    • /
    • 2009
  • In this paper, we proposed that a novel method base on a binary image that technique is proposed for data hiding into binary images and halftone image. A binary image is bitmap image and halftone is composed by two-tone value in a limited region in an image. For this reason, it is not easy to hide messages in binary images. PWLC is a new method to hide a message in binary images. However, it yields images of unacceptable quality, unless you should change very few of it. Therefore, in order to solve this problem, we used run-length method into binary images. That is, we find a proper region to hide messages. In this paper, we proposed new method to hide messages in binary images. In addition, we proved that our algorithm is better than PWLC through the experiment.

  • PDF

Storage systems using RLE compression (RLE 압축 기법을 이용한 저장 시스템)

  • Kim, Kyeong-Og;Kim, Jong-Chan;Ban, Kyeong-Jin;Heo, Su-Yeon;Kim, Eung-Kon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.05a
    • /
    • pp.686-688
    • /
    • 2010
  • The supply of context information is increasing with the propagation of ubiquitous computing environment. Recently, as context information is being collected through electronic tags and sensors attached to the environment, we need methods to efficiently store and search large volumes of data. This paper describes the application of the RLE (Run Length Encoding) compression method for sensors that continuously collect data in USN/RFID terminals.Time information is marked on the data and one data block is generated and saved. This paper proposes a storage method that allows us to quickly search data of the desired time and place by recording time information in continuous data.

  • PDF

Adjustment of Control Limits for Geometric Charts

  • Kim, Byung Jun;Lee, Jaeheon
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.5
    • /
    • pp.519-530
    • /
    • 2015
  • The geometric chart has proven more effective than Shewhart p or np charts to monitor the proportion nonconforming in high-quality processes. Implementing a geometric chart commonly requires the assumption that the in-control proportion nonconforming is known or accurately estimated. However, accurate parameter estimation is very difficult and may require a larger sample size than that available in practice in high-quality process where the proportion of nonconforming items is very small. Thus, the error in the parameter estimation increases and may lead to deterioration in the performance of the control chart if a sample size is inadequate. We suggest adjusting the control limits in order to improve the performance when a sample size is insufficient to estimate the parameter. We propose a linear function for the adjustment constant, which is a function of the sample size, the number of nonconforming items in a sample, and the false alarm rate. We also compare the performance of the geometric charts without and with adjustment using the expected value of the average run length (ARL) and the standard deviation of the ARL (SDARL).

Copula modelling for multivariate statistical process control: a review

  • Busababodhin, Piyapatr;Amphanthong, Pimpan
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.6
    • /
    • pp.497-515
    • /
    • 2016
  • Modern processes often monitor more than one quality characteristic that are referred to as multivariate statistical process control (MSPC) procedures. The MSPC is the most rapidly developing sector of statistical process control and increases interest in the simultaneous inspection of several related quality characteristics. Most multivariate detection procedures based on a multi-normality assumptions are independent, but there are many processes that assume non-normality and correlation. Many multivariate control charts have a lack of related joint distribution. Copulas are tool to construct multivariate modelling and formalizing the dependence structure between random variables and applied in several fields. From copula literature review, there are a few copula to apply in MSPC that have multivariate control charts, and represent a successful tool to identify an out-of-control process. This paper presents various types of copulas modelling for the multivariate control chart. The performance measures of the control chart are the average run length (ARL) and the average number of observations to signal (ANOS). Furthermore, a Monte Carlo simulation is shown when the observations were from an exponential distribution.

Reconstruction of Color-Volume Data for Three-Dimensional Human Anatomic Atlas (3차원 인체 해부도 작성을 위한 칼라 볼륨 데이터의 입체 영상 재구성)

  • 김보형;이철희
    • Journal of Biomedical Engineering Research
    • /
    • v.19 no.2
    • /
    • pp.199-210
    • /
    • 1998
  • In this paper, we present a 3D reconstruction method of color volume data for a computerized human atlas. Binary volume rendering which takes the advantages of object-order ray traversal and run-length encoding visualizes 3D organs at an interactive speed in a general PC without the help of specific hardwares. This rendering method improves the rendering speed by simplifying the determination of the pixel value of an intermediate depth image and applying newly developed normal vector calculation method. Moreover, we describe the 3D boundary encoding that reduces the involved data considerably without the penalty of image quality. The interactive speed of the binary rendering and the storage efficiency of 3D boundary encoding will accelerate the development of the PC-based human atlas.

  • PDF