• Title/Summary/Keyword: Mean normalization

Search Result 147, Processing Time 0.029 seconds

Formant-broadened CMS Using the Log-spectrum Transformed from the Cepstrum (켑스트럼으로부터 변환된 로그 스펙트럼을 이용한 포먼트 평활화 켑스트럴 평균 차감법)

  • 김유진;정혜경;정재호
    • The Journal of the Acoustical Society of Korea
    • /
    • v.21 no.4
    • /
    • pp.361-373
    • /
    • 2002
  • In this paper, we propose a channel normalization method to improve the performance of CMS (cepstral mean subtraction) which is widely adopted to normalize a channel variation for speech and speaker recognition. CMS which estimates the channel effects by averaging long-term cepstrum has a weak point that the estimated channel is biased by the formants of voiced speech which include a useful speech information. The proposed Formant-broadened Cepstral Mean Subtraction (FBCMS) is based on the facts that the formants can be found easily in log spectrum which is transformed from the cepstrum by fourier transform and the formants correspond to the dominant poles of all-pole model which is usually modeled vocal tract. The FBCMS evaluates only poles to be broadened from the log spectrum without polynomial factorization and makes a formant-broadened cepstrum by broadening the bandwidths of formant poles. We can estimate the channel cepstrum effectively by averaging formant-broadened cepstral coefficients. We performed the experiments to compare FBCMS with CMS, PFCMS using 4 simulated telephone channels. In the experiment of channel estimation, we evaluated the distance cepstrum of real channel from the cepstrum of estimated channel and found that we were able to get the mean cepstrum closer to the channel cepstrum due to an softening the bias of mean cepstrum to speech. In the experiment of text-independent speaker identification, we showed the result that the proposed method was superior than the conventional CMS and comparable to the pole-filtered CMS. Consequently, we showed the proposed method was efficiently able to normalize the channel variation based on the conventional CMS.

Optimized Integer Cosine Transform (최적화 정수형 여현 변환)

  • 이종하;김혜숙;송인준;곽훈성
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.32B no.9
    • /
    • pp.1207-1214
    • /
    • 1995
  • We present an optimized integer cosine transform(OICT) as an alternative approach to the conventional discrete cosine transform(DCT), and its fast computational algorithm. In the actual implementation of the OICT, we have used the techniques similar to those of the orthogonal integer transform(OIT). The normalization factors are approximated to single one while keeping the reconstruction error at the best tolerable level. By obtaining a single normalization factor, both forward and inverse transform are performed using only the integers. However, there are so many sets of integers that are selected in the above manner, the best OICT matrix obtained through value minimizing the Hibert-Schmidt norm and achieving fast computational algorithm. Using matrix decomposing, a fast algorithm for efficient computation of the order-8 OICT is developed, which is minimized to 20 integer multiplications. This enables us to implement a high performance 2-D DCT processor by replacing the floating point operations by the integer number operations. We have also run the simulation to test the performance of the order-8 OICT with the transform efficiency, maximum reducible bits, and mean square error for the Wiener filter. When the results are compared to those of the DCT and OIT, the OICT has out-performed them all. Furthermore, when the conventional DCT coefficients are reduced to 7-bit as those of the OICT, the resulting reconstructed images were critically impaired losing the orthogonal property of the original DCT. However, the 7-bit OICT maintains a zero mean square reconstruction error.

  • PDF

NORMALIZATION OF THE HAMILTONIAN AND THE ACTION SPECTRUM

  • OH YONG-GEUN
    • Journal of the Korean Mathematical Society
    • /
    • v.42 no.1
    • /
    • pp.65-83
    • /
    • 2005
  • In this paper, we prove that the two well-known natural normalizations of Hamiltonian functions on the symplectic manifold ($M,\;{\omega}$) canonically relate the action spectra of different normalized Hamiltonians on arbitrary symplectic manifolds ($M,\;{\omega}$). The natural classes of normalized Hamiltonians consist of those whose mean value is zero for the closed manifold, and those which are compactly supported in IntM for the open manifold. We also study the effect of the action spectrum under the ${\pi}_1$ of Hamiltonian diffeomorphism group. This forms a foundational basis for our study of spectral invariants of the Hamiltonian diffeomorphism in [8].

Tone Mapping Method using Non-linear Dynamic Range Normalization for High Dynamic Range Images (HDR 영상을 위한 비선형 동적영역 정규화를 이용한 톤 매핑 기법)

  • Kim, Beom-Yong;Hwang, Bo-Hyun;Yun, Jong-Ho;Choi, Myung-Ryul
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.851-852
    • /
    • 2008
  • In this paper, we propose a tone mapping method using Non-linear Dynamic Range Normalization (NDRN) for High Dynamic Range (HDR) images. HDR images are not suitable for commercial display devices because dynamic range of HDR images do not match with one of Low Dynamic Range (LDR) display devices. To reproduce a tone of HDR images for LDR displays, tone mapping methods have been proposed such as local and global tone mapping. We introduce NDRN to locate mean of HDR images at the center of LDR. NDRN preserves the details for highlight and shadow. By suppressing the significant luminance change in tone mapping, naturalness of original image can be also preserved. The experimental results show that the proposed method preserves details and naturalness of original images.

  • PDF

A Correction Approach to Bidirectional Effects of EO-1 Hyperion Data for Forest Classification

  • Park, Seung-Hwan;Kim, Choen
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.1470-1472
    • /
    • 2003
  • Hyperion, as hyperspectral data, is carried on NASA’s EO-1 satellite, can be used in more subtle discrimination on forest cover, with 224 band in 360 ?2580 nm (10nm interval). In this study, Hyperion image is used to investigate the effects of topography on the classification of forest cover, and to assess whether the topographic correction improves the discrimination of species units for practical forest mapping. A publicly available Digital Elevation Model (DEM), at a scale of 1:25,000, is used to model the radiance variation on forest, considering MSR(Mean Spectral Ratio) on antithesis aspects. Hyperion, as hyperspectral data, is corrected on a pixel-by-pixel basis to normalize the scene to a uniform solar illumination and viewing geometry. As a result, the approach on topographic effect normalization in hyperspectral data can effectively reduce the variation in detected radiance due to changes in forest illumination, progress the classification of forest cover.

  • PDF

Scene Change Detection with 3-Step Process (3단계 과정의 장면 전환검출)

  • Yoon, Shin-Seong;Won, Rhee-Yang
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.6
    • /
    • pp.147-154
    • /
    • 2008
  • First, this paper compute difference value between frames using the composed method of $X^2$ histogram and color histogram and the normalization. Next, cluster representative frame was decided by using the clustering for distance and the k-mean grouping. Finally, representative frame of group was decided by using the likelihood ratio. Proposed method can be known by experiment as outstanding of detection rather than other methods, due to computing of difference value, clustering and grouping, and detecting of representative frame.

  • PDF

The Predictive Factors of the Serum Creatine Kinase Level Normalization Time in Patients with Rhabdomyolysis due to Doxylamine Ingestion (독시라민 중독으로 발생한 횡문근융해증 환자에게서 혈중 크레아틴인산활성화효소 수치가 정상화되는 시기를 예측할 수 있는 인자)

  • Shin, Min-Chul;Kwon, Oh-Young;Lee, Jong-Suk;Choi, Han-Sung;Hong, Hoon-Pyo;Ko, Young-Gwan
    • Journal of The Korean Society of Clinical Toxicology
    • /
    • v.7 no.2
    • /
    • pp.156-163
    • /
    • 2009
  • Purpose: Doxylamine succinate (DS) is frequently used to treat insomnia and it may induce rhabdomyolysis in the overdose cases. The purpose of this study is to evaluate the factors that can predict the serum creatine kinase (CK) level normalization time for patients with rhabdomyolysis due to DS ingestion. Methods: This study was conducted on 71 patients who were admitted with rhabdomyolysis after DS ingestion during the period from January 2000 to July 2009. Rhabdomyolysis was defined as a serum CK level over 1,000 U/L. The collected data included the general characteristics, the anticholinergic symptoms, the ingested dose, the peak serum CK level, the time interval (TI) from the event to the peak CK level and the TI from the event to a CK level below 1,000 U/L. We evaluated the correlation between the patients' variables and the TI from the event to the peak CK level time and the time for a CK level below 1,000 U/L. Results: The mean ingested dose per body weight (BW) was $30.86{\pm}18.63\;mg/kg$ and the mean TI from the event to treatment was $4.04{\pm}3.67$ hours. The TI from the event to the peak CK level was longer for the patients with a larger ingestion dose per BW (r=0.587, p<0.05). The CK normalization time was longer for the patients with a larger ingested dose per BW (r=0.446, p<0.05) and a higher peak CK level (r=0.634, p<0.05). Conclusion: The ingested dose per BW was correlated with the TI from the event to the peak CK level, and the ingested dose per BW and the peak CK level have significant correlations with the CK normalization time. These factors may be used to determine the discharge period of patients who had rhabdomyolysis following a OS overdose.

  • PDF

Truncation Artifact Reduction Using Weighted Normalization Method in Prototype R/F Chest Digital Tomosynthesis (CDT) System (프로토타입 R/F 흉부 디지털 단층영상합성장치 시스템에서 잘림 아티팩트 감소를 위한 가중 정규화 접근법에 대한 연구)

  • Son, Junyoung;Choi, Sunghoon;Lee, Donghoon;Kim, Hee-Joung
    • Journal of the Korean Society of Radiology
    • /
    • v.13 no.1
    • /
    • pp.111-118
    • /
    • 2019
  • Chest digital tomosynthesis has become a practical imaging modality because it can solve the problem of anatomy overlapping in conventional chest radiography. However, because of both limited scan angle and finite-size detector, a portion of chest cannot be represented in some or all of the projection. These bring a discontinuity in intensity across the field of view boundaries in the reconstructed slices, which we refer to as the truncation artifacts. The purpose of this study was to reduce truncation artifacts using a weighted normalization approach and to investigate the performance of this approach for our prototype chest digital tomosynthesis system. The system source-to-image distance was 1100 mm, and the center of rotation of X-ray source was located on 100 mm above the detector surface. After obtaining 41 projection views with ${\pm}20^{\circ}$ degrees, tomosynthesis slices were reconstructed with the filtered back projection algorithm. For quantitative evaluation, peak signal to noise ratio and structure similarity index values were evaluated after reconstructing reference image using simulation, and mean value of specific direction values was evaluated using real data. Simulation results showed that the peak signal to noise ratio and structure similarity index was improved respectively. In the case of the experimental results showed that the effect of artifact in the mean value of specific direction of the reconstructed image was reduced. In conclusion, the weighted normalization method improves the quality of image by reducing truncation artifacts. These results suggested that weighted normalization method could improve the image quality of chest digital tomosynthesis.

DEFORMING PINCHED HYPERSURFACES OF THE HYPERBOLIC SPACE BY POWERS OF THE MEAN CURVATURE INTO SPHERES

  • Guo, Shunzi;Li, Guanghan;Wu, Chuanxi
    • Journal of the Korean Mathematical Society
    • /
    • v.53 no.4
    • /
    • pp.737-767
    • /
    • 2016
  • This paper concerns closed hypersurfaces of dimension $n{\geq}2$ in the hyperbolic space ${\mathbb{H}}_{\kappa}^{n+1}$ of constant sectional curvature evolving in direction of its normal vector, where the speed equals a power ${\beta}{\geq}1$ of the mean curvature. The main result is that if the initial closed, weakly h-convex hypersurface satisfies that the ratio of the biggest and smallest principal curvature at everywhere is close enough to 1, depending only on n and ${\beta}$, then under the flow this is maintained, there exists a unique, smooth solution of the flow which converges to a single point in ${\mathbb{H}}_{\kappa}^{n+1}$ in a maximal finite time, and when rescaling appropriately, the evolving hypersurfaces exponential convergence to a unit geodesic sphere of ${\mathbb{H}}_{\kappa}^{n+1}$.

A Study on the Training Optimization Using Genetic Algorithm -In case of Statistical Classification considering Normal Distribution- (유전자 알고리즘을 이용한 트레이닝 최적화 기법 연구 - 정규분포를 고려한 통계적 영상분류의 경우 -)

  • 어양담;조봉환;이용웅;김용일
    • Korean Journal of Remote Sensing
    • /
    • v.15 no.3
    • /
    • pp.195-208
    • /
    • 1999
  • In the classification of satellite images, the representative of training of classes is very important factor that affects the classification accuracy. Hence, in order to improve the classification accuracy, it is required to optimize pre-classification stage which determines classification parameters rather than to develop classifiers alone. In this study, the normality of training are calculated at the preclassification stage using SPOT XS and LANDSAT TM. A correlation coefficient of multivariate Q-Q plot with 5% significance level and a variance of initial training are considered as an object function of genetic algorithm in the training normalization process. As a result of normalization of training using the genetic algorithm, it was proved that, for the study area, the mean and variance of each class shifted to the population, and the result showed the possibility of prediction of the distribution of each class.