• Title/Summary/Keyword: K-means algorithm

Search Result 1,363, Processing Time 0.031 seconds

Performance Evaluation of Hybrid-SE-MMA Adaptive Equalizer using Adaptive Modulus and Adaptive Step Size (적응 모듈러스와 적응 스텝 크기를 이용한 Hybrid-SE-MMA 적응 등화기의 성능 평가)

  • Lim, Seung-Gag
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.2
    • /
    • pp.97-102
    • /
    • 2020
  • This paper relates with the Hybrid-SE-MMA (Signed-Error MMA) that is possible to improving the equalization performance by using the adaptive modulus and adaptive step size in SE-MMA adaptive equalizer for the minimizing the intersymbol interference. The equalizer tap coefficient is updatted use the error signal in MMA algorithm for adaptive equalizer. But the sign of error signal is used for the simplification of arithmetic operation in SE-MMA algorithm in order to updating the coefficient. By this simplification, we get the fast convergence speed and the reduce the algorithm processing speed, but not in the equalization performance. In this paper, it is possible to improve the equalization performance by computer simulation applying the adaptive modulus to the SE-MMA which is proposional to the power of equalizer output signal. In order to compare the improved equalization performance compared to the present SE-MMA, the recovered signal constellation that is the output of the equalizer, residual isi, MD(maximum distortion), MSE and the SER perfomance that means the robustness to the external noise were used. As a result of computer simulation, the Hybrid-SE-MMA improve equalization performance in the residual isi and MD, MSE, SER than the SE-MMA.

A Statistical Approach for Improving the Embedding Capacity of Block Matching based Image Steganography (블록 매칭 기반 영상 스테가노그래피의 삽입 용량 개선을 위한 통계적 접근 방법)

  • Kim, Jaeyoung;Park, Hanhoon;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.22 no.5
    • /
    • pp.643-651
    • /
    • 2017
  • Steganography is one of information hiding technologies and discriminated from cryptography in that it focuses on avoiding the existence the hidden information from being detected by third parties, rather than protecting it from being decoded. In this paper, as an image steganography method which uses images as media, we propose a new block matching method that embeds information into the discrete wavelet transform (DWT) domain. The proposed method, based on a statistical analysis, reduces loss of embedding capacity due to inequable use of candidate blocks. It works in such a way that computes the variance of each candidate block, preserves candidate blocks with high frequency components while reducing candidate blocks with low frequency components by compressing them exploiting the k-means clustering algorithm. Compared with the previous block matching method, the proposed method can reconstruct secret images with similar PSNRs while embedding higher-capacity information.

Semi Automatic Ontology Generation about XML Documents

  • Gu Mi Sug;Hwang Jeong Hee;Ryu Keun Ho;Jung Doo Yeong;Lee Keum Woo
    • Proceedings of the KSRS Conference
    • /
    • 2004.10a
    • /
    • pp.730-733
    • /
    • 2004
  • Recently XML (eXtensible Markup Language) is becoming the standard for exchanging the documents on the web. And as the amount of information is increasing because of the development of the technique in the Internet, semantic web is becoming to appear for more exact result of information retrieval than the existing one on the web. Ontology which is the basis of the semantic web provides the basic knowledge system to express a particular knowledge. So it can show the exact result of the information retrieval. Ontology defines the particular concepts and the relationships between the concepts about specific domain and it has the hierarchy similar to the taxonomy. In this paper, we propose the generation of semi-automatic ontology based on XML documents that are interesting to many researchers as the means of knowledge expression. To construct the ontology in a particular domain, we suggest the algorithm to determine the domain. So we determined that the domain of ontology is to extract the information of movie on the web. And we used the generalized association rules, one of data mining methods, to generate the ontology, using the tag and contents of XML documents. And XTM (XML Topic Maps), ISO Standard, is used to construct the ontology as an ontology language. The advantage of this method is that because we construct the ontology based on the terms frequently used documents related in the domain, it is useful to query and retrieve the related domain.

  • PDF

Moving Object Detection using Clausius Entropy and Adaptive Gaussian Mixture Model (클라우지우스 엔트로피와 적응적 가우시안 혼합 모델을 이용한 움직임 객체 검출)

  • Park, Jong-Hyun;Lee, Gee-Sang;Toan, Nguyen Dinh;Cho, Wan-Hyun;Park, Soon-Young
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.1
    • /
    • pp.22-29
    • /
    • 2010
  • A real-time detection and tracking of moving objects in video sequences is very important for smart surveillance systems. In this paper, we propose a novel algorithm for the detection of moving objects that is the entropy-based adaptive Gaussian mixture model (AGMM). First, the increment of entropy generally means the increment of complexity, and objects in unstable conditions cause higher entropy variations. Hence, if we apply these properties to the motion segmentation, pixels with large changes in entropy in moments have a higher chance in belonging to moving objects. Therefore, we apply the Clausius entropy theory to convert the pixel value in an image domain into the amount of energy change in an entropy domain. Second, we use an adaptive background subtraction method to detect moving objects. This models entropy variations from backgrounds as a mixture of Gaussians. Experiment results demonstrate that our method can detect motion object effectively and reliably.

New separation technique of regional-residual gravity anomaly using geostatistical spatial filtering (공간필터링을 이용한 중력이상의 광역-잔여 이상 효과 분리)

  • Rim, Hyoung-Rae;Park, Yeong-Sue;Lim, Mu-Teak;Koo, Sung-Bon;Lee, Young-Chal
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2006.06a
    • /
    • pp.155-160
    • /
    • 2006
  • In this paper, we propose a spatial filtering scheme using factorial kriging, one of geostatistical filtering methodin order to separate regional and residual gravity anomaly. This scheme is based on the assumption that regional anomalies have longer distance relation and residual anomalies have effected on smaller range. We decomposed gravity anomalies intotwo variogram models with long and short effectiveranges by means of factorial kriging. And decomposed variogram models produced the regional and residual anomalies. This algorithm was examined using by a synthetic gravity data, and applied to a real microgravity data to figure out abandoned mineshaft.

  • PDF

Domain Decomposition Strategy for Pin-wise Full-Core Monte Carlo Depletion Calculation with the Reactor Monte Carlo Code

  • Liang, Jingang;Wang, Kan;Qiu, Yishu;Chai, Xiaoming;Qiang, Shenglong
    • Nuclear Engineering and Technology
    • /
    • v.48 no.3
    • /
    • pp.635-641
    • /
    • 2016
  • Because of prohibitive data storage requirements in large-scale simulations, the memory problem is an obstacle for Monte Carlo (MC) codes in accomplishing pin-wise three-dimensional (3D) full-core calculations, particularly for whole-core depletion analyses. Various kinds of data are evaluated and quantificational total memory requirements are analyzed based on the Reactor Monte Carlo (RMC) code, showing that tally data, material data, and isotope densities in depletion are three major parts of memory storage. The domain decomposition method is investigated as a means of saving memory, by dividing spatial geometry into domains that are simulated separately by parallel processors. For the validity of particle tracking during transport simulations, particles need to be communicated between domains. In consideration of efficiency, an asynchronous particle communication algorithm is designed and implemented. Furthermore, we couple the domain decomposition method with MC burnup process, under a strategy of utilizing consistent domain partition in both transport and depletion modules. A numerical test of 3D full-core burnup calculations is carried out, indicating that the RMC code, with the domain decomposition method, is capable of pin-wise full-core burnup calculations with millions of depletion regions.

Research on Speed Estimation Method of Induction Motor based on Improved Fuzzy Kalman Filtering

  • Chen, Dezhi;Bai, Baodong;Du, Ning;Li, Baopeng;Wang, Jiayin
    • Journal of international Conference on Electrical Machines and Systems
    • /
    • v.3 no.3
    • /
    • pp.272-275
    • /
    • 2014
  • An improved fuzzy Kalman filtering speed estimation scheme was proposed by means of measuring stator side voltage and current value based on vector control state equation of induction motor. The designed fuzzy adaptive controller conducted recursive online correction of measurement noise covariance matrix by monitoring the ratio of theory residuals and actual residuals to make it approach real noise level gradually, allowing the filter to perform optimal estimation to improve estimation accuracy of EKF. Meanwhile, co-simulation scheme based on MATLAB and Ansoft was proposed in order to improve simulation accuracy. Field-circuit coupling problems of induction motor under the action of vector control were solved and the parameter optimization accuracy was improved dramatically. The simulation and experimental results show that this algorithm has a strong ability to inhibit the random measurement noise. It is able to estimate motor speed accurately, and has superior static and dynamic characteristics.

Implementation of the Blood Pressure and Blood Flow Variation Rate Detection System using Impedance Method (임피던스법을 이용한 혈압 및 혈류 변화량 검출 시스템 구현)

  • Ro, Jung-Hoon;Bae, Jin-Woo;Ye, Soo-Young;Shin, Bum-Joo;Jeon, Gye-Rok
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.10 no.8
    • /
    • pp.1926-1938
    • /
    • 2009
  • In this study, detection system of the blood flow variation rate was implemented using the variation effect of bio electric impedance at time of the blood pressure measurement by means of impedance method. The blood pressure measurement was performed by the oscillometric method. The mean arterial pressure was calculated using maximum amplitude algorithm. The systolic and diastolic pressure were estimated by establishment of the various characteristic ratio according to mean arterial pressure range. Alternative static current source and lock_in amplifier were introduced to impedance measurement. The variation of blood volume was measured using variation bio impedance according to induced cuff pressure at measuring area.

Combined Image Retrieval System using Clustering and Condensation Method (클러스터링과 차원축약 기법을 통합한 영상 검색 시스템)

  • Lee Se-Han;Cho Jungwon;Choi Byung-Uk
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.43 no.1 s.307
    • /
    • pp.53-66
    • /
    • 2006
  • This paper proposes the combined image retrieval system that gives the same relevance as exhaustive search method while its performance can be considerably improved. This system is combined with two different retrieval methods and each gives the same results that full exhaustive search method does. Both of them are two-stage method. One uses condensation of feature vectors, and the other uses binary-tree clustering. These two methods extract the candidate images that always include correct answers at the first stage, and then filter out the incorrect images at the second stage. Inasmuch as these methods use equal algorithm, they can get the same result as full exhaustive search. The first method condenses the dimension of feature vectors, and it uses these condensed feature vectors to compute similarity of query and images in database. It can be found that there is an optimal condensation ratio which minimizes the overall retrieval time. The optimal ratio is applied to first stage of this method. Binary-tree clustering method, searching with recursive 2-means clustering, classifies each cluster dynamically with the same radius. For preserving relevance, its range of query has to be compensated at first stage. After candidate clusters were selected, final results are retrieved by computing similarities again at second stage. The proposed method is combined with above two methods. Because they are not dependent on each other, combined retrieval system can make a remarkable progress in performance.

A Study on Detecting Black IPs for Using Destination Ports of Darknet Traffic (다크넷 트래픽의 목적지 포트를 활용한 블랙 IP 탐지에 관한 연구)

  • Park, Jinhak;Kwon, Taewoong;Lee, Younsu;Choi, Sangsoo;Song, Jungsuk
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.27 no.4
    • /
    • pp.821-830
    • /
    • 2017
  • The internet is an important infra resource that it controls the economy and society of our country. Also, it is providing convenience and efficiency of the everyday life. But, a case of various are occurred through an using vulnerability of an internet infra resource. Recently various attacks of unknown to the user are an increasing trend. Also, currently system of security control is focussing on patterns for detecting attacks. However, internet threats are consistently increasing by intelligent and advanced various attacks. In recent, the darknet is received attention to research for detecting unknown attacks. Since the darknet means a set of unused IP addresses, no real systems connected to the darknet. In this paper, we proposed an algorithm for finding black IPs through collected the darknet traffic based on a statistics data of port information. The proposed method prepared 8,192 darknet space and collected the darknet traffic during 3 months. It collected total 827,254,121 during 3 months of 2016. Applied results of the proposed algorithm, black IPs are June 19, July 21, and August 17. In this paper, results by analysis identify to detect frequency of black IPs and find new black IPs of caused potential cyber threats.