• Title/Summary/Keyword: feature vector calculation

Search Result 31, Processing Time 0.028 seconds

Image Retrieval Using Spacial Color Correlation and Local Texture Characteristics (칼라의 공간적 상관관계 및 국부 질감 특성을 이용한 영상검색)

  • Sung, Joong-Ki;Chun, Young-Deok;Kim, Nam-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.5 s.305
    • /
    • pp.103-114
    • /
    • 2005
  • This paper presents a content-based image retrieval (CBIR) method using the combination of color and texture features. As a color feature, a color autocorrelogram is chosen which is extracted from the hue and saturation components of a color image. As a texture feature, BDIP(block difference of inverse probabilities) and BVLC(block variation of local correlation coefficients) are chosen which are extracted from the value component. When the features are extracted, the color autocorrelogram and the BVLC are simplified in consideration of their calculation complexity. After the feature extraction, vector components of these features are efficiently quantized in consideration of their storage space. Experiments for Corel and VisTex DBs show that the proposed retrieval method yields 9.5% maximum precision gain over the method using only the color autucorrelogram and 4.0% over the BDIP-BVLC. Also, the proposed method yields 12.6%, 14.6%, and 27.9% maximum precision gains over the methods using wavelet moments, CSD, and color histogram, respectively.

An implementation of automated ECG interpretation algorithm and system(IV) - Typificator (심전도 자동 진단 알고리즘 및 장치 구현(IV) - 특성표시기)

  • Kweon, H.J.;Jeong, K.S.;Song, C.G.;Shin, K.S.;Lee, M.H.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1996 no.05
    • /
    • pp.293-297
    • /
    • 1996
  • For the representative beat calculation and efficient rhythm analysis new method, that is, QRS typification were proposed. A problem that were resulted from pattern classification based on binary logic could be solved out by the fuzzy clustering and classification nodes could be reduced by using the proposed new feature vector. The accurate representative beat could be obtained by excluding the ST-T segment that happened outlier through ST-T segment typification procedure.

  • PDF

Enhancement of the k-Means Clustering Speed by Emulation of Birds' Motion in Flock (새떼 이동의 모방에 의한 k-평균 군집 속도의 향상)

  • Lee, Chang-Young
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.9 no.9
    • /
    • pp.965-970
    • /
    • 2014
  • In an effort to improve the convergence speed in k-means clustering, we introduce the notion of the birds' movement in a flock. Their motion is characterized by the observation that each bird runs after his nearest neighbor. We utilize this feature in clustering procedure. Once the class of a vector is determined, then a number of vectors in the vicinity of it are assigned to the same class. Experiments have shown that the required number of iterations for termination is significantly lower in the proposed method than in the conventional one. Furthermore, the time of calculation per iteration is more than 5% shorter in the proposed case. The quality of the clustering, as determined from the total accumulated distance between the vector and its centroid vector, was found to be practically the same. It might be phrased that we may acquire practically the same clustering result with shorter computational time.

Quantization Based Speaker Normalization for DHMM Speech Recognition System (DHMM 음성 인식 시스템을 위한 양자화 기반의 화자 정규화)

  • 신옥근
    • The Journal of the Acoustical Society of Korea
    • /
    • v.22 no.4
    • /
    • pp.299-307
    • /
    • 2003
  • There have been many studies on speaker normalization which aims to minimize the effects of speaker's vocal tract length on the recognition performance of the speaker independent speech recognition system. In this paper, we propose a simple vector quantizer based linear warping speaker normalization method based on the observation that the vector quantizer can be successfully used for speaker verification. For this purpose, we firstly generate an optimal codebook which will be used as the basis of the speaker normalization, and then the warping factor of the unknown speaker will be extracted by comparing the feature vectors and the codebook. Finally, the extracted warping factor is used to linearly warp the Mel scale filter bank adopted in the course of MFCC calculation. To test the performance of the proposed method, a series of recognition experiments are conducted on discrete HMM with thirteen mono-syllabic Korean number utterances. The results showed that about 29% of word error rate can be reduced, and that the proposed warping factor extraction method is useful due to its simplicity compared to other line search warping methods.

Improvement of Domain-specific Keyword Spotting Performance Using Hybrid Confidence Measure (하이브리드 신뢰도를 이용한 제한 영역 핵심어 검출 성능향상)

  • 이경록;서현철;최승호;최승호;김진영
    • The Journal of the Acoustical Society of Korea
    • /
    • v.21 no.7
    • /
    • pp.632-640
    • /
    • 2002
  • In this paper, we proposed ACM (Anti-filler confidence measure) to compensate shortcoming of conventional RLJ-CM (RLJ-CM) and NCM (normalized CM), and integrated proposed ACM and conventional NCM using HCM (hybrid CM). Proposed ACM analyzes that FA (false acceptance) happens by the construction method of anti-phone model, and presumed phoneme sequence in actuality using phoneme recognizer to compensate this. We defined this as anti-phone model and used in confidence measure calculation. Analyzing feature of two confidences measure, conventional NCM shows good performance to FR (false rejection) and proposed ACM shows good performance in FA. This shows that feature of each other are complementary. Use these feature, we integrated two confidence measures using weighting vector α And defined this as HCM. In MDR (missed detection rate) 10% neighborhood, HCM is 0.219 FA/KW/HR (false alarm/keyword/hour). This is that Performance improves 22% than used conventional NCM individually.

Methods for Swing Recognition and Shuttle Cock's Trajectory Calculation in a Tangible Badminton Game (체감형 배드민턴 게임을 위한 스윙 인식과 셔틀콕 궤적 계산 방법)

  • Kim, Sangchul
    • Journal of Korea Game Society
    • /
    • v.14 no.2
    • /
    • pp.67-76
    • /
    • 2014
  • Recently there have been many interests on tangible sport games that can recognize the motions of players. In this paper, we propose essential technologies required for tangible games, which are methods for swing motion recognition and the calculation of shuttle cock's trajectory. When a user carries out a badminton swing while holding a smartphone with his hand, the motion signal generated by smartphone-embedded acceleration sensors is transformed into a feature vector through a Daubechies filter, and then its swing type is recognized using a k-NN based method. The method for swing motion presented herein provides an advantage in a way that a player can enjoy tangible games without purchasing a commercial motion controller. Since a badminton shuttle cock has a particular flight trajectory due to the nature of its shape, it is not easy to calculate the trajectory of the shuttle cock using simple physics rules about force and velocity. In this paper, we propose a method for calculating the flight trajectory of a badminton shuttle cock in which the wind effect is considered.

Fast Motion Estimation for Variable Motion Block Size in H.264 Standard (H.264 표준의 가변 움직임 블록을 위한 고속 움직임 탐색 기법)

  • 최웅일;전병우
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.6
    • /
    • pp.209-220
    • /
    • 2004
  • The main feature of H.264 standard against conventional video standards is the high coding efficiency and the network friendliness. In spite of these outstanding features, it is not easy to implement H.264 codec as a real-time system due to its high requirement of memory bandwidth and intensive computation. Although the variable block size motion compensation using multiple reference frames is one of the key coding tools to bring about its main performance gain, it demands substantial computational complexity due to SAD (Sum of Absolute Difference) calculation among all possible combinations of coding modes to find the best motion vector. For speedup of motion estimation process, therefore, this paper proposes fast algorithms for both integer-pel and fractional-pel motion search. Since many conventional fast integer-pel motion estimation algorithms are not suitable for H.264 having variable motion block sizes, we propose the motion field adaptive search using the hierarchical block structure based on the diamond search applicable to variable motion block sizes. Besides, we also propose fast fractional-pel motion search using small diamond search centered by predictive motion vector based on statistical characteristic of motion vector.

Berg Balance Scale Score Classification Study Using Inertial Sensor (관성센서를 이용한 버그균형검사 점수 분류 연구)

  • Hong, Sangpyo;Kim, Yeon-wook;Cho, WooHyeong;Joa, Kyung-Lim;Jung, Han-Young;Kim, K.S.;Lee, S.M.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • v.11 no.1
    • /
    • pp.53-62
    • /
    • 2017
  • In this paper, we present the score classification accuracy of BBS(Berg Balance Scale) which is the most commonly used balance evaluation tool using machine learning. Data acquisition was performed using the Noraxon system and an inertial sensor of Noraxon system was attached to the body in 8 locations (left and right ankle, left and right upper buttocks, left and right wrists, back, forehead). Based on the 3-axis accelerometer of the inertial sensor, the feature vector STFT(Short Time Fourier Transform) and SAM(Signal Area Magnitude) were extracted. Then, the items of the BBS were divided into static movement and dynamic movement depending on the operation characteristics, and the feature vectors were selected according to the sensor attachment positions which affect the score for each item of the BBS. Feature vectors selected for each item of BBS were classified using GMM(Gaussian Mixture Model). As a result of the accuracy calculation for 40 subjects, 55.5%, 72.2%, 87.5%, 50%, 35.1%, 62.5%, 43.3%, 58.6%, 60.7%, 33.3%, 44.8%, 89.2%, 51.8%, 85.1%, respectively.

Design of Dynamic Buffer Assignment and Message model for Large-scale Process Monitoring of Personalized Health Data (개인화된 건강 데이터의 대량 처리 모니터링을 위한 메시지 모델 및 동적 버퍼 할당 설계)

  • Jeon, Young-Jun;Hwang, Hee-Joung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.6
    • /
    • pp.187-193
    • /
    • 2015
  • The ICT healing platform sets a couple of goals including preventing chronic diseases and sending out early disease warnings based on personal information such as bio-signals and life habits. The 2-step open system(TOS) had a relay designed between the healing platform and the storage of personal health data. It also took into account a publish/subscribe(pub/sub) service based on large-scale connections to transmit(monitor) the data processing process in real time. In the early design of TOS pub/sub, however, the same buffers were allocated regardless of connection idling and type of message in order to encode connection messages into a deflate algorithm. Proposed in this study, the dynamic buffer allocation was performed as follows: the message transmission type of each connection was first put to queuing; each queue was extracted for its feature, computed, and converted into vector through tf-idf, then being entered into a k-means cluster and forming a cluster; connections categorized under a certain cluster would re-allocate the resources according to the resource table of the cluster; the centroid of each cluster would select a queuing pattern to represent the cluster in advance and present it as a resource reference table(encoding efficiency by the buffer sizes); and the proposed design would perform trade-off between the calculation resources and the network bandwidth for cluster and feature calculations to efficiently allocate the encoding buffer resources of TOS to the network connections, thus contributing to the increased tps(number of real-time data processing and monitoring connections per unit hour) of TOS.

A Study on Management Method of Infectious Wastes Applying RFID (감염성 폐기물 관리를 위한 RFID 적용에 관한 연구)

  • Joung, Lyang-Jae;Sung, Nak-Chang;Kang, Hean-Chan;Kang, Dae-Seong
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.8 no.1
    • /
    • pp.63-72
    • /
    • 2007
  • Recently, as recognizing the risk about the infection of an infectious wastes, the problems about the management and treatment of the infectious wastes stand out socially. In this paper, as being possible monitoring whole processing from the origin of the infectious waste to the processing plant, using the RFID which is the kernel technology of the next generation, we tried to solve the second infection problem by inefficient treatment of the infectious wastes. Through the research suggesting in this paper, as storing and monitoring the procedural business articles and the problem about miss-writing and input error being found in management system like documentary writing by the existing manager and computation input by the web application, we can understand the management state, immediately. And the Bio information for the personal authentication is carried out through storing the feature vector calculation by the PCA algorithm, into the tag. It suggested more systematic and safer management plan than previous thing, as giving attention about the wastes to manager.

  • PDF