• Title/Summary/Keyword: Test Vectors

Search Result 310, Processing Time 0.031 seconds

A Weighted Random Pattern Testing Technique for Path Delay Fault Detection in Combinational Logic Circuits (조합 논리 회로의 경로 지연 고장 검출을 위한 가중화 임의 패턴 테스트 기법)

  • 허용민;임인칠
    • Journal of the Korean Institute of Telematics and Electronics A
    • /
    • v.32A no.12
    • /
    • pp.229-240
    • /
    • 1995
  • This paper proposes a new weighted random pattern testing technique to detect path delay faults in combinational logic circuits. When computing the probability of signal transition at primitive logic elements of CUT(Circuit Under Test) by the primary input, the proposed technique uses the information on the structure of CUT for initialization vectors and vectors generated by pseudo random pattern generator for test vectors. We can sensitize many paths by allocating a weight value on signal lines considering the difference of the levels of logic elements. We show that the proposed technique outperforms existing testing method in terms of test length and fault coverage using ISCAS '85 benchmark circuits. We also show that the proposed testing technique generates more robust test vectors for the longest and near-longest paths.

  • PDF

A Test Vector Reordering for Switching Activity Reduction During Test Operation Considering Fanout (테스트시 스위칭 감소를 위해 팬 아웃을 고려한 테스트벡터 재 정렬)

  • Lee, Jae-Hoon;Baek, Chul-Ki;Kim, In-Soo;Min, Hyoung-Bok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.60 no.5
    • /
    • pp.1043-1048
    • /
    • 2011
  • Test vector reordering is a very effective way to reduce power consumption during test application. But, it is time-consuming and complicated processes, and it does not consider internal circuit structure, which may limit the effectiveness. In this paper, we order test vectors using fanout count of primary inputs that consider the internal circuit structure, which may reduce the switching activity. Then, we reorder test test vectors again by using Hamming distance between test vectors. We proposed FOVO algorithm to perform these two ideas. FOVO is an effective way to reduce power consumption during test application. The algorithm is applied to benchmark circuits and we get an average of 3.5% or more reduction of the power consumption.

SMC: An Seed Merging Compression for Test Data (시드 병합을 통한 테스트 데이터의 압축방법)

  • Lee Min-joo;Jun Sung-hun;Kim Yong-joon;Kang Sumg-ho
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.42 no.9 s.339
    • /
    • pp.41-50
    • /
    • 2005
  • As the size of circuits becomes larger, the test method needs more test data volume and larger test application time. In order to reduce test data volume and test application time, a new test data compression/decompression method is proposed. The proposed method is based on an XOR network uses don't-care-bits to improve compression ratio during seed vectors generation. After seed vectors are produced seed vectors can be merged using two prefix codes. It only requires 1 clock time for reusing merged seed vectors, so test application time can be reduced tremendously. Experimental results on large ISCAS '89 benchmark circuits prove the efficiency of the proposed method.

CCQC modal combination rule using load-dependent Ritz vectors

  • Xiangxiu Li;Huating Chen
    • Structural Engineering and Mechanics
    • /
    • v.87 no.1
    • /
    • pp.57-68
    • /
    • 2023
  • Response spectrum method is still an effective approach for the design of buildings with supplemental dampers. In practice, complex complete quadratic combination (CCQC) rule is always used in the response spectrum method to consider the effect of non-classical damping. The conventional CCQC rule is based on exact complex mode vectors. Sometimes the calculated complex mode vectors may be not excited by the external loading and errors in the structural responses always arise due to the mode truncation. Load-dependent Ritz (LDR) vectors are associated with the external loading and LDR vectors not excited can be automatically excluded. Also, contributions of higher modes are implicitly contained in the LDR vectors in terms of static responses. To improve the calculation efficiency and accuracy, LDR vectors are introduced in the CCQC rule in the present study. Firstly, the generation procedure of LDR vectors suitable for non-classical damping system is presented. Compared to the conventional LDR vectors, the LDR vectors herein are complex-valued and named as complex LDR (CLDR) vectors. Based on the CLDR vectors, the CCQC rule is then rederived and an improved response spectrum method is developed. Finally, the effectiveness of the proposed method in this paper is verified through three typical non-classical damping buildings. Numerical results show that the CLDR vector is superior to the complex mode with the same number in the calculation. Since the generation of CLDR vectors requires less computational cost and storage space, the method proposed in this paper offers an attractive alternative, especially for structures with a large number of degrees of freedom.

Test Vector Generator of timing simulation for 224-bit ECDSA hardware (224비트 ECDSA 하드웨어 시간 시뮬레이션을 위한 테스트벡터 생성기)

  • Kim, Tae Hun;Jung, Seok Won
    • Journal of Internet of Things and Convergence
    • /
    • v.1 no.1
    • /
    • pp.33-38
    • /
    • 2015
  • Hardware are developed in various architecture. It is necessary to verifying value of variables in modules generated in each clock cycles for timing simulation. In this paper, a test vector generator in software type generates test vectors for timing simulation of 224-bit ECDSA hardware modules in developing stage. It provides test vectors with GUI format and text file format.

A Study on the Classification of Ultrasonic Liver Images Using Multi Texture Vectors and a Statistical Classifier (다중 거칠기 벡터와 통계적 분류기를 이용한 초음파 간 영상 분류에 관한 연구)

  • 정정원;김동윤
    • Journal of Biomedical Engineering Research
    • /
    • v.17 no.4
    • /
    • pp.433-442
    • /
    • 1996
  • Since one texture property(i.e coarseness, orientation, regularity, granularity) for ultrasound liver ages was not sufficient enough to classify the characteristics of livers, we used multi texture vectors tracted from ultrasound liver images and a statistical classifier. Multi texture vectors are selected among the feature vectors of the normal liver, fat liver and cirrhosis images which have a good separability in those ultrasound liver images. The statistical classifier uses multi texture vectors as input vectors and classifies ultrasound liver images for each multi texture vector by the Bayes decision rule. Then the decision of the liver disease is made by choosing the maximum value from the averages of a posteriori probability for each multi texture vector In our simulation, we obtained higtler correct ratio than that of other methods using single feature vector, for the test set the correct ratio is 94% in the normal liver, 84% in the fat liver and 86% in the cirrhosis liver.

  • PDF

Robust Histogram Equalization Using Compensated Probability Distribution

  • Kim, Sung-Tak;Kim, Hoi-Rin
    • MALSORI
    • /
    • v.55
    • /
    • pp.131-142
    • /
    • 2005
  • A mismatch between the training and the test conditions often causes a drastic decrease in the performance of the speech recognition systems. In this paper, non-linear transformation techniques based on histogram equalization in the acoustic feature space are studied for reducing the mismatched condition. The purpose of histogram equalization(HEQ) is to convert the probability distribution of test speech into the probability distribution of training speech. While conventional histogram equalization methods consider only the probability distribution of a test speech, for noise-corrupted test speech, its probability distribution is also distorted. The transformation function obtained by this distorted probability distribution maybe bring about miss-transformation of feature vectors, and this causes the performance of histogram equalization to decrease. Therefore, this paper proposes a new method of calculating noise-removed probability distribution by using assumption that the CDF of noisy speech feature vectors consists of component of speech feature vectors and component of noise feature vectors, and this compensated probability distribution is used in HEQ process. In the AURORA-2 framework, the proposed method reduced the error rate by over $44\%$ in clean training condition compared to the baseline system. For multi training condition, the proposed methods are also better than the baseline system.

  • PDF

BIST implemetation with test points insertion (테스트 포인트 삽입에 의한 내장형 자체 테스트 구현)

  • 장윤석;이정한김동욱
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.1069-1072
    • /
    • 1998
  • Recently the development of design and automation technology and manufacturing method, has reduced the cost of chip, but it becomes more difficult to test IC chip because test technique doesn't keep up with these techniques. In case of IC testing, obtaining test vectors to be able to detect good chip or bad one is very important, but according to increasing complexity, it is very complex and difficult. Another problem is that during testing, there could be capability of physical and electrical damage on chip. Also there is difficulty in synchronization between CUT (circuit under test) and Test equipment〔1〕. Because of these difficulties, built in self test has been proposed. Not only obtaining test vectors but also reducing test time becomes hot issues nowadays. This paper presents a new test BIST(built in self test) method. Proposed BIST implementation reduces test time and obtains high fault coverage. By searching internal nodes in which are inserted test_point_cells〔2〕and allocating TPG(test pattern generation) stages, test length becomes much shorter.

  • PDF

An Improved K-means Document Clustering using Concept Vectors

  • Shin, Yang-Kyu
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.4
    • /
    • pp.853-861
    • /
    • 2003
  • An improved K-means document clustering method has been presented, where a concept vector is manipulated for each cluster on the basis of cosine similarity of text documents. The concept vectors are unit vectors that have been normalized on the n-dimensional sphere. Because the standard K-means method is sensitive to initial starting condition, our improvement focused on starting condition for estimating the modes of a distribution. The improved K-means clustering algorithm has been applied to a set of text documents, called Classic3, to test and prove efficiency and correctness of clustering result, and showed 7% improvements in its worst case.

  • PDF

Efficient Test Data Compression and Low Power Scan Testing in SoCs

  • Jung, Jun-Mo;Chong, Jong-Wha
    • ETRI Journal
    • /
    • v.25 no.5
    • /
    • pp.321-327
    • /
    • 2003
  • Testing time and power consumption during the testing of SoCs are becoming increasingly important with an increasing volume of test data in intellectual property cores in SoCs. This paper presents a new algorithm to reduce the scan-in power and test data volume using a modified scan latch reordering algorithm. We apply a scan latch reordering technique to minimize the column hamming distance in scan vectors. During scan latch reordering, the don't-care inputs in the scan vectors are assigned for low power and high compression. Experimental results for ISCAS 89 benchmark circuits show that reduced test data and low power scan testing can be achieved in all cases.

  • PDF