• Title/Summary/Keyword: Tolerance Verification

Search Result 74, Processing Time 0.026 seconds

On the Simple Speaker Verification System Using Tolerance Interval Analysis Without Background Speaker Models (Tolerance Interval Analysis를 이용한 배경화자 없는 간단한 화자인증시스템에 관한 연구)

  • Choi, Hong-Sub
    • MALSORI
    • /
    • no.56
    • /
    • pp.147-158
    • /
    • 2005
  • In this paper, we are focused to develop the simplified speaker verification algorithm without background speaker models, which will be adopted in the portable speaker verification system equipped in portable terminals such as mobile phone and PMP. According to the tolerance interval analysis, the population of someone's speaker model can be represented by a suitable number of selected independent samples of speaker model. So we can make the representative speaker model and threshold under the specified confidence level and coverage. Using proposed algorithm with the number of samples is 40, the experiments show that the false rejection rate is $3.0\%$ and the false acceptance rate $4.3\%$, worth comparing to conventional method's results, $5.4\%\;and\;5.5\%$, respectively. Next step of research will be on the suitable adaptation methods to overcome speech variation problems due to aging effect and operating environments.

  • PDF

Probabilistic Soft Error Detection Based on Anomaly Speculation

  • Yoo, Joon-Hyuk
    • Journal of Information Processing Systems
    • /
    • v.7 no.3
    • /
    • pp.435-446
    • /
    • 2011
  • Microprocessors are becoming increasingly vulnerable to soft errors due to the current trends of semiconductor technology scaling. Traditional redundant multi-threading architectures provide perfect fault tolerance by re-executing all the computations. However, such a full re-execution technique significantly increases the verification workload on the processor resources, resulting in severe performance degradation. This paper presents a pro-active verification management approach to mitigate the verification workload to increase its performance with a minimal effect on overall reliability. An anomaly-speculation-based filter checker is proposed to guide a verification priority before the re-execution process starts. This technique is accomplished by exploiting a value similarity property, which is defined by a frequent occurrence of partially identical values. Based on the biased distribution of similarity distance measure, this paper investigates further application to exploit similar values for soft error tolerance with anomaly speculation. Extensive measurements prove that the majority of instructions produce values, which are different from the previous result value, only in a few bits. Experimental results show that the proposed scheme accelerates the processor to be 180% faster than traditional fully-fault-tolerant processor with a minimal impact on overall soft error rate.

Torusity Tolerance Verification using Swarm Intelligence

  • Prakasvudhisarn, Chakguy;Kunnapapdeelert, Siwaporn
    • Industrial Engineering and Management Systems
    • /
    • v.6 no.2
    • /
    • pp.94-105
    • /
    • 2007
  • Measurement technology plays an important role in discrete manufacturing industry. Probe-type coordinate measuring machines (CMMs) are normally used to capture the geometry of part features. The measured points are then fit to verify a specified geometry by using the least squares method (LSQ). However, it occasionally overestimates the tolerance zone, which leads to the rejection of some good parts. To overcome this drawback, minimum zone approaches defined by the ANSI Y14.5M-1994 standard have been extensively pursued for zone fitting in coordinate form literature for such basic features as plane, circle, cylinder and sphere. Meanwhile, complex features such as torus have been left to be dealt-with by the use of profile tolerance definition. This may be impractical when accuracy of the whole profile is desired. Hence, the true deviation model of torus is developed and then formulated as a minimax problem. Next, a relatively new and simple population based evolutionary approach, particle swarm optimization (PSO), is applied by imitating the social behavior of animals to find the minimum tolerance zone torusity. Simulated data with specified torusity zones are used to validate the deviation model. The torusity results are in close agreement with the actual torusity zones and also confirm the effectiveness of the proposed PSO when compared to those of the LSQ.

A Study of Alignment Tolerance's Definition and Test Method for Airborne Camera (항공기 탑재용 카메라 정렬오차 정의 및 시험방안 연구)

  • Song, Dae-Buem;Yoon, Yong-Eun;Lee, Hang-Bok
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.16 no.2
    • /
    • pp.154-159
    • /
    • 2013
  • Alignment tolerance for EO/IR airborne camera using common optic is an important factor in stabilization accuracy and geo-pointing accuracy. Before airborne camera is mounted on the aircraft, defining alignment tolerance and verification of it is essential in production as well as research and development. In this paper we establish basic concept on the definition and elements of alignment tolerance for airborne camera and propose how to measure each of those elements. Components and the measurement sequence of alignment tolerance are as follows: 1) tolerance of alignment between EO and IR LOS. 2) tolerance of sensor alignment. 3) tolerance of position reporting accuracy. 4) tolerance of mount alignment

Design of Plasma Cutting Torch by Tolerance Propagation Analysis (공차누적해석을 이용한 플라즈마 절단토치의 설계에 관한 연구)

  • 방용우;장희석;장희석;양진승
    • Journal of Welding and Joining
    • /
    • v.18 no.3
    • /
    • pp.122-130
    • /
    • 2000
  • Due to the inherent dimensional uncertainty, the tolerances accumulate in the assembly of plasma cutting torch. Tolerance accumulation has serious effect on the performance of the plasma torch. This study proposes a statistical tolerance propagation model, which is based on matrix transform. This model can predict the final tolerance distributions of the completed plasma torch assembly with the prescribed statistical tolerance distribution of each part to be assembled. Verification of the proposed model was performed by making use of Monte Carlo simulation. Monte Carlo simulation generates a large number of discrete plasma torch assembly instances and randomly selects a point within the tolerance region with the prescribed statistical distribution. Monte Carlo simulation results show good agreement with that of the proposed model. This results are promising in that we can predict the final tolerance distributions in advance before assembly process of plasma torch thus provide great benefit at the assembly design stage of plasma torch.

  • PDF

The Effect of Annular Projection Collapse on Tolerance of ECV Assembly (링 프로젝션 돌기의 용입정도가 ECV 조립공차에 미치는 영향)

  • Chang, Hee-Seok;Won, Woong-Yeon;Choi, Duk-Jun;Kim, Jong-Ho;Kim, Jin-Sang;Nahm, Tak-Hyun;Kang, Hee-Jong
    • Journal of Welding and Joining
    • /
    • v.30 no.1
    • /
    • pp.78-84
    • /
    • 2012
  • Due to the inherent dimensional uncertainty, tolerances accumulate in the final assembly. Tolerance accumulation has serious effect on the performance of ECV assembly. This paper proposes a method of tolerance accumulation analysis using Monte Carlo simulation, which includes welding process in assemble process. This method can predict the final tolerance distributions of the completed assembly with the prescribed statistical tolerance distribution of each part to be assembled. With the inclusion of welding, another dimensional uncertainties due to partial melting is to be accounted as well. Partial melting of projection height was included in the tolerance propagation analysis. Verification of the proposed method was performed by making use of Monte Carlo simulation. Monte Carlo simulation results showed promising results in that we can predict the final tolerance distributions in advance before actual assembly process of precision machinery.

A Privacy-preserving Data Aggregation Scheme with Efficient Batch Verification in Smart Grid

  • Zhang, Yueyu;Chen, Jie;Zhou, Hua;Dang, Lanjun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.617-636
    • /
    • 2021
  • This paper presents a privacy-preserving data aggregation scheme deals with the multidimensional data. It is essential that the multidimensional data is rarely mentioned in all researches on smart grid. We use the Paillier Cryptosystem and blinding factor technique to encrypt the multidimensional data as a whole and take advantage of the homomorphic property of the Paillier Cryptosystem to achieve data aggregation. Signature and efficient batch verification have also been applied into our scheme for data integrity and quick verification. And the efficient batch verification only requires 2 pairing operations. Our scheme also supports fault tolerance which means that even some smart meters don't work, our scheme can still work well. In addition, we give two extensions of our scheme. One is that our scheme can be used to compute a fixed user's time-of-use electricity bill. The other is that our scheme is able to effectively and quickly deal with the dynamic user situation. In security analysis, we prove the detailed unforgeability and security of batch verification, and briefly introduce other security features. Performance analysis shows that our scheme has lower computational complexity and communication overhead than existing schemes.

Distributed OCSP Certificate Verification Model for Reducing Response Time (응답시간 단축을 위한 분산 OCSP 인증서 검증 모델)

  • Choi Seung kwon;Jang Yoon sik;Ji Hong il;Shin Seung soo;Cho Yong hwan
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.4A
    • /
    • pp.304-311
    • /
    • 2005
  • OCSP has specific characters which can suspend, close, and correct in real time. But, as more clients use the OCSP server verification, more updated information is needed, which can lead to jamming in the OCSP server. To apply this technique of Distributed OCSP server so as to reduce the certificate verification OCSP from jamming. Also, the Distributed OCSP server will solve the problems of the intensive central structure. Simulation results show that the average reply time of certificate verification request and server load are reduced in the case using distributed OCSP. In addition to this advantage, resource distribution and fault tolerance are acquired because of multiple OCSP.

A Profile Tolerance Usage in GD&T for Precision Manufacturing (정밀제조를 위한 기하공차에서의 윤곽공차 사용)

  • Kim, Kyung-Wook;Chang, Sung-Ho
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.40 no.2
    • /
    • pp.145-149
    • /
    • 2017
  • One of the challenges facing precision manufacturers is the increasing feature complexity of tight tolerance parts. All engineering drawings must account for the size, form, orientation, and location of all features to ensure manufacturability, measurability, and design intent. Geometric controls per ASME Y14.5 are typically applied to specify dimensional tolerances on engineering drawings and define size, form, orientation, and location of features. Many engineering drawings lack the necessary geometric dimensioning and tolerancing to allow for timely and accurate inspection and verification. Plus-minus tolerancing is typically ambiguous and requires extra time by engineering, programming, machining, and inspection functions to debate and agree on a single conclusion. Complex geometry can result in long inspection and verification times and put even the most sophisticated measurement equipment and processes to the test. In addition, design, manufacturing and quality engineers are often frustrated by communication errors over these features. However, an approach called profile tolerancing offers optimal definition of design intent by explicitly defining uniform boundaries around the physical geometry. It is an efficient and effective method for measurement and quality control. There are several advantages for product designers who use position and profile tolerancing instead of linear dimensioning. When design intent is conveyed unambiguously, manufacturers don't have to field multiple question from suppliers as they design and build a process for manufacturing and inspection. Profile tolerancing, when it is applied correctly, provides manufacturing and inspection functions with unambiguously defined tolerancing. Those data are manufacturable and measurable. Customers can see cost and lead time reductions with parts that consistently meet the design intent. Components can function properly-eliminating costly rework, redesign, and missed market opportunities. However a supplier that is poised to embrace profile tolerancing will no doubt run into resistance from those who would prefer the way things have always been done. It is not just internal naysayers, but also suppliers that might fight the change. In addition, the investment for suppliers can be steep in terms of training, equipment, and software.

Tolerance Computation for Process Parameter Considering Loss Cost : In Case of the Larger is better Characteristics (손실 비용을 고려한 공정 파라미터 허용차 산출 : 망대 특성치의 경우)

  • Kim, Yong-Jun;Kim, Geun-Sik;Park, Hyung-Geun
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.40 no.2
    • /
    • pp.129-136
    • /
    • 2017
  • Among the information technology and automation that have rapidly developed in the manufacturing industries recently, tens of thousands of quality variables are estimated and categorized in database every day. The former existing statistical methods, or variable selection and interpretation by experts, place limits on proper judgment. Accordingly, various data mining methods, including decision tree analysis, have been developed in recent years. Cart and C5.0 are representative algorithms for decision tree analysis, but these algorithms have limits in defining the tolerance of continuous explanatory variables. Also, target variables are restricted by the information that indicates only the quality of the products like the rate of defective products. Therefore it is essential to develop an algorithm that improves upon Cart and C5.0 and allows access to new quality information such as loss cost. In this study, a new algorithm was developed not only to find the major variables which minimize the target variable, loss cost, but also to overcome the limits of Cart and C5.0. The new algorithm is one that defines tolerance of variables systematically by adopting 3 categories of the continuous explanatory variables. The characteristics of larger-the-better was presumed in the environment of programming R to compare the performance among the new algorithm and existing ones, and 10 simulations were performed with 1,000 data sets for each variable. The performance of the new algorithm was verified through a mean test of loss cost. As a result of the verification show, the new algorithm found that the tolerance of continuous explanatory variables lowered loss cost more than existing ones in the larger is better characteristics. In a conclusion, the new algorithm could be used to find the tolerance of continuous explanatory variables to minimize the loss in the process taking into account the loss cost of the products.