• Title/Summary/Keyword: threshold approach

Search Result 511, Processing Time 0.03 seconds

A Tolerant Rough Set Approach for Handwritten Numeral Character Classification

  • Kim, Daijin;Kim, Chul-Hyun
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.06a
    • /
    • pp.288-295
    • /
    • 1998
  • This paper proposes a new data classification method based on the tolerant rough set that extends the existing equivalent rough set. Similarity measure between two data is described by a distance function of all constituent attributes and they are defined to be tolerant when their similarity measure exceeds a similarity threshold value. The determination of optimal similarity theshold value is very important for the accurate classification. So, we determine it optimally by using the genetic algorithm (GA), where the goal of evolution is to balance two requirements such that (1) some tolerant objects are required to be included in the same class as many as possible. After finding the optimal similarity threshold value, a tolerant set of each object is obtained and the data set is grounded into the lower and upper approximation set depending on the coincidence of their classes. We propose a two-stage classification method that all data are classified by using the lower approxi ation at the first stage and then the non-classified data at the first stage are classified again by using the rough membership functions obtained from the upper approximation set. We apply the proposed classification method to the handwritten numeral character classification. problem and compare its classification performance and learning time with those of the feed forward neural network's back propagation algorithm.

  • PDF

A Correlative Approach for Identifying Complex Phases by Electron Backscatter Diffraction and Transmission Electron Microscopy

  • Na, Seon-Hyeong;Seol, Jae-Bok;Jafari, Majid;Park, Chan-Gyung
    • Applied Microscopy
    • /
    • v.47 no.1
    • /
    • pp.43-49
    • /
    • 2017
  • A new method was introduced to distinguish the ferrite, bainite and martensite in transformation induced plasticity (TRIP) steel by using electron backscatter diffraction (EBSD) and transmission electron microscopy (TEM). EBSD is a very powerful microstructure analysis technique at the length scales ranging from tens of nanometers to millimeters. However, iron BCC phases such as ferrite, bainite and martensite cannot be easily distinguished by EBSD due to their similar surface morphology and crystallographic structure. Among the various EBSD-based methodology, image quality (IQ) values, which present the perfection of a crystal lattice, was used to distinguish the iron BCC phases. IQ values are very useful tools to discern the iron BCC phases because of their different density of crystal defect and lattice distortion. However, there are still remaining problems that make the separation of bainite and martensite difficult. For instance, these phases have very similar IQ values in many cases, especially in deformed region; therefore, even though the IQ value was used, it has been difficult to distinguish the bainite and martensite. For more precise separation of bainite and martensite, IQ threshold values were determined by a correlative TEM analysis. By determining the threshold values, iron BCC phases were successfully separated.

A Study on Classification and Localization of Structural Damage through Wavelet Analysis

  • Koh, Bong-Hwan;Jung, Uk
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2007.11a
    • /
    • pp.754-759
    • /
    • 2007
  • This study exploits the data discriminating capability of silhouette statistics, which combines wavelet-based vertical energy threshold technique for the purpose of extracting damage-sensitive features and clustering signals of the same class. This threshold technique allows to first obtain a suitable subset of the extracted or modified features of our data, i.e., good predictor sets should contain features that are strongly correlated to the characteristics of the data without considering the classification method used, although each of these features should be as uncorrelated with each other as possible. The silhouette statistics have been used to assess the quality of clustering by measuring how well an object is assigned to its corresponding cluster. We use this concept for the discriminant power function used in this paper. The simulation results of damage detection in a truss structure show that the approach proposed in this study can be successfully applied for locating both open- and breathing-type damage even in the presence of a considerable amount of process and measurement noise.

  • PDF

Assessment of the Risk of Exposure to Chemical Carcinogens

  • Purchase, Iain F.H.
    • Toxicological Research
    • /
    • v.17
    • /
    • pp.41-45
    • /
    • 2001
  • The methods used for risk assessment from exposure to chemicals are well established. in most cases where toxicity other than carcinogenesis is being considered, the standard method relies on establishing the No Observed Adverse Effect Level (NOAEL) in the most sensitive animal toxicity study and using an appropriate safety factor (SF) to determine the exposure which would be associated with an acceptable risk. For carcinogens a different approach is used because it has been argued there is no threshold of effect. Thus mathematical equations are used to extrapolate from the high doses used in ani-mal experiments. These methods have been strongly criticised in recent years on several grounds. The most cogent criticisms are a) the equations are not based on a thorough understanding of the mechanisms of carcinogenesis and b) the outcome of a risk assessment based on such models varies more as a consequence of changes to the assumptions and equation used than it does from the data derived from carcinogenicity experiments. Other criticisms include the absence of any measure of the variance on the risk assessment and the selection of default values that are very conservative. Recent advances in the application of risk assessment emphasise that measures of both the exposure and the hazard should be considered as a distribution of values. The outcome of such a risk assessment provides an estimate of the distribution of the risks.

  • PDF

Equity in urban households' out-of-pocket payments for health care (도시가계 의료비 지출의 형평성)

  • Lee Weon Young
    • Health Policy and Management
    • /
    • v.15 no.1
    • /
    • pp.30-56
    • /
    • 2005
  • This paper used two threshold approaches to measure the equity in urban households' out-of-pocket payments for health care from 1997 to 2002, which developed by Wagstaff and van Doorslaer. One approach used catastrophic health expenditure, which means that payments exceed a 'pre-specified proportion' of total consumption expenditures or ability to pay and the other used impoverishment that they did not drive households into poverty. Indicies for 'catastrophic expenditure' captured intensity as well as its incidence and also the degree of which catastrophic payments occur disproportionately among poor households. Measure of poverty impact also captured both intensity and incidence. The methods applied with data on out-of-pocket payments from the Urban Household Expenditure Survey Incidence and intensity of catastrophic payments - both in terms of total household consumption as well as ability to pay - increased between 1997 and 2002, and that both incidence and intensity of 'catastrophic expenditure' became less concentrated among the poor, but more concentrated in 2001 than in 1997. The incidence and intensity of the poverty impact of out-of-pocket payments increased between 1997 and 2002. Health security system may not have provided financial protection against catastrophic health expenditure to low-income households, because of high user fee policy not considering income level. The policies alleviating catastrophic health payments among the poor need to be more developed, and two threshold approaches further evaluated on our policy context.

Intrusion Detection for Black Hole and Gray Hole in MANETs

  • She, Chundong;Yi, Ping;Wang, Junfeng;Yang, Hongshen
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.7
    • /
    • pp.1721-1736
    • /
    • 2013
  • Black and gray hole attack is one kind of routing disturbing attacks and can bring great damage to the network. As a result, an efficient algorithm to detect black and gray attack is important. This paper demonstrate an adaptive approach to detecting black and gray hole attacks in ad hoc network based on a cross layer design. In network layer, we proposed a path-based method to overhear the next hop's action. This scheme does not send out extra control packets and saves the system resources of the detecting node. In MAC layer, a collision rate reporting system is established to estimate dynamic detecting threshold so as to lower the false positive rate under high network overload. We choose DSR protocol to test our algorithm and ns-2 as our simulation tool. Our experiment result verifies our theory: the average detection rate is above 90% and the false positive rate is below 10%. Moreover, the adaptive threshold strategy contributes to decrease the false positive rate.

Network Neutrality in the Digital Convergence Era : a System Dynamics Model with Two-Sided Market Framework (디지털 컨버전스 환경에서 양면시장 플랫폼으로서의 인터넷망 중립성에 관한 동태적 분석)

  • Kim, Do-Hoon
    • Journal of Information Technology Services
    • /
    • v.10 no.2
    • /
    • pp.75-94
    • /
    • 2011
  • The industrial ecosystem around the Internet services has been evolving since the Internet was first introduced. The Net Neutrality issue best represents the process of the evolution and presents an inevitable challenge that the industry should overcome. This paper deals with this structural change with the Two-Sided Market framework and provides a System Dynamics(SD) model to evaluate the economic implications of the net neutrality policy. In particular, our approach analyzes the policy impacts when two competing platforms (network providers) play a role of the platform in a typical two-sided market, which connects Content Providers(CPs) with users. Previous studies show that the indirect network externality between these two markets makes the entire system tip to one platform. When the multi-homing in the CP market is allowed as in our model, however, their argument may lose its validity. To examine the system behavior, conducted here is SD simulations of our model. The simulation results show that co-existence of the competing platforms persists with the network effects over a certain threshold. The net neutrality policy seems to lower the threshold based on our experimental outcomes.

NOGSEC: A NOnparametric method for Genome SEquence Clustering (녹섹(NOGSEC): A NOnparametric method for Genome SEquence Clustering)

  • 이영복;김판규;조환규
    • Korean Journal of Microbiology
    • /
    • v.39 no.2
    • /
    • pp.67-75
    • /
    • 2003
  • One large topic in comparative genomics is to predict functional annotation by classifying protein sequences. Computational approaches for function prediction include protein structure prediction, sequence alignment and domain prediction or binding site prediction. This paper is on another computational approach searching for sets of homologous sequences from sequence similarity graph. Methods based on similarity graph do not need previous knowledges about sequences, but largely depend on the researcher's subjective threshold settings. In this paper, we propose a genome sequence clustering method of iterative testing and graph decomposition, and a simple method to calculate a strict threshold having biochemical meaning. Proposed method was applied to known bacterial genome sequences and the result was shown with the BAG algorithm's. Result clusters are lacking some completeness, but the confidence level is very high and the method does not need user-defined thresholds.

Gradient field based method for segmenting 3D point cloud (Gradient Field 기반 3D 포인트 클라우드 지면분할 기법)

  • Vu, Hoang;Chu, Phuong;Cho, Seoungjae;Zhang, Weiqiang;Wen, Mingyun;Sim, Sungdae;Kwak, Kiho;Cho, Kyungeun
    • Annual Conference of KIPS
    • /
    • 2016.10a
    • /
    • pp.733-734
    • /
    • 2016
  • This study proposes a novel approach for ground segmentation of 3D point cloud. We combine two techniques: gradient threshold segmentation, and mean height evaluation. Acquired 3D point cloud is represented as a graph data structures by exploiting the structure of 2D reference image. The ground parts nearing the position of the sensor are segmented based on gradient threshold technique. For sparse regions, we separate the ground and nonground by using a technique called mean height evaluation. The main contribution of this study is a new ground segmentation algorithm which works well with 3D point clouds from various environments. The processing time is acceptable and it allows the algorithm running in real time.

Residual Echo Suppression Based on Tracking Echo-Presence Uncertainty (Tracking Echo-Presence Uncertainty 기반의 잔여 반향 억제)

  • Park, Yun-Sik;Chang, Joon-Hyuk
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.10C
    • /
    • pp.955-960
    • /
    • 2009
  • In this paper, we propose a novel approach to residual echo suppression (RES) algorithm based on tracking echo-presence uncertainty (TEPU) to improve the performance of acoustic echo suppression (AES) in the frequency domain. In the proposed method, the ratio of the microphone input and the echo-suppressed output signal power is employed as the threshold value for the decision rule to estimate the echo-presence uncertainty applied to the RES filter. The proposed RES scheme estimates the echo presence uncertainty in each frequency bin and effectively reduces residual echo signal in a simple fashion. The performance of the proposed algorithm is evaluated by the objective test and yields better results compared with the conventional schemes.