• Title/Summary/Keyword: Vector Machines

Search Result 534, Processing Time 0.034 seconds

Recognition of rolling bearing fault patterns and sizes based on two-layer support vector regression machines

  • Shen, Changqing;Wang, Dong;Liu, Yongbin;Kong, Fanrang;Tse, Peter W.
    • Smart Structures and Systems
    • /
    • v.13 no.3
    • /
    • pp.453-471
    • /
    • 2014
  • The fault diagnosis of rolling element bearings has drawn considerable research attention in recent years because these fundamental elements frequently suffer failures that could result in unexpected machine breakdowns. Artificial intelligence algorithms such as artificial neural networks (ANNs) and support vector machines (SVMs) have been widely investigated to identify various faults. However, as the useful life of a bearing deteriorates, identifying early bearing faults and evaluating their sizes of development are necessary for timely maintenance actions to prevent accidents. This study proposes a new two-layer structure consisting of support vector regression machines (SVRMs) to recognize bearing fault patterns and track the fault sizes. The statistical parameters used to track the fault evolutions are first extracted to condense original vibration signals into a few compact features. The extracted features are then used to train the proposed two-layer SVRMs structure. Once these parameters of the proposed two-layer SVRMs structure are determined, the features extracted from other vibration signals can be used to predict the unknown bearing health conditions. The effectiveness of the proposed method is validated by experimental datasets collected from a test rig. The results demonstrate that the proposed method is highly accurate in differentiating between fault patterns and determining their fault severities. Further, comparisons are performed to show that the proposed method is better than some existing methods.

Multiclass Support Vector Machines with SCAD

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.5
    • /
    • pp.655-662
    • /
    • 2012
  • Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the $L_1$, $L_2$ penalty functions and the developed method.

Training of Support Vector Machines Using the Modified Kernel-adatron Algorithm (수정된 kernel-adatron 알고리즘에 의한 Support Vector Machines의 학습)

  • 조용현
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2000.04b
    • /
    • pp.469-471
    • /
    • 2000
  • 본 논문에서는 모멘트 항을 추가한 수정된 kernel-adatron 알고리즘을 제안하고 이른 support vector machines의 학습기법으로 이용하였다. 이는 기울기상승법에서 일어나는 최적해로의 수렴에 따른 발진을 억제하여 그 수렴 속도를 좀더 개선시키는 모멘트의 장점과 kernel-adatron 알고리즘의 구현용이성을 그대로 살리기 위함이다. 제안된 학습기법의 SVM을 실제 200명의 암환자를 2부류(초기와 악성)로 분류하여 문제에 적용하여 시뮬레이션한 결과, Cambell등의 kernel-adatron 알고리즘을 이용한 SVM의 결과와 비교할 때 학습시간과 시험 데이터의 분류률에서 더욱 우수한 성능이 있음을 확인할 수 있었다.

  • PDF

Estimating Basin of Attraction for Multi-Basin Processes Using Support Vector Machine

  • Lee, Dae-Won;Lee, Jae-Wook
    • Management Science and Financial Engineering
    • /
    • v.18 no.1
    • /
    • pp.49-53
    • /
    • 2012
  • A novel method of transient stability analysis is presented in this paper. The proposed method extracts data points near the basin-of-attraction boundary and then builds a support vector machine (SVM) model learned from the generated data. The constructed SVM classifier has been shown to reduce dramatically the conservativeness of the estimated basin of attraction.

REGRESSION WITH CENSORED DATA BY LEAST SQUARES SUPPORT VECTOR MACHINE

  • Kim, Dae-Hak;Shim, Joo-Yong;Oh, Kwang-Sik
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.1
    • /
    • pp.25-34
    • /
    • 2004
  • In this paper we propose a prediction method on the regression model with randomly censored observations of the training data set. The least squares support vector machine regression is applied for the regression function prediction by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed prediction method.

Development of an Intelligent Trading System Using Support Vector Machines and Genetic Algorithms (Support Vector Machines와 유전자 알고리즘을 이용한 지능형 트레이딩 시스템 개발)

  • Kim, Sun-Woong;Ahn, Hyun-Chul
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.1
    • /
    • pp.71-92
    • /
    • 2010
  • As the use of trading systems increases recently, many researchers are interested in developing intelligent trading systems using artificial intelligence techniques. However, most prior studies on trading systems have common limitations. First, they just adopted several technical indicators based on stock indices as independent variables although there are a variety of variables that can be used as independent variables for predicting the market. In addition, most of them focus on developing a model that predicts the direction of the stock market indices rather than one that can generate trading signals for maximizing returns. Thus, in this study, we propose a novel intelligent trading system that mitigates these limitations. It is designed to use both the technical indicators and the other non-price variables on the market. Also, it adopts 'two-threshold mechanism' so that it can transform the outcome of the stock market prediction model based on support vector machines to the trading decision signals like buy, sell or hold. To validate the usefulness of the proposed system, we applied it to the real world data-the KOSPI200 index from May 2004 to December 2009. As a result, we found that the proposed system outperformed other comparative models from the perspective of 'rate of return'.

Quasi-3D analysis of Axial Flux Permanent Magnet Rotating Machines using Space Harmonic Methods (공간고조파법을 이용한 축 자속 영구자석 회전기기의 준(準)-3D 특성 해석)

  • Choi, Jang-Young
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.60 no.5
    • /
    • pp.942-948
    • /
    • 2011
  • This paper deals with characteristic analysis of axial flux permanent magnet (AFPM) machines with axially magnetized PM rotor using quasi-3-D analysis modeling. On the basis of magnetic vector potential and a two-dimensional (2-D) polar-coordinate system, the magnetic field solutions due to various PM rotors are obtained. In particular, 3-D problem, that is, the reduction of magnetic fields near outer and inner radius of the PM is solved by introducing a special function for radial position. And then, the analytical solutions for back-emf and torque are also derived from magnetic field solutions. The predictions are shown in good agreement with those obtained from 3-D finite element analyses (FEA). Finally, it can be judged that analytical solutions for electromagnetic quantities presented in this paper are very useful for the AFPM machines in terms of following items : initial design, sensitivity analysis with design parameters, and estimation of control parameters.

Electromagnetic Analysis of Slotless Brushless Permanent Magnet Machines According to Magnetization Patterns (슬롯리스 브러시리스 영구자석기기의 자화 패턴에 따른 전자기적 특성해석)

  • Jang Seok-Myeong;Choi Jang-Young;Cho Han-Wook;Park Ji-Hoon
    • The Transactions of the Korean Institute of Electrical Engineers B
    • /
    • v.54 no.12
    • /
    • pp.576-585
    • /
    • 2005
  • This paper deals with the electromagnetic field analysis of slotless brushless permanent magnet machines with three different magnetization patterns such as Halbach, parallel and radial magnetization. The magnetization modeling of Halbach, parallel and radial magnetization is performed analytically. And then, analytical solutions for open-circuit field distributions, armature reaction field distributions, flux linkages due to PMs and stator windings, torque, back-emf and inductance are derived in terms of magnetic vector potential and two-dimensional (2-d) polar coordinate systems. The analytical results are validated extensively by finite element (FE) analyses. The magnet volume required in order to produce identical flux density is compared with each magnetization. Finally, analytical solutions and derivation procedures of those presented in this paper can be applied to slotless and slotted brushless permanent magnet AC and DC machines.

Rotor Loss Analysis in Permanent Magnet High-Speed Machine Using Coupled FEM and Analytical Method

  • Jang Seok-Myeong;Cho Han-Wook;Lee Sung-Ho;Yang Hyun-Sup
    • KIEE International Transaction on Electrical Machinery and Energy Conversion Systems
    • /
    • v.5B no.3
    • /
    • pp.272-276
    • /
    • 2005
  • This paper deals with the method to calculate the rotor eddy current losses of permanent magnet high-speed machines considering the effects of time/space flux harmonics. The flux harmonics caused by the slot geometry in the stator is calculated from the time variation of the magnetic field distribution obtained by the magneto-static finite element analysis and double Fast Fourier Transform. And, using the analytical approach considering the multiple flux harmonics and the Poynting vector, the rotor losses is evaluated in each rotor composite. Using this method is simple and workable for any kind of stator slot shape for rotor loss analysis.

A New Support Vector Machines for Classifying Uncertain Data (불완전 데이터의 패턴 분석을 위한 $_{MI}$SVMs)

  • Kiyoung, Lee;Dae-Won, Kim;Doheon, Lee;Kwang H., Lee
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.10b
    • /
    • pp.703-705
    • /
    • 2004
  • Conventional support vector machines (SVMs) find optimal hyperplanes that have maximal margins by treating all data equivalently. In the real world, however, the data within a data set may differ in degree of uncertainty or importance due to noise, inaccuracies or missing values in the data. Hence, if all data are treated as equivalent, without considering such differences, the optimal hyperplanes identified are likely to be less optimal. In this paper, to more accurately identify the optimal hyperplane in a given uncertain data set, we propose a membership-induced distance from a hyperplane using membership values, and formulate three kinds of membership-induced SVMs.

  • PDF