• 제목/요약/키워드: hyperplane function

검색결과 28건 처리시간 0.024초

ENUMERATION OF GRAPHS AND THE CHARACTERISTIC POLYNOMIAL OF THE HYPERPLANE ARRANGEMENTS 𝒥n

  • Song, Joungmin
    • 대한수학회지
    • /
    • 제54권5호
    • /
    • pp.1595-1604
    • /
    • 2017
  • We give a complete formula for the characteristic polynomial of hyperplane arrangements ${\mathcal{J}}_n$ consisting of the hyperplanes $x_i+x_j=1$, $x_k=0$, $x_l=1$, $1{\leq}i$, j, k, $l{\leq}n$. The formula is obtained by associating hyperplane arrangements with graphs, and then enumerating central graphs via generating functions for the number of bipartite graphs of given order, size and number of connected components.

On the Dynamics of Multi-Dimensional Lotka-Volterra Equations

  • Abe, Jun;Matsuoka, Taiju;Kunimatsu, Noboru
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1623-1628
    • /
    • 2004
  • In the 3-dimensional cyclic Lotka-Volterra equations, we show the solution on the invariant hyperplane. In addition, we show the existence of the invariant hyperplane by the center manifold theorem under the some conditions. With this result, we can lead the hyperplane of the n-dimensional cyclic Lotka-Volterra equaions. In other section, we study the 3- or 4-dimensional Hamiltonian Lotka-Volterra equations which satisfy the Jacobi identity. We analyze the solution of the Hamiltonian Lotka- Volterra equations with the functions called the split Liapunov functions by [4], [5] since they provide the Liapunov functions for each region separated by the invariant hyperplane. In the cyclic Lotka-Volterra equations, the role of the Liapunov functions is the same in the odd and even dimension. However, in the Hamiltonian Lotka-Volterra equations, we can show the difference of the role of the Liapunov function between the odd and the even dimension by the numerical calculation. In this paper, we regard the invariant hyperplane as the important item to analyze the motion of Lotka-Volterra equations and occur the chaotic orbit. Furtheremore, an example of the asymptoticaly stable and stable solution of the 3-dimensional cyclic Lotka-Volterra equations, 3- and 4-dimensional Hamiltonian equations are shown.

  • PDF

가스터빈 엔진의 복합 결함 진단을 위한 SVM과 MLP의 성능 비교 (A Performance Comparison of SVM and MLP for Multiple Defect Diagnosis of Gas Turbine Engine)

  • 박준철;노태성;최동환
    • 한국추진공학회:학술대회논문집
    • /
    • 한국추진공학회 2005년도 제25회 추계학술대회논문집
    • /
    • pp.158-161
    • /
    • 2005
  • 본 연구에서는 Support Vector Machine (SVM)을 이용하여 가스 터빈 엔진의 결함 진단을 시도하였다. SVM은 벡터 공간에서 임의의 비선형 경계인 Hyperplane을 찾아 두 개의 집합을 분류하는 방법으로 수학적으로 최적의 해를 찾을 수 있다고 알려져 있다. 이러한 이진 분류용 SVM을 다층으로 결합하여 가스 터빈의 결함을 정량적으로 판단해 내는 방법을 제안하였으며 기존의 Multi Layer Perceptron(MLP)보다 빠르고 신뢰성 있는 진단 결과를 보여주었음을 확인하였다.

  • PDF

On the Support Vector Machine with the kernel of the q-normal distribution

  • Joguchi, Hirofumi;Tanaka, Masaru
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 ITC-CSCC -2
    • /
    • pp.983-986
    • /
    • 2002
  • Support Vector Machine (SVM) is one of the methods of pattern recognition that separate input data using hyperplane. This method has high capability of pattern recognition by using the technique, which says kernel trick, and the Radial basis function (RBF) kernel is usually used as a kernel function in kernel trick. In this paper we propose using the q-normal distribution to the kernel function, instead of conventional RBF, and compare two types of the kernel function.

  • PDF

커널 이완 절차에 의한 커널 공간의 저밀도 표현 학습 (Spare Representation Learning of Kernel Space Using the Kernel Relaxation Procedure)

  • 류재홍;정종철
    • 한국지능시스템학회논문지
    • /
    • 제11권9호
    • /
    • pp.817-821
    • /
    • 2001
  • 본 논문은 분류 문제의 훈련 패턴으로부터 형성되는 커널 공간의 저밀도 표현을 가능하게 하는 커널 방법에 대한 새로운 학습방법론을 제안한다. 선형 판별 함수에 대한 기존의 학습법 중에서 이완 절차가 SVM(Support Vector Machine) 분류기와 동등하게 선형분리 가능 패턴분류 문제의 최대 마진 분리 초평면을 얻을 수 있다. 기존의 이완 절차는 지원 백터에 대한 필요 조건을 만족한다. 본 논문에서는 학습 중 지원 벡터를 확인하기 위한 충분 조건을 제시한다. 순차적 학습을 위하여 기존의 SVM을 확장하고 커널 판별함수를 정의한 후에 체계적인 학습방법을 제시한다. 실험 결과는 새 방법이 기존의 방법과 동등하거나 우수한 분류 성능을 갖고있음을 보여준다.

  • PDF

Imbalanced SVM-Based Anomaly Detection Algorithm for Imbalanced Training Datasets

  • Wang, GuiPing;Yang, JianXi;Li, Ren
    • ETRI Journal
    • /
    • 제39권5호
    • /
    • pp.621-631
    • /
    • 2017
  • Abnormal samples are usually difficult to obtain in production systems, resulting in imbalanced training sample sets. Namely, the number of positive samples is far less than the number of negative samples. Traditional Support Vector Machine (SVM)-based anomaly detection algorithms perform poorly for highly imbalanced datasets: the learned classification hyperplane skews toward the positive samples, resulting in a high false-negative rate. This article proposes a new imbalanced SVM (termed ImSVM)-based anomaly detection algorithm, which assigns a different weight for each positive support vector in the decision function. ImSVM adjusts the learned classification hyperplane to make the decision function achieve a maximum GMean measure value on the dataset. The above problem is converted into an unconstrained optimization problem to search the optimal weight vector. Experiments are carried out on both Cloud datasets and Knowledge Discovery and Data Mining datasets to evaluate ImSVM. Highly imbalanced training sample sets are constructed. The experimental results show that ImSVM outperforms over-sampling techniques and several existing imbalanced SVM-based techniques.

커널 이완절차에 의한 커널 공간의 저밀도 표현 학습 (Sparse Representation Learning of Kernel Space Using the Kernel Relaxation Procedure)

  • 류재홍;정종철
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2001년도 추계학술대회 학술발표 논문집
    • /
    • pp.60-64
    • /
    • 2001
  • In this paper, a new learning methodology for Kernel Methods is suggested that results in a sparse representation of kernel space from the training patterns for classification problems. Among the traditional algorithms of linear discriminant function(perceptron, relaxation, LMS(least mean squared), pseudoinverse), this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epochs. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

이동최소제곱 유한차분법을 이용한 계면경계를 갖는 이종재료의 열전달문제 해석 (Heat Transfer Analysis of Bi-Material Problem with Interfacial Boundary Using Moving Least Squares Finite Difference Method)

  • 윤영철;김도완
    • 한국전산구조공학회논문집
    • /
    • 제20권6호
    • /
    • pp.779-787
    • /
    • 2007
  • 본 연구는 계면경계에서 특이성을 갖는 이종재료 열전달문제를 효율적으로 해석할 수 있는 이동최소제곱 유한차분법을 제시한다 이동최소제곱 유한차분법은 격자망(grid)없이 절점만으로 이동최소제곱법을 이용하여 Taylor 다항식을 구성하고 차분식을 만들어 미분방정식을 직접 푼다. 초평면함수 개념에 근거한 쐐기함수를 이동최소제곱 센스(sense)로 근사식에 매입하여 쐐기거동과 미분 점프에 따른 계면경계 특성을 효과적으로 묘사하고 고속으로 미분을 근사하는 이동최소제곱 유한차분법의 강점을 발휘하도록 했다. 서로 다른 열전달계수를 갖는 이종재료 열전도문제 해석을 통해 이동최소제곱 유한차분법이 계면경계문제에서도 뛰어난 계산효율성과 해의 정확성을 확보할 수 있음을 보였다.

Support Vector Machine에 대한 커널 함수의 성능 분석 (Performance Analysis of Kernel Function for Support Vector Machine)

  • 심우성;성세영;정차근
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2009년도 정보 및 제어 심포지움 논문집
    • /
    • pp.405-407
    • /
    • 2009
  • SVM(Support Vector Machine) is a classification method which is recently watched in mechanical learning system. Vapnik, Osuna, Platt etc. had suggested methodology in order to solve needed QP(Quadratic Programming) to realize SVM so that have extended application field. SVM find hyperplane which classify into 2 class by converting from input space converter vector to characteristic space vector using Kernel Function. This is very systematic and theoretical more than neural network which is experiential study method. Although SVM has superior generalization characteristic, it depends on Kernel Function. There are three category in the Kernel Function as Polynomial Kernel, RBF(Radial Basis Function) Kernel, Sigmoid Kernel. This paper has analyzed performance of SVM against kernel using virtual data.

  • PDF

AN EXTENSION OF SCHNEIDER'S CHARACTERIZATION THEOREM FOR ELLIPSOIDS

  • Dong-Soo Kim;Young Ho Kim
    • 대한수학회보
    • /
    • 제60권4호
    • /
    • pp.905-913
    • /
    • 2023
  • Suppose that M is a strictly convex hypersurface in the (n + 1)-dimensional Euclidean space 𝔼n+1 with the origin o in its convex side and with the outward unit normal N. For a fixed point p ∈ M and a positive constant t, we put 𝚽t the hyperplane parallel to the tangent hyperplane 𝚽 at p and passing through the point q = p - tN(p). We consider the region cut from M by the parallel hyperplane 𝚽t, and denote by Ip(t) the (n + 1)-dimensional volume of the convex hull of the region and the origin o. Then Schneider's characterization theorem for ellipsoids states that among centrally symmetric, strictly convex and closed surfaces in the 3-dimensional Euclidean space 𝔼3, the ellipsoids are the only ones satisfying Ip(t) = 𝜙(p)t, where 𝜙 is a function defined on M. Recently, the characterization theorem was extended to centrally symmetric, strictly convex and closed hypersurfaces in 𝔼n+1 satisfying for a constant 𝛽, Ip(t) = 𝜙(p)t𝛽. In this paper, we study the volume Ip(t) of a strictly convex and complete hypersurface in 𝔼n+1 with the origin o in its convex side. As a result, first of all we extend the characterization theorem to strictly convex and closed (not necessarily centrally symmetric) hypersurfaces in 𝔼n+1 satisfying Ip(t) = 𝜙(p)t𝛽. After that we generalize the characterization theorem to strictly convex and complete (not necessarily closed) hypersurfaces in 𝔼n+1 satisfying Ip(t) = 𝜙(p)t𝛽.