• Title/Summary/Keyword: hyperplane function

Search Result 28, Processing Time 0.026 seconds

ENUMERATION OF GRAPHS AND THE CHARACTERISTIC POLYNOMIAL OF THE HYPERPLANE ARRANGEMENTS 𝒥n

  • Song, Joungmin
    • Journal of the Korean Mathematical Society
    • /
    • v.54 no.5
    • /
    • pp.1595-1604
    • /
    • 2017
  • We give a complete formula for the characteristic polynomial of hyperplane arrangements ${\mathcal{J}}_n$ consisting of the hyperplanes $x_i+x_j=1$, $x_k=0$, $x_l=1$, $1{\leq}i$, j, k, $l{\leq}n$. The formula is obtained by associating hyperplane arrangements with graphs, and then enumerating central graphs via generating functions for the number of bipartite graphs of given order, size and number of connected components.

On the Dynamics of Multi-Dimensional Lotka-Volterra Equations

  • Abe, Jun;Matsuoka, Taiju;Kunimatsu, Noboru
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1623-1628
    • /
    • 2004
  • In the 3-dimensional cyclic Lotka-Volterra equations, we show the solution on the invariant hyperplane. In addition, we show the existence of the invariant hyperplane by the center manifold theorem under the some conditions. With this result, we can lead the hyperplane of the n-dimensional cyclic Lotka-Volterra equaions. In other section, we study the 3- or 4-dimensional Hamiltonian Lotka-Volterra equations which satisfy the Jacobi identity. We analyze the solution of the Hamiltonian Lotka- Volterra equations with the functions called the split Liapunov functions by [4], [5] since they provide the Liapunov functions for each region separated by the invariant hyperplane. In the cyclic Lotka-Volterra equations, the role of the Liapunov functions is the same in the odd and even dimension. However, in the Hamiltonian Lotka-Volterra equations, we can show the difference of the role of the Liapunov function between the odd and the even dimension by the numerical calculation. In this paper, we regard the invariant hyperplane as the important item to analyze the motion of Lotka-Volterra equations and occur the chaotic orbit. Furtheremore, an example of the asymptoticaly stable and stable solution of the 3-dimensional cyclic Lotka-Volterra equations, 3- and 4-dimensional Hamiltonian equations are shown.

  • PDF

A Performance Comparison of SVM and MLP for Multiple Defect Diagnosis of Gas Turbine Engine (가스터빈 엔진의 복합 결함 진단을 위한 SVM과 MLP의 성능 비교)

  • Park Jun-Cheol;Roh Tae-Seong;Choi Dong-Whan
    • Proceedings of the Korean Society of Propulsion Engineers Conference
    • /
    • 2005.11a
    • /
    • pp.158-161
    • /
    • 2005
  • In this study, the defect diagnosis of the gas turbine engine was tried using Support Vector Machine(SVM). It is known that SVM can find the optimal solution mathematically through classifying two groups and searching for the Hyperplane of the arbitrary nonlinear boundary. The method for the decision of the gas turbine defect quantitatively was proposed using the Multi Layer SVM for classifying two groups and it was verified that SVM was shown quicker and more reliable diagnostic results than the existing Multi Layer Perceptron(MLP).

  • PDF

On the Support Vector Machine with the kernel of the q-normal distribution

  • Joguchi, Hirofumi;Tanaka, Masaru
    • Proceedings of the IEEK Conference
    • /
    • 2002.07b
    • /
    • pp.983-986
    • /
    • 2002
  • Support Vector Machine (SVM) is one of the methods of pattern recognition that separate input data using hyperplane. This method has high capability of pattern recognition by using the technique, which says kernel trick, and the Radial basis function (RBF) kernel is usually used as a kernel function in kernel trick. In this paper we propose using the q-normal distribution to the kernel function, instead of conventional RBF, and compare two types of the kernel function.

  • PDF

Spare Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완 절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.9
    • /
    • pp.817-821
    • /
    • 2001
  • In this paper, a new learning methodology for kernel methods that results in a sparse representation of kernel space from the training patterns for classification problems is suggested. Among the traditional algorithms of linear discriminant function, this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epoches. For sequential learning of kernel methods, extended SVM and kernel discriminant function are defined. Systematic derivation of learning algorithm is introduced. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Imbalanced SVM-Based Anomaly Detection Algorithm for Imbalanced Training Datasets

  • Wang, GuiPing;Yang, JianXi;Li, Ren
    • ETRI Journal
    • /
    • v.39 no.5
    • /
    • pp.621-631
    • /
    • 2017
  • Abnormal samples are usually difficult to obtain in production systems, resulting in imbalanced training sample sets. Namely, the number of positive samples is far less than the number of negative samples. Traditional Support Vector Machine (SVM)-based anomaly detection algorithms perform poorly for highly imbalanced datasets: the learned classification hyperplane skews toward the positive samples, resulting in a high false-negative rate. This article proposes a new imbalanced SVM (termed ImSVM)-based anomaly detection algorithm, which assigns a different weight for each positive support vector in the decision function. ImSVM adjusts the learned classification hyperplane to make the decision function achieve a maximum GMean measure value on the dataset. The above problem is converted into an unconstrained optimization problem to search the optimal weight vector. Experiments are carried out on both Cloud datasets and Knowledge Discovery and Data Mining datasets to evaluate ImSVM. Highly imbalanced training sample sets are constructed. The experimental results show that ImSVM outperforms over-sampling techniques and several existing imbalanced SVM-based techniques.

Sparse Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.60-64
    • /
    • 2001
  • In this paper, a new learning methodology for Kernel Methods is suggested that results in a sparse representation of kernel space from the training patterns for classification problems. Among the traditional algorithms of linear discriminant function(perceptron, relaxation, LMS(least mean squared), pseudoinverse), this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epochs. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Heat Transfer Analysis of Bi-Material Problem with Interfacial Boundary Using Moving Least Squares Finite Difference Method (이동최소제곱 유한차분법을 이용한 계면경계를 갖는 이종재료의 열전달문제 해석)

  • Yoon, Young-Cheol;Kim, Do-Wan
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.20 no.6
    • /
    • pp.779-787
    • /
    • 2007
  • This paper presents a highly efficient moving least squares finite difference method (MLS FDM) for a heat transfer problem of bi-material with interfacial boundary. The MLS FDM directly discretizes governing differential equations based on a node set without a grid structure. In the method, difference equations are constructed by the Taylor polynomial expanded by moving least squares method. The wedge function is designed on the concept of hyperplane function and is embedded in the derivative approximation formula on the moving least squares sense. Thus interfacial singular behavior like normal derivative jump is naturally modeled and the merit of MLS FDM in fast derivative computation is assured. Numerical experiments for heat transfer problem of bi-material with different heat conductivities show that the developed method achieves high efficiency as well as good accuracy in interface problems.

Performance Analysis of Kernel Function for Support Vector Machine (Support Vector Machine에 대한 커널 함수의 성능 분석)

  • Sim, Woo-Sung;Sung, Se-Young;Cheng, Cha-Keon
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.405-407
    • /
    • 2009
  • SVM(Support Vector Machine) is a classification method which is recently watched in mechanical learning system. Vapnik, Osuna, Platt etc. had suggested methodology in order to solve needed QP(Quadratic Programming) to realize SVM so that have extended application field. SVM find hyperplane which classify into 2 class by converting from input space converter vector to characteristic space vector using Kernel Function. This is very systematic and theoretical more than neural network which is experiential study method. Although SVM has superior generalization characteristic, it depends on Kernel Function. There are three category in the Kernel Function as Polynomial Kernel, RBF(Radial Basis Function) Kernel, Sigmoid Kernel. This paper has analyzed performance of SVM against kernel using virtual data.

  • PDF

AN EXTENSION OF SCHNEIDER'S CHARACTERIZATION THEOREM FOR ELLIPSOIDS

  • Dong-Soo Kim;Young Ho Kim
    • Bulletin of the Korean Mathematical Society
    • /
    • v.60 no.4
    • /
    • pp.905-913
    • /
    • 2023
  • Suppose that M is a strictly convex hypersurface in the (n + 1)-dimensional Euclidean space 𝔼n+1 with the origin o in its convex side and with the outward unit normal N. For a fixed point p ∈ M and a positive constant t, we put 𝚽t the hyperplane parallel to the tangent hyperplane 𝚽 at p and passing through the point q = p - tN(p). We consider the region cut from M by the parallel hyperplane 𝚽t, and denote by Ip(t) the (n + 1)-dimensional volume of the convex hull of the region and the origin o. Then Schneider's characterization theorem for ellipsoids states that among centrally symmetric, strictly convex and closed surfaces in the 3-dimensional Euclidean space 𝔼3, the ellipsoids are the only ones satisfying Ip(t) = 𝜙(p)t, where 𝜙 is a function defined on M. Recently, the characterization theorem was extended to centrally symmetric, strictly convex and closed hypersurfaces in 𝔼n+1 satisfying for a constant 𝛽, Ip(t) = 𝜙(p)t𝛽. In this paper, we study the volume Ip(t) of a strictly convex and complete hypersurface in 𝔼n+1 with the origin o in its convex side. As a result, first of all we extend the characterization theorem to strictly convex and closed (not necessarily centrally symmetric) hypersurfaces in 𝔼n+1 satisfying Ip(t) = 𝜙(p)t𝛽. After that we generalize the characterization theorem to strictly convex and complete (not necessarily closed) hypersurfaces in 𝔼n+1 satisfying Ip(t) = 𝜙(p)t𝛽.