• Title/Summary/Keyword: parametric functions

Search Result 274, Processing Time 0.018 seconds

Mesh distortion sensitivity of 8-node plane elasticity elements based on parametric, metric, parametric-metric, and metric-parametric formulations

  • Rajendran, S.;Subramanian, S.
    • Structural Engineering and Mechanics
    • /
    • v.17 no.6
    • /
    • pp.767-788
    • /
    • 2004
  • The classical 8-node isoparametric serendipity element uses parametric shape functions for both test and trial functions. Although this element performs well in general, it yields poor results under severe mesh distortions. The distortion sensitivity is caused by the lack of continuity and/or completeness of shape functions used for test and trial functions. A recent element using parametric and metric shape functions for constructing the test and trial functions exhibits distortion immunity. This paper discusses the choice of parametric or metric shape functions as the basis for test and/or trial functions, satisfaction of continuity and completeness requirements, and their connection to distortion sensitivity. Also, the performances of four types of elements, viz., parametric, metric, parametric-metric, and metric-parametric, are compared for distorted meshes, and their merits and demerits are discussed.

SOME EVALUATIONS OF INFINITE SERIES INVOLVING DIRICHLET TYPE PARAMETRIC HARMONIC NUMBERS

  • Hongyuan Rui;Ce Xu;Xiaobin Yin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.61 no.3
    • /
    • pp.671-697
    • /
    • 2024
  • In this paper, we formally introduce the notion of a general parametric digamma function Ψ(−s; A, a) and we find the Laurent expansion of Ψ(−s; A, a) at the integers and poles. Considering the contour integrations involving Ψ(−s; A, a), we present some new identities for infinite series involving Dirichlet type parametric harmonic numbers by using the method of residue computation. Then applying these formulas obtained, we establish some explicit relations of parametric linear Euler sums and some special functions (e.g. trigonometric functions, digamma functions, Hurwitz zeta functions etc.). Moreover, some illustrative special cases as well as immediate consequences of the main results are also considered.

Integration of History-based Parametric CAD Model Translators Using Automation API (오토메이션 API를 사용한 설계 이력 기반 파라메트릭 CAD 모델 번역기의 통합)

  • Kim B.;Han S.
    • Korean Journal of Computational Design and Engineering
    • /
    • v.11 no.3
    • /
    • pp.164-171
    • /
    • 2006
  • As collaborative design and configuration design are of increasing importance in product development, it becomes essential to exchange the feature and parametric CAD models among participants. A history-based parametric method has been proposed and implemented. But each translator which exchanges the feature and parametric information tends to be heavy because to implement duplicated functions such as the identification of the selected geometries, mapping between features which have different attributes. Furthermore. because the history-based parametric translator uses the procedural model as the neutral format, which is the XML macro file, the history-based parametric translators need a geometric modeling kernel to generate an internal explicit geometric model. To ease the problem, we implemented a shared integration platform, the TransCAD. The TransCAD separates translators from the XML macro files. The translators for various CAD systems need to communicate with only the TransCAD. To support the communication with the TransCAD, we exposed the functions of the TransCAD by using the Automation APIs, which is developed by Microsoft. The Automation APIs of the TransCAD consist of the part modeling functions, the data extraction functions, and the utility functions. Each translator uses these functions to translate a parametric CAD model from the sending CAD system into the XML format, or from the in format into the model of the receiving CAD system This paper introduces what the TransCAD is and how it works for the exchange of the feature and parametric models.

SOME RELATIONS ON PARAMETRIC LINEAR EULER SUMS

  • Weiguo Lu;Ce Xu;Jianing Zhou
    • Bulletin of the Korean Mathematical Society
    • /
    • v.60 no.4
    • /
    • pp.985-1001
    • /
    • 2023
  • Recently, Alzer and Choi [2] introduced and studied a set of the four linear Euler sums with parameters. These sums are parametric extensions of Flajolet and Salvy's four kinds of linear Euler sums [9]. In this paper, by using the method of residue computations, we will establish two explicit combined formulas involving two parametric linear Euler sums S++p,q (a, b) and S+-p,q (a, b) defined by Alzer and Choi, which can be expressed in terms of a linear combinations of products of trigonometric functions, digamma functions and Hurwitz zeta functions.

Performance Improvement Method of Fully Connected Neural Network Using Combined Parametric Activation Functions (결합된 파라메트릭 활성함수를 이용한 완전연결신경망의 성능 향상)

  • Ko, Young Min;Li, Peng Hang;Ko, Sun Woo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.1
    • /
    • pp.1-10
    • /
    • 2022
  • Deep neural networks are widely used to solve various problems. In a fully connected neural network, the nonlinear activation function is a function that nonlinearly transforms the input value and outputs it. The nonlinear activation function plays an important role in solving the nonlinear problem, and various nonlinear activation functions have been studied. In this study, we propose a combined parametric activation function that can improve the performance of a fully connected neural network. Combined parametric activation functions can be created by simply adding parametric activation functions. The parametric activation function is a function that can be optimized in the direction of minimizing the loss function by applying a parameter that converts the scale and location of the activation function according to the input data. By combining the parametric activation functions, more diverse nonlinear intervals can be created, and the parameters of the parametric activation functions can be optimized in the direction of minimizing the loss function. The performance of the combined parametric activation function was tested through the MNIST classification problem and the Fashion MNIST classification problem, and as a result, it was confirmed that it has better performance than the existing nonlinear activation function and parametric activation function.

Experiments on Extraction of Non-Parametric Warping Functions for Speaker Normalization (화자 정규화를 위한 비정형 워핑함수 도출에 관한 실험)

  • Shin, Ok-Keun
    • The Journal of the Acoustical Society of Korea
    • /
    • v.24 no.5
    • /
    • pp.255-261
    • /
    • 2005
  • In this paper. experiments are conducted to extract a set of non-Parametric warping functions to examine the characteristics of the warping among speakers' utterances. For this Purpose. we made use of MFCC and LP spectra of vowels in choosing reference spectrum of each vowel as well as representative spectra of each speaker. These spectra are compared by DTW to give the warping functions of each speaker. The set of warping functions are then defined by clustering the warping functions of all the speakers. Noting that male and female warping functions have shapes similar to Piecewise linear function and Power function respectively, a new hybrid set of warping functions is defined. The effectiveness of the extracted warping functions are evaluated by conducting phone level recognition experiments, and improvements in accuracy rate are observed in both warping functions.

Parametric model for the dielectric function of InGaAs alloy films (Parametric model을 이용한 InGaAs 박막의 유전함수 연구)

  • 인용섭;김태중;최재규;김영동
    • Journal of the Korean Vacuum Society
    • /
    • v.12 no.1
    • /
    • pp.20-24
    • /
    • 2003
  • We Performed the modeling of the dielectric functions of InGaAs by using the parametric semiconductor model. Parametric model describes the analytic dielectric function as the summation of several energy-bounded Gaussian-broadened polynomials and provides a reasonably well parameterized function which can accurately reproduce the optical constants of InGaAs materials. We obtained the values of fitting parameters of an arbitrary composition $\chi$ through the parametric model. And then, from these parameters we could obtain the unknown dielectric functions of InGaAs alloy films ($0\leq\chi\leq1$).

Performance Improvement Method of Convolutional Neural Network Using Combined Parametric Activation Functions (결합된 파라메트릭 활성함수를 이용한 합성곱 신경망의 성능 향상)

  • Ko, Young Min;Li, Peng Hang;Ko, Sun Woo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.9
    • /
    • pp.371-380
    • /
    • 2022
  • Convolutional neural networks are widely used to manipulate data arranged in a grid, such as images. A general convolutional neural network consists of a convolutional layers and a fully connected layers, and each layer contains a nonlinear activation functions. This paper proposes a combined parametric activation function to improve the performance of convolutional neural networks. The combined parametric activation function is created by adding the parametric activation functions to which parameters that convert the scale and location of the activation function are applied. Various nonlinear intervals can be created according to parameters that convert multiple scales and locations, and parameters can be learned in the direction of minimizing the loss function calculated by the given input data. As a result of testing the performance of the convolutional neural network using the combined parametric activation function on the MNIST, Fashion MNIST, CIFAR10 and CIFAR100 classification problems, it was confirmed that it had better performance than other activation functions.

Performance Improvement Method of Deep Neural Network Using Parametric Activation Functions (파라메트릭 활성함수를 이용한 심층신경망의 성능향상 방법)

  • Kong, Nayoung;Ko, Sunwoo
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.3
    • /
    • pp.616-625
    • /
    • 2021
  • Deep neural networks are an approximation method that approximates an arbitrary function to a linear model and then repeats additional approximation using a nonlinear active function. In this process, the method of evaluating the performance of approximation uses the loss function. Existing in-depth learning methods implement approximation that takes into account loss functions in the linear approximation process, but non-linear approximation phases that use active functions use non-linear transformation that is not related to reduction of loss functions of loss. This study proposes parametric activation functions that introduce scale parameters that can change the scale of activation functions and location parameters that can change the location of activation functions. By introducing parametric activation functions based on scale and location parameters, the performance of nonlinear approximation using activation functions can be improved. The scale and location parameters in each hidden layer can improve the performance of the deep neural network by determining parameters that minimize the loss function value through the learning process using the primary differential coefficient of the loss function for the parameters in the backpropagation. Through MNIST classification problems and XOR problems, parametric activation functions have been found to have superior performance over existing activation functions.