• Title/Summary/Keyword: Linear constraint

Search Result 411, Processing Time 0.026 seconds

Optimal Var Allocation in system planning by stochastic Linear Programming (확률 선형 계획법에 의한 최적 Var 배분 계획에 관한 연구)

  • Song, Kil-Yeong;Lee, Hee-Yeong
    • Proceedings of the KIEE Conference
    • /
    • 1988.07a
    • /
    • pp.863-865
    • /
    • 1988
  • This paper presents a optimal Var allocation algorithm for minimizing transmission line losses and improving voltage profile in a given system. In this paper, nodal input data is considered as Gaussian distribution with their mean value and their variance. A Stocastic Linear programming technique based on chance constrained method is applied, to solve the var allocation problem with probabilistic constraint. The test result in 6-Bus Model system showes that the voltage distribution of load buses is improved and the power loss is more reduced than before var allocation.

  • PDF

A note on a method for estimating the linear expenditure system with one restriction

  • Lee, Seok-Koo
    • Journal of the Korean Statistical Society
    • /
    • v.4 no.1
    • /
    • pp.67-78
    • /
    • 1975
  • Over twenty-five years ago, Professor Klein and Rubin presented the linear expenditure system. That system was first estimated by Stone. Subsequently many investigators have estimated that system. In this paper, many points of the error structure shown by Pollak and Wales are referred to. Barten presented an estimation theorem on a singular covariance matrix. In order to estimate parameters, we place an emphasis on the maximum likihood method which we believe to be most appropriate. As we have one linear restriction on parameters to be estimated, we maximized the associated likelihood function subject to that linear restriction through the well-known lagrange multiplier method. This paper is organized in the following fashion : (1) a brief description on classical consumer theory, (2) a linear expenditure system and its constraint, (3) dyanmic specification and stochastic specification, (4) estimation method, and (5) conclusion.

  • PDF

The Buffer Allocation with Linear Resource Constraints in a Continuous Flow Line (자원제약조건을 갖는 연속흐름라인에서 Buffer 의 할당에 관한 연구)

  • Seong, Deok-Hyun;Chang, Soo-Young;Hong, Yu-Shin
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.21 no.4
    • /
    • pp.541-553
    • /
    • 1995
  • An efficient algorithm is proposed for a buffer allocation in a continuous flow line. The problem is formulated as a non-linear programming with linear constraints. The concept of pseudo gradient and gradient projection is employed in developing the algorithm. Numerical experiments show that the algorithm gives the actual optimal solutions to the problems with single linear constraint limiting the total buffer capacity. Also, even in longer production lines, it gives quite good solutions to the problems with the general linear resource constraints within a few seconds.

  • PDF

PAPR Reduction Method for the Nonlinear Distortion in the Multicode CDMA System (멀티코드 CDMA 시스템에서 비선형 왜곡에 대처하는 PAPR 저감 기법)

  • Kim Sang-Woo;Kim Namil;Kim Sun-Ae;Suh Jae-Won;Ryu Heung-Cyoon
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.16 no.12 s.103
    • /
    • pp.1171-1178
    • /
    • 2005
  • Multi-code code division multiple access(MC-CDMA) has been proposed for providing the various service rates with different quality of service requirement by assigning multiple codes and increasing the capacity. However, it suffers from the serious problem of high peak to average power ratio(PAPR). So, it requires large input back-off, which causes poor power consumption in high power amplifier(HPA). In this paper, we propose a new method that can reduce PAPR efficiently by constraint codes based on the opposite correlation to the incoming information data in MC-CDMA. PAPR reduction depends on the length and indices of constraint codes in MC-CDMA system. There is a trade-off between PAPR reduction and the length of constraint codes. From the simulation results, we also investigate the BER improvement in AWGN channel with HPA. The simulation results show that BER performance can be similar with linear amplifier in two cases: 1) Using exact constraint codes without input back-off and 2) a few constraint codes with small input back-off.

Principal Component Analysis of Compositional Data using Box-Cox Contrast Transformation (Box-Cox 대비변환을 이용한 구성비율자료의 주성분분석)

  • 최병진;김기영
    • The Korean Journal of Applied Statistics
    • /
    • v.14 no.1
    • /
    • pp.137-148
    • /
    • 2001
  • Compositional data found in many practical applications consist of non-negative vectors of proportions with the constraint which the sum of the elements of each vector is unity. It is well-known that the statistical analysis of compositional data suffers from the unit-sum constraint. Moreover, the non-linear pattern frequently displayed by the data does not facilitate the application of the linear multivariate techniques such as principal component analysis. In this paper we develop new type of principal component analysis for compositional data using Box-Cox contrast transformation. Numerical illustrations are provided for comparative purpose.

  • PDF

Fatigue Life Evaluation of an Actual Structure under the Irregular Loading using an Acceleration Test (가속시험을 통한 불규칙하중을 받는 실구조물의 피로수명평가)

  • 김형익;배봉국;박재실;석창성;모진용
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.166-169
    • /
    • 2004
  • A fatigue test was used to evaluate the fatigue life of an actual structure. The loaded state and the constraint condition of an actual structure must be same as the specimen in order to apply the test results to an actual structure by the specimen. The loaded state and constraint conditions can't be same as the specimen in the actual structure which is complicated. In order to reduce these differences, an actual structure test with a lot of frequencies is need to get a fatigue life curve. Therefore, ten sets of accelerated test units which attached unbalanced mass were composed in this study. Acceleration history about the vibration of an actual structure was acquired. Rainflow counting was used on acceleration history, and the life curve return formula was assumed. The return formula that damage satisfied `1' was acquired in a feedback process by the Miner's rule, which was the linear cumulative damage theory. A conservative fatigue life curve was determined with a return formula to have been presumed by each set. The fatigue life of regular rpm condition was calculated by these conservative fatigue life curves.

  • PDF

Feature Extraction via Sparse Difference Embedding (SDE)

  • Wan, Minghua;Lai, Zhihui
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.7
    • /
    • pp.3594-3607
    • /
    • 2017
  • The traditional feature extraction methods such as principal component analysis (PCA) cannot obtain the local structure of the samples, and locally linear embedding (LLE) cannot obtain the global structure of the samples. However, a common drawback of existing PCA and LLE algorithm is that they cannot deal well with the sparse problem of the samples. Therefore, by integrating the globality of PCA and the locality of LLE with a sparse constraint, we developed an improved and unsupervised difference algorithm called Sparse Difference Embedding (SDE), for dimensionality reduction of high-dimensional data in small sample size problems. Significantly differing from the existing PCA and LLE algorithms, SDE seeks to find a set of perfect projections that can not only impact the locality of intraclass and maximize the globality of interclass, but can also simultaneously use the Lasso regression to obtain a sparse transformation matrix. This characteristic makes SDE more intuitive and more powerful than PCA and LLE. At last, the proposed algorithm was estimated through experiments using the Yale and AR face image databases and the USPS handwriting digital databases. The experimental results show that SDE outperforms PCA LLE and UDP attributed to its sparse discriminating characteristics, which also indicates that the SDE is an effective method for face recognition.

Gaussian Weighted CFCM for Blind Equalization of Linear/Nonlinear Channel

  • Han, Soo-Whan
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.14 no.3
    • /
    • pp.169-180
    • /
    • 2013
  • The modification of conditional Fuzzy C-Means (CFCM) with Gaussian weights (CFCM_GW) is accomplished for blind equalization of channels in this paper. The proposed CFCM_GW can deal with both of linear and nonlinear channels, because it searches for the optimal desired states of an unknown channel in a direct manner, which is not dependent on the type of channel structure. In the search procedure of CFCM_GW, the Bayesian likelihood fitness function, the Gaussian weighted partition matrix and the conditional constraint are exploited. Especially, in contrast to the common Euclidean distance in conventional Fuzzy C-Means(FCM), the Gaussian weighted partition matrix and the conditional constraint in the proposed CFCM_GW make it more robust to the heavy noise communication environment. The selected channel states by CFCM_GW are always close to the optimal set of a channel even when the additive white Gaussian noise (AWGN) is heavily corrupted. These given channel states are utilized as the input of the Bayesian equalizer to reconstruct transmitted symbols. The simulation studies demonstrate that the performance of the proposed method is relatively superior to those of the existing conventional FCM based approaches in terms of accuracy and speed.

Dynamic Survivable Routing for Shared Segment Protection

  • Tapolcai, Janos;Ho, Pin-Han
    • Journal of Communications and Networks
    • /
    • v.9 no.2
    • /
    • pp.198-209
    • /
    • 2007
  • This paper provides a thorough study on shared segment protection (SSP) for mesh communication networks in the complete routing information scenario, where the integer linear program (ILP) in [1] is extended such that the following two constraints are well addressed: (a) The restoration time constraint for each connection request, and (b) the switching/merging capacity constraint at each node. A novel approach, called SSP algorithm, is developed to reduce the extremely high computation complexity in solving the ILP formulation. Basically, our approach is to derive a good approximation on the parameters in the ILP by referring to the result of solving the corresponding shared path protection (SPP) problem. Thus, the design space can be significantly reduced by eliminating some edges in the graphs. We will show in the simulation that with our approach, the optimality can be achieved in most of the cases. To verify the proposed formulation and investigate the performance impairment in terms of average cost and success rate by the additional two constraints, extensive simulation work has been conducted on three network topologies, in which SPP and shared link protection (SLP) are implemented for comparison. We will demonstrate that the proposed SSP algorithm can effectively and efficiently solve the survivable routing problem with constraints on restoration time and switching/merging capability of each node. The comparison among the three protection types further verifies that SSP can yield significant advantages over SPP and SLP without taking much computation time.