• 제목/요약/키워드: Deterministic algorithms

검색결과 117건 처리시간 0.025초

균형-교환방법을 적용한 경제급전문제 최적화 알고리즘 (Optimization Algorithm for Economic Load Dispatch Problem Using Balance and Swap Method)

  • 이상운
    • 한국인터넷방송통신학회논문지
    • /
    • 제15권2호
    • /
    • pp.255-262
    • /
    • 2015
  • 경제급전 최적화 문제를 해결하는 결정론적인 알고리즘에 존재하지 않아 지금까지는 비결정론적인 휴리스틱 알고리즘들이 제안되고 있다. 본 논문은 균형과 교환 방법을 도입하여 경제급전의 최적화 문제를 풀 수 있는 알고리즘을 제안하였다. 제안된 알고리즘은 초기치에 대해 성인걸음수와 아기걸음 수별로 발전량을 감소시켜 ${\Sigma}P_i=P_d$로 균형을 맞추고, 이 때 최소 발전비용을 가진 방법을 선택한다. 다음으로 선택된 방법에 대해 성인걸음-아기걸음 교환과 거인걸음 교환 방법으로 최적화한 값을 구하여 최소값 방법을 선택한다. 마지막으로 선택된 방법에 대해 $P_i{\pm}{\beta}$, (${\beta}=0.1,0.01,0.001,0.0001$)의 교환을 수행하였다. 경제급전 문제의 시험사례로 빈번히 활용되고 있는 3개 데이터에 대해 제안된 알고리즘을 적용한 결과 2개 데이터에서는 성능을 향상시켰으며, 1개 데이터는 기존의 최적해와 동일한 결과를 얻었다. 제안된 알고리즘은 항상 동일한 결과를 얻을 수 있고, 모든 데이터에 적합하므로 경제급전 최적화 알고리즘으로 실제 적용이 가능하다.

컴프턴 카메라를 위한 재배열 기반 확정론적 영상재구성법 (Rebinning-Based Deterministic Image Reconstruction Methods for Compton Camera)

  • 이미노;이수진;서희
    • 대한의용생체공학회:의공학회지
    • /
    • 제32권1호
    • /
    • pp.15-24
    • /
    • 2011
  • While Compton imaging is recognized as a valuable 3-D technique in nuclear medicine, reconstructing an image from Compton scattered data has been of a difficult problem due to its computational complexity. The most complex and time-consuming computation in Compton camera reconstruction is to perform the conical projection and backprojection operations. To alleviate the computational burden imposed by these operations, we investigate a rebinning method which can convert conical projections into parallel projections. The use of parallel projections allows to directly apply the existing deterministic reconstruction methods, which have been useful for conventional emission tomography, to Compton camera reconstruction. To convert conical projections into parallel projections, a cone surface is sampled with a number of lines. Each line is projected onto an imaginary plane that is mostly perpendicular to the line. The projection data rebinned in each imaginary plane can then be treated as the standard parallel projection data. To validate the rebinning method, we tested with the representative deterministic algorithms, such as the filtered backprojection method and the algebraic reconstruction technique. Our experimental results indicate that the rebinning method can be useful when the direct application of existing deterministic methods is needed for Compton camera reconstruction.

Error Analysis of the Exponential RLS Algorithms Applied to Speech Signal Processing

  • Yoo, Kyung-Yul
    • The Journal of the Acoustical Society of Korea
    • /
    • 제15권3E호
    • /
    • pp.78-85
    • /
    • 1996
  • The set of admissible time-variations in the input signal can be separated into two categories : slow parameter changes and large parameter changes which occur infrequently. A common approach used in the tracking of slowly time-varying parameters is the exponential recursive least-squares(RLS) algorithm. There have been a variety of research works on the error analysis of the exponential RLS algorithm for the slowly time-varying parameters. In this paper, the focus has been given to the error analysis of exponential RLS algorithms for the input data with abrupt property changes. The voiced speech signal is chosen as the principal application. In order to analyze the error performance of the exponential RLS algorithm, deterministic properties of the exponential RLS algorithms is first analyzed for the case of abrupt parameter changes, the impulsive input(or error variance) synchronous to the abrupt change of parameter vectors actually enhances the convergence of the exponential RLS algorithm. The analysis has also been verified through simulations on the synthetic speech signal.

  • PDF

웨이브렛 변환과 신경망 학습을 이용한 고저항 지락사고 검출에 관한 연구 (A Syudy on the Detection of High Impedance Faults using Wavelet Transforms and Neural Network)

  • 홍대승;배영철;전상영;임화영
    • 한국정보통신학회:학술대회논문집
    • /
    • 한국해양정보통신학회 2000년도 추계종합학술대회
    • /
    • pp.459-462
    • /
    • 2000
  • The analysis of distribution line faults is essential to the proper protection of power system. A high impedance fault(HIF) dose not make enough current to cause conventional protective device operating. so it is well hon that undesirable operating conditions and certain types of faults on electric distribution feeders cannot be detected by using conventional protection system. In this paper, we prove that the nature of the high impedance faults is indeed a deterministic chaos, not a random motion Algorithms for estimating Lyapunov spectrum and the largest Lyapunov exponent are applied to various fault currents detections in order to evaluate the orbital instability peculiar to deterministic chaos dynamically, and fractal dimensions of fault currents which represent geometrical self-similarity are calculated. Wavelet transform analysis is applied the time-scale information to fault signal. Time-scale representation of high impedance faults can detect easily and localize correctly the fault waveform.

  • PDF

카오스 해석에 기초한 고저항 고장전류의 특징 추출에 관한 연구 (A Study on Extracting Characteristics of High Impedance Fault-Current Based on Chaotic Analysis.)

  • 배영철;고재호;임화영
    • 한국정보통신학회논문지
    • /
    • 제4권2호
    • /
    • pp.379-388
    • /
    • 2000
  • Previous studies on high impedance faults assumed that the erratic behavior of fault current would be random. In this paper, we prove that the nature of the high impedance faults is indeed a deterministic chaos, not a random motion. Algorithms for estimating Lyapunov spectrum and the largest Lyapunov exponent are applied to various fault currents in order to evaluate the orbital instability peculiar to deterministic chaos dynamically, and fractal dimensions of fault currents, which represent geometrical self-similarity are calculated. In addition, qualitative analysis such as phase planes, Poincare maps obtained from fault currents indicate that the irregular behavior is described by strange attractor.

  • PDF

Global Optimization of Clusters in Gene Expression Data of DNA Microarrays by Deterministic Annealing

  • Lee, Kwon Moo;Chung, Tae Su;Kim, Ju Han
    • Genomics & Informatics
    • /
    • 제1권1호
    • /
    • pp.20-24
    • /
    • 2003
  • The analysis of DNA microarry data is one of the most important things for functional genomics research. The matrix representation of microarray data and its successive 'optimal' incisional hyperplanes is a useful platform for developing optimization algorithms to determine the optimal partitioning of pairwise proximity matrix representing completely connected and weighted graph. We developed Deterministic Annealing (DA) approach to determine the successive optimal binary partitioning. DA algorithm demonstrated good performance with the ability to find the 'globally optimal' binary partitions. In addition, the objects that have not been clustered at small non­zero temperature, are considered to be very sensitive to even small randomness, and can be used to estimate the reliability of the clustering.

March Test 기법의 한게 및 알고리즘(반도체 메모리의 커플링 고장을 중심으로) (The Limit of the March Test Method and Algorithms (On Detecting Coupling Faults of Semiconductor Memories))

  • 여정모;조상복
    • 전자공학회논문지A
    • /
    • 제29A권8호
    • /
    • pp.99-109
    • /
    • 1992
  • First, the coupling faults of semiconductor memory are classified in detail. The chained coupling fault is introduced and defined, which results from sequential influencing of the coupling effects among memory cells, and its mapping relation is described. The linked coupling fault and its order are defined. Second, the deterministic “Algorithm GA” is proposed, which detects stuack-at faults, transition faults, address decoder faults, unlinked 2-coupling faults, and unlinked chained coupling faults. The time complexity and the fault coverage are improved in this algorithm. Third, it is proved that the march test of an address sequence can detect 97.796% of the linked 2-coupling faults with order 2. The deterministic “Algorithm NA” proposed can detect to the limit. The time complexity and the fault coverage are improved in this algorithm.

  • PDF

웨이브릿 변환과 카오스 특성을 이용한 고저항 지락사고 검출에 관한 연구 (A Study on High Impedance Fault Detection using Wavelet Transform and Chaos Properties)

  • 홍대승;임화영
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2000년도 하계학술대회 논문집 D
    • /
    • pp.2525-2527
    • /
    • 2000
  • The analysis of distribution line faults is essential to the proper protection of power system. A high impedance fault(HIF) dose not make enough current to cause conventional protective device operating, so it is well known that undesirable operating conditions and certain types of faults on electric distribution feeders cannot be detected by using conventional protection system. In this paper, we prove that the nature of the high impedance faults is indeed a deterministic chaos, not a random motion. Algorithms for estimating Lyapunov spectrum and the largest Lyapunov exponent are applied to various fault currents detections in order to evaluate the orbital instability peculiar to deterministic chaos dynamically, and fractal dimensions of fault currents which represent geometrical self-similarity are calculated. Wavelet transform analysis is applied the time-scale information to fault signal. Time-scale representation of high impedance faults can detect easily and localize correctly the fault waveform.

  • PDF

전력계통의 고임피던스 고장으로부터 혼돈특성 추출에 관한 연구 (A Study on Extracting Chaotic Properties from High Impedance Faults in Power Systems)

  • 고재호
    • 한국지능시스템학회논문지
    • /
    • 제9권5호
    • /
    • pp.545-549
    • /
    • 1999
  • Previous studies on high impedance faults assumed that the erratic behavior of fault current would be random. In this paper we prove that the nature of the high impedance faults is indeed a deterministic chaos not a random motion. Algorithms for estimating Lyapunov spectrum and the largest Lyapunov exponent are applied to various fault currents in order to evaluate the orbital instability peculiar to deterministic chaos dynamically and fractal dimensions of fault currents which represent geometrical self-similarity are calculated. In addition qualitative analysis such a s phase planes Poincare maps obtained from fault currents indicate that the irregular behavior is described by strange attractor.

  • PDF

Virtual Machine Placement Methods using Metaheuristic Algorithms in a Cloud Environment - A Comprehensive Review

  • Alsadie, Deafallah
    • International Journal of Computer Science & Network Security
    • /
    • 제22권4호
    • /
    • pp.147-158
    • /
    • 2022
  • Cloud Computing offers flexible, on demand, ubiquitous resources for cloud users. Cloud users are provided computing resources in a virtualized environment. In order to meet the growing demands for computing resources, data centres contain a large number of physical machines accommodating multiple virtual machines. However, cloud data centres cannot utilize their computing resources to their total capacity. Several policies have been proposed for improving energy proficiency and computing resource utilization in cloud data centres. Virtual machine placement is an effective method involving efficient mapping of virtual machines to physical machines. However, the availability of many physical machines accommodating multiple virtual machines in a data centre has made the virtual machine placement problem a non deterministic polynomial time hard (NP hard) problem. Metaheuristic algorithms have been widely used to solve the NP hard problems of multiple and conflicting objectives, such as the virtual machine placement problem. In this context, we presented essential concepts regarding virtual machine placement and objective functions for optimizing different parameters. This paper provides a taxonomy of metaheuristic algorithms for the virtual machine placement method. It is followed by a review of prominent research of virtual machine placement methods using meta heuristic algorithms and comparing them. Finally, this paper provides a conclusion and future research directions in virtual machine placement of cloud computing.