• Title/Summary/Keyword: Simulated Annealing

Search Result 635, Processing Time 0.021 seconds

Bayesian Nonlinear Blind Channel Equalizer based on Gaussian Weighted MFCM

  • Han, Soo-Whan;Park, Sung-Dae;Lee, Jong-Keuk
    • Journal of Korea Multimedia Society
    • /
    • v.11 no.12
    • /
    • pp.1625-1634
    • /
    • 2008
  • In this study, a modified Fuzzy C-Means algorithm with Gaussian weights (MFCM_GW) is presented for the problem of nonlinear blind channel equalization. The proposed algorithm searches for the optimal channel output states of a nonlinear channel based on received symbols. In contrast to conventional Euclidean distance in Fuzzy C-Means (FCM), the use of the Bayesian likelihood fitness function and the Gaussian weighted partition matrix is exploited in this method. In the search procedure, all possible sets of desired channel states are constructed by considering the combinations of estimated channel output states. The set of desired states characterized by the maxima] value of the Bayesian fitness is selected and updated by using the Gaussian weights. After this procedure, the Bayesian equalizer with the final desired states is implemented to reconstruct transmitted symbols. The performance of the proposed method is compared with those of a simplex genetic algorithm (GA), a hybrid genetic algorithm (GA merged with simulated annealing (SA):GASA), and a previously developed version of MFCM. In particular, a relative]y high accuracy and a fast search speed have been observed.

  • PDF

Multi-Objective Pareto Optimization of Parallel Synthesis of Embedded Computer Systems

  • Drabowski, Mieczyslaw
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.3
    • /
    • pp.304-310
    • /
    • 2021
  • The paper presents problems of optimization of the synthesis of embedded systems, in particular Pareto optimization. The model of such a system for its design for high-level of abstract is based on the classic approach known from the theory of task scheduling, but it is significantly extended, among others, by the characteristics of tasks and resources as well as additional criteria of optimal system in scope structure and operation. The metaheuristic algorithm operating according to this model introduces a new approach to system synthesis, in which parallelism of task scheduling and resources partition is applied. An algorithm based on a genetic approach with simulated annealing and Boltzmann tournaments, avoids local minima and generates optimized solutions. Such a synthesis is based on the implementation of task scheduling, resources identification and partition, allocation of tasks and resources and ultimately on the optimization of the designed system in accordance with the optimization criteria regarding cost of implementation, execution speed of processes and energy consumption by the system during operation. This paper presents examples and results for multi-criteria optimization, based on calculations for specifying non-dominated solutions and indicating a subset of Pareto solutions in the space of all solutions.

PESA: Prioritized experience replay for parallel hybrid evolutionary and swarm algorithms - Application to nuclear fuel

  • Radaideh, Majdi I.;Shirvan, Koroush
    • Nuclear Engineering and Technology
    • /
    • v.54 no.10
    • /
    • pp.3864-3877
    • /
    • 2022
  • We propose a new approach called PESA (Prioritized replay Evolutionary and Swarm Algorithms) combining prioritized replay of reinforcement learning with hybrid evolutionary algorithms. PESA hybridizes different evolutionary and swarm algorithms such as particle swarm optimization, evolution strategies, simulated annealing, and differential evolution, with a modular approach to account for other algorithms. PESA hybridizes three algorithms by storing their solutions in a shared replay memory, then applying prioritized replay to redistribute data between the integral algorithms in frequent form based on their fitness and priority values, which significantly enhances sample diversity and algorithm exploration. Additionally, greedy replay is used implicitly to improve PESA exploitation close to the end of evolution. PESA features in balancing exploration and exploitation during search and the parallel computing result in an agnostic excellent performance over a wide range of experiments and problems presented in this work. PESA also shows very good scalability with number of processors in solving an expensive problem of optimizing nuclear fuel in nuclear power plants. PESA's competitive performance and modularity over all experiments allow it to join the family of evolutionary algorithms as a new hybrid algorithm; unleashing the power of parallel computing for expensive optimization.

Refinement of protein NMR structures using atomistic force field and implicit solvent model: Comparison of the accuracies of NMR structures with Rosetta refinement

  • Jee, Jun-Goo
    • Journal of the Korean Magnetic Resonance Society
    • /
    • v.26 no.1
    • /
    • pp.1-9
    • /
    • 2022
  • There are two distinct approaches to improving the quality of protein NMR structures during refinement: all-atom force fields and accumulated knowledge-assisted methods that include Rosetta. Mao et al. reported that, for 40 proteins, Rosetta increased the accuracies of their NMR-determined structures with respect to the X-ray crystal structures (Mao et al., J. Am. Chem. Soc. 136, 1893 (2014)). In this study, we calculated 32 structures of those studied by Mao et al. using all-atom force field and implicit solvent model, and we compared the results with those obtained from Rosetta. For a single protein, using only the experimental NOE-derived distances and backbone torsion angle restraints, 20 of the lowest energy structures were extracted as an ensemble from 100 generated structures. Restrained simulated annealing by molecular dynamics simulation searched conformational spaces with a total time step of 1-ns. The use of GPU-accelerated AMBER code allowed the calculations to be completed in hours using a single GPU computer-even for proteins larger than 20 kDa. Remarkably, statistical analyses indicated that the structures determined in this way showed overall higher accuracies to their X-ray structures compared to those refined by Rosetta (p-value < 0.01). Our data demonstrate the capability of sophisticated atomistic force fields in refining NMR structures, particularly when they are coupled with the latest GPU-based calculations. The straightforwardness of the protocol allows its use to be extended to all NMR structures.

A Study on Optimal Placement of Underwater Target Position Tracking System considering Marine Environment (해양환경을 고려한 수중기동표적 위치추적체계 최적배치에 관한 연구)

  • Taehyeong Kim;Seongyong Kim;Minsu Han;Kyungjun Song
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.26 no.5
    • /
    • pp.400-408
    • /
    • 2023
  • The tracking accuracy of buoy-based LBL(Long Base Line) systems can be significantly influenced by sea environmental conditions. Particularly, the position of buoys that may have drifted due to sea currents. Therefore it is necessary to predict and optimize the drifted-buoy positions in the deploying step. This research introduces a free-drift simulation model using ocean data from the European CMEMS. The simulation model's predictions are validated by comparing them to actual sea buoy drift tracks, showing a substantial match in averaged drift speed and direction. Using this drift model, we optimize the initial buoy layout and compare the tracking performance between the center hexagonal layout and close track layout. Our results verify that the optimized layout achieves lower tracking errors compared to the other two layout.

Limiting conditions prediction using machine learning for loss of condenser vacuum event

  • Dong-Hun Shin;Moon-Ghu Park;Hae-Yong Jeong;Jae-Yong Lee;Jung-Uk Sohn;Do-Yeon Kim
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4607-4616
    • /
    • 2023
  • We implement machine learning regression models to predict peak pressures of primary and secondary systems, a major safety concern in Loss Of Condenser Vacuum (LOCV) accident. We selected the Multi-dimensional Analysis of Reactor Safety-KINS standard (MARS-KS) code to analyze the LOCV accident, and the reference plant is the Korean Optimized Power Reactor 1000MWe (OPR1000). eXtreme Gradient Boosting (XGBoost) is selected as a machine learning tool. The MARS-KS code is used to generate LOCV accident data and the data is applied to train the machine learning model. Hyperparameter optimization is performed using a simulated annealing. The randomly generated combination of initial conditions within the operating range is put into the input of the XGBoost model to predict the peak pressure. These initial conditions that cause peak pressure with MARS-KS generate the results. After such a process, the error between the predicted value and the code output is calculated. Uncertainty about the machine learning model is also calculated to verify the model accuracy. The machine learning model presented in this paper successfully identifies a combination of initial conditions that produce a more conservative peak pressure than the values calculated with existing methodologies.

Preparation of Nanocrystalline ZrO2 Film by Using a Zirconium Naphthenate and Evaluation of Calcium Phosphate Forming Ability (지르코늄 나프테네이트를 이용한 나노결정질 ZrO2 박막의 제조와 칼슘 포스페이트 형성 능력의 평가)

  • Oh, Jeong-Sun;Ahn, Jun-Hyung;Yun, Yeon-Hum;Kang, Bo-An;Kim, Sang-Bok;Hwang, Kyu-Seog;Shim, Yeon-A
    • Journal of the Korean Ceramic Society
    • /
    • v.39 no.9
    • /
    • pp.884-889
    • /
    • 2002
  • In order to investigate the calcium phosphate forming ability of nanocrystalline $ZrO_2$ film, we prepared $ZrO_2/Si$ structure by using a chemical solution deposition with a zirconium naphthenate as a starting material. Precursor sol was spin-coated onto the (100)Si substrate and prefired at 500$^{\circ}C$ for 10 min in air, followed by final annealing at 800$^{\circ}C$ for 30 min in air. Crystallinity of the annealed film was examined by X-ray diffraction analysis. Surface morphology and surface roughness of the film were characterized by field emission-scanning electron microscope and atomic force microscope. After annealing, nanocrystalline $ZrO_2$ grains were obtained on the surface of the film with a homogeneous interface between the film and substrate. After immersion for 1 or 5 days in a simulated body fluid, formation of calcium phosphate was observed on $ZrO_2$ film annealed at 800$^{\circ}C$ by energy dispersive X-ray spectrometer. The fourier transform infrared spectroscopy revealed that carbonate was substituted into the calcium phosphate.

Multicriteria shape design of a sheet contour in stamping

  • Oujebbour, Fatima-Zahra;Habbal, Abderrahmane;Ellaia, Rachid;Zhao, Ziheng
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.3
    • /
    • pp.187-193
    • /
    • 2014
  • One of the hottest challenges in automotive industry is related to weight reduction in sheet metal forming processes, in order to produce a high quality metal part with minimal material cost. Stamping is the most widely used sheet metal forming process; but its implementation comes with several fabrication flaws such as springback and failure. A global and simple approach to circumvent these unwanted process drawbacks consists in optimizing the initial blank shape with innovative methods. The aim of this paper is to introduce an efficient methodology to deal with complex, computationally expensive multicriteria optimization problems. Our approach is based on the combination of methods to capture the Pareto Front, approximate criteria (to save computational costs) and global optimizers. To illustrate the efficiency, we consider the stamping of an industrial workpiece as test-case. Our approach is applied to the springback and failure criteria. To optimize these two criteria, a global optimization algorithm was chosen. It is the Simulated Annealing algorithm hybridized with the Simultaneous Perturbation Stochastic Approximation in order to gain in time and in precision. The multicriteria problems amounts to the capture of the Pareto Front associated to the two criteria. Normal Boundary Intersection and Normalized Normal Constraint Method are considered for generating a set of Pareto-optimal solutions with the characteristic of uniform distribution of front points. The computational results are compared to those obtained with the well-known Non-dominated Sorting Genetic Algorithm II. The results show that our proposed approach is efficient to deal with the multicriteria shape optimization of highly non-linear mechanical systems.

Optimum Design of Steel Frames Using Genetic Algorithms (유전자 알고리즘을 이용한 강 뼈대 구조물의 최적설계)

  • 정영식;정석진
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.13 no.3
    • /
    • pp.337-349
    • /
    • 2000
  • Genetic Algorithms(GA) together with simulated annealing are often called methods of last resorts since they can be applicable to any kind of problems, particularly those to which no sophisticated procedures are applicable or feasible. The design of structures is primarily the process of selecting a section for each member from those available in the market, resulting in the problem of combinatorial nature. Therefore it is usual for the design space to include astronomical number of designs making the search in the space often impossible. In this work, Genetic Algorithms and some related technique are introduced and applied to the design of steel frameworks. In problems with a small number of design variables, GA found true global optima. GA also found true optima for the continuous variable test problems and proved their applicability to structural optimization. For those problems of real size, however, it appears to be difficult to expect GA to find optimum or even near optimum designs. The use of G bit improvement added to ordinary GA has shown much better results and draws attention for further research.

  • PDF

Development of a Design System for Multi-Stage Gear Drives (2nd Report : Development of a Generalized New Design Algortitm

  • Chong, Tae-Hyong;Inho Bae
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.2 no.2
    • /
    • pp.65-72
    • /
    • 2001
  • The design of multi-stage gear drives is a time-consuming process, since on includes more complicated problems, which are not considered in the design of single-stage gear drives. The designer has th determine the number of reduction stages and the gear ratios of each reduction state. In addition, the design problems include not only the dimensional design but also the configuration design of gear drive elements. There is no definite rule and principle for these types of design problems. Thus the design practices largely depend on the sense and the experiences of the designer , and consequently result in undesirable design solution. We propose a new generalized design algorithm to support the designer at the preliminary design phase of multi-stage gear drives. The proposed design algorithm automates the design process by integrating the dimensional design and the configuration design process. The algorithm consists of four steps. In the first step, a designer determines the number of reduction stage. In the second step. gear ratios se chosen by using the random search method. In the third step, the values of basic design parameter are chosen by using the generate and test method. Then, the values of other dimension, such ad pitch diameter, outer diameter, and face width, are calculated for the configuration design in the final step. The strength and durability of a gear is guaranteed by the bending strength and the pitting resistance rating practices by using the AGMA rating formulas. In the final step, the configuration design is carried out b using the simulated annealing algorithm. The positions of gears and shafts are determined to minimize the geometrical volume(size) of a gearbox, while satisfying spatial constraints between them. These steps are carried out iteratively until a desirable solution is acquired. The propose design algorithm has been applied to the preliminary design of four-stage gear drives in order to validate the availability. The design solution have shown considerably good results in both aspects of the dimensional and the configuration design.

  • PDF