• Title/Summary/Keyword: random set theory

Search Result 36, Processing Time 0.022 seconds

Development of an Optimization Technique for Robust Design of Mechanical Structures (기계 구조의 강건 설계를 위한 최적화 기법의 개발)

  • Jeong, Do-Hyeon;Lee, Byeong-Chae
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.24 no.1 s.173
    • /
    • pp.215-224
    • /
    • 2000
  • In order to reduce the variation effects of uncertainties in the engineering environments, new robust optimization method, which considers the uncertainties in design process, is proposed. Both design variables and system parameters are considered as random variables about their nominal values. To ensure the robustness of performance function, a new objective is set to minimize the variance of that function. Constraint variations are handled by introducing probability constraints. Probability constraints are solved by the advanced first order second moment (AFOSM) method based on the reliability theory. The proposed robust optimization method has an advantage that the second derivatives of the constraints are not required. The suggested method is examined by solving three examples and the results are compared with those for deterministic case and those available in literature.

A Multi-Class Task Scheduling Strategy for Heterogeneous Distributed Computing Systems

  • El-Zoghdy, S.F.;Ghoneim, Ahmed
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.1
    • /
    • pp.117-135
    • /
    • 2016
  • Performance enhancement is one of the most important issues in high performance distributed computing systems. In such computing systems, online users submit their jobs anytime and anywhere to a set of dynamic resources. Jobs arrival and processes execution times are stochastic. The performance of a distributed computing system can be improved by using an effective load balancing strategy to redistribute the user tasks among computing resources for efficient utilization. This paper presents a multi-class load balancing strategy that balances different classes of user tasks on multiple heterogeneous computing nodes to minimize the per-class mean response time. For a wide range of system parameters, the performance of the proposed multi-class load balancing strategy is compared with that of the random distribution load balancing, and uniform distribution load balancing strategies using simulation. The results show that, the proposed strategy outperforms the other two studied strategies in terms of average task response time, and average computing nodes utilization.

Numerical measures of Indicating Placement of Posets on Scale from Chains to Antichains

  • Bae, Kyoung-Yul
    • The Journal of Information Technology and Database
    • /
    • v.3 no.1
    • /
    • pp.97-108
    • /
    • 1996
  • In this paper we obtain several function defined on finite partially ordered sets(posets) which may indicate constraints of comparability on sets of teams(tasks, etc.) for which evaluation is computationally simple, a relatively rare condition in graph-based algorithms. Using these functions a set of numerical coefficients and associated distributions obtained from a computer simulation of certain families of random graphs is determined. From this information estimates may be made as to the actual linearity of complicated posets. Applications of these ideas is to all areas where obtaining rankings from partial information in rational ways is relevant as in, e.g., team_, scaling_, and scheduling theory as well as in theoretical computer science. Theoretical consideration of special and desirable properties of various functions is provided permitting judgment concerning sensitivity of these functions to changes in parameters describing (finite) posets.

  • PDF

Comparative study of Probabilistic Load Flow and Fuzzy Load Flow (확률적 전력조류계산과 퍼지 전력조류계산과의 비교 연구)

  • Jung, Young-Soo;Shim, Jae-Hong;Kim, Jin-O
    • Proceedings of the KIEE Conference
    • /
    • 1997.07c
    • /
    • pp.1100-1102
    • /
    • 1997
  • This paper presents a generalized multi-parameter distribution method for the convolution of linear combination of random variables to calculate system load flow in a conventional probabilistic approach and also presents a conceptual possibilistic approach using fuzzy set theory to manage uncertainties. The probability distribution function is transformed into an appropriate possibilistic representation under the compromise between the transformation consistency and the human updating experience. The IEEE 25-bus system is used to demonstrate the capability of the proposed algorithm.

  • PDF

Stochastic Model for Unification of Stereo Vision and Image Restoration (스테레오 비젼 및 영상복원 과정의 통합을 위한 확률 모형)

  • Woo, Woon-Tak;Jeong, Hong
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.29B no.9
    • /
    • pp.37-49
    • /
    • 1992
  • The standard definition of computational vision is a set of inverse problems of recovering surfaces from images. Thus the common characteristics of the most early vision problems are ill-posed. The main idea for solving ill-posed problems is to restrict the class of admissible solutions by introducing suitable a priori knowledge. Standard regurarization methods lead to satisfactory solutions of early vision problems but cannot deal effectively and directly with a few general problems, such as discontinuity and fusion of information from multiple modules. In this paper, we discuss limitations of standard regularization theory and present new stochastic method. We will outline a rigorous approach to overcome part of ill-posedness of image restoration, edge detection, and stereo vision problems, based on Bayes estimation and MRF(Markov random field) model, that effectively deals with the problems. This result makes one hope that this framework could be useful in the solution of other vision problems.

  • PDF

Breast Density and Risk of Breast Cancer in Asian Women: A Meta-analysis of Observational Studies

  • Bae, Jong-Myon;Kim, Eun Hee
    • Journal of Preventive Medicine and Public Health
    • /
    • v.49 no.6
    • /
    • pp.367-375
    • /
    • 2016
  • Objectives: The established theory that breast density is an independent predictor of breast cancer risk is based on studies targeting white women in the West. More Asian women than Western women have dense breasts, but the incidence of breast cancer is lower among Asian women. This meta-analysis investigated the association between breast density in mammography and breast cancer risk in Asian women. Methods: PubMed and Scopus were searched, and the final date of publication was set as December 31, 2015. The effect size in each article was calculated using the interval-collapse method. Summary effect sizes (sESs) and 95% confidence intervals (CIs) were calculated by conducting a meta-analysis applying a random effect model. To investigate the dose-response relationship, random effect dose-response meta-regression (RE-DRMR) was conducted. Results: Six analytical epidemiology studies in total were selected, including one cohort study and five case-control studies. A total of 17 datasets were constructed by type of breast density index and menopausal status. In analyzing the subgroups of premenopausal vs. postmenopausal women, the percent density (PD) index was confirmed to be associated with a significantly elevated risk for breast cancer (sES, 2.21; 95% CI, 1.52 to 3.21; $I^2=50.0%$). The RE-DRMR results showed that the risk of breast cancer increased 1.73 times for each 25% increase in PD in postmenopausal women (95% CI, 1.20 to 2.47). Conclusions: In Asian women, breast cancer risk increased with breast density measured using the PD index, regardless of menopausal status. We propose the further development of a breast cancer risk prediction model based on the application of PD in Asian women.

Fusion of Local and Global Detectors for PHD Filter-Based Multi-Object Tracking (검출기 융합에 기반을 둔 확률가정밀도 (PHD) 필터를 적용한 다중 객체 추적 방법)

  • Yoon, Ju Hong;Hwang, Youngbae;Choi, Byeongho;Yoon, Kuk-Jin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.9
    • /
    • pp.773-777
    • /
    • 2016
  • In this paper, a novel multi-object tracking method to track an unknown number of objects is proposed. To handle multiple object states and uncertain observations efficiently, a probability hypothesis density (PHD) filter is adopted and modified. The PHD filter is capable of reducing false positives, managing object appearances and disappearances, and estimating the multiple object trajectories in a unified framework. Although the PHD filter is robust in cluttered environments, it is vulnerable to false negatives. For this reason, we propose to exploit local observations in an RFS of the observation model. Each local observation is generated by using an online trained object detector. The main purpose of the local observation is to deal with false negatives in the PHD filtering procedure. The experimental results demonstrated that the proposed method robustly tracked multiple objects under practical situations.

Design of comprehensive mechanical properties by machine learning and high-throughput optimization algorithm in RAFM steels

  • Wang, Chenchong;Shen, Chunguang;Huo, Xiaojie;Zhang, Chi;Xu, Wei
    • Nuclear Engineering and Technology
    • /
    • v.52 no.5
    • /
    • pp.1008-1012
    • /
    • 2020
  • In order to make reasonable design for the improvement of comprehensive mechanical properties of RAFM steels, the design system with both machine learning and high-throughput optimization algorithm was established. As the basis of the design system, a dataset of RAFM steels was compiled from previous literatures. Then, feature engineering guided random forests regressors were trained by the dataset and NSGA II algorithm were used for the selection of the optimal solutions from the large-scale solution set with nine composition features and two treatment processing features. The selected optimal solutions by this design system showed prospective mechanical properties, which was also consistent with the physical metallurgy theory. This efficiency design mode could give the enlightenment for the design of other metal structural materials with the requirement of multi-properties.

Theoretical background discussion on variable polarity arc welding of aluminum (가변 극성 알루미늄 아크 용접의 이론적 배경 고찰)

  • Cho, Jungho;Lee, Jungjae;Bae, Seunghwan;Lee, Yongki;Park, Kyungbae;Kim, Yongjun;Lee, Junkyung
    • Journal of Welding and Joining
    • /
    • v.33 no.2
    • /
    • pp.14-17
    • /
    • 2015
  • Cleaning effect is well known mechanism of oxide layer removal in DCEP polarity. It is also known that DCEN has higher heat input efficiency than DCEP in GTAW process. Based on these two renowned arc theories, conventional variable polarity arc for aluminum welding was set up to have minimum DCEP and maximum DCEN duty ratio to achieve the highest heat input efficiency and weldability increase. However, recent several variable polarity GTA research papers reported unexpected result of proportional relationship between DCEP duty ratio and heat input. The authors also observed the same result then suggested combination of tunneling effect and random walk of cathode spot to fill up the gap between experiment and conventional arc theory. In this research, suggested combinational work of tunneling effect and rapid cathode spot changing is applied to another unexpected phenomena of variable polarity aluminum arc welding. From previous research, it is reported that wider oxide removal range, narrower bead width and shallower penetration depth are observed in thin oxide layered aluminum compared to the case of thick oxide. This result was reported for the first time and it was hard to explain the reason at that time therefore the inference by the authors was hardly acceptable. However, the suggested combinational theory successfully explains the result of the previous report in logical way.

A Review on nuclear magnetic resonance logging: fundamental theory and measurements (자기공명검층: 기본 이론 및 자료 측정)

  • Jang, Jae Hwa;Nam, Myung Jin
    • Geophysics and Geophysical Exploration
    • /
    • v.15 no.4
    • /
    • pp.235-244
    • /
    • 2012
  • Nuclear magnetic resonance (NMR) logging has been considered one of the most complicated nevertheless, one of the most powerful logging methods for the characterization on of both rocks and natural fluids in formation. NMR measures magnetized signals (polarization and relaxation) between the properties of hydrogen nucleus called magnetic moment and applied magnetic fields. The measured data set contains two important petrophysical properties such as density of hydrogen in the fluids inside the pore space and the distinct decay rate for fluid type. Therefore, after the proper data processing, key petrophysical information, not only the quantities and properties of fluids but also supplies of rock characterization in a porous medium, could be archived. Thus, based on this information, several ongoing researches are being developed in estimating aspects of reservoir productivity information, permeability and wettability since it is the key to having correct interpretation. This study goes through the basic theory of NMR at first, and then reviews NMR logging tools as well as their technical characteristics. This paper also briefly discusses the basic knowledge of NMR simulation algorithm by using Random walk.