• Title/Summary/Keyword: stochastic approach

Search Result 583, Processing Time 0.031 seconds

Kalman filter modeling for the estimation of tropospheric and ionospheric delays from the GPS network (망기반 대류 및 전리층 지연 추출을 위한 칼만필터 모델링)

  • Hong, Chang-Ki
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.30 no.6_1
    • /
    • pp.575-581
    • /
    • 2012
  • In general, various modeling and estimation techniques have been proposed to extract the tropospheric and ionospheric delays from the GPS CORS. In this study, Kalman filter approach is adopted to estimate the tropospheric and ionospheric delays and the proper modeling for the state vector and the variance-covariance matrix for the process noises are performed. The coordinates of reference stations and the zenith wet delays are estimated with the assumption of random walk stochastic process. Also, the first-order Gauss-Markov stochastic process is applied to compute the ionospheric effects. For the evaluation of the proposed modeling technique, Kalman filter algorithm is implemented and the numerical test is performed with the CORS data. The results show that the atmospheric effects can be estimated successfully and, as a consequence, can be used for the generation of VRS data.

A Ship-Valuation Model Based on Monte Carlo Simulation (몬테카를로 시뮬레이션방법을 이용한 선박가치 평가)

  • Choi, Jung-Suk;Lee, Ki-Hwan;Nam, Jong-Sik
    • Journal of Korea Port Economic Association
    • /
    • v.31 no.3
    • /
    • pp.1-14
    • /
    • 2015
  • This study utilizes Monte Carlo simulation to forecast the time charter rate of vessels, the three-month Libor interest rate, and the ship demolition price, to mitigate future uncertainties involving these factors. The simulation was performed 10,000 times to obtain an exact result. For the empirical analysis - based on considerations in ordering ships in 2010-a comparison between the Monte Carlo simulation-based stochastic discounted cash flow (DCF) method and traditional DCF methods was made. The analysis revealed that the net present value obtained through Monte Carlo simulation was lower than that obtained via regular DCF methods, alerting the owners to risks and preventing them from placing injudicious orders for ships. This research has implications in reducing the uncertainties that future shipping markets face, through the use of a stochastic DCF approach with relevant variables and probability methods.

Layered-earth Resistivity Inversion of Small-loop Electromagnetic Survey Data using Particle Swarm Optimization (입자 군집 최적화법을 이용한 소형루프 전자탐사 자료의 층서구조 전기비저항 역해석)

  • Jang, Hangilro
    • Geophysics and Geophysical Exploration
    • /
    • v.22 no.4
    • /
    • pp.186-194
    • /
    • 2019
  • Deterministic optimization, commonly used to find the geophysical inverse solutions, have its limitation that it cannot find the proper solution since it might converge into the local minimum. One of the solutions to this problem is to use global optimization based on a stochastic approach, among which a large number of particle swarm optimization (PSO) applications have been introduced. In this paper, I developed a geophysical inversion algorithm applying PSO method for the layered-earth resistivity inversion of the small-loop electromagnetic (EM) survey data and carried out numerical inversion experiments on synthetic datasets. From the results, it is confirmed that the PSO inversion algorithm could increase the inversion success rate even when attempting the inversion of small-loop EM survey data from which it might be difficult to find a best solution by applying the Gauss-Newton inversion algorithm.

Opportunistic Scheduling Schemes for Elastic Services in OFDMA Systems (OFDMA 시스템에서 Elastic 서비스를 위한 Opportunistic 스케줄링 기법)

  • Kwon, Jeong-Ahn;Lee, Jang-Won
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.1A
    • /
    • pp.76-83
    • /
    • 2009
  • In this paper, we provide opportunistic scheduling schemes for elastic services in OFDMA systems with fairness constraints for each user. We adopt the network utility maximization framework in which a utility function is defined for each user to represent its level of satisfaction to the service. Since we consider elastic services whose degree of satisfaction depends on its average data rate, we define the utility function of each user as a function of its average data rate. In addition, for fair resource allocation among users, we define fairness requirements of each user by using utility functions. We first formulate an optimization problem for each fairness requirement that aim at maximizing network utility, which is defined as the sum of utilities of users. We then develop an opportunistic scheduling scheme for each fairness requirement by solving the problem using a dual approach and a stochastic sub-gradient algorithm.

A Study on Regional Differences in Healthcare in Korea: Using Position Value for Relative Comparison Index (한국 지역 간 보건의료수준의 상대적 위치 비교 연구: Position Value for Relative Comparison Index를 활용하여)

  • Youn, Hin-Moi;Yun, Choa;Kang, Soo Hyun;Kwon, Junhyun;Lee, Hyeon Ji;Park, Eun-Cheol;Jang, Sung-In
    • Health Policy and Management
    • /
    • v.31 no.4
    • /
    • pp.491-507
    • /
    • 2021
  • Background: This study aims to measure regional healthcare differences in Korea, and define relatively underserved areas. Methods: We employed position value for relative comparison index (PARC) to measure the healthcare status of 250 areas using 137 indicators in five following domains: healthcare demand, supply, accessibility, service utilization, and outcome. We performed a sensitivity analysis using t-SNE (t-distributed stochastic neighboring embedding). Results: Based on PARC values, 83 areas were defined as relatively underserved areas, 49 of which were categorized as moderate and 34 as severe. The provincial regions with the most underserved areas were Gyeongbuk (16 areas), Gangwon (13), Jeonnam (13), and Gyeongnam (12). Conclusion: This study suggests a relative comparison approach to define relatively underserved areas in healthcare. Further studies incorporating various perspectives and methods are required for policy implications.

Numerical evaluation of gamma radiation monitoring

  • Rezaei, Mohsen;Ashoor, Mansour;Sarkhosh, Leila
    • Nuclear Engineering and Technology
    • /
    • v.51 no.3
    • /
    • pp.807-817
    • /
    • 2019
  • Airborne Gamma Ray Spectrometry (AGRS) with its important applications such as gathering radiation information of ground surface, geochemistry measuring of the abundance of Potassium, Thorium and Uranium in outer earth layer, environmental and nuclear site surveillance has a key role in the field of nuclear science and human life. The Broyden-Fletcher-Goldfarb-Shanno (BFGS), with its advanced numerical unconstrained nonlinear optimization in collaboration with Artificial Neural Networks (ANNs) provides a noteworthy opportunity for modern AGRS. In this study a new AGRS system empowered by ANN-BFGS has been proposed and evaluated on available empirical AGRS data. To that effect different architectures of adaptive ANN-BFGS were implemented for a sort of published experimental AGRS outputs. The selected approach among of various training methods, with its low iteration cost and nondiagonal scaling allocation is a new powerful algorithm for AGRS data due to its inherent stochastic properties. Experiments were performed by different architectures and trainings, the selected scheme achieved the smallest number of epochs, the minimum Mean Square Error (MSE) and the maximum performance in compare with different types of optimization strategies and algorithms. The proposed method is capable to be implemented on a cost effective and minimum electronic equipment to present its real-time process, which will let it to be used on board a light Unmanned Aerial Vehicle (UAV). The advanced adaptation properties and models of neural network, the training of stochastic process and its implementation on DSP outstands an affordable, reliable and low cost AGRS design. The main outcome of the study shows this method increases the quality of curvature information of AGRS data while cost of the algorithm is reduced in each iteration so the proposed ANN-BFGS is a trustworthy appropriate model for Gamma-ray data reconstruction and analysis based on advanced novel artificial intelligence systems.

High-velocity ballistics of twisted bilayer graphene under stochastic disorder

  • Gupta, K.K.;Mukhopadhyay, T.;Roy, L.;Dey, S.
    • Advances in nano research
    • /
    • v.12 no.5
    • /
    • pp.529-547
    • /
    • 2022
  • Graphene is one of the strongest, stiffest, and lightest nanoscale materials known to date, making it a potentially viable and attractive candidate for developing lightweight structural composites to prevent high-velocity ballistic impact, as commonly encountered in defense and space sectors. In-plane twist in bilayer graphene has recently revealed unprecedented electronic properties like superconductivity, which has now started attracting the attention for other multi-physical properties of such twisted structures. For example, the latest studies show that twisting can enhance the strength and stiffness of graphene by many folds, which in turn creates a strong rationale for their prospective exploitation in high-velocity impact. The present article investigates the ballistic performance of twisted bilayer graphene (tBLG) nanostructures. We have employed molecular dynamics (MD) simulations, augmented further by coupling gaussian process-based machine learning, for the nanoscale characterization of various tBLG structures with varying relative rotation angle (RRA). Spherical diamond impactors (with a diameter of 25Å) are enforced with high initial velocity (Vi) in the range of 1 km/s to 6.5 km/s to observe the ballistic performance of tBLG nanostructures. The specific penetration energy (Ep*) of the impacted nanostructures and residual velocity (Vr) of the impactor are considered as the quantities of interest, wherein the effect of stochastic system parameters is computationally captured based on an efficient Gaussian process regression (GPR) based Monte Carlo simulation approach. A data-driven sensitivity analysis is carried out to quantify the relative importance of different critical system parameters. As an integral part of this study, we have deterministically investigated the resonant behaviour of graphene nanostructures, wherein the high-velocity impact is used as the initial actuation mechanism. The comprehensive dynamic investigation of bilayer graphene under the ballistic impact, as presented in this paper including the effect of twisting and random disorder for their prospective exploitation, would lead to the development of improved impact-resistant lightweight materials.

A Hybrid Multi-Level Feature Selection Framework for prediction of Chronic Disease

  • G.S. Raghavendra;Shanthi Mahesh;M.V.P. Chandrasekhara Rao
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.12
    • /
    • pp.101-106
    • /
    • 2023
  • Chronic illnesses are among the most common serious problems affecting human health. Early diagnosis of chronic diseases can assist to avoid or mitigate their consequences, potentially decreasing mortality rates. Using machine learning algorithms to identify risk factors is an exciting strategy. The issue with existing feature selection approaches is that each method provides a distinct set of properties that affect model correctness, and present methods cannot perform well on huge multidimensional datasets. We would like to introduce a novel model that contains a feature selection approach that selects optimal characteristics from big multidimensional data sets to provide reliable predictions of chronic illnesses without sacrificing data uniqueness.[1] To ensure the success of our proposed model, we employed balanced classes by employing hybrid balanced class sampling methods on the original dataset, as well as methods for data pre-processing and data transformation, to provide credible data for the training model. We ran and assessed our model on datasets with binary and multivalued classifications. We have used multiple datasets (Parkinson, arrythmia, breast cancer, kidney, diabetes). Suitable features are selected by using the Hybrid feature model consists of Lassocv, decision tree, random forest, gradient boosting,Adaboost, stochastic gradient descent and done voting of attributes which are common output from these methods.Accuracy of original dataset before applying framework is recorded and evaluated against reduced data set of attributes accuracy. The results are shown separately to provide comparisons. Based on the result analysis, we can conclude that our proposed model produced the highest accuracy on multi valued class datasets than on binary class attributes.[1]

Optimal design of nonlinear damping system for seismically-excited adjacent structures using multi-objective genetic algorithm integrated with stochastic linearization method (추계학적 선형화 방법 및 다목적 유전자 알고리즘을 이용한 지진하중을 받는 인접 구조물에 대한 비선형 감쇠시스템의 최적 설계)

  • Ok, Seung-Yong;Song, Jun-Ho;Koh, Hyun-Moo;Park, Kwan-Soon
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.11 no.6
    • /
    • pp.1-14
    • /
    • 2007
  • Optimal design method of nonlinear damping system for seismic response control of adjacent structures is studied in this paper. The objective functions of the optimal design are defined by structural response and total amount of the dampers. In order to obtain a solution minimizing two mutually conflicting objective functions simultaneously, multi-objective optimization technique based on genetic algorithm is adopted. In addition, stochastic linearization method is embedded into the multi-objective framework to efficiently estimate the seismic responses of the adjacent structures interconnected by nonlinear hysteretic dampers without performing nonlinear time-history analyses. As a numerical example to demonstrate the effectiveness of the proposed technique, 20-story and 10-story buildings are considered and MR dampers of which hysteretic behaviors vary with the magnitude of the input voltage are considered as nonlinear hysteretic damper interconnecting two adjacent buildings. The proposed approach can provide the optimal number and capacities of the MR dampers, which turned out to be more economical than the uniform distribution system while maintaining similar control performance. The proposed damper system is verified to show more stable performance in terms of the pounding probability between two adjacent buildings. The applicability of the proposed method to the design problem for optimally placing semi-active control system is examined as well.

Evaluating the Efficiency of Personal Information Protection Activities in a Private Company: Using Stochastic Frontier Analysis (개인정보처리자의 개인정보보호 활동 효율성 분석: 확률변경분석을 활용하여)

  • Jang, Chul-Ho;Cha, Yun-Ho;Yang, Hyo-Jin
    • Informatization Policy
    • /
    • v.28 no.4
    • /
    • pp.76-92
    • /
    • 2021
  • The value of personal information is increasing with the digital transformation of the 4th Industrial Revolution. The purpose of this study is to analyze the efficiency of personal information protection efforts of 2,000 private companies. It uses a stochastic frontier approach (SFA), a parametric estimation method that measures the absolute efficiency of protective activities. In particular, the personal information activity index is used as an output variable for efficiency analysis, with the personal information protection budget and number of personnel utilized as input variables. As a result of the analysis, efficiency is found to range from a minimum of 0.466 to a maximum of 0.949, and overall average efficiency is 0.818 (81.8%). The main causes of inefficiency include non-fulfillment of personal information management measures, lack of system for promoting personal information protection education, and non-fulfillment of obligations related to CCTV. Policy support is needed to implement safety measures and perform personal information encryption, especially customized support for small and medium-sized enterprises.