• Title/Summary/Keyword: Sequential decision-making process

Search Result 20, Processing Time 0.021 seconds

Multiobjective R&D Investment Planning under Uncertainty (불확실한 상황하에서의 다복적 R & D 투자계획수립에 관한 연구-최적화 기법과 계층화 분석과정의 통합접 접근방안을 중심으로-)

  • 이영찬;민재형
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.20 no.2
    • /
    • pp.39-60
    • /
    • 1995
  • In this paper, an integration of stochastic dynamic programming (SDP), integer goal programming (IGP) and analytic hierarchy process (AHP) is proposed to handle multiobjective-multicriteria sequential decision making problems under uncertainty inherent in R & D investment planning. SDP has its capability to handle problems which are sequential and stochastic. In the SDP model, the probabilities of the funding levels in any time period are generated using a subjective model which employs functional relationships among interrelated parameters, scenarios of future budget availability and subjective inputs elicited from a group of decision makers. The SDP model primarily yields an optimal investment planning policy considering the possibility that actual funding received may be less than anticipated one and thus the projects being selected under the anticipated budget would be interrupted. IGP is used to handle the multiobjective issues such as tradoff between economic benefit and technology accumulation level. Other managerial concerns related to the determination of the optimal project portifolio within each stage of the SDP model. including project selection, project scheduling and annual budget allocation are also determined by the IGP. AHP is proposed for generating scenario-based transformation probabilities under budgetary uncertainty and for quantifying the environmental risk to be considered.

  • PDF

The fashion consumer purchase patterns and influencing factors through big data - Based on sequential pattern analysis -

  • Ki Yong Kwon
    • The Research Journal of the Costume Culture
    • /
    • v.31 no.5
    • /
    • pp.607-626
    • /
    • 2023
  • This study analyzes consumer fashion purchase patterns from a big data perspective. Transaction data from 1 million transactions at two Korean fashion brands were collected. To analyze the data, R, Python, the SPADE algorithm, and network analysis were used. Various consumer purchase patterns, including overall purchase patterns, seasonal purchase patterns, and age-specific purchase patterns, were analyzed. Overall pattern analysis found that a continuous purchase pattern was formed around the brands' popular items such as t-shirts and blouses. Network analysis also showed that t-shirts and blouses were highly centralized items. This suggests that there are items that make consumers loyal to a brand rather than the cachet of the brand name itself. These results help us better understand the process of brand equity construction. Additionally, buying patterns varied by season, and more items were purchased in a single shopping trip during the spring season compared to other seasons. Consumer age also affected purchase patterns; findings showed an increase in purchasing the same item repeatedly as age increased. This likely reflects the difference in purchasing power according to age, and it suggests that the decision-making process for pur- chasing products simplifies as age increases. These findings offer insight for fashion companies' establishment of item-specific marketing strategies.

Study on the Effective Use of Thread in Agent Modeling (에이전트 모델링에서 효율적인 쓰레드 사용에 관한 연구)

  • Lim S.J.;Song J.Y.;Lee S.W.;Kim D.H.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.10a
    • /
    • pp.980-983
    • /
    • 2005
  • An agent Is an autonomous process that recognizes external environment, exchanges knowledge with external machines and performs an autonomous decision-making function in order to achieve common goals. The techniques fur tackling complexity in software need to be introduced. That is decomposition, abstraction and organization. Agent-oriented model ing has the merits of decomposition. In decomposition, each autonomous unit may have a control thread. Thread is single sequential flow in program. The use of thread in agent modeling has an important meaning in the performance of CPU and the relation of autonomous units.

  • PDF

Design and Implementation of On-Board Control System (위성 운용을 위한 On-Board Control System 설계 및 구현)

  • Shin, Hyun-Kyu
    • Aerospace Engineering and Technology
    • /
    • v.13 no.1
    • /
    • pp.86-95
    • /
    • 2014
  • For the accomplishment of satellite's own missions, complicated control procedures and commands are required. Although absolute-Time Command and Command Sequences have been applied for controlling satellite and its operation historically, these command system only has a capability to manage sequential control process and has a limitation that cannot deal with decision and branch for corresponding to the condition of the time. To resolve above limitation, KARI has designed RTCSC which is mainly based on the existing RTCS but adopts a one-byte Op Code for supporting decision making and conditional branch. This paper introduces the design and implementation of RTCSC as On-Board Control System like OBCP, VML and IP.

Managing Approximation Models in Multidisciplinary Optimization (다분야 최적화에서의 근사모델 관리기법의 활용)

  • 양영순;정현승;연윤석
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2000.10a
    • /
    • pp.141-148
    • /
    • 2000
  • In system design, it is not always possible that all decision makers can cooperate fully and thus avoid conflict. They each control a specified subset of design variables and seek to minimize their own cost functions subject to their individual constraints. However, a system management team makes every effort to coordinate multiple disciplines and overcome such noncooperative environment. Although full cooperation is difficult to achieve, noncooperation also should be avoided as possible. Our approach is to predict the results of their cooperation and generate approximate Pareto set for their multiple objectives. The Pareto set can be obtained according to the degree of one's conceding coupling variables in the other's favor. We employ approximation concept for modelling this coordination and the mutiobjective genetic algorithm for exploring the coupling variable space for obtaining an approximate Pareto set. The approximation management concept is also used for improving the accuracy of the Pareto set. The exploration for the coupling variable space is more efficient because of its smaller dimension than the design variable space. Also, our approach doesn't force the disciplines to change their own way of running analysis and synthesis tools. Since the decision making process is not sequential, the required time can be reduced comparing to the existing multidisciplinary optimization techniques. This approach is applied to some mathematical examples and structural optimization problems.

  • PDF

A Ppoisson Regression Aanlysis of Physician Visits (외래이용빈도 분석의 모형과 기법)

  • 이영조;한달선;배상수
    • Health Policy and Management
    • /
    • v.3 no.2
    • /
    • pp.159-176
    • /
    • 1993
  • The utilization of outpatient care services involves two steps of sequential decisions. The first step decision is about whether to initiate the utilization and the second one is about how many more visits to make after the initiation. Presumably, the initiation decision is largely made by the patient and his or her family, while the number of additional visits is decided under a strong influence of the physician. Implication is that the analysis of the outpatient care utilization requires to specify each of the two decisions underlying the utilization as a distinct stochastic process. This paper is concerned with the number of physician visits, which is, by definition, a discrete variable that can take only non-negative integer values. Since the initial visit is considered in the analysis of whether or not having made any physician visit, the focus on the number of visits made in addition to the initial one must be enough. The number of additional visits, being a kind of count data, could be assumed to exhibit a Poisson distribution. However, it is likely that the distribution is over dispersed since the number of physician visits tends to cluster around a few values but still vary widely. A recently reported study of outpatient care utilization employed an analysis based upon the assumption of a negative binomial distribution which is a type of overdispersed Poisson distribution. But there is an indication that the use of Poisson distribution making adjustments for over-dispersion results in less loss of efficiency in parameter estimation compared to the use of a certain type of distribution like a negative binomial distribution. An analysis of the data for outpatient care utilization was performed focusing on an assessment of appropriateness of available techniques. The data used in the analysis were collected by a community survey in Hwachon Gun, Kangwon Do in 1990. It was observed that a Poisson regression with adjustments for over-dispersion is superior to either an ordinary regression or a Poisson regression without adjustments oor over-dispersion. In conclusion, it seems the most approprite to assume that the number of physician visits made in addition to the initial visist exhibits an overdispersed Poisson distribution when outpatient care utilization is studied based upon a model which embodies the two-part character of the decision process uderlying the utilization.

  • PDF

Changes in Statistical Knowledge and Experience of Data-driven Decision-making of Pre-service Teachers who Participated in Data Analysis Projects (데이터 분석 프로젝트 참여한 예비 교사의 통계적 지식에 대한 변화와 데이터 기반 의사 결정의 경험)

  • Suh, Heejoo;Han, Sunyoung
    • Communications of Mathematical Education
    • /
    • v.35 no.2
    • /
    • pp.153-172
    • /
    • 2021
  • Various competencies such as critical thinking, systems thinking, problem solving competence, communication skill, and data literacy are likely to be required in the 4th industrial revolution. The competency regarding data literacy is one of those competencies. To nurture citizens who will live in the future, it is timely to consider research on teacher education for supporting teachers' development of statistical thinking as well as statistical knowledge. Therefore, in this study we developed and implemented a data analysis project for pre-service teachers to understand their changes in statistical knowledge in addition to their experiences of data-driven decision making process that required them utilizing their statistical thinking. We used a mixed method (i.e., sequential explanatory design) research to analyze the quantitative and qualitative data collected. The findings indicated that pre-service teachers have low knowledge level of their understanding on the relationship between population means and sample means, and estimation of the population mean and its interpretation. When it comes to the data-driven decision making process, we found that the pre-service teachers' experiences varied even when they worked as a small group for the project. We end this paper by presenting implications of the study for the fields of teacher education and statistics education.

Tidy-up Task Planner based on Q-learning (정리정돈을 위한 Q-learning 기반의 작업계획기)

  • Yang, Min-Gyu;Ahn, Kuk-Hyun;Song, Jae-Bok
    • The Journal of Korea Robotics Society
    • /
    • v.16 no.1
    • /
    • pp.56-63
    • /
    • 2021
  • As the use of robots in service area increases, research has been conducted to replace human tasks in daily life with robots. Among them, this study focuses on the tidy-up task on a desk using a robot arm. The order in which tidy-up motions are carried out has a great impact on the success rate of the task. Therefore, in this study, a neural network-based method for determining the priority of the tidy-up motions from the input image is proposed. Reinforcement learning, which shows good performance in the sequential decision-making process, is used to train such a task planner. The training process is conducted in a virtual tidy-up environment that is configured the same as the actual tidy-up environment. To transfer the learning results in the virtual environment to the actual environment, the input image is preprocessed into a segmented image. In addition, the use of a neural network that excludes unnecessary tidy-up motions from the priority during the tidy-up operation increases the success rate of the task planner. Experiments were conducted in the real world to verify the proposed task planning method.

Approximate Dynamic Programming Based Interceptor Fire Control and Effectiveness Analysis for M-To-M Engagement (근사적 동적계획을 활용한 요격통제 및 동시교전 효과분석)

  • Lee, Changseok;Kim, Ju-Hyun;Choi, Bong Wan;Kim, Kyeongtaek
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.50 no.4
    • /
    • pp.287-295
    • /
    • 2022
  • As low altitude long-range artillery threat has been strengthened, the development of anti-artillery interception system to protect assets against its attacks will be kicked off. We view the defense of long-range artillery attacks as a typical dynamic weapon target assignment (DWTA) problem. DWTA is a sequential decision process in which decision making under future uncertain attacks affects the subsequent decision processes and its results. These are typical characteristics of Markov decision process (MDP) model. We formulate the problem as a MDP model to examine the assignment policy for the defender. The proximity of the capital of South Korea to North Korea border limits the computation time for its solution to a few second. Within the allowed time interval, it is impossible to compute the exact optimal solution. We apply approximate dynamic programming (ADP) approach to check if ADP approach solve the MDP model within processing time limit. We employ Shoot-Shoot-Look policy as a baseline strategy and compare it with ADP approach for three scenarios. Simulation results show that ADP approach provide better solution than the baseline strategy.

Uncertainty Analysis on the Simulations of Runoff and Sediment Using SWAT-CUP (SWAT-CUP을 이용한 유출 및 유사모의 불확실성 분석)

  • Kim, Minho;Heo, Tae-Young;Chung, Sewoong
    • Journal of Korean Society on Water Environment
    • /
    • v.29 no.5
    • /
    • pp.681-690
    • /
    • 2013
  • Watershed models have been increasingly used to support an integrated management of land and water, non-point source pollutants, and implement total daily maximum load policy. However, these models demand a great amount of input data, process parameters, a proper calibration, and sometimes result in significant uncertainty in the simulation results. For this reason, uncertainty analysis is necessary to minimize the risk in the use of the models for an important decision making. The objectives of this study were to evaluate three different uncertainty analysis algorithms (SUFI-2: Sequential Uncertainty Fitting-Ver.2, GLUE: Generalized Likelihood Uncertainty Estimation, ParaSol: Parameter Solution) that used to analyze the sensitivity of the SWAT(Soil and Water Assessment Tool) parameters and auto-calibration in a watershed, evaluate the uncertainties on the simulations of runoff and sediment load, and suggest alternatives to reduce the uncertainty. The results confirmed that the parameters which are most sensitive to runoff and sediment simulations were consistent in three algorithms although the order of importance is slightly different. In addition, there was no significant difference in the performance of auto-calibration results for runoff simulations. On the other hand, sediment calibration results showed less modeling efficiency compared to runoff simulations, which is probably due to the lack of measurement data. It is obvious that the parameter uncertainty in the sediment simulation is much grater than that in the runoff simulation. To decrease the uncertainty of SWAT simulations, it is recommended to estimate feasible ranges of model parameters, and obtain sufficient and reliable measurement data for the study site.