• Title/Summary/Keyword: Decision-Making Models

Search Result 661, Processing Time 0.025 seconds

Open BIM-based quantity take-off system for schematic estimation of building frame in early design stage

  • Choi, Jungsik;Kim, Hansaem;Kim, Inhan
    • Journal of Computational Design and Engineering
    • /
    • v.2 no.1
    • /
    • pp.16-25
    • /
    • 2015
  • Since construction projects are large and complex, it is especially important to provide concurrent construction process to BIM models with construction automation. In particular, the schematic Quantity Take-Off (QTO) estimation on the BIM models is a strategy, which can be used to assist decision making in just minutes, because 70-80% of construction costs are determined by designers' decisions in the early design stage. This paper suggests a QTO process and a QTO prototype system within the building frame of Open BIM to improve the low reliability of estimation in the early design stage. The research consists of the following four steps: (1) analyzing Level of Detail (LOD) at the early design stage to apply to the QTO process and system, (2) BIM modeling for Open BIM based QTO, (3) checking the quality of the BIM model based on the checklist for applying to QTO and improving constructability, and (4) developing and verifying a QTO prototype system. The proposed QTO system is useful for improving the reliability of schematic estimation through decreasing risk factors and shortening time required.

A Study on the Analysis of Comparison of Churn Prediction Models in Mobile Telecommunication Services (이동통신서비스 해지고객 예측모형의 비교 분석에 관한 연구)

  • Kim, Choong-Nyoung;Chang, Nam-Sik;Kim, Jun-Woo
    • Asia pacific journal of information systems
    • /
    • v.12 no.1
    • /
    • pp.139-158
    • /
    • 2002
  • As the telecommunication market becomes mature in Korea, severe competition has already begun on the market. While service providers struggled for the last couple of years to acquire as many new customers as possible, nowadays they are making more efforts on retaining the current customers. The churn management by analyzing customers' demographic and transactional data becomes one of the key customer retention strategies which most companies pursue. However, the customer data analysis has still remained at the basic level in the industry, even though it has considerable potential as a tool for understanding customer behavior. This paper develops several churn prediction models using data mining techniques such as logistic regression, decision trees, and neural networks. For model-building, real data were used which were collected from one of the major telecommunication companies in Korea. This paper explores various ways of comparing model performance, while the hit ratio was mainly focused in the previous research. The comparison criteria used in this study include gain ratio, Kolmogorov-Smirnov statistics, distribution of the predicted values, and explanation ability. This paper also suggest some guidance for model selection in applying data mining techniques.

A Study of Impementaton Methodology for Data Warehouse (데이터웨어하우스 구축 방법론에 대한 연구)

  • Lee, Byong-Soo;Lee, Sang-Rak;Chang, Keun;Yoon, Ju-Yong
    • Proceedings of the Korea Society for Industrial Systems Conference
    • /
    • 1999.05a
    • /
    • pp.3-9
    • /
    • 1999
  • Using Information systems to process massive data, quickly and exactly, organizations have chances to enhance their performance. The limitations of IS function to support decision-making, however, have been frequently mentioned. In this context, in addition to traditional mathematical model that is a kernel of DSS, the needs for Data Warehouse which is a system supporting business process analysis are emerging. In this study, for those needs, first, we introduce issues of implementation methodology for D/W, especially various models relating development process. Then, we investigate correlation between these models and key factors for success of D/W.

  • PDF

Bayesian demand model based seismic vulnerability assessment of a concrete girder bridge

  • Bayat, M.;Kia, M.;Soltangharaei, V.;Ahmadi, H.R.;Ziehl, P.
    • Advances in concrete construction
    • /
    • v.9 no.4
    • /
    • pp.337-343
    • /
    • 2020
  • In the present study, by employing fragility analysis, the seismic vulnerability of a concrete girder bridge, one of the most common existing structural bridge systems, has been performed. To this end, drift demand model as a fundamental ingredient of any probabilistic decision-making analyses is initially developed in terms of the two most common intensity measures, i.e., PGA and Sa (T1). Developing a probabilistic demand model requires a reliable database that is established in this paper by performing incremental dynamic analysis (IDA) under a set of 20 ground motion records. Next, by employing Bayesian statistical inference drift demand models are developed based on pre-collapse data obtained from IDA. Then, the accuracy and reasonability of the developed models are investigated by plotting diagnosis graphs. This graphical analysis demonstrates probabilistic demand model developed in terms of PGA is more reliable. Afterward, fragility curves according to PGA based-demand model are developed.

Maximization in Reliability Design when Stress/Strength has Time Dependent Model of Deterministic Cycle Times

  • Oh, Chung-Hwan
    • Journal of Korean Society for Quality Management
    • /
    • v.18 no.1
    • /
    • pp.129-147
    • /
    • 1990
  • This study is to refer to the optimization problems when the stress and strength follow the time dependent model, considering a decision making process in the design methodology from reliability viewpoint. Reliability of a component can be expressed and computed if the probability distributions for the stress and strength in the time dependent case are known. The factors which determine the parameters of the distributions for stress and strength random variables can be controlled in design problems. This leads to the problem of finding the optimal values of these parameters subject to resources and design constraints. This paper is to present techniques for solving the optimization problems at the design stage like as minimizing the total cost to be spent on controlling the stress and strength parameters for random variables subject to the constraint that the component must have a specified reliability, alternatively, maximizing the component reliability subject to certain constraints on amount of resources available to control the parameters. The derived expressions and computations of reliability in the time dependent case and some optimization models of these cases are discussed. The special structure of these models is exploited to develop the optimization techniques which are illustrated by design examples.

  • PDF

The Status of Soil and Groundwater Contamination in Japan and Case Studies of their Remediation (일본의 토양지하수오염 및 복원사례)

  • Komai, Takeshi;Kawabe, Yoshishige
    • Proceedings of the Korean Society of Soil and Groundwater Environment Conference
    • /
    • 2003.04a
    • /
    • pp.25-39
    • /
    • 2003
  • Risk and exposure assessment for subsurface environment is very important for both aspects of health and environmental protection as well as making decision of remedial goal for engineering activities. Exposure due to hazardous chemicals in the subsurface environment is essential to assess risk lev121 to individual person, especially from soil and groundwater environmental media. In this paper, the status of soil and groundwater contamination is presented to discuss on the problem for environmental risk assessment. The methodologies of fate and exposure models are also discussed by conducting the case studies of exposure assessment for heavy metals, organic compounds, and dioxin compounds. In addition, the structure of exposure models and available data for model calculation are examined to make clear more realistic exposure scenarios and the application to the practical environmental issues. Three kinds of advanced remediation techniques for soil and groundwater contamination are described in this paper, The most practical method for VOCs is the bio-remediation technique in which biological process due to consortium of microorganisms can be applied. For more effective remediation of soil contaminated by heavy metals we have adopted the soil flushing technique and clean-up system using electro-kinetic method. We have also developed the advanced techniques of geo-melting method for soil contaminated by DXNs and PCB compounds. These techniques are planed to introduce and to apply for a lot of contaminated sites in Japan.

  • PDF

A Heuristic Approach to Budget-Mix Problems (여산믹스문제를 위한 발견적접근)

  • Lee Jae-Kwan
    • Journal of the military operations research society of Korea
    • /
    • v.6 no.1
    • /
    • pp.93-101
    • /
    • 1980
  • An effectively designed budget system in the poor resources environment necessarily has three design criteria : (i) to be both planning-oriented and control-oriented, (ii) to be both rationalistic and realistic, (iii) to be sensitive to the variations of resources environment. PPB system is an extreme (planning-oriented and rationalistic) and conventional OEB/OUB system is the other extreme (control-oriented and incrementalistic). Generally, the merits of rationalism are limited because of the infeasibility of applications. Hence, mixtures of the two extremes such as MBO, ZBB, and RZBB have been examined and applied during the last decade. The classical mathematical models of capital budgeting are the starting points of the development of the Budget-Mix Model introduced in this paper. They are modified by the followings: (i) technological-resource constraints, (ii) bounded-variable constraint, (iii) the exchange rules. Special emphasis is laid on the above (iii), because we need more efficient interresource exchanges in the budget-mix process. The Budget-Mix Model is not based on optimization, but a heuristic approach which assures a satisficing solution. And the application fields of this model range between the incremental Nonzero-Base Budgeting and the rational Zero-Base Budgeting. In this thesis, the author suggests 'the budget- mix concept' and a budget-mix model. Budget-mix is a decision process of making program-mix and resource-mix together. For keeping this concept in the existing organization realistic, we need the development of quantitative models describing budget-mix situations.

  • PDF

Management Evaluation on the Regional Fisheries Cooperatives using Data Envelopment Analysis Model (DEA모형에 의한 지역수협의 경영평가)

  • Lee, Kang-Woo
    • The Journal of Fisheries Business Administration
    • /
    • v.42 no.2
    • /
    • pp.15-30
    • /
    • 2011
  • This study is designed to measure the relative efficiency of regional fishery cooperatives based on Data Envelopment Analysis(DEA) methods. Selecting 40 regional fishery cooperatives in Busan as Decision Making Units (DMUs), the study uses their panel data from 2007 to 2008 to rank the relative efficiency of the DMUs. First, the efficiency score of the DMUs are calculated using CCR, SBM, and super-SMB model. Within the model, input variables are the number of employees and area of fishery cooperatives. Output variables are the amount of deposit money, loan and profit. Based on the efficiency scores calculated from super-SMB model, the efficiency ranking of the DMUs is determined. Second, the differences in average efficiency calculated from the three DEA models are tested using a pair-wise mean comparison test. The results based on the efficiency scores evaluated from super-SMB model show that seven out of the forty DMUs are efficient; among the efficient DMUs, the DMUs that can be benchmarked for inefficient DMUs through the frequency analysis of reference set being identified. Third, the differences in average efficiency of the three DEA models between 2007 and 2008 are tested using pair-wise mean comparison test and the study estimates the efficiency change of the DMUs between 2007 and 2008 using Malmquist productivity index(MPI). Finally, the paper suggests an improved composite DMU superior to the inefficient DMUs evaluated by Super-SBM model.

A Study of Implementation Methodology for Data Warehouse (데이터웨어 하우스 구축 방법론에 대한 연구)

  • Lee, Byong-Soo;Lee, Sang-Rak;Chang, Keun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.4 no.2
    • /
    • pp.23-31
    • /
    • 1999
  • Using Information Systems to process massive data, quickly and exactly, organizations have chances to enhance their performance. The limitations of IS function to support decision-making, however, have been frequently mentioned In this context, in addition to traditional mathematical model that is a kernel DSS, the needs for Data Warehouse which is a system supporting business process analysis are emerging. In this study, for those needs first we introduce issues of implementation methodology for D/W, especially various models relating development process. Then we investigate correlation between these models and key factors for success of R/W.

  • PDF

Risk Assessment and Pharmacogenetics in Molecular and Genomic Epidemiology

  • Park, Sue-K.;Choi, Ji-Yeob
    • Journal of Preventive Medicine and Public Health
    • /
    • v.42 no.6
    • /
    • pp.371-376
    • /
    • 2009
  • In this article, we reviewed the literature on risk assessment (RA) models with and without molecular genomic markers and the current utility of the markers in the pharmacogenetic field. Epidemiological risk assessment is applied using statistical models and equations established from current scientific knowledge of risk and disease. Several papers have reported that traditional RA tools have significant limitations in decision-making in management strategies for individuals as predictions of diseases and disease progression are inaccurate. Recently, the model added information on the genetic susceptibility factors that are expected to be most responsible for differences in individual risk. On the continuum of health care, from diagnosis to treatment, pharmacogenetics has been developed based on the accumulated knowledge of human genomic variation involving drug distribution and metabolism and the target of action, which has the potential to facilitate personalized medicine that can avoid therapeutic failure and serious side effects. There are many challenges for the applicability of genomic information in a clinical setting. Current uses of genetic markers for managing drug therapy and issues in the development of a valid biomarker in pharmacogenetics are discussed.