• Title/Summary/Keyword: Cost Normalization

Search Result 40, Processing Time 0.024 seconds

Cost Normalization Procedure for Phase-Based Performance Measurement

  • Choi, Jiyong;Yun, Sungmin;Oliveira, Daniel;Mulva, Stephen;Kang, Youngcheol
    • International conference on construction engineering and project management
    • /
    • 2015.10a
    • /
    • pp.72-76
    • /
    • 2015
  • Capital project benchmarking requires an effective cost normalization process to compare cost performance of projects accomplished in different time and location. Existing cost normalization approaches have been established based on the assumption that all required information for cost normalization is fully identified once a project is completed. Cost normalization, however, is sometimes required to evaluate phase-level outcomes of an ongoing project where the required information is not fully available. This paper aims to provide a cost normalization procedure for phase-based performance assessment. The procedure consists of three normalization steps: currency conversion, location adjustment, and time adjustment considering various scenarios where the required information is not fully identified. This paper also presents how the cost normalization procedure has been applied to the 10-10 Performance Assessment Program, which is a phase-based performance assessment system developed by the Construction Industry Institute (CII). Both researchers and industrial professionals can apply the cost normalization procedure to studies and practices regarding to cost estimation, feasibility analysis, and performance assessment.

  • PDF

Cost Normalization Framework for a Benchmarking System: A Case for Downstream and Chemical Construction Projects

  • Yin, Zhe;DeGezelle, Deborah;Pappas, Mike;Caldas, Carlos
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.590-598
    • /
    • 2022
  • Benchmarking is an important tool to assess the performance of capital projects in the construction industry. Incorporating cost-related metrics into a benchmarking system requires an effective cost normalization process to enable meaningful comparisons among projects that were executed at different locations and times. Projects in the downstream and chemicals sector have unique characteristics compared to other types of construction projects, they require a distinctive cost normalization framework to be developed to benchmark their absolute cost performance. The purpose of this study is to develop such a framework to be used for the case of benchmarking the downstream and chemical projects for their performance assessment. The research team started with a review of existing cost normalization methodologies adopted in benchmarking systems and conducted 7 interviews to identify the current cost normalization practices used by industrial professionals. A panel of 12 experts was then convened and it held 6 review sessions to accomplish the framework development. The cost normalization framework for benchmarking downstream and chemical projects was established as a three-step procedure and it adopts a 4-element cost breakdown structure to accommodate projects submitted by both owners and contractors. It also incorporated 5 published cost indexes that are compatible with downstream and chemical projects and they were embedded into 2 options to complete the normalization process. The framework was then pilot-tested on 4 completed projects to validate its functional practicality and the downstream and chemical use case in the benchmarking system.

  • PDF

A Model of Quality Function Deployment with Cost-Quality Tradeoffs (품질과 비용의 득실관계를 고려한 품질기능전개 모형)

  • 우태희;박재현
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2002.05a
    • /
    • pp.227-230
    • /
    • 2002
  • This paper presents an analytic method of quality function deployment(QFD) that is to maximize customer satisfaction subject to technical and economic sides in process design. We have used Wasserman's normalization method and the analytical hierarchy process(AHP) to determine the intensity of the relationship between customer requirements and process design attributes. This paper also shows cost-quality model the tradeoff between quality and cost as a linear programming(LP) with new constraints that have designated special process required allocating firstly The cost-quality function deployment of piston ring is presented to illustrate the feasibility of such techniques.

  • PDF

A Model of Quality Function Deployment with Cost-Quality Tradeoffs (품질과 비용의 득실관계를 고려한 품질기능전개 모형)

  • 우태희;박재현
    • Journal of the Korea Safety Management & Science
    • /
    • v.4 no.2
    • /
    • pp.169-178
    • /
    • 2002
  • This paper presents an analytic method of quality function deployment(QFD) that is to maximize customer satisfaction subject to technical and economic sides in process design. We have used Wasserman's normalization method and the analytical hierarchy process(AHP) to determine the intensity of the relationship between customer requirements and process design attributes. This paper also shows cost-quality model the tradeoff between quality and cost as a linear programming(LP) with new constraints that have designated special process required allocating firstly. The cost-quality function deployment of piston ring is presented to illustrate the feasibility of such techniques.

Calculation Model for Function & Cost Score based on Normalization Method in Design VE (정규화 기법 기반의 설계VE 기능 및 비용 점수 산출 모델)

  • Lee, Jongsik
    • Korean Journal of Construction Engineering and Management
    • /
    • v.16 no.4
    • /
    • pp.98-106
    • /
    • 2015
  • VE aims at reduction in a budget, improvement of function, structural safety and quality security for public construction projects. However, there is possibility for the structural safety and quality security review to be insufficient because related regulations are mostly composed of analysis on economic efficiency of design. In addition, due to the misconception about VE as a cost saving methodology, an alternative is being presented which still focuses mainly on cost saving, but with no objective evaluation of function related to cost. In order to improve this, the government adopted the reduction of life cycle cost and proposal of value improvement, and let people specify the cost and function of the original plan versus the alternative plan, and the value changes between them. However, it is written mainly into practical convenience rather than theoretical basis since a specific way is not suggested. The current method sets a different starting point by applying the attributional difference of function and cost. Furthermore, an evaluation standard for correlating is an important element in rational decision making for assessing and choosing an alternative. This paper analyzes the process and method of function & cost scoring when performing VE and suggests a mathematical normalization model in order to support rational decision making when selecting an optimum plan.

A Study of cost data modeling for Megaproject (메가프로젝트 원가 자료 분석에 관한 연구)

  • Ji, Seong-Min;Cho, Jae-Kyung;Hyun, Chang-Taek
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2009.11a
    • /
    • pp.253-256
    • /
    • 2009
  • To the success of the megaproject including various and complex facilities, it is needed to establish a database system. Developments in data collection, storage and extracting technology have enabled iPMIS to manage various and complex information about cost and time. Especially, when we consider that both the go and no go decision in feasibility, Cost is an important and clear criteria in megaproject. Thus, Cost data modeling is the basis of the system and is necessary process. This research is focus on the structure and definition about CBS data which is collected from sites. We used four tools which are Function Analysis in VE, Casual loop Diagram in System Dynamics, Decision Tree in Data-mining, and Normalization in SQL to identify its cause and effect relationship on CBS data. Cost data modeling provide iPMIS with helpful guideline.

  • PDF

PMSM sensorless control by back emf normalization (역기전력 정규화에 의한 PMSM의 센서리스 제어)

  • Lee Jung-Jun;Park Sung-Jun;Kim Cheul-U
    • Proceedings of the KIPE Conference
    • /
    • 2002.07a
    • /
    • pp.300-303
    • /
    • 2002
  • With increase of servo motor In industrial and home application, a number of papers related to PMSM control have been researched. Among them, sensorless control schemes are especially concerned in the view point of its cost reduction. In the conventional approach, a rotor position is generally estimated by the integration of estimated rotor speed. In this method, because of their tight relationship between the amplitude of back-emf and rotor position. it is somewhat difficult to find two parameters at the same time. To solve this problem, a novel sensorless control scheme is proposed. It utilizes a back-emf normalization, so it does not requires the variables related with the amplitude of back-emf. The validity of the proposed control scheme was verified through experimental results.

  • PDF

A Study on 3D Data Model Development by Normalizing and Method of its Effective Use - Focused on Building Interior Construction - (정규화를 통한 3차원 데이터 모델 구축 및 활용성 향상 방안 연구 -건축 마감 공사 중심으로 -)

  • Lee, Myoung-Hoon;Ham, Nam-Hyuk;Kim, Ju-Hyung;Kim, Jae-Jun
    • Journal of The Korean Digital Architecture Interior Association
    • /
    • v.10 no.3
    • /
    • pp.11-18
    • /
    • 2010
  • Cost estimation through fast and correct quantity take offs are crucial in the process of construction project. The existing methods for cost estimation are mainly based on 2D-based drawings and the estimation result tends to be different according to the estimator's experience, the quality and quantity of used information and estimation time. To solve these problems, the domestic construction industry have recently tried to use the data extracted from 3D data modeling based on BIM(Building Information Modeling) in order to achieve more accurate and objective cost estimation. However it tends to increase dramatically the quantity of information that can be used in cost estimation by estimators. Therefore in order to achieve quality information data from 3D data modeling, the characteristics of the project should be reflected on the 3D model and it is most important to extract information only for cost estimation from the whole 3D model fast and accurately. Thus this study aims to propose the 3D modeling method through Data Normalization which maximizes the usability of 3D Data modeling in cost estimation process.

A Database Design without Storage Constraint Considering Denormalization in Relational Database (관계형 데이터베이스에서 저장용량에 제약이 없는 경우 비 정규화를 고려한 데이터베이스 설계)

  • 장영관;강맹규
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.19 no.37
    • /
    • pp.251-261
    • /
    • 1996
  • Databases are critical to business information systems and RDBMS is most widely used for the database system. Normalization was designed to control various anomalies(insert, update, and delete anomalies). However normalized database design does not account for the tradeoffs necessary for the performance reason. In this research, we model a database design problem without storage constraint. Given a normalized database design, in this model, we do the denormalization of duplicating columns in order in reduce frequent join processes. In this paper, we consider insert, update, delete, and storage cost, and the anomalies are treated by additional disk I/O cost necessary for each insert, update transaction. We propose a branch and bound method, and show considerable cost reduction.

  • PDF

Step-size Normalization of Information Theoretic Learning Methods based on Random Symbols (랜덤 심볼에 기반한 정보이론적 학습법의 스텝 사이즈 정규화)

  • Kim, Namyong
    • Journal of Internet Computing and Services
    • /
    • v.21 no.2
    • /
    • pp.49-55
    • /
    • 2020
  • Information theoretic learning (ITL) methods based on random symbols (RS) use a set of random symbols generated according to a target distribution and are designed nonparametrically to minimize the cost function of the Euclidian distance between the target distribution and the input distribution. One drawback of the learning method is that it can not utilize the input power statistics by employing a constant stepsize for updating the algorithm. In this paper, it is revealed that firstly, information potential input (IPI) plays a role of input in the cost function-derivative related with information potential output (IPO) and secondly, input itself does in the derivative related with information potential error (IPE). Based on these observations, it is proposed to normalize the step-size with the statistically varying power of the two different inputs, IPI and input itself. The proposed algorithm in an communication environment of impulsive noise and multipath fading shows that the performance of mean squared error (MSE) is lower by 4dB, and convergence speed is 2 times faster than the conventional methods without step-size normalization.