• Title/Summary/Keyword: Challenge Models

Search Result 291, Processing Time 0.025 seconds

K-1 Tank Life Cycle Cost Estimate Using PRICE Model (PRICE 모델을 이용한 K1전차 수명주기 비용추정)

  • 강창호;강성진
    • Journal of the military operations research society of Korea
    • /
    • v.25 no.2
    • /
    • pp.44-61
    • /
    • 1999
  • Cost estimation has posed a significant challenge to estimators, planners, and managers in both government and military. Considerable historical evidence shows that accurate cost estimation has been difficult to achieve across a wide range of projects, including weapon systems. This paper introduces new cost estimating concept, CAIV(Cost As an Independent Variable) and a cost estimating case study using PRICE model, computer aided parametric estimating models(CAPE) for K1 tank cost estimate. CAIV concept is to set realistic but aggressive cost objectives easily in each acquisition program and to achieve cost, schedule, and performance objectives considering various managing risks with a project manager and industry teams. The Price model is one of computer aided cost estimating models and widely used in U.S. defense system analysis as a tool for CAIV. We analyze theories, inputs, outputs of the PRICE model and present a case study for K1 tank to estimate costs in requirement and concept phase, program and budgeting phase, and life cycle phase. Finally we obtain results that the Price model can be used in various phases of PPBEES depending upon available data and time.

  • PDF

Dynamic Heterogeneity in Spin Facilitated Model of Supercooled Liquid: Crossover from Fragile to Strong Liquid Behavior

  • Choi, Seo Woo;Kim, Soree;Jung, YounJoon
    • Proceeding of EDISON Challenge
    • /
    • 2014.03a
    • /
    • pp.183-195
    • /
    • 2014
  • Kinetically constrained models (KCM) have attracted interest as models that assign dynamic origins to the interesting dynamic properties of supercooled liquid. Signs of dynamic heterogeneity in the crossover model that linearly interpolates between the FA-like symmetric constraint and the East model constraint by asymmetric parameter b were investigated using Monte Carlo technique. When the asymmetry parameter was decreased sufficiently, smooth fragile-to-strong dynamic transition was observed in terms of the relaxation time, diffusion constant, Stokes-Einstein violation, and dynamic length scale. Competition between energetically favored symmetric relaxation mechanism and entropically favored asymmetric relaxation mechanism is behind such transition.

  • PDF

자전거 프레임 특정부분의 보강효과와 프레임에 미치는 응력과 변형 연구

  • Kim, Tae-Hun;Yang, Dong-Min;Ha, Yun-Su
    • Proceeding of EDISON Challenge
    • /
    • 2015.03a
    • /
    • pp.207-211
    • /
    • 2015
  • In this paper, 2 kinds of models about bike frame are simulated with static structural analysis. A bike frame with diamond type is compared with another model that Down tube is eliminated from original diamond frame. About both types of models, Property of a material and conditions of restriction & load are the same. This study shows reinforcement effects of a partial frame by adding down tube and impacts generated by applying a load at the frame such as weak points & high stress parts as well as expected deformation. The structural result of this study indicates that the equivalent stress or total deformation decreases by 57.1% or 36.4%, respectively. Also stress concentration sites are leg connecting parts, front/rear wheels fixed region and Max deformation is generated from Seat tube. In conclusion, A Down tube is highly efficient as reinforcement than frame without non down tube. Furthermore, The safety rises in case of reducing top tube thickness and increasing a reinforcement at leg connecting parts or concentration regions.

  • PDF

The Vertical Corporate Campus: Integrating Modern Workplace Models into the High-Rise Typology

  • Britton, John;Hargis, Steve
    • International Journal of High-Rise Buildings
    • /
    • v.5 no.2
    • /
    • pp.127-136
    • /
    • 2016
  • As the great urban migration continues to drive the growth of cities worldwide, global companies are seeking new approaches to the urban workplace and corporate campus. In light of environmental and economic imperatives to develop taller and denser central business districts, a key challenge is merging contemporary workplace concepts, which emphasize large, open floors and high levels of connectivity, with high-rise typologies with smaller floor plates set around center cores. This paper traces the evolution of the corporate campus and emerging design strategies for translating contemporary workplace models into a vertical campus typology that allows companies to realize the benefits of urban locations, while contributing to a more sustainable future.

The Optimal Maintenance Strategy of a Rail Bridge by Using Life Cycle Cost (생애주기 비용을 이용한 철도교량의 최적유지관리)

  • Yang Seung-le
    • Journal of the Korean Society for Railway
    • /
    • v.8 no.6 s.31
    • /
    • pp.544-549
    • /
    • 2005
  • Nowadays, most of bridge networks are complete or close to completion. The biggest challenge railroad./highway agencies and departments of transportation face is the maintenance of these networks, keeping them safe and serviceable, with limited funds. To maintain the bridges effectively, there is an urgent need to predict their remaining life from a system reliability viewpoint. And, it is necessary to develop the maintenance models based on system reliability concept. In this paper, maintenance models are developed for preventive maintenance and essential maintenance by using system reliability and lifetime distributions. The proposed model is applied to an existing railroad bridge. The optimal maintenance strategy of this bridge is obtained in terms of services life extension and cumulative maintenance cost.

Parallel Dense Merging Network with Dilated Convolutions for Semantic Segmentation of Sports Movement Scene

  • Huang, Dongya;Zhang, Li
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.11
    • /
    • pp.3493-3506
    • /
    • 2022
  • In the field of scene segmentation, the precise segmentation of object boundaries in sports movement scene images is a great challenge. The geometric information and spatial information of the image are very important, but in many models, they are usually easy to be lost, which has a big influence on the performance of the model. To alleviate this problem, a parallel dense dilated convolution merging Network (termed PDDCM-Net) was proposed. The proposed PDDCMNet consists of a feature extractor, parallel dilated convolutions, and dense dilated convolutions merged with different dilation rates. We utilize different combinations of dilated convolutions that expand the receptive field of the model with fewer parameters than other advanced methods. Importantly, PDDCM-Net fuses both low-level and high-level information, in effect alleviating the problem of accurately segmenting the edge of the object and positioning the object position accurately. Experimental results validate that the proposed PDDCM-Net achieves a great improvement compared to several representative models on the COCO-Stuff data set.

Motion classification using distributional features of 3D skeleton data

  • Woohyun Kim;Daeun Kim;Kyoung Shin Park;Sungim Lee
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.6
    • /
    • pp.551-560
    • /
    • 2023
  • Recently, there has been significant research into the recognition of human activities using three-dimensional sequential skeleton data captured by the Kinect depth sensor. Many of these studies employ deep learning models. This study introduces a novel feature selection method for this data and analyzes it using machine learning models. Due to the high-dimensional nature of the original Kinect data, effective feature extraction methods are required to address the classification challenge. In this research, we propose using the first four moments as predictors to represent the distribution of joint sequences and evaluate their effectiveness using two datasets: The exergame dataset, consisting of three activities, and the MSR daily activity dataset, composed of ten activities. The results show that the accuracy of our approach outperforms existing methods on average across different classifiers.

P-Triple Barrier Labeling: Unifying Pair Trading Strategies and Triple Barrier Labeling Through Genetic Algorithm Optimization

  • Ning Fu;Suntae Kim
    • International journal of advanced smart convergence
    • /
    • v.12 no.4
    • /
    • pp.111-118
    • /
    • 2023
  • In the ever-changing landscape of finance, the fusion of artificial intelligence (AI)and pair trading strategies has captured the interest of investors and institutions alike. In the context of supervised machine learning, crafting precise and accurate labels is crucial, as it remains a top priority to empower AI models to surpass traditional pair trading methods. However, prevailing labeling techniques in the financial sector predominantly concentrate on individual assets, posing a challenge in aligning with pair trading strategies. To address this issue, we propose an inventive approach that melds the Triple Barrier Labeling technique with pair trading, optimizing the resultant labels through genetic algorithms. Rigorous backtesting on cryptocurrency datasets illustrates that our proposed labeling method excels over traditional pair trading methods and corresponding buy-and-hold strategies in both profitability and risk control. This pioneering method offers a novel perspective on trading strategies and risk management within the financial domain, laying a robust groundwork for further enhancing the precision and reliability of pair trading strategies utilizing AI models.

A Fuzzy Logic Based Software Development Cost Estimation Model with improved Accuracy

  • Shrabani Mallick;Dharmender Singh Kushwaha
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.6
    • /
    • pp.17-22
    • /
    • 2024
  • Software cost and schedule estimation is usually based on the estimated size of the software. Advanced estimation techniques also make use of the diverse factors viz, nature of the project, staff skills available, time constraints, performance constraints, technology required and so on. Usually, estimation is based on an estimation model prepared with the help of experienced project managers. Estimation of software cost is predominantly a crucial activity as it incurs huge economic and strategic investment. However accurate estimation still remains a challenge as the algorithmic models used for Software Project planning and Estimation doesn't address the true dynamic nature of Software Development. This paper presents an efficient approach using the contemporary Constructive Cost Model (COCOMO) augmented with the desirable feature of fuzzy logic to address the uncertainty and flexibility associated with the cost drivers (Effort Multiplier Factor). The approach has been validated and interpreted by project experts and shows convincing results as compared to simple algorithmic models.

Fine-Tuning Strategies for Weather Condition Shifts: A Comparative Analysis of Models Trained on Synthetic and Real Datasets

  • Jungwoo Kim;Min Jung Lee;Suha Kwak
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2024.05a
    • /
    • pp.794-797
    • /
    • 2024
  • Despite advancements in deep learning, existing semantic segmentation models exhibit suboptimal performance under adverse weather conditions, such as fog or rain, whereas they perform well in clear weather conditions. To address this issue, much of the research has focused on making image or feature-level representations weather-independent. However, disentangling the style and content of images remains a challenge. In this work, we propose a novel fine-tuning method, 'freeze-n-update.' We identify a subset of model parameters that are weather-independent and demonstrate that by freezing these parameters and fine-tuning others, segmentation performance can be significantly improved. Experiments on a test dataset confirm both the effectiveness and practicality of our approach.