• 제목/요약/키워드: collection cost

검색결과 507건 처리시간 0.029초

메가프로젝트 원가 자료 분석에 관한 연구 (A Study of cost data modeling for Megaproject)

  • 지성민;조재경;현창택
    • 한국건축시공학회:학술대회논문집
    • /
    • 한국건축시공학회 2009년도 추계 학술논문 발표대회
    • /
    • pp.253-256
    • /
    • 2009
  • To the success of the megaproject including various and complex facilities, it is needed to establish a database system. Developments in data collection, storage and extracting technology have enabled iPMIS to manage various and complex information about cost and time. Especially, when we consider that both the go and no go decision in feasibility, Cost is an important and clear criteria in megaproject. Thus, Cost data modeling is the basis of the system and is necessary process. This research is focus on the structure and definition about CBS data which is collected from sites. We used four tools which are Function Analysis in VE, Casual loop Diagram in System Dynamics, Decision Tree in Data-mining, and Normalization in SQL to identify its cause and effect relationship on CBS data. Cost data modeling provide iPMIS with helpful guideline.

  • PDF

초소형위성 비용분석 사례연구를 통한 비용분석 업무발전 방향에 대한 고찰 (A Study on Work Development Direction of Cost Analysis through Cost Analysis of Micro Satellite)

  • 이태화
    • 품질경영학회지
    • /
    • 제51권3호
    • /
    • pp.461-479
    • /
    • 2023
  • Purpose: It emphasizes the importance of cost analysis for weapons systems that require enormous develop- ment costs, analyzes the problems of cost analysis steps from a practical point of view, and presents the direction of business development in terms of cost analysis reliability, timeliness, and efficiency. Methods: It analyzes the R&D cost of Micro satellites with a complex cost structure and large scale according to engineering estimation procedures, derives major analysis step-by-step problems, and presents business development directions. Results: Problems with standards and assumptions, data collection, cost division structure, and cost estimation methods were derived through the micro satellite cost analysis process, and business development directions such as expanding common standards, standardizing basic data, standardizing cost division structures and cost items, and data asset were presented. Conclusion: In order to develop work in terms of cost analysis reliability, timeliness, and efficiency, it is important to prepare and standardize standards and rules for detailed tasks at each analysis stage, and through this, it is expected that high utilization value and systematic cost data will be assetized in the future.

종합대학 도서관장서의 적정량기준 설정에 관한 고찰 -미국의 종합대학도서관기준을 중심으로- (Problems in Quantification of Adequacy of Academic Library Collections -Critical Analysis of Standards for Academic Libraries in the U.S.-)

  • 정용선
    • 한국문헌정보학회지
    • /
    • 제8권
    • /
    • pp.183-207
    • /
    • 1981
  • Library standards have been the source of considerable controversy, whereas many problems are involved in developing stardard for university library collections. For evaluation purposes, standards should be precise, quantifiable and measurable. In the United States, however, standards for academic libraries are limited to qualitative statements and principles. Quantitative standards, when given, are ususally related to the number of population in the institution being served, or the prescribed quantitative objectives are often arbitrarily formulated by value judgements. The study in this paper attempts to explain the problems involved in developing quantitative standard for academic library collections. Two problems facing in the formulation of the optimal size of collection are identified. One is the theoretically faulty concept of adequacy of collection to meet the situations of diversity of university libraies, and the other is the difficulties in quantification and measurement, along with the lack of concept of adequacy of collection. However, quantification of adequate size of collection is proved to be useful on the pratical level, even though not valid theoretically. ACRL, Clapp/Jordan and Voigt developed formulas or models for setting the optimal size of a library collection for any particular university library. The main purpose of this study is the analysis of the above formulas. ACRL standard was drawn from obervation and analysis of statistcs in leading library collections. In academic field, this judgement appears to have been based on the assumption that a high-grade institution would be apt to have a good library collection. This study criticizes ACRL standard for its failure to include some determinants of measurements, and points out the limitations of the standard. In contrast. Clapp/Jordan developed a formula rather scientifically based upon bibliographical sources. This is similarly empirical but has the advantage of bringing into play the elements which make universities diverse in nature. Both ACRL and Clapp/Jordan formulas share two major defects. (1) the specific subject needs of the collection are not indiacted directly, and (2) percentage rate of growth is an indicator in measuring the potential utility of a collection. Thus both formulas failed to provide a basis for meaningful evaluation. Voigt further developed a model for determining acquisition rates for currently published materials based on bibliographic technique. Voigt model encourages experimentation with different programs and different allocations of input resources, designed to meet the needs of the library's particular population. Standard for university library collections can be formulated in terms of input(traditional indicator), or additionally, in terms of output(cost-effectiveness). Cost effectiveness is expressed as user satisfaction, ability to provide wanted materials within a reasonable time period. Thus simple quantitative method does not cover all the situations of diversity of university library collections, nor measures the effectiveness of collections. Valid standard could not be established without further research.

  • PDF

A Light-weighted Data Collection Method for DNS Simulation on the Cyber Range

  • Li, Shuang;Du, Shasha;Huang, Wenfeng;Liang, Siyu;Deng, Jinxi;Wang, Le;Huang, Huiwu;Liao, Xinhai;Su, Shen
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제14권8호
    • /
    • pp.3501-3518
    • /
    • 2020
  • The method of DNS data collection is one of the most important parts of DNS simulation. DNS data contains a lot of information. When it comes to analyzing the DNS security issues by simulation on the cyber range with customized features, we only need some of them, such as IP address, domain name information, etc. Therefore, the data we need are supposed to be light-weighted and easy to manipulate. Many researchers have designed different schemes to obtain their datasets, such as LDplayer and Thales system. However, existing solutions consume excessive computational resources, which are not necessary for DNS security simulation. In this paper, we propose a light-weighted active data collection method to prepare the datasets for DNS simulation on cyber range. We evaluate the performance of the method and prove that it can collect DNS data in a short time and store the collected data at a lower storage cost. In addition, we give two examples to illustrate how our method can be used in a variety of applications.

연산 특성을 고려한 향상된 적응적 가비지 컬렉션 정책 (An Advanced Adaptive Garbage Collection Policy by Considering the Operation Characteristics)

  • 박송화;이정훈;이원오;김현우
    • 대한임베디드공학회논문지
    • /
    • 제13권5호
    • /
    • pp.269-277
    • /
    • 2018
  • NAND flash memory has widely been used because of non-volatility, low power consumption and fast access time. However, it suffers from inability to provide update-in-place and the erase cycle is limited. The unit of read/write operation is a page and the unit of erase operation is a block. Moreover erase operation is slower than other operations. We proposed the Adaptive Garbage Collection (called "AGC") policy which focuses on not only reducing garbage collection process time for real-time guarantee but also wear-leveling for a flash memory lifetime. The AGC performs better than Cost-benefit policy and Greedy policy. But the AGC does not consider the operation characteristics. So we proposed the Advanced Adaptive Garbage Collection (called "A-AGC") policy which considers the page write operation count and block erase operation count. The A-AGC reduces the write operations by considering the data update frequency and update data size. Also, it reduces the erase operations by considering the file fragmentation. We implemented the A-AGC policy and measured the performance compared with the AGC policy. Simulation results show that the A-AGC policy performs better than AGC, specially for append operation.

A Survey on the Mobile Crowdsensing System life cycle: Task Allocation, Data Collection, and Data Aggregation

  • Xia Zhuoyue;Azween Abdullah;S.H. Kok
    • International Journal of Computer Science & Network Security
    • /
    • 제23권3호
    • /
    • pp.31-48
    • /
    • 2023
  • The popularization of smart devices and subsequent optimization of their sensing capacity has resulted in a novel mobile crowdsensing (MCS) pattern, which employs smart devices as sensing nodes by recruiting users to develop a sensing network for multiple-task performance. This technique has garnered much scholarly interest in terms of sensing range, cost, and integration. The MCS is prevalent in various fields, including environmental monitoring, noise monitoring, and road monitoring. A complete MCS life cycle entails task allocation, data collection, and data aggregation. Regardless, specific drawbacks remain unresolved in this study despite extensive research on this life cycle. This article mainly summarizes single-task, multi-task allocation, and space-time multi-task allocation at the task allocation stage. Meanwhile, the quality, safety, and efficiency of data collection are discussed at the data collection stage. Edge computing, which provides a novel development idea to derive data from the MCS system, is also highlighted. Furthermore, data aggregation security and quality are summarized at the data aggregation stage. The novel development of multi-modal data aggregation is also outlined following the diversity of data obtained from MCS. Overall, this article summarizes the three aspects of the MCS life cycle, analyzes the issues underlying this study, and offers developmental directions for future scholars' reference.

웹 크롤러를 이용한 자동 패치 정보 수집 시스템 (Automatic Patch Information Collection System Using Web Crawler)

  • 김용건;나사랑;김환국;원유재
    • 정보보호학회논문지
    • /
    • 제28권6호
    • /
    • pp.1393-1399
    • /
    • 2018
  • 다양한 소프트웨어를 사용하는 기업은 보안 업체에서 제공하는 패치관리시스템을 사용하여 소프트웨어의 취약점을 일괄적으로 관리해서 보안 수준을 높인다. 시스템 관리자는 최신 소프트웨어 버전을 유지하기 위해 신규 패치 정보를 제공하는 벤더 사이트를 모니터링 하지만 패치를 제공하는 주기가 불규칙적이고 웹 페이지 구조가 다르기 때문에 패치 정보를 검색하고 수집하는데 많은 비용과 모니터링 시간이 소요된다. 이를 줄이기 위해 키워드나 웹 서비스를 기반으로 패치 정보 수집을 자동화하는 연구가 진행되었으나 벤더 사이트에서 패치 정보를 제공하는 구조가 규격화되어 있지 않기 때문에 특정 벤더 사이트에서만 적용 가능했다. 본 논문에서는 패치 정보를 제공하는 벤더 사이트 구조와 특징을 분석하고 패치 정보 수집에 소모되는 비용과 모니터링 시간을 줄이기 위해서 웹 크롤러를 이용해 패치 정보 수집을 자동화하는 시스템을 제안한다.

휴리스틱 기법을 적용한 촬영계획 최적화에 대한 연구 (Development of Image Collection Planning Optimization Using Heuristic Method)

  • 배희진;전정남;채태병
    • 대한원격탐사학회지
    • /
    • 제28권4호
    • /
    • pp.459-466
    • /
    • 2012
  • 위성영상 운영은 주문접수, 촬영계획, 영상처리, 영상 배포의 과정으로 구분하며 촬영계획은 주문 접수 과정에서 전달된 신규 주문과 이미 진행 중인 주문을 바탕으로 한정된 위성자원을 최대한 활용하여 사용자의 촬영요청을 가능한 적절한 시기에 반영할 수 있도록 위성의 촬영계획을 수립하는 과정으로 위성 자원의 효율적인 활용이라는 취지와 가장 밀접한 관계를 갖는 과정에 해당한다. 촬영계획은 동시에 많은 변수를 고려해야 하기 때문에 연산량도 많은 편이며, 스케줄링을 수행할 때마다 동일한 과정을 반복적으로 수행해야 한다. 본 논문에서는 촬영계획의 효율성을 높이기 위해 촬영계획의 최적화 연구를 수행하였다. 먼저 촬영계획 수행과정과 제약조건을 정리하여 가능한 많이 촬영할 수 있도록 촬영계획 모형을 수립하고 그 모형을 해결하기 위해 휴리스틱 알고리즘을 제안하였다.

낸드 플래시 메모리를 위한 적응형 가비지 컬렉션 정책 (An Adaptive Garbage Collection Policy for NAND Flash Memory)

  • 한규태;김성조
    • 한국정보과학회논문지:컴퓨팅의 실제 및 레터
    • /
    • 제15권5호
    • /
    • pp.322-330
    • /
    • 2009
  • 제자리 덮어쓰기가 불가능하고 블록의 지움 횟수가 제한되는 낸드 플래시 메모리를 저장매체로 사용하기 위해서 지움 횟수 평준화를 지원하는 다양한 가비지 컬렉션 정책들이 연구되고 있다. 기존정책들은 지움 횟수 평준화를 지원하기 위해 가비지 컬렉션이 수행될 때마다 전체 블록에 대해 지움 대상블록을 선정하기 위한 클리닉 지표를 구하는 연산을 수행하여야 하고 이 연산들은 시스템의 성능을 저하시킨다. 본 논문에서 제안하는 가비지 컬렉션 정책은 지움 횟수의 분산(variance)과 블록들의 최대 지움횟수에 따라 변경되는 임계값을 이용하여 전체 블록에 대한 클리닉 지표를 구하는 연산을 수행하지 않으면서 지움 횟수 평준화를 제공한다. 가비지 컬렉션 시 분산이 임계값 보다 작을 때는 Greedy 정책을 이용하여 지움 비용을 최소화하고, 분산이 임계값 보다 클 때는 최대 지움 횟수를 가진 블록을 지움 대상에서 제외하여 지움 횟수를 평준화한다. 본 논문에서 제안하는 방법으로 가비지 컬렉션을 수행하였을 때, 블록들의 지움 횟수가 지움 횟수 상한에 가까워질수록 블록들의 지움 횟수 표준 편차가 0에 근접하며, 기존의 지움 횟수 평준화를 지원하는 알고리즘과 비교하여 두 배 이상 빠른 가비지 컬렉션 속도를 보였다.

코로나19 팬데믹과 영업순환주기가 외식업체의 원가 비대칭적 행태에 미치는 영향 (The Effect of COVID-19 Pandemic and Operanting Cycle on Asymmetric Cost Behavior in Food Service Industry)

  • 박원
    • 디지털융복합연구
    • /
    • 제20권4호
    • /
    • pp.215-224
    • /
    • 2022
  • 본 연구는 외식기업을 대상으로 원가의 비대칭적 행태를 검증하고 최근 경기 침체 현상과 관련된 코로나 19 팬데믹 현상과 유동성에 영향을 미치는 영업순환주기가 원가의 행태에 영향을 미치는지 검증하고자 하였다. 2019년과 2020년 외식기업을 대상으로 분석을 실시하였으며 원가는 매출원가와 판매비와 관리비의 합으로 측정하였다. 분석 결과, 외식기업은 활동수준 감소에 따라 원가탄력적인 행태로 나타났다. 또한 코로나 이후 이전보다 원가의 하방탄력적인 행태는 강화되었으며 영업순환주기가 짧을수록 원가는 하방탄력적인 행태로 나타났다. 마지막으로 영업순환주기의 구성요소인 재고자산 보유기간과 매출채권 회수기간 모두 짧을수록 원가의 하방탄력적인 행태는 강화되었다. 이러한 결과는 외식기업을 대상으로 원가의 구조 및 그러한 구조에 팬데믹 현상과 영업순환주기가 영향을 미칠 수 있는 요인을 검증한 것에 의미가 있을 것으로 보인다. 코로나 19로 인하여 외식업체가 직면한 상황을 원가 측면에 접근하였으며 이러한 팬데믹 현상이 기업의 매출 감소에 원가절감으로 이어질 수 있다.