• 제목/요약/키워드: large-scale data

검색결과 2,733건 처리시간 0.031초

An XPDL-Based Workflow Control-Structure and Data-Sequence Analyzer

  • Kim, Kwanghoon Pio
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제13권3호
    • /
    • pp.1702-1721
    • /
    • 2019
  • A workflow process (or business process) management system helps to define, execute, monitor and manage workflow models deployed on a workflow-supported enterprise, and the system is compartmentalized into a modeling subsystem and an enacting subsystem, in general. The modeling subsystem's functionality is to discover and analyze workflow models via a theoretical modeling methodology like ICN, to graphically define them via a graphical representation notation like BPMN, and to systematically deploy those graphically defined models onto the enacting subsystem by transforming into their textual models represented by a standardized workflow process definition language like XPDL. Before deploying those defined workflow models, it is very important to inspect its syntactical correctness as well as its structural properness to minimize the loss of effectiveness and the depreciation of efficiency in managing the corresponding workflow models. In this paper, we are particularly interested in verifying very large-scale and massively parallel workflow models, and so we need a sophisticated analyzer to automatically analyze those specialized and complex styles of workflow models. One of the sophisticated analyzers devised in this paper is able to analyze not only the structural complexity but also the data-sequence complexity, especially. The structural complexity is based upon combinational usages of those control-structure constructs such as subprocesses, exclusive-OR, parallel-AND and iterative-LOOP primitives with preserving matched pairing and proper nesting properties, whereas the data-sequence complexity is based upon combinational usages of those relevant data repositories such as data definition sequences and data use sequences. Through the devised and implemented analyzer in this paper, we are able eventually to achieve the systematic verifications of the syntactical correctness as well as the effective validation of the structural properness on those complicate and large-scale styles of workflow models. As an experimental study, we apply the implemented analyzer to an exemplary large-scale and massively parallel workflow process model, the Large Bank Transaction Workflow Process Model, and show the structural complexity analysis results via a series of operational screens captured from the implemented analyzer.

초월대수비용함수를 이용한 근해어업의 규모의 경제성 분석 (Analysis on Economies of Scale in Offshore Fishery Using a Translog Cost Function)

  • 신용민;심성현
    • Ocean and Polar Research
    • /
    • 제39권1호
    • /
    • pp.61-71
    • /
    • 2017
  • This study estimates the cost function through offshore fishery cost data and analyzed the economies of scale of Korea's offshore fishery. For the estimation of the cost function, translog cost function was used, and the analysis implemented the panel analysis of the panel data. Also, annual economies of scale of the offshore fishery and economies of scale of 14 offshore fisheries in 2015 were analyzed using translog cost function coefficient estimation. The analysis result of economies of scale of Korea's offshore fishery showed that with the exception of 2003, economies of scale exist in all periods of time. However, as it almost reaches the minimum efficient scale, it was revealed that further scale expansion will bring inefficiency. Thus, according to the analysis result, Korea's offshore fishery requires a scale reduction policy rather than scale expansion policy, and this seems to coincide with the current government's fishery reconstruction policy and its practice such as the fishing vessel buyback program. The analysis result of economies of scale of each offshore fishery in 2015 showed that economies of scale of each offshore fishery exists with the exception of five trawl fisheries such as large pair-trawl and large otter trawl and large purse seines. This strongly suggests that the five fisheries and Large Purse Seines with non performing economies of scale need urgent scale reduction and should be the first target for the government's fishery reconstruction policy.

대규모 강의를 위한 안드로이드 폰 앱 개발 (Android phone app. development for large scale classes)

  • 김일민
    • 디지털융복합연구
    • /
    • 제9권6호
    • /
    • pp.343-354
    • /
    • 2011
  • 인문학 수업에서 교수와 학생간에 상호작용은 매우 중요하다. 그러나 많은 학생들이 수강하는 대규모 집합 교육의 경우, 수많은 학생들과 교수가 의견을 교환하기는 어렵다. 본 논문에서는 안드로이드 폰의 앱(app)을 개발하여, 대규모 집합 교육에서 발생할 수 있는 상호작용의 제한성을 극복하고자한다. 본 논문에서 개발한 앱은 스마트폰의 정보처리 기능과 무선 통신 기능을 활용하였다. 대규모 집합 교육의 보조 기구로 사용하여 대규모 집합 교육의 교육 효과를 높이는데 목적이 있다.

EVALUATING CRITICAL SUCCESS FACTORS FOR ACCURATE FIRST COST ESTIMATES OF LARGE-SCALE CONSTRUCTION PROJECTS

  • Jin-Lee Kim;Ok-Kyue Kim
    • 국제학술발표논문집
    • /
    • The 3th International Conference on Construction Engineering and Project Management
    • /
    • pp.354-360
    • /
    • 2009
  • The demands for large-scale construction projects such as Mega-projects are largely increasing due to the rapid growth of increasing populations as well as the need to replace existing buildings and infrastructure. Increasing costs of materials, supplies, and labors require the first cost estimates at the preliminary planning stage to be as accurate as possible. This paper presents the results obtained from the survey on evaluating nine critical success factors that influence the accurate first cost estimates for large-scale projects from practical experiences. It then examines the current cost structures of construction companies for large-scale projects, followed by the causes for cost and schedule overrun. Twenty completed surveys were collected and the Analytic Hierarchy Process was applied to analyze the data. The results indicate that technology issues, the contract type, and social and environmental impacts are the significant leading factors for accurate first cost estimates of large-scale construction projects.

  • PDF

대용량 자료에 대한 서포트 벡터 회귀에서 모수조절 (Parameter Tuning in Support Vector Regression for Large Scale Problems)

  • 류지열;곽민정;윤민
    • 한국지능시스템학회논문지
    • /
    • 제25권1호
    • /
    • pp.15-21
    • /
    • 2015
  • 커널에 대한 모수의 조절은 서포트 벡터 기계의 일반화 능력에 영향을 준다. 이와 같이 모수들의 적절한 값을 결정하는 것은 종종 어려운 작업이 된다. 서포트 벡터 회귀에서 이와 같은 모수들의 값을 결정하기 위한 부담은 앙상블 학습을 사용함으로써 감소시킬 수 있다. 그러나 대용량의 자료에 대한 문제에 직접적으로 적용하기에는 일반적으로 시간 소모적인 방법이다. 본 논문에서 서포트 벡터 회귀의 모수 조절에 대한 부담을 감소하기 위하여 원래 자료집합을 유한개의 부분집합으로 분해하는 방법을 제안하였다. 제안하는 방법은 대용량의 자료들인 경우와 특히 불균등 자료 집합에서 효율적임을 보일 것이다.

웹기반 GIS 플랫폼 상 가시화 처리를 위한 대용량 BIM 데이터의 경량화 알고리즘 제시 (A Study on Light-weight Algorithm of Large scale BIM data for Visualization on Web based GIS Platform)

  • 김지은;홍창희
    • Spatial Information Research
    • /
    • 제23권1호
    • /
    • pp.41-48
    • /
    • 2015
  • BIM 기술은 기존 2D 기반 도면처리에서 나아가 3D 모델링을 통한 시설물의 전 생애주기에 발생하는 데이터를 포함한다. 이러한 특성상 하나의 건물은 그 데이터의 방대한 양으로 엄청난 크기의 파일을 생산한다. 대표 표준포맷인 IFC가 그 예로, 객체 기반의 형상정보 및 속성정보를 기반으로 상당한 데이터를 포함한 대용량 처리에 대한 이슈가 종종 발생하고 있다. 이는 렌더링 속도를 증가시키고, 그래픽 카드 용량을 많이 차지하기 때문에, 화면 가시화 측면에서 비효율적이다. 대용량 데이터의 경량화 문제는 프로그램의 프로세스와 품질 측면에서 필수적으로 해결되어야 한다. 본 연구는 국내 및 해외 연구사례에서 경량화에 관련된 다양한 시도를 확인하였다. 이를 기반으로 대용량 BIM 데이터를 효과적으로 컨트롤하고 가시화하기 위해, BIM 특성을 고려하여 최대한 활용할 수 있는 데이터의 경량화 기법을 제안하고 검증하였다. 이는 웹 기반 GIS 플랫폼 상에서 대용량 시설물 데이터를 운용하는데, 최적의 시설물 유형을 분석하고 객체 기반의 IFC 특성을 최대한 활용하여 사용자 측면에서 화면전환의 품질을 확보하고 프로세스 측면에서 효과적인 메모리 운영을 확인하였다.

Sampled-Data Observer-Based Decentralized Fuzzy Control for Nonlinear Large-Scale Systems

  • Koo, Geun Bum;Park, Jin Bae;Joo, Young Hoon
    • Journal of Electrical Engineering and Technology
    • /
    • 제11권3호
    • /
    • pp.724-732
    • /
    • 2016
  • In this paper, a sampled-data observer-based decentralized fuzzy control technique is proposed for a class of nonlinear large-scale systems, which can be represented to a Takagi-Sugeno fuzzy system. The premise variable is assumed to be measurable for the design of the observer-based fuzzy controller, and the closed-loop system is obtained. Based on an exact discretized model of the closed-loop system, the stability condition is derived for the closed-loop system. Also, the stability condition is converted into the linear matrix inequality (LMI) format. Finally, an example is provided to verify the effectiveness of the proposed techniques.

Development of the design methodology for large-scale database based on MongoDB

  • Lee, Jun-Ho;Joo, Kyung-Soo
    • 한국컴퓨터정보학회논문지
    • /
    • 제22권11호
    • /
    • pp.57-63
    • /
    • 2017
  • The recent sudden increase of big data has characteristics such as continuous generation of data, large amount, and unstructured format. The existing relational database technologies are inadequate to handle such big data due to the limited processing speed and the significant storage expansion cost. Thus, big data processing technologies, which are normally based on distributed file systems, distributed database management, and parallel processing technologies, have arisen as a core technology to implement big data repositories. In this paper, we propose a design methodology for large-scale database based on MongoDB by extending the information engineering methodology based on E-R data model.

Computer Vision-based Continuous Large-scale Site Monitoring System through Edge Computing and Small-Object Detection

  • Kim, Yeonjoo;Kim, Siyeon;Hwang, Sungjoo;Hong, Seok Hwan
    • 국제학술발표논문집
    • /
    • The 9th International Conference on Construction Engineering and Project Management
    • /
    • pp.1243-1244
    • /
    • 2022
  • In recent years, the growing interest in off-site construction has led to factories scaling up their manufacturing and production processes in the construction sector. Consequently, continuous large-scale site monitoring in low-variability environments, such as prefabricated components production plants (precast concrete production), has gained increasing importance. Although many studies on computer vision-based site monitoring have been conducted, challenges for deploying this technology for large-scale field applications still remain. One of the issues is collecting and transmitting vast amounts of video data. Continuous site monitoring systems are based on real-time video data collection and analysis, which requires excessive computational resources and network traffic. In addition, it is difficult to integrate various object information with different sizes and scales into a single scene. Various sizes and types of objects (e.g., workers, heavy equipment, and materials) exist in a plant production environment, and these objects should be detected simultaneously for effective site monitoring. However, with the existing object detection algorithms, it is difficult to simultaneously detect objects with significant differences in size because collecting and training massive amounts of object image data with various scales is necessary. This study thus developed a large-scale site monitoring system using edge computing and a small-object detection system to solve these problems. Edge computing is a distributed information technology architecture wherein the image or video data is processed near the originating source, not on a centralized server or cloud. By inferring information from the AI computing module equipped with CCTVs and communicating only the processed information with the server, it is possible to reduce excessive network traffic. Small-object detection is an innovative method to detect different-sized objects by cropping the raw image and setting the appropriate number of rows and columns for image splitting based on the target object size. This enables the detection of small objects from cropped and magnified images. The detected small objects can then be expressed in the original image. In the inference process, this study used the YOLO-v5 algorithm, known for its fast processing speed and widely used for real-time object detection. This method could effectively detect large and even small objects that were difficult to detect with the existing object detection algorithms. When the large-scale site monitoring system was tested, it performed well in detecting small objects, such as workers in a large-scale view of construction sites, which were inaccurately detected by the existing algorithms. Our next goal is to incorporate various safety monitoring and risk analysis algorithms into this system, such as collision risk estimation, based on the time-to-collision concept, enabling the optimization of safety routes by accumulating workers' paths and inferring the risky areas based on workers' trajectory patterns. Through such developments, this continuous large-scale site monitoring system can guide a construction plant's safety management system more effectively.

  • PDF

대형구조물 모니터링을 위한 high-rate GPS 자료처리 (A High-rate GPS Data Processing for Large-scale Structure Monitoring)

  • 배태석
    • 한국측량학회:학술대회논문집
    • /
    • 한국측량학회 2010년 춘계학술발표회 논문집
    • /
    • pp.181-182
    • /
    • 2010
  • For real-time displacement monitoring of large-scale structures, the high-rate (>1 Hz) GPS data processing is necessary, which is not possible even for the scientific GPS data processing softwares. Since the baseline is generally very short in this case, most of the atmospheric effects are removed, resulting in the unknowns of position and integer ambiguity. The number of unknowns in real-time kinematic GPS positioning makes the positioning impossible with usual approach, thus two-step approach is tested in this study.

  • PDF