• Title/Summary/Keyword: workflow model

Search Result 216, Processing Time 0.025 seconds

FinBERT Fine-Tuning for Sentiment Analysis: Exploring the Effectiveness of Datasets and Hyperparameters (감성 분석을 위한 FinBERT 미세 조정: 데이터 세트와 하이퍼파라미터의 효과성 탐구)

  • Jae Heon Kim;Hui Do Jung;Beakcheol Jang
    • Journal of Internet Computing and Services
    • /
    • v.24 no.4
    • /
    • pp.127-135
    • /
    • 2023
  • This research paper explores the application of FinBERT, a variational BERT-based model pre-trained on financial domain, for sentiment analysis in the financial domain while focusing on the process of identifying suitable training data and hyperparameters. Our goal is to offer a comprehensive guide on effectively utilizing the FinBERT model for accurate sentiment analysis by employing various datasets and fine-tuning hyperparameters. We outline the architecture and workflow of the proposed approach for fine-tuning the FinBERT model in this study, emphasizing the performance of various datasets and hyperparameters for sentiment analysis tasks. Additionally, we verify the reliability of GPT-3 as a suitable annotator by using it for sentiment labeling tasks. Our results show that the fine-tuned FinBERT model excels across a range of datasets and that the optimal combination is a learning rate of 5e-5 and a batch size of 64, which perform consistently well across all datasets. Furthermore, based on the significant performance improvement of the FinBERT model with our Twitter data in general domain compared to our news data in general domain, we also express uncertainty about the model being further pre-trained only on financial news data. We simplify the complex process of determining the optimal approach to the FinBERT model and provide guidelines for selecting additional training datasets and hyperparameters within the fine-tuning process of financial sentiment analysis models.

Basic System Architecture Design for Airport GIS Service Models (Airport GIS 구축을 위한 서비스모델 설계에 관한 연구)

  • Sim, Jae-Yong;Lee, Tong-Hoon;Park, Joo-Young
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.7 no.3
    • /
    • pp.82-94
    • /
    • 2008
  • Airport GIS is a comprehensive information system to improve security and efficiency of airport. At the initial stage to make it real, the current status of domestic and international regulations along with relevant standardization bas been reviewed. Gimpo Airport becomes a test-bed to get some ideas about how to bring the airport GIS into workflow by building service model and basic design based on current status and demand analysis of the airport. The 6 service models primarily brought into the project are as follows: (1) Local vehicles safety management in airside, (2) Intelligent traffic control between flights and vehicles at main cross points, (3) Dynamic safety management against FOD in airside and breakage on pavement, (4) Special support vehicle management such as deicing remotely controlled, (5) Response and support for fire vehicles and ambulances of signatory institutions in emergency. The upcoming research topic aims at drawing a specific design and building integrated system in the future.

  • PDF

Deployment of Lean System for Effective Lean Construction Implementation (효과적인 린 건설 수행을 위한 린 시스템 운용 방안 제시)

  • Kim, Dae-Young
    • Korean Journal of Construction Engineering and Management
    • /
    • v.6 no.6 s.28
    • /
    • pp.152-159
    • /
    • 2005
  • Since Lean Construction has been introduced as a new management approach to improve productivity in the construction industry, much research is in progress to develop lean concepts and principles for better implementation and to get results of the successful adaptation of lean ideas from manufacturing for application in the construction industry. Currently, Lean Construction Institute developed Last Planner System to achieve better workflow and effective planning management. Domestic construction companies are also getting interested in the Lean Construction. However, there are no detailed guidelines how to implement lean system. Thus, this study will provide a guideline for deployment of last planner system, a prototype lean implementation control checklist, and all project participants roles based on the case studies in USA. This study should provide a framework to apply lean construction to domestic construction sites.

A Study on Designing a Next-Generation Records Management System (차세대 기록관리시스템 재설계 모형 연구)

  • Oh, Jin-Kwan;Yim, Jin-Hee
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.18 no.2
    • /
    • pp.163-188
    • /
    • 2018
  • How do we create a next generation Records Management System? Under a rapidly changing system development environment, the records management system of public institutions has remained stable for the past 10 years. For this reason, it seems to be the key cause of the structural problem of the Records Management System, which makes it difficult to accommodate user requirements and apply a new recording technology. The purpose of this study is to present a redesigned model for a next-generation records management system by analyzing the status of the electronic records management. This study analyzed "A Study on the Redesign of the Next-Generation Electronic Records Management Process," records management technology of advanced records management system, and a case of an overseas system. Based on the analysis results, the improvement direction of the records management system was analyzed from several aspects: functional, software design, and software distribution. This study thus suggests that the creation of a microservice architecture-based (MSA) and an open source software-oriented (OSS) records management system should be the focus of next-generation record management.

Runtime Prediction Based on Workload-Aware Clustering (병렬 프로그램 로그 군집화 기반 작업 실행 시간 예측모형 연구)

  • Kim, Eunhye;Park, Ju-Won
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.38 no.3
    • /
    • pp.56-63
    • /
    • 2015
  • Several fields of science have demanded large-scale workflow support, which requires thousands of CPU cores or more. In order to support such large-scale scientific workflows, high capacity parallel systems such as supercomputers are widely used. In order to increase the utilization of these systems, most schedulers use backfilling policy: Small jobs are moved ahead to fill in holes in the schedule when large jobs do not delay. Since an estimate of the runtime is necessary for backfilling, most parallel systems use user's estimated runtime. However, it is found to be extremely inaccurate because users overestimate their jobs. Therefore, in this paper, we propose a novel system for the runtime prediction based on workload-aware clustering with the goal of improving prediction performance. The proposed method for runtime prediction of parallel applications consists of three main phases. First, a feature selection based on factor analysis is performed to identify important input features. Then, it performs a clustering analysis of history data based on self-organizing map which is followed by hierarchical clustering for finding the clustering boundaries from the weight vectors. Finally, prediction models are constructed using support vector regression with the clustered workload data. Multiple prediction models for each clustered data pattern can reduce the error rate compared with a single model for the whole data pattern. In the experiments, we use workload logs on parallel systems (i.e., iPSC, LANL-CM5, SDSC-Par95, SDSC-Par96, and CTC-SP2) to evaluate the effectiveness of our approach. Comparing with other techniques, experimental results show that the proposed method improves the accuracy up to 69.08%.

Causes and Countermeasures of School Records Misclassifications : Focusing on the 'General Disposition Authority for School Records' (학교 기록물 분류의 문제점과 개선방안 학교 기록관리기준표 분석을 중심으로)

  • Woo, Jee-won;Seol, Moon-won
    • The Korean Journal of Archival Studies
    • /
    • no.58
    • /
    • pp.299-332
    • /
    • 2018
  • The purpose of this study is to investigate the current status and causes of misclassification of school records and to suggest the directions to improve the School Records Management Criteria Table(general disposition authority for school records), which will lead to misclassification reducement. This study begins with analysing the records created or received in four schools sampled for one year to investigate the status and causes of misclassifications. A advisory group including four administrative officers and seven records managers was formed and group meeting was held twice to identify the causes of the misclassification and to suggest alternatives. In this study, 33 unit tasks(transactions) with frequent misclassification were identified, and the cause of misclassification was analyzed based on focus group interviews. The main causes of misclassification were categorized into two types. This study concludes with suggesting the improvement of the School Records Management Criteria Table for addressing the causes, including commentary reinforcement and the addition of workflow to complex tasks.

Deep Learning in Radiation Oncology

  • Cheon, Wonjoong;Kim, Haksoo;Kim, Jinsung
    • Progress in Medical Physics
    • /
    • v.31 no.3
    • /
    • pp.111-123
    • /
    • 2020
  • Deep learning (DL) is a subset of machine learning and artificial intelligence that has a deep neural network with a structure similar to the human neural system and has been trained using big data. DL narrows the gap between data acquisition and meaningful interpretation without explicit programming. It has so far outperformed most classification and regression methods and can automatically learn data representations for specific tasks. The application areas of DL in radiation oncology include classification, semantic segmentation, object detection, image translation and generation, and image captioning. This article tries to understand what is the potential role of DL and what can be more achieved by utilizing it in radiation oncology. With the advances in DL, various studies contributing to the development of radiation oncology were investigated comprehensively. In this article, the radiation treatment process was divided into six consecutive stages as follows: patient assessment, simulation, target and organs-at-risk segmentation, treatment planning, quality assurance, and beam delivery in terms of workflow. Studies using DL were classified and organized according to each radiation treatment process. State-of-the-art studies were identified, and the clinical utilities of those researches were examined. The DL model could provide faster and more accurate solutions to problems faced by oncologists. While the effect of a data-driven approach on improving the quality of care for cancer patients is evidently clear, implementing these methods will require cultural changes at both the professional and institutional levels. We believe this paper will serve as a guide for both clinicians and medical physicists on issues that need to be addressed in time.

Analysis of the trueness and precision of complete denture bases manufactured using digital and analog technologies

  • Leonardo Ciocca;Mattia Maltauro;Valerio Cimini;Lorenzo Breschi;Angela Montanari;Laura Anderlucci;Roberto Meneghello
    • The Journal of Advanced Prosthodontics
    • /
    • v.15 no.1
    • /
    • pp.22-32
    • /
    • 2023
  • PURPOSE. Digital technology has enabled improvements in the fitting accuracy of denture bases via milling techniques. The aim of this study was to evaluate the trueness and precision of digital and analog techniques for manufacturing complete dentures (CDs). MATERIALS AND METHODS. Sixty identical CDs were manufactured using different production protocols. Digital and analog technologies were compared using the reference geometric approach, and the Δ-error values of eight areas of interest (AOI) were calculated. For each AOI, a precise number of measurement points was selected according to sensitivity analyses to compare the Δ-error of trueness and precision between the original model and manufactured prosthesis. Three types of statistical analysis were performed: to calculate the intergroup cumulative difference among the three protocols, the intergroup among the AOIs, and the intragroup difference among AOIs. RESULTS. There was a statistically significant difference between the dentures made using the oversize process and injection molding process (P < .001), but no significant difference between the other two manufacturing methods (P = .1227). There was also a statistically significant difference between the dentures made using the monolithic process and the other two processes for all AOIs (P = .0061), but there was no significant difference between the other two processes (P = 1). Within each group, significant differences among the AOIs were observed. CONCLUSION. The monolithic process yielded better results, in terms of accuracy (trueness and precision), than the other groups, although all three processes led to dentures with Δ-error values well within the clinical tolerance limit.

Simulation-Based Material Property Analysis of 3D Woven Materials Using Artificial Neural Network (시뮬레이션 기반 3차원 엮임 재료의 물성치 분석 및 인공 신경망 해석)

  • Byungmo Kim;Seung-Hyun Ha
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.36 no.4
    • /
    • pp.259-264
    • /
    • 2023
  • In this study, we devised a parametric analysis workflow for efficiently analyzing the material properties of 3D woven materials. The parametric model uses wire spacing in the woven materials as a design parameter; we generated 2,500 numerical models with various combinations of these design parameters. Using MATLAB and ANSYS software, we obtained various material properties, such as bulk modulus, thermal conductivity, and fluid permeability of the woven materials, through a parametric batch analysis. We then used this large dataset of material properties to perform a regression analysis to validate the relationship between design variables and material properties, as well as the accuracy of numerical analysis. Furthermore, we constructed an artificial neural network capable of predicting the material properties of 3D woven materials on the basis of the obtained material database. The trained network can accurately estimate the material properties of the woven materials with arbitrary design parameters, without the need for numerical analyses.

A study on the Performance of Hybrid Normal Mapping Techniques for Real-time Rendering

  • ZhengRan Liu;KiHong Kim;YuanZi Sang
    • International journal of advanced smart convergence
    • /
    • v.12 no.4
    • /
    • pp.361-369
    • /
    • 2023
  • Achieving realistic visual quality while maintaining optimal real-time rendering performance is a major challenge in evolving computer graphics and interactive 3D applications. Normal mapping, as a core technology in 3D, has matured through continuous optimization and iteration. Hybrid normal mapping as a new mapping model has also made significant progress and has been applied in the 3D asset production pipeline. This study comprehensively explores the hybrid normal techniques, analyzing Linear Blending, Overlay Blending, Whiteout Blending, UDN Blending, and Reoriented Normal Mapping, and focuses on how the various hybrid normal techniques can be used to achieve rendering performance and visual fidelity. performance and visual fidelity. Under the consideration of computational efficiency, visual coherence, and adaptability in different 3D production scenes, we design comparative experiments to explore the optimal solutions of the hybrid normal techniques by analyzing and researching the code, the performance of different hybrid normal mapping in the engine, and analyzing and comparing the data. The purpose of the research and summary of the hybrid normal technology is to find out the most suitable choice for the mainstream workflow based on the objective reality. Provide an understanding of the hybrid normal mapping technique, so that practitioners can choose how to apply different hybrid normal techniques to the corresponding projects. The purpose of our research and summary of mixed normal technology is to find the most suitable choice for mainstream workflows based on objective reality. We summarized the hybrid normal mapping technology and experimentally obtained the advantages and disadvantages of different technologies, so that practitioners can choose to apply different hybrid normal mapping technologies to corresponding projects in a reasonable manner.