• Title/Summary/Keyword: Workflow Process

Search Result 315, Processing Time 0.022 seconds

Development of a Web-based User Experience Certification System based on User-centered System Design Approach (사용자 중심의 웹 기반 제품 사용경험 인증·평가 시스템 개발)

  • Na, Ju Yeoun;Kim, Jihee;Jung, Sungwook;Lee, Dong Hyun;Lee, Cheol;Bahn, Sangwoo
    • The Journal of Society for e-Business Studies
    • /
    • v.24 no.1
    • /
    • pp.29-48
    • /
    • 2019
  • Recently, product design innovation to improve user experience has been perceived as a core element of enterprise competitiveness due to the fierce market competition and decrease of the technological gap between companies, but there is insufficient services to support the product experience evaluation of small and medium-sized companies (SMCs). The aim of this study is to develop a web-based product user experience evaluation and certification system supporting product design practices for SMCs. For system interface design, we conducted systematic functional requirement elicitation methods such as user survey, workflow analysis, user task definition, and function definition. Then main functions, information structure, navigation method, and detailed graphic user interfaces were developed with consideration of user interactions and requirements. In particular, it provides the databases for evaluation efficiency to support the evaluation process above a certain level of performance and efficiency, and knowledge databases to utilize in the evaluation and product design improvement. With help of the developed service platform, It is expected that the service platform would enhance SMCs' product development capability with regard to the user experience evaluation by connecting the consulting firms with SMCs.

Evaluation of functional suitable digital complete denture system based on 3D printing technology

  • Deng, Kehui;Chen, Hu;Wang, Yong;Zhou, Yongsheng;Sun, Yuchun
    • The Journal of Advanced Prosthodontics
    • /
    • v.13 no.6
    • /
    • pp.361-372
    • /
    • 2021
  • PURPOSE. To improve the clinical effects of complete denture use and simplify its clinical application, a digital complete denture restoration workflow (Functional Suitable Digital Complete Denture System, FSD) was proposed and preliminary clinical evaluation was done. MATERIALS AND METHODS. Forty edentulous patients were enrolled, of which half were treated by a prosthodontic chief physician, and the others were treated by a postgraduate student. Based on the primary impression and jaw relation obtained at the first visit, diagnostic denture was designed and printed to create a definitive impression, jaw relation, and esthetic confirmation at the second visit. A redesigned complete denture was printed as a mold to fabricate final denture that was delivered at the third visit. To evaluate accuracy of impression made by diagnostic denture, the final denture was used as a tray to make impression, and 3D comparison was used to analyze their difference. To evaluate the clinical effect of FSD, visual analogue scores (VAS) were determined by both dentists and patients. RESULTS. Two visits were reduced before denture delivery. The RMS values of 3D comparison between the impression made via diagnostic dentures and the final dentures were 0.165 ± 0.033 mm in the upper jaw and 0.139 ± 0.031 mm in the lower jaw. VAS ratings were between 8.5 and 9.6 in the chief physician group, while 7.7 and 9.5 in the student group; there was no statistical difference between the two groups. CONCLUSION. FSD can simplify the complete denture restoration process and reduce the number of visits. The accuracy of impressions made by diagnostic dentures was acceptable in clinic. The VASs of both dentists and patients were satisfied.

Design of Standard Metadata Schema for Computing Resource Management (컴퓨팅 리소스 관리를 위한 표준 메타데이터 스키마 설계)

  • Lee, Mikyoung;Cho, Minhee;Song, Sa-Kwang;Yim, Hyung-Jun
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.433-435
    • /
    • 2022
  • In this paper, we introduce a computing resource standard metadata schema design plan for registering, retrieving, and managing computing resources used for research data analysis and utilization in the Korea Research Data Commons(KRDC). KRDC is a joint utilization system of research data and computing resources to maximize the sharing and utilization of research data. Computing resources refer to all resources in the computing environment, such as analysis infrastructure and analysis software, necessary to analyze and utilize research data used in the entire research process. The standard metadata schema for KRDC computing resource management is designed by considering common attributes for computing resource management and other attributes according to each computing resource feature. The standard metadata schema for computing resource management consists of a computing resource metadata schema and a computing resource provider metadata schema. In addition, the metadata schema of computing resources and providers was designed as a service schema and a system schema group according to their characteristics. The standard metadata schema designed in this paper is used for computing resource registration, retrieval, management, and workflow services for computing resource providers and computing resource users through the KRDC web service, and is designed in a scalable form for various computing resource links.

  • PDF

Integrated Data Safe Zone Prototype for Efficient Processing and Utilization of Pseudonymous Information in the Transportation Sector (교통분야 가명정보의 효율적 처리 및 활용을 위한 통합데이터안심구역 프로토타입)

  • Hyoungkun Lee;Keedong Yoo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.23 no.3
    • /
    • pp.48-66
    • /
    • 2024
  • According to the three amended Laws of the Data Economy and the Data Industry Act of Korea, systems for pseudonymous data integration and Data Safe Zones have been operated separately by selected agencies, eventually causing a burden of use in SMEs, startups, and general users because of complicated and ineffective procedures. An over-stringent pseudonymization policy to prevent data breaches has also compromised data quality. Such trials should be improved to ensure the convenience of use and data quality. This paper proposes a prototype system of the Integrated Data Safe Zone based on redesigned and optimized pseudonymization workflows. Conventional workflows of pseudonymization were redesigned by applying the amended guidelines and selectively revising existing guidelines for business process redesign. The proposed prototype has been shown quantitatively to outperform the conventional one: 6-fold increase in time efficiency, 1.28-fold in cost reduction, and 1.3-fold improvement in data quality.

An Analysis of Big Video Data with Cloud Computing in Ubiquitous City (클라우드 컴퓨팅을 이용한 유시티 비디오 빅데이터 분석)

  • Lee, Hak Geon;Yun, Chang Ho;Park, Jong Won;Lee, Yong Woo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.45-52
    • /
    • 2014
  • The Ubiquitous-City (U-City) is a smart or intelligent city to satisfy human beings' desire to enjoy IT services with any device, anytime, anywhere. It is a future city model based on Internet of everything or things (IoE or IoT). It includes a lot of video cameras which are networked together. The networked video cameras support a lot of U-City services as one of the main input data together with sensors. They generate huge amount of video information, real big data for the U-City all the time. It is usually required that the U-City manipulates the big data in real-time. And it is not easy at all. Also, many times, it is required that the accumulated video data are analyzed to detect an event or find a figure among them. It requires a lot of computational power and usually takes a lot of time. Currently we can find researches which try to reduce the processing time of the big video data. Cloud computing can be a good solution to address this matter. There are many cloud computing methodologies which can be used to address the matter. MapReduce is an interesting and attractive methodology for it. It has many advantages and is getting popularity in many areas. Video cameras evolve day by day so that the resolution improves sharply. It leads to the exponential growth of the produced data by the networked video cameras. We are coping with real big data when we have to deal with video image data which are produced by the good quality video cameras. A video surveillance system was not useful until we find the cloud computing. But it is now being widely spread in U-Cities since we find some useful methodologies. Video data are unstructured data thus it is not easy to find a good research result of analyzing the data with MapReduce. This paper presents an analyzing system for the video surveillance system, which is a cloud-computing based video data management system. It is easy to deploy, flexible and reliable. It consists of the video manager, the video monitors, the storage for the video images, the storage client and streaming IN component. The "video monitor" for the video images consists of "video translater" and "protocol manager". The "storage" contains MapReduce analyzer. All components were designed according to the functional requirement of video surveillance system. The "streaming IN" component receives the video data from the networked video cameras and delivers them to the "storage client". It also manages the bottleneck of the network to smooth the data stream. The "storage client" receives the video data from the "streaming IN" component and stores them to the storage. It also helps other components to access the storage. The "video monitor" component transfers the video data by smoothly streaming and manages the protocol. The "video translator" sub-component enables users to manage the resolution, the codec and the frame rate of the video image. The "protocol" sub-component manages the Real Time Streaming Protocol (RTSP) and Real Time Messaging Protocol (RTMP). We use Hadoop Distributed File System(HDFS) for the storage of cloud computing. Hadoop stores the data in HDFS and provides the platform that can process data with simple MapReduce programming model. We suggest our own methodology to analyze the video images using MapReduce in this paper. That is, the workflow of video analysis is presented and detailed explanation is given in this paper. The performance evaluation was experiment and we found that our proposed system worked well. The performance evaluation results are presented in this paper with analysis. With our cluster system, we used compressed $1920{\times}1080(FHD)$ resolution video data, H.264 codec and HDFS as video storage. We measured the processing time according to the number of frame per mapper. Tracing the optimal splitting size of input data and the processing time according to the number of node, we found the linearity of the system performance.