• Title/Summary/Keyword: computer technologies

Search Result 2,673, Processing Time 0.033 seconds

Reflecting Academic Symposia as a Trend at Animation Festivals, Media Art Festivals and Conferences on Computer Animation (학술회 반영 경향의 애니메이션 페스티벌과 미디어 아트 페스티벌 그리고 컴퓨터 애니메이션 학회)

  • Hagler, Juergen;Bruckner, Franziska
    • Cartoon and Animation Studies
    • /
    • s.49
    • /
    • pp.611-631
    • /
    • 2017
  • At first there was practice, then festivals and theory followed. Compared to the animation production, which is older then the medium film itself, festivals and theory in this area started with a delay. While animation programs where shown in film festivals like Cannes since the mid 1940s, the first animation festival in Annecy, France was founded in 1960, followed by several short-lived events in Romania, Italy and Tokyo and finally in 1972 by the second oldest festival up to date, Animafest Zagreb. Animation theory evolved in the late 1980s in the Anglo-American area with associations like the Society for Animation Studies, following its 'big sister' film studies. Expanding ever since as a research area, European animation studies in e.g. France, German speaking countries, Poland or Croatia have been catching up in recent years by organizing theoretical conferences and publications. A vivid synergy between practice, festivals and theory has always been a key factor for establishing a platform for the art form and culture of animation. However, in the past few years a trend could be observed towards a more intense interaction between animation festivals and theory. Animation festivals are hosting theoretical and scientific symposia or conferences, which are open for artist positions and insights into the industry. At the beginning of the lecture a short reflection of the concept of Animafest Scanner itself is followed by an introduction of the Symposium Expanded Animation at the media festival Ars Electronica Linz. The talk will subsequently focus on the multilayered academic symposia at the Festival of Animated Film ITFS and the International Conference on Animation, Effects, VR, Games and Transmedia in Stuttgart. These case studies will reveal the blurring boundaries between art, science, theory and industry as well as the specificities of the interplay between artists, practitioners, scholars, curators and festival visitors in different formats.

Impact of Net-Based Customer Service on Firm Profits and Consumer Welfare (기업의 온라인 고객 서비스가 기업의 수익 및 고객의 후생에 미치는 영향에 관한 연구)

  • Kim, Eun-Jin;Lee, Byung-Tae
    • Asia pacific journal of information systems
    • /
    • v.17 no.2
    • /
    • pp.123-137
    • /
    • 2007
  • The advent of the Internet and related Web technologies has created an easily accessible link between a firm and its customers, and has provided opportunities to a firm to use information technology to support supplementary after-sale services associated with a product or service. It has been widely recognized that supplementary services are an important source of customer value and of competitive advantage as the characteristics of the product itself. Many of these supplementary services are information-based and need not be co-located with the product, so more and more companies are delivering these services electronically. Net-based customer service, which is defined as an Internet-based computerized information system that delivers services to a customer, therefore, is the core infrastructure for supplementary service provision. The importance of net-based customer service in delivering supplementary after-sale services associated with product has been well documented. The strategic advantages of well-implemented net-based customer service are enhanced customer loyalty and higher lock-in of customers, and a resulting reduction in competition and the consequent increase in profits. However, not all customers utilize such net-based customer service. The digital divide is the phenomenon in our society that captures the observation that not all customers have equal access to computers. Socioeconomic factors such as race, gender, and education level are strongly related to Internet accessibility and ability to use. This is due to the differences in the ability to bear the cost of a computer, and the differences in self-efficacy in the use of a technology, among other reasons. This concept, applied to e-commerce, has been called the "e-commerce divide." High Internet penetration is not eradicating the digital divide and e-commerce divide as one would hope. Besides, to accommodate personalized support, a customer must often provide personal information to the firm. This personal information includes not only name and address, but also preferences information and perhaps valuation information. However, many recent studies show that consumers may not be willing to share information about themselves due to concerns about privacy online. Due to the e-commerce divide, and due to privacy and security concerns of the customer for sharing personal information with firms, limited numbers of customers adopt net-based customer service. The limited level of customer adoption of net-based customer service affects the firm profits and the customers' welfare. We use a game-theoretic model in which we model the net-based customer service system as a mechanism to enhance customers' loyalty. We model a market entry scenario where a firm (the incumbent) uses the net-based customer service system in inducing loyalty in its customer base. The firm sells one product through the traditional retailing channels and at a price set for these channels. Another firm (the entrant) enters the market, and having observed the price of the incumbent firm (and after deducing the loyalty levels in the customer base), chooses its price. The profits of the firms and the surplus of the two customers segments (the segment that utilizes net-based customer service and the segment that does not) are analyzed in the Stackelberg leader-follower model of competition between the firms. We find that an increase in adoption of net-based customer service by the customer base is not always desirable for firms. With low effectiveness in enhancing customer loyalty, firms prefer a high level of customer adoption of net-based customer service, because an increase in adoption rate decreases competition and increases profits. A firm in an industry where net-based customer service is highly effective loyalty mechanism, on the other hand, prefers a low level of adoption by customers.

Dosimetric Characteristics on Penumbra Regions of the Multileaf Collimator as Compared with the Lead Alloy Block (다엽 콜리메이터(Multileaf Collimator)와 합금납 차폐물(Lead Alloy Block)의 반 그림자영역의 선량 분포상의 특성 비교)

  • Lee Sang Wook;Oh Young Tack;Kim Woo Cheol;Keum Ki Chang;Yoon Seong Ick;Kim Hyun Soo;Park Won;Chu Seong Sil;Kim Gwi Eon
    • Radiation Oncology Journal
    • /
    • v.13 no.4
    • /
    • pp.391-396
    • /
    • 1995
  • Purpose : The Conformal Radiation Therapy has bee widely used under favour of development of computer technologies. The delivery of a large number of static radiation fields are being necessary for the conformal irradiation. In this paper we investigate dosimetric characteristics on penumbra regions of a multileaf collimator(MLC), and compare to those of lead alloy block for the optimal use of the system in 3-D conformal radiotherapy. Materials and Methods : The measurement of penumbra by MLC or lead alloy block was performed with 6 or 10 MV X-rays. The film was positioned at a dmax depth and 10 cm depth, and its optical density was determined using a scanning videodensitometer. The effective penumbra, the distance from $80{\%}$ to $20{\%}$ isodose lines and $90{\%}$ to $10{\%}$ were analyzed as a function of the angle between the direction of leaf motion and the edge defined by leaves. Results : Increasing MLC angle ($0-75^{\circ}$) was observed with increasing the penumbra widths and the scalloping effect. There was no definite differences of penumbra width from $80{\%}$ to $20{\%}$ isodose lines, while being the small increase of penumbra width from $90{\%}$ to $10{\%}$ isodose line varing the depth and energy. The effective penumbra width of lead alloy block are agree resonably with those of MLC within 4.8mm. Conclusion : The comparative qualitative study of the penumbra between MLC and lead alloy block demonstrate the clinical acceptability and suitability of the multileaf collimator for 3-D conformal radiotherapy.

  • PDF

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.

The Need for Paradigm Shift in Semantic Similarity and Semantic Relatedness : From Cognitive Semantics Perspective (의미간의 유사도 연구의 패러다임 변화의 필요성-인지 의미론적 관점에서의 고찰)

  • Choi, Youngseok;Park, Jinsoo
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.111-123
    • /
    • 2013
  • Semantic similarity/relatedness measure between two concepts plays an important role in research on system integration and database integration. Moreover, current research on keyword recommendation or tag clustering strongly depends on this kind of semantic measure. For this reason, many researchers in various fields including computer science and computational linguistics have tried to improve methods to calculating semantic similarity/relatedness measure. This study of similarity between concepts is meant to discover how a computational process can model the action of a human to determine the relationship between two concepts. Most research on calculating semantic similarity usually uses ready-made reference knowledge such as semantic network and dictionary to measure concept similarity. The topological method is used to calculated relatedness or similarity between concepts based on various forms of a semantic network including a hierarchical taxonomy. This approach assumes that the semantic network reflects the human knowledge well. The nodes in a network represent concepts, and way to measure the conceptual similarity between two nodes are also regarded as ways to determine the conceptual similarity of two words(i.e,. two nodes in a network). Topological method can be categorized as node-based or edge-based, which are also called the information content approach and the conceptual distance approach, respectively. The node-based approach is used to calculate similarity between concepts based on how much information the two concepts share in terms of a semantic network or taxonomy while edge-based approach estimates the distance between the nodes that correspond to the concepts being compared. Both of two approaches have assumed that the semantic network is static. That means topological approach has not considered the change of semantic relation between concepts in semantic network. However, as information communication technologies make advantage in sharing knowledge among people, semantic relation between concepts in semantic network may change. To explain the change in semantic relation, we adopt the cognitive semantics. The basic assumption of cognitive semantics is that humans judge the semantic relation based on their cognition and understanding of concepts. This cognition and understanding is called 'World Knowledge.' World knowledge can be categorized as personal knowledge and cultural knowledge. Personal knowledge means the knowledge from personal experience. Everyone can have different Personal Knowledge of same concept. Cultural Knowledge is the knowledge shared by people who are living in the same culture or using the same language. People in the same culture have common understanding of specific concepts. Cultural knowledge can be the starting point of discussion about the change of semantic relation. If the culture shared by people changes for some reasons, the human's cultural knowledge may also change. Today's society and culture are changing at a past face, and the change of cultural knowledge is not negligible issues in the research on semantic relationship between concepts. In this paper, we propose the future directions of research on semantic similarity. In other words, we discuss that how the research on semantic similarity can reflect the change of semantic relation caused by the change of cultural knowledge. We suggest three direction of future research on semantic similarity. First, the research should include the versioning and update methodology for semantic network. Second, semantic network which is dynamically generated can be used for the calculation of semantic similarity between concepts. If the researcher can develop the methodology to extract the semantic network from given knowledge base in real time, this approach can solve many problems related to the change of semantic relation. Third, the statistical approach based on corpus analysis can be an alternative for the method using semantic network. We believe that these proposed research direction can be the milestone of the research on semantic relation.

Real-time CRM Strategy of Big Data and Smart Offering System: KB Kookmin Card Case (KB국민카드의 빅데이터를 활용한 실시간 CRM 전략: 스마트 오퍼링 시스템)

  • Choi, Jaewon;Sohn, Bongjin;Lim, Hyuna
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.1-23
    • /
    • 2019
  • Big data refers to data that is difficult to store, manage, and analyze by existing software. As the lifestyle changes of consumers increase the size and types of needs that consumers desire, they are investing a lot of time and money to understand the needs of consumers. Companies in various industries utilize Big Data to improve their products and services to meet their needs, analyze unstructured data, and respond to real-time responses to products and services. The financial industry operates a decision support system that uses financial data to develop financial products and manage customer risks. The use of big data by financial institutions can effectively create added value of the value chain, and it is possible to develop a more advanced customer relationship management strategy. Financial institutions can utilize the purchase data and unstructured data generated by the credit card, and it becomes possible to confirm and satisfy the customer's desire. CRM has a granular process that can be measured in real time as it grows with information knowledge systems. With the development of information service and CRM, the platform has change and it has become possible to meet consumer needs in various environments. Recently, as the needs of consumers have diversified, more companies are providing systematic marketing services using data mining and advanced CRM (Customer Relationship Management) techniques. KB Kookmin Card, which started as a credit card business in 1980, introduced early stabilization of processes and computer systems, and actively participated in introducing new technologies and systems. In 2011, the bank and credit card companies separated, leading the 'Hye-dam Card' and 'One Card' markets, which were deviated from the existing concept. In 2017, the total use of domestic credit cards and check cards grew by 5.6% year-on-year to 886 trillion won. In 2018, we received a long-term rating of AA + as a result of our credit card evaluation. We confirmed that our credit rating was at the top of the list through effective marketing strategies and services. At present, Kookmin Card emphasizes strategies to meet the individual needs of customers and to maximize the lifetime value of consumers by utilizing payment data of customers. KB Kookmin Card combines internal and external big data and conducts marketing in real time or builds a system for monitoring. KB Kookmin Card has built a marketing system that detects realtime behavior using big data such as visiting the homepage and purchasing history by using the customer card information. It is designed to enable customers to capture action events in real time and execute marketing by utilizing the stores, locations, amounts, usage pattern, etc. of the card transactions. We have created more than 280 different scenarios based on the customer's life cycle and are conducting marketing plans to accommodate various customer groups in real time. We operate a smart offering system, which is a highly efficient marketing management system that detects customers' card usage, customer behavior, and location information in real time, and provides further refinement services by combining with various apps. This study aims to identify the traditional CRM to the current CRM strategy through the process of changing the CRM strategy. Finally, I will confirm the current CRM strategy through KB Kookmin card's big data utilization strategy and marketing activities and propose a marketing plan for KB Kookmin card's future CRM strategy. KB Kookmin Card should invest in securing ICT technology and human resources, which are becoming more sophisticated for the success and continuous growth of smart offering system. It is necessary to establish a strategy for securing profit from a long-term perspective and systematically proceed. Especially, in the current situation where privacy violation and personal information leakage issues are being addressed, efforts should be made to induce customers' recognition of marketing using customer information and to form corporate image emphasizing security.

Performance Optimization of Numerical Ocean Modeling on Cloud Systems (클라우드 시스템에서 해양수치모델 성능 최적화)

  • JUNG, KWANGWOOG;CHO, YANG-KI;TAK, YONG-JIN
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.27 no.3
    • /
    • pp.127-143
    • /
    • 2022
  • Recently, many attempts to run numerical ocean models in cloud computing environments have been tried actively. A cloud computing environment can be an effective means to implement numerical ocean models requiring a large-scale resource or quickly preparing modeling environment for global or large-scale grids. Many commercial and private cloud computing systems provide technologies such as virtualization, high-performance CPUs and instances, ether-net based high-performance-networking, and remote direct memory access for High Performance Computing (HPC). These new features facilitate ocean modeling experimentation on commercial cloud computing systems. Many scientists and engineers expect cloud computing to become mainstream in the near future. Analysis of the performance and features of commercial cloud services for numerical modeling is essential in order to select appropriate systems as this can help to minimize execution time and the amount of resources utilized. The effect of cache memory is large in the processing structure of the ocean numerical model, which processes input/output of data in a multidimensional array structure, and the speed of the network is important due to the communication characteristics through which a large amount of data moves. In this study, the performance of the Regional Ocean Modeling System (ROMS), the High Performance Linpack (HPL) benchmarking software package, and STREAM, the memory benchmark were evaluated and compared on commercial cloud systems to provide information for the transition of other ocean models into cloud computing. Through analysis of actual performance data and configuration settings obtained from virtualization-based commercial clouds, we evaluated the efficiency of the computer resources for the various model grid sizes in the virtualization-based cloud systems. We found that cache hierarchy and capacity are crucial in the performance of ROMS using huge memory. The memory latency time is also important in the performance. Increasing the number of cores to reduce the running time for numerical modeling is more effective with large grid sizes than with small grid sizes. Our analysis results will be helpful as a reference for constructing the best computing system in the cloud to minimize time and cost for numerical ocean modeling.

Deep Learning OCR based document processing platform and its application in financial domain (금융 특화 딥러닝 광학문자인식 기반 문서 처리 플랫폼 구축 및 금융권 내 활용)

  • Dongyoung Kim;Doohyung Kim;Myungsung Kwak;Hyunsoo Son;Dongwon Sohn;Mingi Lim;Yeji Shin;Hyeonjung Lee;Chandong Park;Mihyang Kim;Dongwon Choi
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.1
    • /
    • pp.143-174
    • /
    • 2023
  • With the development of deep learning technologies, Artificial Intelligence powered Optical Character Recognition (AI-OCR) has evolved to read multiple languages from various forms of images accurately. For the financial industry, where a large number of diverse documents are processed through manpower, the potential for using AI-OCR is great. In this study, we present a configuration and a design of an AI-OCR modality for use in the financial industry and discuss the platform construction with application cases. Since the use of financial domain data is prohibited under the Personal Information Protection Act, we developed a deep learning-based data generation approach and used it to train the AI-OCR models. The AI-OCR models are trained for image preprocessing, text recognition, and language processing and are configured as a microservice architected platform to process a broad variety of documents. We have demonstrated the AI-OCR platform by applying it to financial domain tasks of document sorting, document verification, and typing assistance The demonstrations confirm the increasing work efficiency and conveniences.

KANO-TOPSIS Model for AI Based New Product Development: Focusing on the Case of Developing Voice Assistant System for Vehicles (KANO-TOPSIS 모델을 이용한 지능형 신제품 개발: 차량용 음성비서 시스템 개발 사례)

  • Yang, Sungmin;Tak, Junhyuk;Kwon, Donghwan;Chung, Doohee
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.287-310
    • /
    • 2022
  • Companies' interest in developing AI-based intelligent new products is increasing. Recently, the main concern of companies is to innovate customer experience and create new values by developing new products through the effective use of Artificial intelligence technology. However, due to the nature of products based on radical technologies such as artificial intelligence, intelligent products differ from existing products and development methods, so it is clear that there is a limitation to applying the existing development methodology as it is. This study proposes a new research method based on KANO-TOPSIS for the successful development of AI-based intelligent new products by using car voice assistants as an example. Using the KANO model, select and evaluate functions that customers think are necessary for new products, and use the TOPSIS method to derives priorities by finding the importance of functions that customers need. For the analysis, major categories such as vehicle condition check and function control elements, driving-related elements, characteristics of voice assistant itself, infotainment elements, and daily life support elements were selected and customer demand attributes were subdivided. As a result of the analysis, high recognition accuracy should be considered as a top priority in the development of car voice assistants. Infotainment elements that provide customized content based on driver's biometric information and usage habits showed lower priorities than expected, while functions related to driver safety such as vehicle condition notification, driving assistance, and security, also showed as the functions that should be developed preferentially. This study is meaningful in that it presented a new product development methodology suitable for the characteristics of AI-based intelligent new products with innovative characteristics through an excellent model combining KANO and TOPSIS.

A Study on Public Interest-based Technology Valuation Models in Water Resources Field (수자원 분야 공익형 기술가치평가 시스템에 대한 연구)

  • Ryu, Seung-Mi;Sung, Tae-Eung
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.3
    • /
    • pp.177-198
    • /
    • 2018
  • Recently, as economic property it has become necessary to acquire and utilize the framework for water resource measurement and performance management as the property of water resources changes to hold "public property". To date, the evaluation of water technology has been carried out by feasibility study analysis or technology assessment based on net present value (NPV) or benefit-to-cost (B/C) effect, however it is not yet systemized in terms of valuation models to objectively assess an economic value of technology-based business to receive diffusion and feedback of research outcomes. Therefore, K-water (known as a government-supported public company in Korea) company feels the necessity to establish a technology valuation framework suitable for technical characteristics of water resources fields in charge and verify an exemplified case applied to the technology. The K-water evaluation technology applied to this study, as a public interest goods, can be used as a tool to measure the value and achievement contributed to society and to manage them. Therefore, by calculating the value in which the subject technology contributed to the entire society as a public resource, we make use of it as a basis information for the advertising medium of performance on the influence effect of the benefits or the necessity of cost input, and then secure the legitimacy for large-scale R&D cost input in terms of the characteristics of public technology. Hence, K-water company, one of the public corporation in Korea which deals with public goods of 'water resources', will be able to establish a commercialization strategy for business operation and prepare for a basis for the performance calculation of input R&D cost. In this study, K-water has developed a web-based technology valuation model for public interest type water resources based on the technology evaluation system that is suitable for the characteristics of a technology in water resources fields. In particular, by utilizing the evaluation methodology of the Institute of Advanced Industrial Science and Technology (AIST) in Japan to match the expense items to the expense accounts based on the related benefit items, we proposed the so-called 'K-water's proprietary model' which involves the 'cost-benefit' approach and the FCF (Free Cash Flow), and ultimately led to build a pipeline on the K-water research performance management system and then verify the practical case of a technology related to "desalination". We analyze the embedded design logic and evaluation process of web-based valuation system that reflects characteristics of water resources technology, reference information and database(D/B)-associated logic for each model to calculate public interest-based and profit-based technology values in technology integrated management system. We review the hybrid evaluation module that reflects the quantitative index of the qualitative evaluation indices reflecting the unique characteristics of water resources and the visualized user-interface (UI) of the actual web-based evaluation, which both are appended for calculating the business value based on financial data to the existing web-based technology valuation systems in other fields. K-water's technology valuation model is evaluated by distinguishing between public-interest type and profitable-type water technology. First, evaluation modules in profit-type technology valuation model are designed based on 'profitability of technology'. For example, the technology inventory K-water holds has a number of profit-oriented technologies such as water treatment membranes. On the other hand, the public interest-type technology valuation is designed to evaluate the public-interest oriented technology such as the dam, which reflects the characteristics of public benefits and costs. In order to examine the appropriateness of the cost-benefit based public utility valuation model (i.e. K-water specific technology valuation model) presented in this study, we applied to practical cases from calculation of benefit-to-cost analysis on water resource technology with 20 years of lifetime. In future we will additionally conduct verifying the K-water public utility-based valuation model by each business model which reflects various business environmental characteristics.