• Title/Summary/Keyword: data value

Search Result 16,922, Processing Time 0.043 seconds

A Study on the Data Value: In Public Data (데이터 가치에 대한 탐색적 연구: 공공데이터를 중심으로)

  • Lee, Sang Eun;Lee, Jung Hoon;Choi, Hyun Jin
    • Journal of Information Technology Services
    • /
    • v.21 no.1
    • /
    • pp.145-161
    • /
    • 2022
  • The data is a key catalyst for the development of the fourth industry, and has been viewed as an essential element of the new industry, with technology convergence such as artificial intelligence, augmented/virtual reality, self-driving and 5 G. This will determine the price and value of the data as the user uses data in which the data is based on the context of the situation, rather than the data itself of the past supplier-centric data. This study began with, what factors will increase the value of data from a user perspective not a supplier perspective The study was limited to public data and users conducted research on users using data, such as analysis or development based on data. The study was designed to gauge the value of data that was not studied in the user's perspective, and was instrumental in raising the value of data in the jurisdiction of supplying and managing data.

Economic Valuation of Public Sector Data: A Case Study on Small Business Credit Guarantee Data (공공부문 데이터의 경제적 가치평가 연구: 소상공인 신용보증 데이터 사례)

  • Kim, Dong Sung;Kim, Jong Woo;Lee, Hong Joo;Kang, Man Su
    • Knowledge Management Research
    • /
    • v.18 no.1
    • /
    • pp.67-81
    • /
    • 2017
  • As the important breakthrough continues in the field of machine learning and artificial intelligence recently, there has been a growing interest in the analysis and the utilization of the big data which constitutes a foundation for the field. In this background, while the economic value of the data held by the corporates and public institutions is well recognized, the research on the evaluation of its economic value is still insufficient. Therefore, in this study, as a part of the economic value evaluation of the data, we have conducted the economic value measurement of the data generated through the small business guarantee program of Korean Federation of Credit Guarantee Foundations (KOREG). To this end, by examining the previous research related to the economic value measurement of the data and intangible assets at home and abroad, we established the evaluation methods and conducted the empirical analysis. For the data value measurements in this paper, we used 'cost-based approach', 'revenue-based approach', and 'market-based approach'. In order to secure the reliability of the measured result of economic values generated through each approach, we conducted expert verification with the employees. Also, we derived the major considerations and issues in regards to the economic value measurement of the data. These will be able to contribute to the empirical methods for economic value measurement of the data in the future.

A Conceptual Study on the Quantitative Measurement of Digital Data Value (디지털 데이터 가치의 정량적 측정에 대한 개념적 연구)

  • Choi, Sung Ho;Lee, Sang Kon
    • Journal of Information Technology Services
    • /
    • v.21 no.5
    • /
    • pp.1-13
    • /
    • 2022
  • With the rapid development of computer technology and communication networks in modern society, human economic activities in the almost every field of our society depend on various electronic devices. The huge amount of digital data generated in these circumstances is refined by technologies such as artificial intelligence and big data, and its value has become larger and larger. However, until now, it is the reality that the digital data has not been clearly defined as an economic asset, and the institutional criteria for expressing its value are unclear. Therefore, this study organizes the definition and characteristics of digital data, and examines the matters to be considered when considering digital data in terms of accounting assets. In addition, a method that can objectively measure the value of digital data was presented as a quantitative calculation model considering the time value of profits and costs.

The Impact of Big Data Analytics Capabilities and Values on Business Performance (빅데이터 분석능력과 가치가 비즈니스 성과에 미치는 영향)

  • Noh, Mi Jin;Lee, Choong Kwon
    • Smart Media Journal
    • /
    • v.10 no.1
    • /
    • pp.108-115
    • /
    • 2021
  • This study investigated the relationships between the analytics capability and value of big data and business performance for big data analysts of business organizations. The values that big data can bring were categorized into transactional value, strategic value, transformational value, and informational value, and we attempted to verify whether these values lead to business performance. Two hundred samples from employees with experience in big data analysis were collected and analyzed. The hypotheses were tested with a structural equation model, and the capability of big data analytics was found to have a significant effect on the value and business performance of big data. Among the big data values, transactional value, strategic value, and transformational value had a positive effect on business performance, but the impact of informational value has not been proven. The results of this study are expected to provide useful information to business organizations seeking to achieve business performance using big data.

A comparative study on the business value assessment of local government open data assets in China based on AHP technique (AHP기법을 활용한 중국 지방정부 공공데이터 자산의 상업적 가치평가 대한 비교연구)

  • Jiaming Yin;Jae-Yeon Sim
    • Industry Promotion Research
    • /
    • v.8 no.3
    • /
    • pp.201-210
    • /
    • 2023
  • This study is based on data ecology theory and takes Chinese local governments' open public data as the research object. Data asset value assessment methods are compared from a new perspective of data business operations. The results show that the assessment model constructed using the hierarchical analysis method (AHP) can more objectively reflect the commercial value of government open data assets than the traditional cost, revenue and market methods, has the advantage of a comprehensive assessment of data value index, and better reflects the findings of a comprehensive index of regional data value. The data show that the local government data value assessment index is positively proportional to the region's digital economy development index, highlighting the driving effect on the digital economy. The results of the study provide a good help for the identification of local government data value rights. The research and practice of promoting the construction of data innovation and data business operation models, improving social well-being and promoting the rapid development of the digital economy to achieve data realisation provides a good reference.

A study for system design that guarantees the integrity of computer files based on blockchain and checksum

  • Kim, Minyoung
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.4
    • /
    • pp.392-401
    • /
    • 2021
  • When a data file is shared through various methods on the Internet, the data file may be damaged in various cases. To prevent this, some websites provide the checksum value of the download target file in text data type. The checksum value provided in this way is then compared with the checksum value of the downloaded file and the published checksum value. If they are the same, the file is regarded as the same. However, the checksum value provided in text form is easily tampered with by an attacker. Because of this, if the correct checksum cannot be verified, the reliability and integrity of the data file cannot be ensured. In this paper, a checksum value is generated to ensure the integrity and reliability of a data file, and this value and related file information are stored in the blockchain. After that, we will introduce the research contents for designing and implementing a system that provides a function to share the checksum value stored in the block chain and compare it with other people's files.

A Study on Veracity of Raw Data based on Value Creation -Focused on YouTube Monetization

  • CHOI, Seoyeon;SHIN, Seung-Jung
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.218-223
    • /
    • 2021
  • The five elements of big data are said to be Volume, Variety, Velocity, Veracity, and Value. Among them, data lacking the Veracity of the data or fake data not only makes an error in decision making, but also hinders the creation of value. This study analyzed YouTube's revenue structure to focus the effect of data integrity on data valuation among these five factors. YouTube is one of the OTT service platforms, and due to COVID-19 in 2020, YouTube creators have emerged as a new profession. Among the revenue-generating models provided by YouTube, the process of generating advertising revenue based on click-based playback was analyzed. And, analyzed the process of subtracting the profits generated from invalid activities that not the clicks due to viewers' pure interests, then paying the final revenue. The invalid activity in YouTube's revenue structure is Raw Data, not pure viewing activity of viewers, and it was confirmed a direct impact on revenue generation. Through the analysis of this process, the new Data Value Chain was proposed.

A Data Quality Management Maturity Model

  • Ryu, Kyung-Seok;Park, Joo-Seok;Park, Jae-Hong
    • ETRI Journal
    • /
    • v.28 no.2
    • /
    • pp.191-204
    • /
    • 2006
  • Many previous studies of data quality have focused on the realization and evaluation of both data value quality and data service quality. These studies revealed that poor data value quality and poor data service quality were caused by poor data structure. In this study we focus on metadata management, namely, data structure quality and introduce the data quality management maturity model as a preferred maturity model. We empirically show that data quality improves as data management matures.

  • PDF

Implementation of Real Time 3 channel Transmission System Using ECG Data Compression Algorithm by Max-Min Slope Update (최대 및 최소 기울기 갱신에 의한 ECG 압축 알고리듬을 이용한 실시간 3채널 전송시스템 구현)

  • 조진호;김명남
    • Journal of Biomedical Engineering Research
    • /
    • v.16 no.3
    • /
    • pp.271-278
    • /
    • 1995
  • An ECG data compression algorithM using max-min slope update is proposed and a real time 3 channel ECG transmission system is implemented using the proposed algorithm. In order to effectively compress ECG data, we compare a threshold value with the max-min slope difference (MMSD) which is updated at each sample values. If this MMSD value is smaller than the threshold value, then the data is compressed. Conversely, when the MMSD value is larger than threshold value, the data is transmitted after storing the value and the length between the data which is beyond previous threshold level. As a result, it can accurately compress both the region of QRS, P, and T wave that has fast-changing and the region of the base line that slope is changing slow. Therefore, it Is possible to enhance the compression rate and the percent roms difference. In addition, because of the simplicity, this algorithm is more suitable for real-time implementation.

  • PDF

An Improved Interpolation Method using Pixel Difference Values for Effective Reversible Data Hiding (효과적인 가역 정보은닉을 위한 픽셀의 차이 값을 이용한 개선된 보간법)

  • Kim, Pyung Han;Jung, Ki Hyun;Yoon, Eun-Jun;Ryu, Kwan-Woo
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.6
    • /
    • pp.768-788
    • /
    • 2021
  • The reversible data hiding technique safely transmits secret data to the recipient from malicious attacks by third parties. In addition, this technique can completely restore the image used as a transmission medium for secret data. The reversible data hiding schemes have been proposed in various forms, and recently, the reversible data hiding schemes based on interpolation are actively researching. The reversible data hiding scheme based on the interpolation method expands the original image into the cover image and embed secret data. However, the existing interpolation-based reversible data hiding schemes did not embed secret data during the interpolation process. To improve this problem, this paper proposes embedding the first secret data during the image interpolation process and embedding the second secret data into the interpolated cover image. In the embedding process, the original image is divided into blocks without duplicates, and the maximum and minimum values are determined within each block. Three way searching based on the maximum value and two way searching based on the minimum value are performed. And, image interpolation is performed while embedding the first secret data using the PVD scheme. A stego image is created by embedding the second secret data using the maximum difference value and log function in the interpolated cover image. As a result, the proposed scheme embeds secret data twice. In particular, it is possible to embed secret data even during the interpolation process of an image that did not previously embed secret data. Experimental results show that the proposed scheme can transmit more secret data to the receiver while maintaining the image quality similar to other interpolation-based reversible data hiding schemes.