• Title/Summary/Keyword: Data integrity

Search Result 1,363, Processing Time 0.034 seconds

Verification Control Algorithm of Data Integrity Verification in Remote Data sharing

  • Xu, Guangwei;Li, Shan;Lai, Miaolin;Gan, Yanglan;Feng, Xiangyang;Huang, Qiubo;Li, Li;Li, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.2
    • /
    • pp.565-586
    • /
    • 2022
  • Cloud storage's elastic expansibility not only provides flexible services for data owners to store their data remotely, but also reduces storage operation and management costs of their data sharing. The data outsourced remotely in the storage space of cloud service provider also brings data security concerns about data integrity. Data integrity verification has become an important technology for detecting the integrity of remote shared data. However, users without data access rights to verify the data integrity will cause unnecessary overhead to data owner and cloud service provider. Especially malicious users who constantly launch data integrity verification will greatly waste service resources. Since data owner is a consumer purchasing cloud services, he needs to bear both the cost of data storage and that of data verification. This paper proposes a verification control algorithm in data integrity verification for remotely outsourced data. It designs an attribute-based encryption verification control algorithm for multiple verifiers. Moreover, data owner and cloud service provider construct a common access structure together and generate a verification sentinel to verify the authority of verifiers according to the access structure. Finally, since cloud service provider cannot know the access structure and the sentry generation operation, it can only authenticate verifiers with satisfying access policy to verify the data integrity for the corresponding outsourced data. Theoretical analysis and experimental results show that the proposed algorithm achieves fine-grained access control to multiple verifiers for the data integrity verification.

A Lightweight Integrity Authentication Scheme based on Reversible Watermark for Wireless Body Area Networks

  • Liu, Xiyao;Ge, Yu;Zhu, Yuesheng;Wu, Dajun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.12
    • /
    • pp.4643-4660
    • /
    • 2014
  • Integrity authentication of biometric data in Wireless Body Area Network (WBAN) is a critical issue because the sensitive data transmitted over broadcast wireless channels could be attacked easily. However, traditional cryptograph-based integrity authentication schemes are not suitable for WBAN as they consume much computational resource on the sensor nodes with limited memory, computational capability and power. To address this problem, a novel lightweight integrity authentication scheme based on reversible watermark is proposed for WBAN and implemented on a TinyOS-based WBAN test bed in this paper. In the proposed scheme, the data is divided into groups with a fixed size to improve grouping efficiency; the histogram shifting technique is adopted to avoid possible underflow or overflow; local maps are generated to restore the shifted data; and the watermarks are generated and embedded in a chaining way for integrity authentication. Our analytic and experimental results demonstrate that the integrity of biometric data can be reliably authenticated with low cost, and the data can be entirely recovered for healthcare applications by using our proposed scheme.

Automatic Data Matching System for CAD Data's Integrity (CAD 데이터의 무결성을 위한 데이터 매칭 자동화 시스템)

  • Byeon, Hae-Gwon;Yoo, Woo-Sik
    • IE interfaces
    • /
    • v.24 no.1
    • /
    • pp.71-77
    • /
    • 2011
  • Design works consist of essential works and subsidiary works. Essential design works means designing creative ideas and productive ideas, while subsidiary design works means helping essential works those are making data tables and specification sheets, checking CAD data's integrity. Subsidiary design works forms the bulk of the whole design process and affects the time limit of delivery. Therefore we propose the automatic data matching system for CAD data's integrity. Proposed system is automatic system supporting subsidiary design works. The data matching system consists of three parts; 1) automatic generation of data tables 2) supporting module for checking CAD data's integrity between Drawings 3) automatic generation of spec. sheets. Developed system was tested in LCD equipment manufacture company and was found to be useful system.

A Representation of Engineering Change Objects and Their Integrity Constraints Using an Active Object-Oriented Database Model (능동형 객체지향적 데이터베이스 모텔을 이용한 설계변경 개체 및 제약조건의 표현)

  • 도남철
    • Journal of Information Technology Applications and Management
    • /
    • v.10 no.1
    • /
    • pp.111-125
    • /
    • 2003
  • This paper proposes a product data model that can express and enforce integrity constraints on product structure during engineering changes (ECs). The model adopts and extends an active object-oriented database model in order to Integrate IC data and their integrity constraints. Tightly integrated with product structure, It will enable designers to maintain and exchange consistent EC data throughout the product life cycle. In order to properly support operations for ECs, the model provides the data, operations, and Event-Condition-Action rules for nested ECs and simultaneous EC applications to multiple options. in addition, the EC objects proposed In the model integrate the data and Integrity constraints into a unified repository. This repository enables designers to access all EC data and integrity constraints through the product structure and relationships between EC objects. This paper also describes a prototype product data management system based on the proposed model In order to demonstrate its effectiveness.

  • PDF

Development of an Internet based Virtual Reality Environment and Web Database for the Integrity Evaluation of the Nuclear Power Plant (원자력발전소 건전성평가를 위한 인터넷기반 가상현실환경과 웹데이터베이스의 개발)

  • 김종춘;정민중;최재붕;김영진;표창률
    • Korean Journal of Computational Design and Engineering
    • /
    • v.6 no.2
    • /
    • pp.140-146
    • /
    • 2001
  • A nuclear Power Plant is composed of a number of mechanical components. Maintaining the integrity of these components is one of the most critical issues in nuclear industry. In order to evaluate the integrity of these mechanical components, a lot of data are required including inspection data, geometrical data, material properties, etc. Therefore, an effective database system is essential to manage the integrity of nuclear power plant. For this purpose, an internet based virtual reality environment and web database system was proposed. The developed virtual reality environment provides realistic geometrical configurations of mechanical components using VRML (Virtual Reality Modeling Language). The virtual reality environment was linked with the web database, which can manage the required data for the integrity evaluation. The proposed system is able to share the information regarding the integrity evaluation through internet, and thus, will be suitable for an integrated system for the maintenance of mechanical components.

  • PDF

Building A PDM/CE Environment and Validating Integrity Using STEP (STEP을 이용한 PDM/CE환경의 구축과 데이타 무결성 확인)

  • 유상봉;서효원;고굉욱
    • The Journal of Society for e-Business Studies
    • /
    • v.1 no.1
    • /
    • pp.173-194
    • /
    • 1996
  • In order to adapt today's short product life cycle and rapid technology changes., integrated systems should be extended to support PDM (Product Data Management) or CE(Concurrent Engineering). A PDM/CE environment has been developed and a prototype is Presented in this paper. Features of the PDM/CE environment are 1) integrated product information model (IPIM) includes both data model and integrity constraints, 2) database systems are organized hierarchically so that working data C8Mot be referenced by other application systems until they are released into the global database, and 3) integrity constraints written in EXPRESS are validated both in the local databases and the global database. By keeping the integrity of the product data, undesirable propagation of illegal data to other application system can be prevented. For efficient validation, the constraints are distributed into the local and the global schemata. Separate triggering mechanisms are devised using the dependency of constraints to three different data operations, i.e., insertion, deletion, and update.

  • PDF

The Effect of Corporate Integrity on Stock Price Crash Risk

  • YIN, Hong;ZHANG, Ruonan
    • Asian Journal of Business Environment
    • /
    • v.10 no.1
    • /
    • pp.19-28
    • /
    • 2020
  • Purpose: This research aims to investigate the impact of corporate integrity on stock price crash risk. Research design, data, and methodology: Taking 1419 firms listed in Shenzhen Stock Exchange in China as a sample, this paper empirically analyzed the relationship between corporate integrity and stock price crash risk. The main integrity data was hand-collected from Shenzhen Stock Exchange Website. Other financial data was collected from CSMAR Database. Results: Findings show that corporate integrity can significantly decrease stock price crash risk. After changing the selection of samples, model estimation methods and the proxy variable of stock price crash risk, the conclusion is still valid. Further research shows that the relationship between corporate integrity and stock price crash risk is only found in firms with weak internal control and firms in poor legal system areas. Conclusions: Results of the study suggest that corporate integrity has a significant influence on behaviors of managers. Business ethics reduces the likelihood of managers to overstate financial performance and hide bad news, which leads to the low likelihood of future stock price crashes. Meanwhile, corporate integrity can supplement internal control and legal system in decreasing stock price crash risks.

RPIDA: Recoverable Privacy-preserving Integrity-assured Data Aggregation Scheme for Wireless Sensor Networks

  • Yang, Lijun;Ding, Chao;Wu, Meng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.12
    • /
    • pp.5189-5208
    • /
    • 2015
  • To address the contradiction between data aggregation and data security in wireless sensor networks, a Recoverable Privacy-preserving Integrity-assured Data Aggregation (RPIDA) scheme is proposed based on privacy homomorphism and aggregate message authentication code. The proposed scheme provides both end-to-end privacy and data integrity for data aggregation in WSNs. In our scheme, the base station can recover each sensing data collected by all sensors even if these data have been aggregated by aggregators, thus can verify the integrity of all sensing data. Besides, with these individual sensing data, base station is able to perform any further operations on them, which means RPIDA is not limited in types of aggregation functions. The security analysis indicates that our proposal is resilient against typical security attacks; besides, it can detect and locate the malicious nodes in a certain range. The performance analysis shows that the proposed scheme has remarkable advantage over other asymmetric schemes in terms of computation and communication overhead. In order to evaluate the performance and the feasibility of our proposal, the prototype implementation is presented based on the TinyOS platform. The experiment results demonstrate that RPIDA is feasible and efficient for resource-constrained sensor nodes.

Meteorological Data Integrity for Environmental Impact Assessment in Yongdam Catchment (용담댐시험유역 환경영향평가의 신뢰수준 향상을 위한 기상자료의 품질검정)

  • Lee, Khil-Ha
    • Journal of Environmental Science International
    • /
    • v.29 no.10
    • /
    • pp.981-988
    • /
    • 2020
  • This study presents meteorological data integrity to improve environmental quality assessment in Yongdam catchment. The study examines both extreme ranges of meteorological data measurements and data reliability which include maximum and minimum temperature, relative humidity, dew point temperature, radiation, heat flux. There were some outliers and missing data from the measurements. In addition, the latent heat flux and sensible heat flux data were not reasonable and evapotranspiration data did not match at some points. The accuracy and consistency of data stored in a database for the study were secured from the data integrity. Users need to take caution when using meteorological data from the Yongdam catchment in the preparation of water resources planning, environmental impact assessment, and natural hazards analysis.

Video Integrity Checking Scheme by Using Merkle Tree (머클트리를 활용한 영상무결성 검사 기법 )

  • Yun-Hee Kang;Eun-Young CHANG;Taeun Kwonk
    • Journal of Platform Technology
    • /
    • v.10 no.4
    • /
    • pp.39-46
    • /
    • 2022
  • Recently, digital contents including video and sound are created in various fields, transmitted to the cloud through the Internet, and then stored and used. In order to utilize digital content, it is essential to verify data integrity, and it is necessary to ensure network bandwidth efficiency of verified data. This paper describes the design and implementation of a server that maintains, manages, and provides data for verifying the integrity of video data. The server receives and stores image data from Logger, a module that acquires image data, and performs a function of providing data necessary for verification to Verifier, a module that verifies image data. Then, a lightweight Merkle tree is constructed using the hash value. The light-weight Merkle tree can quickly detect integrity violations without comparing individual hash values of the corresponding video frame changes of the video frame indexes of the two versions. A lightweight Merkle tree is constructed by generating a hash value of digital content so as to have network bandwidth efficiency, and the result of performing proof of integrity verification is presented.