• Title/Summary/Keyword: Data integrity

Search Result 1,363, Processing Time 0.026 seconds

Provably-Secure Public Auditing with Deduplication

  • Kim, Dongmin;Jeong, Ik Rae
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.4
    • /
    • pp.2219-2236
    • /
    • 2017
  • With cloud storage services, users can handle an enormous amount of data in an efficient manner. However, due to the widespread popularization of cloud storage, users have raised concerns about the integrity of outsourced data, since they no longer possess the data locally. To address these concerns, many auditing schemes have been proposed that allow users to check the integrity of their outsourced data without retrieving it in full. Yuan and Yu proposed a public auditing scheme with a deduplication property where the cloud server does not store the duplicated data between users. In this paper, we analyze the weakness of the Yuan and Yu's scheme as well as present modifications which could improve the security of the scheme. We also define two types of adversaries and prove that our proposed scheme is secure against these adversaries under formal security models.

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.

An Improved Privacy Preserving Construction for Data Integrity Verification in Cloud Storage

  • Xia, Yingjie;Xia, Fubiao;Liu, Xuejiao;Sun, Xin;Liu, Yuncai;Ge, Yi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.10
    • /
    • pp.3607-3623
    • /
    • 2014
  • The increasing demand in promoting cloud computing in either business or other areas requires more security of a cloud storage system. Traditional cloud storage systems fail to protect data integrity information (DII), when the interactive messages between the client and the data storage server are sniffed. To protect DII and support public verifiability, we propose a data integrity verification scheme by deploying a designated confirmer signature DCS as a building block. The DCS scheme strikes the balance between public verifiable signatures and zero-knowledge proofs which can address disputes between the cloud storage server and any user, whoever acting as a malicious player during the two-round verification. In addition, our verification scheme remains blockless and stateless, which is important in conducting a secure and efficient cryptosystem. We perform security analysis and performance evaluation on our scheme, and compared with the existing schemes, the results show that our scheme is more secure and efficient.

How Social Intelligence, Integrity, and Self-efficacy Affect Job Satisfaction: Empirical Evidence from Indonesia

  • ALIFUDDIN, Moh.;WIDODO, Widodo
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.8 no.7
    • /
    • pp.625-633
    • /
    • 2021
  • The study aims to explore the empirical effect of social intelligence, integrity, self-efficacy, and affective commitment on job satisfaction, and also to prove the theoretical model regarding affective commitment as a mediator between social intelligence, integrity, self-efficacy, and job satisfaction. This research uses a quantitative approach to the survey method through a Likert scale model questionnaire. The questionnaire for all research variables is reliable with an alpha coefficient > 0.7. The research participants are comprised of 386 teachers in Indonesia selected by accidental sampling. Data analysis uses path analysis supported by descriptive statistics and correlational matrices. The research results indicate that social intelligence, integrity, self-efficacy, and affective commitment have a significant effect on job satisfaction. Besides, affective commitment also indirectly mediates the effect of social intelligence, integrity, and self-efficacy on job satisfaction. Thus, a new model regarding the effect of social intelligence, integrity, and self-efficacy on job satisfaction mediating by affective commitment was confirmed. The research suggested that the teachers' job satisfaction can improve through social intelligence, integrity, self-efficacy, and affective commitment. Therefore, researchers and practitioners can adopt a new empirical model to enhance job satisfaction through social intelligence, integrity, self-efficacy, and affective commitment in the future.

Design and Implementation of e2eECC for Automotive On-Chip Bus Data Integrity (차량용 온칩 버스의 데이터 무결성을 위한 종단간 에러 정정 코드(e2eECC)의 설계 및 구현)

  • Eunbae Gil;Chan Park;Juho Kim;Joonho Chung;Joosock Lee;Seongsoo Lee
    • Journal of IKEEE
    • /
    • v.28 no.1
    • /
    • pp.116-122
    • /
    • 2024
  • AMBA AHB-Lite bus is widely used in on-chip bus protocol for low-power and cost-effective SoC. However, it lacks built-in error detection and correction for end-to-end data integrity. This can lead to data corruption and system instability, particularly in harsh environments like automotive applications. To mitigate this problem, this paper proposes the application of SEC-DED (Single Error Correction-Double Error Detection) to AMBA AHB-Lite bus. It aims not only to detect errors in real-time but also to correct them, thereby enhancing end-to-end data integrity. Simulation results demonstrate real-time error detection and correction when errors occur, which bolsters end-to-end data integrity of automotive on-chip bus.

Implementation of Rule Management System for Validating Spatial Object Integrity (공간 객체 무결성 검증을 위한 규칙 관리 시스템의 구현)

  • Go, Goeng-Uk;Yu, Sang-Bong;Kim, Gi-Chang;Cha, Sang-Gyun
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.12
    • /
    • pp.1393-1403
    • /
    • 1999
  • 공간 데이타베이스 시스템을 통하여 공유되는 공간 데이타는 무결성이 적절하게 유지되지 않는 한 전체 응용 시스템의 행위를 예측할 수 없게 되므로 데이타의 무결성 확인 및 유지는 필수적이다. 특히 공공 GIS에 저장된 공간 데이타는 토지 이용도 평가, 도시 계획, 자원 관리, 시설물 관리, 안전 관리, 국방 등 국가 전체 및 지역의 중요한 정책 결정을 위한 다양한 응용 시스템들에 의해 이용되므로 적절한 공간 객체의 무결성 확인이 더욱 더 필요하다. 본 논문에서는 능동(active) DBMS의 능동 규칙(active rule) 기법을 이용하여 공간 객체의 무결성 확인을 지원하기 위한 규칙 관리 시스템을 제시한다. 능동 규칙을 이용한 공간 객체의 무결성 확인은 응용 프로그래머를 무결성 확인에 대한 부담으로부터 자유롭게 할 수 있다. 본 시스템은 특정 DBMS에 종속되지 않는 독립적인 외부 시스템으로 존재하며, 능동 규칙 관리기, 규칙 베이스, 그리고 활성규칙 생성기의 3 부분으로 구성된다. 사용자가 공간 데이타베이스 응용 프로그램을 통해 공간 객체를 조작하고자 할 때, 본 시스템은 데이타베이스 트랜잭션을 단위로 조작되는 모든 공간 객체의 무결성 확인을 위해 응용 프로그램에 삽입될 무결성 제약조건 규칙들을 효율적으로 관리하는 역할을 한다.Abstract It is necessary that the integrity of spatial data shared through the spatial database system is validated and appropriately maintained, otherwise the activity of whole application system is unpredictable. Specially, the integrity of spatial data stored in public GIS has to be validated, because those data are used by various applications which make a decision on an important policy of the region and/or whole nation such as evaluation of land use, city planning, resource management, facility management, risk management/safety supervision, national defense. In this paper, we propose rule management system to support validating the integrity of spatial object, using the technique of active rule technique from active DBMS. Validating data integrity using active rules allows database application programmer to be free from a burden on validation of the data integrity. This system is an independent, external system that is not subject to specific DBMS and consists of three parts, which are the active rule manager, the rule base, and the triggered rule generator. When an user tries to manipulate spatial objects through a spatial database application program, this system serves to efficiently manage integrity rules to be inserted into the application program to validate the integrity constraints of all the spatial objects manipulated by database transactions.

A Rule-Based Database Verification System Based on the Integrity Constranints (무결성 제약에 기초한 규칙 기반 데이타베이스 검증 시스템)

  • Ryu, Myeong-Chun;Park, Chang-Hyeon
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.1
    • /
    • pp.77-86
    • /
    • 1996
  • In managing a certain database, the integrity of data is very important. The important. The integrity constrains thus should be considered carefully when a database is designed and, after the database is created, it is required for a database manager to check continuously if some data contained in the database violate the integrity constraints considered. It is however not easy to check the violateion of integrity constraints when the size and the complexity of database are increased. This paper suggests a rule-based database verification system to relax the difficulty of checking the integrity violation, in which a database is coupled with a rule-based system including the knowledge about the integrity constraints. The rule-database verification system suggested accepts the model descriptions of an application domain, generates the knowledge base consisting of rules and facts by analyzing the model description and proceeds the verification process to check the integrity of the database.

  • PDF

Factors Influencing Death Anxiety in the Aged (노인의 죽음불안에 영향을 미치는 요인)

  • Lee, Jung-In;Kim, Soon-Yi
    • Journal of Korean Public Health Nursing
    • /
    • v.25 no.1
    • /
    • pp.28-37
    • /
    • 2011
  • Purpose: The study examined influencing factors on death anxiety in the aged. Method: This was a descriptive survey study. Data were collected from March to June, 2010, from357 older home-dwelling adults. The questionnaires solicited information on death anxiety, family function, morale, health behavior and ego-integrity. Data was analyzed using descriptive statistics, Pearson's correlation, and stepwise multiple regression. Results: Average scores were 2.50 for death anxiety, 3.80 for family function, 9.0 for morale, 3.12 for health behavior and 2.84 for ego-integrity. There were statistically significant negative correlations between family function and death anxiety, morale and death anxiety, health behavior and death anxiety, and ego-integrity and death anxiety Morale, ego-integrity, and economic status were a significant predictor of death anxiety. Conclusion: Multilateral efforts are needed to assist the aged in successful aging through continuous body activities and active participation in society.

3-L Model: A Model for Checking the Integrity Constraints of Mobile Databases

  • Ibrahim, Hamidah;Dzolkhifli, Zarina;Affendey, Lilly Suriani;Madiraju, Praveen
    • Journal of Computing Science and Engineering
    • /
    • v.3 no.4
    • /
    • pp.260-277
    • /
    • 2009
  • In this paper we propose a model for checking integrity constraints of mobile databases called Three-Level (3-L) model, wherein the process of constraint checking to maintain the consistent state of mobile databases is realized at three different levels. Sufficient and complete tests proposed in the previous works together with the idea of caching relevant data items for checking the integrity constraints are adopted. This has improved the checking mechanism by preventing delays during the process of checking constraints and performing the update. Also, the 3-L model reduces the amount of data accessed given that much of the tasks are performed at the mobile host, and hence speeds up the checking process.

An Enhanced Remote Data Checking Scheme for Dynamic Updates

  • Dong, Lin;Park, Jinwoo;Hur, Junbeom;Park, Ho-Hyun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.5
    • /
    • pp.1744-1765
    • /
    • 2014
  • A client stores data in the cloud and uses remote data checking (RDC) schemes to check the integrity of the data. The client can detect the corruption of the data using RDC schemes. Recently, robust RDC schemes have integrated forward error-correcting codes (FECs) to ensure the integrity of data while enabling dynamic update operations. Thus, minor data corruption can be recovered by FECs, whereas major data corruption can be detected by spot-checking techniques. However, this requires high communication overhead for dynamic update, because a small update may require the client to download an entire file. The Variable Length Constraint Group (VLCG) scheme overcomes this disadvantage by downloading the RS-encoded parity data for update instead of the entire file. Despite this, it needs to download all the parity data for any minor update. In this paper, we propose an improved RDC scheme in which the communication overhead can be reduced by downloading only a part of the parity data for update while simultaneously ensuring the integrity of the data. Efficiency and security analysis show that the proposed scheme enhances efficiency without any security degradation.