• Title/Summary/Keyword: Data Integrity Verification

Search Result 105, Processing Time 0.024 seconds

Design of CCTV Enclosure Record Management System based on Blockchain

  • Yu, Kwan Woo;Lee, Byung Mun;Kang, Un Gu
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.12
    • /
    • pp.141-149
    • /
    • 2022
  • In this paper, we propose a design of CCTV enlcosure record management system based on blockchain. Since CCTV video records are transferred to the control center through enclosure, it is very important to manage the enclosure to prevent modulation and damage of the video records. Recently, a smart enclosure monitoring system with real-time remote monitoring and opening and closing state management functions is used to manage CCTV enclosures, but there is a limitation to securing the safety of CCTV video records. The proposed system detect modulated record and recover the record through hash value comparison by distributed stored record in the blockchain. In addition, the integrity verification API is provided to ensure the integrity of enclosure record received by the management server. In order to verify the effectiveness of the system, the integrity verification accuracy and elapsed time were measured through experiments. As a result, the integrity of enclosure record (accuracy: 100%) was confirmed, and it was confirmed that the elapsed time for verification (average: 73 ms) did not affect monitoring.

A Security-Enhanced Identity-Based Batch Provable Data Possession Scheme for Big Data Storage

  • Zhao, Jining;Xu, Chunxiang;Chen, Kefei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.9
    • /
    • pp.4576-4598
    • /
    • 2018
  • In big data age, flexible and affordable cloud storage service greatly enhances productivity for enterprises and individuals, but spontaneously has their outsourced data susceptible to integrity breaches. Provable Data Possession (PDP) as a critical technology, could enable data owners to efficiently verify cloud data integrity, without downloading entire copy. To address challenging integrity problem on multiple clouds for multiple owners, an identity-based batch PDP scheme was presented in ProvSec 2016, which attempted to eliminate public key certificate management issue and reduce computation overheads in a secure and batch method. In this paper, we firstly demonstrate this scheme is insecure so that any clouds who have outsourced data deleted or modified, could efficiently pass integrity verification, simply by utilizing two arbitrary block-tag pairs of one data owner. Specifically, malicious clouds are able to fabricate integrity proofs by 1) universally forging valid tags and 2) recovering data owners' private keys. Secondly, to enhance the security, we propose an improved scheme to withstand these attacks, and prove its security with CDH assumption under random oracle model. Finally, based on simulations and overheads analysis, our batch scheme demonstrates better efficiency compared to an identity based multi-cloud PDP with single owner effort.

Verification of safety integrity for vital data processing device through quantitative safety analysis (정량적 안전성 분석을 통한 Vital 데이터 처리장치의 안전무결성 요구사항 검증)

  • Choi, Jin-Woo;Park, Jae-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.7
    • /
    • pp.4863-4870
    • /
    • 2015
  • Currently, as a priority to secure the safety of the railway signalling system, verification for satisfy of the safety integrity requirements(SIR) is required to the essential elements. Safety Integrity Requirements(SIR) verification is performed based on the system safety analysis. But the probability of securing basic data for system safety analysis significantly dropped because there is no experience yet performed in the country. Therefore we are had to rely on a qualitative analysis. There are methods such as qualitative risk analysis matrix, and risk graphs. The qualitative analysis is wide, the width of the accident. However, the reliability of the result is significantly less has a disadvantage. Therefore, it should be parallel quantitative safety analysis of the system/products in order to compensate for the disadvantages of the qualitative analysis. This paper presents a quantitative safety analysis method to overcome the disadvantages of the qualitative analysis. And through a result, highly reliable Safety Integrity Requirements(SIR) verification measures proposed. Verification results, the dangerous failure incidence for vital data processing device was calculated to be $1.172279{\times}10^{-9}$. The result was verified to exceed the required safety integrity targets more.

Study on Development of HDD Integrity Verification System using FirmOS (FirmOS를 이용한 HDD 무결성 검사 시스템 개발에 관한 연구)

  • Yeom, Jae-Hwan;Oh, Se-Jin;Roh, Duk-Gyoo;Jung, Dong-Kyu;Hwang, Ju-Yeon;Oh, Chungsik;Kim, Hyo-Ryoung;Shin, Jae-Sik
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.18 no.2
    • /
    • pp.55-61
    • /
    • 2017
  • In radio astronomy, high-capacity HDDs are being used to save huge amounts of HDDs in order to record the observational data. For VLBI observations, observational speeds increase and huge amounts of observational data must be stored as they expand to broadband. As the HDD is frequently used, the number of failures occurred, and then it takes a lot of time to recover it. In addition, if a failed HDD is continuously used, observational data loss occurs. And it costs a lot of money to buy a new HDD. In this study, we developed the integrity verification system of the Serial ATA HDD using FirmOS. The FirmOS is an OS that has been developed to function exclusively for specific purposes on a system having a general server board and CPU. The developed system performs the process of writing and reading specific patterns of data in a physical area of the SATA HDD based on a FirmOS. In addition, we introduced a method to investigate the integrity of HDD integrity by comparing it with the stored pattern data from the HDD controller. Using the developed system, it was easy to determine whether the disk pack used in VLBI observations has error or not, and it is very useful to improve the observation efficiency. This paper introduces the detail for the design, configuration, testing, etc. of the SATA HDD integrity verification system developed.

  • PDF

Accurate Signal Integrity Verification of Transmission Lines Based on High-Frequency Measurement (고주파 전송선 회로의 실험적 고찰을 통한 정확한 시그널 인테그러티 검증)

  • Shin, Seung-Hoon;Eo, Yung-Seon
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.48 no.7
    • /
    • pp.82-90
    • /
    • 2011
  • An accurate signal integrity verification method based on high-frequency measurements is proposed. For practical transmission lines that require a package process, process variations metal roughness and skin effects and boundary conditions may have deteriorative effects on circuit performance. These effects are represented in terms of parameters that can be readily utilized for field-solver. Thereby a more accurate signal integrity verification using field-solver can be achieved. It is shown that in both single and coupled lines the signal transients using the proposed method have excellent agreement with the measurement data.

A Privacy-preserving Data Aggregation Scheme with Efficient Batch Verification in Smart Grid

  • Zhang, Yueyu;Chen, Jie;Zhou, Hua;Dang, Lanjun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.617-636
    • /
    • 2021
  • This paper presents a privacy-preserving data aggregation scheme deals with the multidimensional data. It is essential that the multidimensional data is rarely mentioned in all researches on smart grid. We use the Paillier Cryptosystem and blinding factor technique to encrypt the multidimensional data as a whole and take advantage of the homomorphic property of the Paillier Cryptosystem to achieve data aggregation. Signature and efficient batch verification have also been applied into our scheme for data integrity and quick verification. And the efficient batch verification only requires 2 pairing operations. Our scheme also supports fault tolerance which means that even some smart meters don't work, our scheme can still work well. In addition, we give two extensions of our scheme. One is that our scheme can be used to compute a fixed user's time-of-use electricity bill. The other is that our scheme is able to effectively and quickly deal with the dynamic user situation. In security analysis, we prove the detailed unforgeability and security of batch verification, and briefly introduce other security features. Performance analysis shows that our scheme has lower computational complexity and communication overhead than existing schemes.

A Query Result Integrity Assurance Scheme Using an Order-preserving Encryption Scheme in the Database Outsourcing Environment (데이터베이스 아웃소싱 환경에서 순서 보존 암호화 기법을 이용한 질의 결과 무결성 검증 기법)

  • Jang, Miyoung;Chang, Jae Woo
    • Journal of KIISE
    • /
    • v.42 no.1
    • /
    • pp.97-106
    • /
    • 2015
  • Recently, research on database encryption for data protection and query result authentication methods has been performed more actively in the database outsourcing environment. Existing database encryption schemes are vulnerable to order matching and counting attack of intruders who have background knowledge of the original database domain. Existing query result integrity auditing methods suffer from the transmission overhead of verification object. To resolve these problems, we propose a group-order preserving encryption index and a query result authentication method based on the encryption index. Our group-order preserving encryption index groups the original data for data encryption and support query processing without data decryption. We generate group ids by using the Hilbert-curve so that we can protect the group information while processing a query. Finally, our periodic function based data grouping and query result authentication scheme can reduce the data size of the query result verification. Through performance evaluation, we show that our method achieves better performance than an existing bucket-based verification scheme, it is 1.6 times faster in terms of query processing time and produces verification data that is 20 times smaller.

Dilution of Precision (DOP) Based Landmark Exclusion Method for Evaluating Integrity Risk of LiDAR-based Navigation Systems

  • Choi, Pil Hun;Lee, Jinsil;Lee, Jiyun
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.9 no.3
    • /
    • pp.285-292
    • /
    • 2020
  • This paper introduces a new computational efficient Dilution of Precision (DOP)-based landmark exclusion method while ensuring the safety of the LiDAR-based navigation system that uses an innovation-based Nearest-Neighbor (NN) Data Association (DA) process. The NN DA process finds a correct landmark association hypothesis among all potential landmark permutations using Kalman filter innovation vectors. This makes the computational load increases exponentially as the number of landmarks increases. In this paper, we thus exclude landmarks by introducing DOP that quantifies the geometric distribution of landmarks as a way to minimize the loss of integrity performance that can occur by reducing landmarks. The number of landmarks to be excluded is set as the maximum number that can satisfy the integrity risk requirement. For the verification of the method, we developed a simulator that can analyze integrity risk according to the landmark number and its geometric distribution. Based on the simulation, we analyzed the relationship between DOP and integrity risk of the DA process by excluding each landmark. The results showed a tendency to minimize the loss of integrity performance when excluding landmarks with poor DOP. The developed method opens the possibility of assuring the safety risk of the Lidar-based navigation system in real-time applications by reducing a substantial amount of computational load.

Experimental Characterization-Based Signal Integrity Verification of Sub-Micron VLSI Interconnects

  • Eo, Yung-Seon;Park, Young-Jun;Kim, Yong-Ju;Jeong, Ju-Young;Kwon, Oh-Kyong
    • Journal of Electrical Engineering and information Science
    • /
    • v.2 no.5
    • /
    • pp.17-26
    • /
    • 1997
  • Interconnect characterization on a wafer level was performed. Test patterns for single, two-coupled, and triple-coupled lines ere designed by using 0.5$\mu\textrm{m}$ CMOS process. Then interconnect capacitances and resistances were experimentally extracted by using tow port network measurements, Particularly to eliminate parasitic effects, the Y-parameter de-embedding was performed with specially designed de-embedding patterns. Also, for the purpose of comparisons, capacitance matrices were calculated by using the existing CAD model and field-solver-based commercial simulator, METAL and MEDICI. This work experimentally verifies that existing CAD models or parameter extraction may have large deviation from real values. The signal transient simulation with the experimental data and other methodologies such as field-solver-based simulation and existing model was performed. as expected, the significantly affect on the signal delay and crosstalk. The signal delay due to interconnects dominates the sub-micron-based a gate delay (e.g., inverter). Particularly, coupling capacitance deviation is so large (about more than 45% in the worst case) that signal integrity cannot e guaranteed with the existing methodologies. The characterization methodologies of this paper can be very usefully employed for the signal integrity verification or he electrical design rule establishments of IC interconnects in the industry.

  • PDF

Identity-based Provable Data Possession for Multicloud Storage with Parallel Key-Insulation

  • Nithya, S. Mary V.;Rhymend Uthariaraj, V.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.9
    • /
    • pp.3322-3347
    • /
    • 2021
  • Cloud Storage is the primary component of many businesses on cloud. Majority of the enterprises today are adopting a multicloud strategy to keep away from vendor lock-in and to optimize cost. Auditing schemes are used to ascertain the integrity of cloud data. Of these schemes, only the Provable Data Possession schemes (PDP) are resilient to key-exposure. These PDP schemes are devised using Public Key Infrastructure (PKI-) based cryptography, Identity-based cryptography, etc. PKI-based systems suffer from certificate-related communication/computational complexities. The Identity-based schemes deal with the exposure of only the auditing secret key (audit key). But with the exposure of both the audit key and the secret key used to update the audit key, the auditing process itself becomes a complete failure. So, an Identity-based PDP scheme with Parallel Key-Insulation is proposed for multiple cloud storage. It reduces the risk of exposure of both the audit key and the secret key used to update the audit key. It preserves the data privacy from the Third Party Auditor, secure against malicious Cloud Service Providers and facilitates batch auditing. The resilience to key-exposure is proved using the CDH assumption. Compared to the existing Identity-based multicloud schemes, it is efficient in integrity verification.