• Title/Summary/Keyword: Hash Data

Search Result 334, Processing Time 0.023 seconds

Clustering Algorithm Using Hashing in Classification of Multispectral Satellite Images

  • Park, Sung-Hee;Kim, Hwang-Soo;Kim, Young-Sup
    • Korean Journal of Remote Sensing
    • /
    • v.16 no.2
    • /
    • pp.145-156
    • /
    • 2000
  • Clustering is the process of partitioning a data set into meaningful clusters. As the data to process increase, a laster algorithm is required than ever. In this paper, we propose a clustering algorithm to partition a multispectral remotely sensed image data set into several clusters using a hash search algorithm. The processing time of our algorithm is compared with that of clusters algorithm using other speed-up concepts. The experiment results are compared with respect to the number of bands, the number of clusters and the size of data. It is also showed that the processing time of our algorithm is shorter than that of cluster algorithms using other speed-up concepts when the size of data is relatively large.

A study for system design that guarantees the integrity of computer files based on blockchain and checksum

  • Kim, Minyoung
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.4
    • /
    • pp.392-401
    • /
    • 2021
  • When a data file is shared through various methods on the Internet, the data file may be damaged in various cases. To prevent this, some websites provide the checksum value of the download target file in text data type. The checksum value provided in this way is then compared with the checksum value of the downloaded file and the published checksum value. If they are the same, the file is regarded as the same. However, the checksum value provided in text form is easily tampered with by an attacker. Because of this, if the correct checksum cannot be verified, the reliability and integrity of the data file cannot be ensured. In this paper, a checksum value is generated to ensure the integrity and reliability of a data file, and this value and related file information are stored in the blockchain. After that, we will introduce the research contents for designing and implementing a system that provides a function to share the checksum value stored in the block chain and compare it with other people's files.

File Deduplication using Logical Partition of Storage System (저장 시스템의 논리 파티션을 이용한 파일 중복 제거)

  • Kong, Jin-San;Yoo, Chuck;Ko, Young-Woong
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.7 no.6
    • /
    • pp.345-351
    • /
    • 2012
  • In traditional target-based data deduplication system, all of the files should be chunked and compared for reducing duplicated data blocks. One of the critical problem of this system arises as the number of files are increasing. The system suffers from computational delay for calculating hash value and processing metadata for handling each file. To overcome this problem, in this paper, we propose a novel data deduplication system using logical partition of storage system. The system applies data deduplication scheme to each logical partition not each file. Experiment result shows that the proposed system is more efficient compared with traditional deduplication scheme where the logical partition is full of files by 50% in terms of deduplication capacity and processing time.

Improved Tree-Based ${\mu}TESLA$ Broadcast Authentication Protocol Based on XOR Chain for Data-Loss Tolerant and Gigh-Efficiency (데이터 손실에 강하고 효율적 연산을 지원하는 XOR 체인을 이용한 트리기반 ${\mu}TESLA$ 프로토콜 개선)

  • Yeo, Don-Gu;Jang, Jae-Hoon;Choi, Hyun-Woo;Youm, Heung-Youl
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.20 no.2
    • /
    • pp.43-55
    • /
    • 2010
  • ${\mu}TESLA$ broadcast authentication protocol have been developed by many researchers for providing authenticated broadcasting message between receiver and sender in sensor networks. Those cause authentication delay Tree-based ${\mu}TESLA$[3] solves the problem of authentication delay. But, it has new problems from Merkel hash tree certificate structure. Such as an increase in quantity of data transmission and computation according to the number of sender or parameter of ${\mu}TESLA$ chain. ${\mu}TPCT$-based ${\mu}TESLA$[4] has an advantages, such as a fixed computation cost by altered Low-level Merkel has tree to hash chain. However, it only use the sequential values of Hash chain to authenticate ${\mu}TESLA$ parameters. So, It can't ensure the success of authentication in lossy sensor network. This paper is to propose the improved method for Tree-based ${\mu}TESLA$ by using XOR-based chain. The proposed scheme provide advantages such as a fixed computation cost with ${\mu}$TPCT-based ${\mu}TESLA$ and a message loss-tolerant with Tree-based ${\mu}TESLA$.

Design and Implementation of Multiple Filter Distributed Deduplication System Applying Cuckoo Filter Similarity (쿠쿠 필터 유사도를 적용한 다중 필터 분산 중복 제거 시스템 설계 및 구현)

  • Kim, Yeong-A;Kim, Gea-Hee;Kim, Hyun-Ju;Kim, Chang-Geun
    • Journal of Convergence for Information Technology
    • /
    • v.10 no.10
    • /
    • pp.1-8
    • /
    • 2020
  • The need for storage, management, and retrieval techniques for alternative data has emerged as technologies based on data generated from business activities conducted by enterprises have emerged as the key to business success in recent years. Existing big data platform systems must load a large amount of data generated in real time without delay to process unstructured data, which is an alternative data, and efficiently manage storage space by utilizing a deduplication system of different storages when redundant data occurs. In this paper, we propose a multi-layer distributed data deduplication process system using the similarity of the Cuckoo hashing filter technique considering the characteristics of big data. Similarity between virtual machines is applied as Cuckoo hash, individual storage nodes can improve performance with deduplication efficiency, and multi-layer Cuckoo filter is applied to reduce processing time. Experimental results show that the proposed method shortens the processing time by 8.9% and increases the deduplication rate by 10.3%.

Operation Technique of Spatial Data Change Recognition Data per File (파일 단위 공간데이터 변경 인식 데이터 운영 기법)

  • LEE, Bong-Jun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.24 no.4
    • /
    • pp.184-193
    • /
    • 2021
  • The system for managing spatial data updates the existing information by extracting only the information that is different from the existing information for the newly obtained spatial information file to update the stored information. In order to extract only objects that have changed from existing information, it is necessary to compare whether there is any difference from existing information for all objects included in the newly obtained spatial information file. This study was conducted to improve this total inspection method in a situation where the amount of spatial information that is frequently updated increases and data update is required at the national level. In this study, before inspecting individual objects in a new acquisition space information file, a method of determining whether individual space objects have been changed only by the information in the file was considered. Spatial data files have structured data characteristics different from general image or text document files, so it is possible to determine whether to change the file unit in a simpler way compared to the existing method of creating and managing file hash. By reducing the number of target files that require full inspection, it is expected to improve the use of resources in the system by saving the overall data quality inspection time and saving data extraction time.

A Study on Extraction of Mobile Forensic Data and Integrity Proof (모바일 포렌식 자료의 추출과 무결성 입증 연구)

  • Kim, Ki-Hwan;Park, Dea-Woo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.6
    • /
    • pp.177-185
    • /
    • 2007
  • Lately, it is a trend that diffusion of Mobile Information Appliance that do various function by development of IT technology. There is function that do more convenient and efficient exchange information and business using mobile phone that is Mobile Information Appliance, but disfunction that is utilized by pointed end engineering data leakage, individual's privacy infringement, threat, etc. relationship means to use mobile phone is appeared and problems were appeared much. However, legal research of statute unpreparedness and so on need research and effort to prove delete, copy, integrity of digital evidence that transfer secures special quality of easy digital evidence to objective evidence in investigation vantage point is lacking about crime who use this portable phone. It is known that this Digital Forensic field is Mobile Forensic. In this paper. We are verify about acquisition way of digital evidence that can happen in this treatise through mobile phone that is Mobile Forensic's representative standing and present way to prove integrity of digital evidence using Hash Function.

  • PDF

Security Elevation of XML Document Using DTD Digital Signature (DTD 전자서명을 이용한 XML문서의 보안성 향상)

  • Park, Dou-Joon;Min, Hye-Lan;Lee, Joon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • v.9 no.1
    • /
    • pp.1080-1083
    • /
    • 2005
  • Can speak that DTD is meta data that define meaning of expressed data on XML document. Therefore, in case DTD information is damaged this information to base security of XML document dangerous. Not that attach digital signature on XML document at send-receive process of XML document in this research, proposed method to attach digital signature to DTD. As reading DTD file to end first, do parsing, and store abstracted element or attribute entitys in hash table. Read hash table and achieve message digest if parsing is ended. Compose and create digital signature with individual key after achievement. When sign digital, problem that create entirely other digest cost because do not examine about order that change at message digest process is happened. This solved by method to create DTD's digital signature using DOM that can embody tree structure for standard structure and document.

  • PDF

An Automatically Extracting Formal Information from Unstructured Security Intelligence Report (비정형 Security Intelligence Report의 정형 정보 자동 추출)

  • Hur, Yuna;Lee, Chanhee;Kim, Gyeongmin;Jo, Jaechoon;Lim, Heuiseok
    • Journal of Digital Convergence
    • /
    • v.17 no.11
    • /
    • pp.233-240
    • /
    • 2019
  • In order to predict and respond to cyber attacks, a number of security companies quickly identify the methods, types and characteristics of attack techniques and are publishing Security Intelligence Reports(SIRs) on them. However, the SIRs distributed by each company are huge and unstructured. In this paper, we propose a framework that uses five analytic techniques to formulate a report and extract key information in order to reduce the time required to extract information on large unstructured SIRs efficiently. Since the SIRs data do not have the correct answer label, we propose four analysis techniques, Keyword Extraction, Topic Modeling, Summarization, and Document Similarity, through Unsupervised Learning. Finally, has built the data to extract threat information from SIRs, analysis applies to the Named Entity Recognition (NER) technology to recognize the words belonging to the IP, Domain/URL, Hash, Malware and determine if the word belongs to which type We propose a framework that applies a total of five analysis techniques, including technology.

An Effective Control Method for Improving Integrity of Mobile Phone Forensics (모바일 포렌식의 무결성 보장을 위한 효과적인 통제방법)

  • Kim, Dong-Guk;Jang, Seong-Yong;Lee, Won-Young;Kim, Yong-Ho;Park, Chang-Hyun
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.19 no.5
    • /
    • pp.151-166
    • /
    • 2009
  • To prove the integrity of digital evidence on the investigation procedure, the data which is using the MD 5(Message Digest 5) hash-function algorithm has to be discarded, if the integrity was damaged on the investigation. Even though a proof restoration of the deleted area is essential for securing the proof regarding a main phase of a case, it was difficult to secure the decisive evidence because of the damaged evidence data due to the difference between the overall hash value and the first value. From this viewpoint, this paper proposes the novel model for the mobile forensic procedure, named as "E-Finder(Evidence Finder)", to ,solve the existing problem. The E-Finder has 5 main phases and 15 procedures. We compared E-Finder with NIST(National Institute of Standards and Technology) and Tata Elxsi Security Group. This paper thus achieved the development and standardization of the investigation methodology for the mobile forensics.