• Title/Summary/Keyword: data duplication

Search Result 204, Processing Time 0.021 seconds

A Design of Hop-by-Hop based Reliable Congestion Control Protocol for WSNs (무선 센서 네트워크를 위한 Hop-by-Hop 기반의 신뢰성 있는 혼잡제어 기법 설계)

  • Heo Kwan;Kim Hyun-Tae;Yang Hae-Kwon;Ra In-Ho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2006.05a
    • /
    • pp.442-445
    • /
    • 2006
  • In Wireless Sensor Networks(WSNs), a sensor node broadcasts an acquisited data to neighboring other nodes and it makes serious data duplication problem that increases network traffic loads and data loss. This problem is concerned with the conflict condition for supporting both the reliability of data transfer and avoidance of network congestion. To solve the problem, a reliable congestion control protocol is necessary that considers critical factors affecting on data transfer reliability such as reliable data transmission, wireless loss, and congestion loss for supporting effective congestion control in WSNs. In this paper, we proposes a reliable congestion protocol, ratted HRCCP, based on hop-hop sequence number, and DSbACK by minimizing useless data transfers as an energy-saved congestion control method.

  • PDF

A Study on Duplication Verification of Public Library Catalog Data: Focusing on the Case of G Library in Busan (공공도서관 목록데이터의 중복검증에 관한 연구 - 부산 지역 G도서관 사례를 중심으로 -)

  • Min-geon Song;Soo-Sang Lee
    • Journal of Korean Library and Information Science Society
    • /
    • v.55 no.1
    • /
    • pp.1-26
    • /
    • 2024
  • The purpose of this study is to derive an integration plan for bibliographic records by applying a duplicate verification algorithm to the item-based catalog in public libraries. To this, G Library, which was opened recently in Busan, was selected. After collecting OPAC data from G Library through web crawling, multipart monographs of Korean Literature (KDC 800) were selected and KERIS duplicate verification algorithm was applied. After two rounds of data correction based on the verification results, the duplicate verification rate increased by a total of 2.74% from 95.53% to 98.27%. Even after data correction, 24 books that were judged to be similar or inconsistent were identified as data from other published editions after receiving separate ISBN such as revised versions or hard copies. Through this, it was confirmed that the duplicate verification rate could be improved through catalog data correction work, and the possibility of using the KERIS duplicate verification algorithm as a tool to convert duplicate item-based records from public libraries into manifestation-based records was confirmed.

A Secure and Practical Encrypted Data De-duplication with Proof of Ownership in Cloud Storage (클라우드 스토리지 상에서 안전하고 실용적인 암호데이터 중복제거와 소유권 증명 기술)

  • Park, Cheolhee;Hong, Dowon;Seo, Changho
    • Journal of KIISE
    • /
    • v.43 no.10
    • /
    • pp.1165-1172
    • /
    • 2016
  • In cloud storage environment, deduplication enables efficient use of the storage. Also, in order to save network bandwidth, cloud storage service provider has introduced client-side deduplication. Cloud storage service users want to upload encrypted data to ensure confidentiality. However, common encryption method cannot be combined with deduplication, because each user uses a different private key. Also, client-side deduplication can be vulnerable to security threats because file tag replaces the entire file. Recently, proof of ownership schemes have suggested to remedy the vulnerabilities of client-side deduplication. Nevertheless, client-side deduplication over encrypted data still causes problems in efficiency and security. In this paper, we propose a secure and practical client-side encrypted data deduplication scheme that has resilience to brute force attack and performs proof of ownership over encrypted data.

Performance Analysis of Cellular If Using Combined Cache and Alternative Handoff Method for Realtime Data Transmission (실시간 데이터를 지원하는 통합 캐시 및 차별화된 핸드오프를 이용한 셀룰러 IP의 성능분석)

  • Seo, Jeong-Hwa;Han, Tae-Young;Kim, Nam
    • The Journal of the Korea Contents Association
    • /
    • v.3 no.2
    • /
    • pp.65-72
    • /
    • 2003
  • In this paper, the new scheme using a Combined Cache(CC) that combing the Paging and Routing Cache(PRC) and an alternative handoff method according to the type of data transmission for achieving the efficient realtime communication. The PRC and quasi-soft handoff method reduce the path duplication. But they increase the network traffic load because of the handoff state packet of Mobile Host(MH). Moreover the use the same handoff method without differentiating the type of transmission data. Those problems are solved by operating U with a semi-soft handoff method for realtime data transmission and with a hard handoff method for non-realtime data transmission. As a result or simulation a better performance is obtained because of the reduction of the number of control packet in case that the number of cells are below 20. And the packet arrival time and loss of packet decreased significantly for realtime data transmission.

  • PDF

A Study on Ring Buffer for Efficiency of Mass Data Transmission in Unstable Network Environment (불안정한 네트워크 환경에서 대용량 데이터의 전송 효율화를 위한 링 버퍼에 관한 연구)

  • Song, Min-Gyu;Kim, Hyo-Ryoung
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.15 no.6
    • /
    • pp.1045-1054
    • /
    • 2020
  • In this paper, we designed a TCP/IP based ring buffer system that can stably transfer bulk data streams in the unstable network environments. In the scheme we proposed, The observation data stream generated and output by each radio observatory's backend system as a UDP frame is stored as a UDP packet in a large capacity ring buffer via a socket buffer in the client system. Thereafter, for stable transmission to the remote destination, the packets are processed in TCP and transmitted to the socket buffer of server system in the correlation center, which packets are stored in a large capacity ring buffer if there is no problem with the packets. In case of errors such as loss, duplication, and out of order delivery, the packets are retransmitted through TCP flow control, and we guaranteed that the reliability of data arriving at the correlation center. When congestion avoidance occurs due to network performance instability, we also suggest that performance degradation can be minimized by applying parallel streams.

Development of a Reproducibility Index for cDNA Microarray Experiments

  • Kim, Byung-Soo;Rha, Sun-Young
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2002.05a
    • /
    • pp.79-83
    • /
    • 2002
  • Since its introduction in 1995 by Schena et al. cDNA microarrays have been established as a potential tool for high-throughput analysis which allows the global monitoring of expression levels for thousands of genes simultaneously. One of the characteristics of the cDNA microarray data is that there is inherent noise even after the removal of systematic effects in the experiment. Therefore, replication is crucial to the microarray experiment. The assessment of reproducibility among replicates, however, has drawn little attention. Reproducibility may be assessed with several different endpoints along the process of data reduction of the microarray data. We define the reproducibility to be the degree with which replicate arrays duplicate each other. The aim of this note is to develop a novel measure of reproducibility among replicates in the cDNA microarray experiment based on the unprocessed data. Suppose we have p genes and n replicates in a microarray experiment. We first develop a measure of reproducibility between two replicates and generalize this concept for a measure of reproducibility of one replicate against the remaining n-1 replicates. We used the rank of the outcome variable and employed the concept of a measure of tracking in the blood pressure literature. We applied the reproducibility measure to two sets of microarray experiments in which one experiment was performed in a more homogeneous environment, resulting in validation of this novel method. The operational interpretation of this measure is clearer than Pearson's correlation coefficient which might be used as a crude measure of reproducibility of two replicates.

  • PDF

Application and Evaluation of Vector Map Watermarking Algorithm for Robustness Enhancement (강인성 향상을 위한 벡터 맵 워터마킹 알고리즘의 적용과 평가)

  • Won, Sung Min;Park, Soo Hong
    • Spatial Information Research
    • /
    • v.21 no.3
    • /
    • pp.31-43
    • /
    • 2013
  • Although the vector map data possesses much higher values than other types of multimedia, the data copyright and the protection against illegal duplication are still far away from the attention. This paper proposes a novel watermarking technique which is both robust to diverse attacks and optimized to a vector map structure. Six approaches are proposed for the design of the watermarking algorithm: point-based approach, building a minimum perimeter triangle, watermark embedding in the length ratio, referencing to the pixel position of the watermark image, grouping, and using the one-way function. Our method preserves the characteristics of watermarking such as embedding effectiveness, fidelity, and false positive rate, while maintaining robustness to all types of attack except a noise attack. Furthermore, our method is a blind scheme in which robustness is independent of the map data. Finally, our method provides a solution to the challenging issue of degraded robustness under severe simplification attacks.

A Case Study on Energy focused Smart City, London of the UK: Based on the Framework of 'Business Model Innovation'

  • Song, Minzheong
    • International journal of advanced smart convergence
    • /
    • v.9 no.2
    • /
    • pp.8-19
    • /
    • 2020
  • We see an energy fucused smart city evolution of the UK along with the project of "Smart London Plan (SLP)." A theoretical logic of business model innovation has been discussed and a research framework of evolving energy focused smart city is formulated. The starting point is the silo system. In the second stage, the private investment in smart meters establishes a basement for next stages. As results, the UK's smart energy sector has evolved from smart meter installation through smart grid to new business models such as water-energy nexus and microgrid. Before smart meter installation of the government, the electricity system was centralized. However, after consumer engagement plan has been set to make them understand benefits that they can secure through smart meters, the customer behavior has been changed. The data analytics firm enables greater understanding of consumer behavior and it helps energy industry to be smart via controlling, securing and using that data to improve the energy system. In the third stage, distribution network operators (DNOs)' access to smart meter data has been allowed and the segmentation starts. In the fourth stage, with collaboration of Ofwat and Ofgem, it is possible to eliminate unnecessary duplication of works and reduce interest conflict between water and electricity. In the fifth stage, smart meter and grid has been integrated as an "adaptive" system and a transition from DNO to DSO is accomplished for the integrated operation. Microgrid is a prototype for an "adaptive" smart grid. Previous steps enable London to accomplish a platform leadership to support the increasing electrification of the heating and transport sector and smart home.

Design and Implementation of Flying-object Tracking Management System by using Radar Data (레이더 자료를 이용한 항적추적관리시스템 설계 및 구현)

  • Lee Moo-Eun;Ryu Keun-Ho
    • The KIPS Transactions:PartD
    • /
    • v.13D no.2 s.105
    • /
    • pp.175-182
    • /
    • 2006
  • Radars are used to detect the motion of the low flying enemy planes in the military. Radar-detected raw data are first processed and then inserted into the ground tactical C4I system. Next, these data we analyzed and broadcasted to the Shooter system in real time. But the accuracy of information and time spent on the displaying and graphical computation are dependent on the operator's capability. In this paper, we propose the Flying Object Tracking Management System that allows the displaying of the objects' trails in real time by using data received from the radars. We apply the coordinate system translation algorithm, existing communication protocol improvements with communication equipment, and signal and information computation process. Especially, radar signal duplication computation and synchronization algorithm is developed to display the objects' coordinates and thus we can improve the Tactical Air control system's reliability, efficiency, and easy-of-usage.

A Design and Implementation of a Query Interpreter for SQL/MM Part5 (SQL/MM Part5를 지원하는 쿼리변환기의 설계 및 구현)

  • Kang Gi-Jun;Lee Bu-Kwon;Seo Yeong-Geon
    • Journal of Digital Contents Society
    • /
    • v.6 no.2
    • /
    • pp.107-112
    • /
    • 2005
  • We need a research for representing and processing of multimedia data in database because of increasing the importance and utilization of the data owing to development of internet technology. RDBMS supports only the storing-structure to store multimedia, but the support for data type, representation and query of multimedia is insufficient. To cope with this problem, ISO/IEC standardized SQL multimedia(SQL/MM) for multimedia data. However, ORDBMS supports SQL/MM, but RDBMS does not support it. Therfore, this theis proposes a query interpreter to support SQL/MM in MS-SQL 2000 as one of RDBMS and introduces a image retrieval application using it. The quary interpreter supports the function to convert SQL/MM into SQL, and additionally the function of the image duplication check. The image processing application using a query interpreter can easily be integrated and operated with traditional RDBMS-based system.

  • PDF