• Title/Summary/Keyword: hashing

Search Result 214, Processing Time 0.022 seconds

Enhancing RCC(Recyclable Counter With Confinement) with Cuckoo Hashing (Cuckoo Hashing을 이용한 RCC에 대한 성능향상)

  • Jang, Rhong-ho;Jung, Chang-hun;Kim, Keun-young;Nyang, Dae-hun;Lee, Kyung-Hee
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.41 no.6
    • /
    • pp.663-671
    • /
    • 2016
  • According to rapidly increasing of network traffics, necessity of high-speed router also increased. For various purposes, like traffic statistic and security, traffic measurement function should performed by router. However, because of the nature of high-speed router, memory resource of router was limited. RCC proposed a way to measure traffics with high speed and accuracy. Additional quadratic probing hashing table used for accumulating elephant flows in RCC. However, in our experiment, quadratic probing performed many overheads when allocated small memory space or load factor was high. Especially, quadratic requested many calculations in update and lookup. To face this kind of problem, we use a cuckoo hashing which performed a good performance in update and loop for enhancing the RCC. As results, RCC with cuckoo hashing performed high accuracy and speed even when load factor of memory was high.

Binary Hashing CNN Features for Action Recognition

  • Li, Weisheng;Feng, Chen;Xiao, Bin;Chen, Yanquan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.9
    • /
    • pp.4412-4428
    • /
    • 2018
  • The purpose of this work is to solve the problem of representing an entire video using Convolutional Neural Network (CNN) features for human action recognition. Recently, due to insufficient GPU memory, it has been difficult to take the whole video as the input of the CNN for end-to-end learning. A typical method is to use sampled video frames as inputs and corresponding labels as supervision. One major issue of this popular approach is that the local samples may not contain the information indicated by the global labels and sufficient motion information. To address this issue, we propose a binary hashing method to enhance the local feature extractors. First, we extract the local features and aggregate them into global features using maximum/minimum pooling. Second, we use the binary hashing method to capture the motion features. Finally, we concatenate the hashing features with global features using different normalization methods to train the classifier. Experimental results on the JHMDB and MPII-Cooking datasets show that, for these new local features, binary hashing mapping on the sparsely sampled features led to significant performance improvements.

Security Analysis based on Differential Entropy m 3D Model Hashing (3D 모델 해싱의 미분 엔트로피 기반 보안성 분석)

  • Lee, Suk-Hwan;Kwon, Ki-Ryong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.12C
    • /
    • pp.995-1003
    • /
    • 2010
  • The content-based hashing for authentication and copy protection of image, video and 3D model has to satisfy the robustness and the security. For the security analysis of the hash value, the modelling method based on differential entropy had been presented. But this modelling can be only applied to the image hashing. This paper presents the modelling for the security analysis of the hash feature value in 3D model hashing based on differential entropy. The proposed security analysis modeling design the feature extracting methods of two types and then analyze the security of two feature values by using differential entropy modelling. In our experiment, we evaluated the security of feature extracting methods of two types and discussed about the trade-off relation of the security and the robustness of hash value.

Design and Implementation of the dynamic hashing structure for indexing the current positions of moving objects (이동체의 현재 위치 색인을 위한 동적 해슁 구조의 설계 및 구현)

  • 전봉기
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.6
    • /
    • pp.1266-1272
    • /
    • 2004
  • Location-Based Services(LBS) give rise to location-dependent queries of which results depend on the positions of moving objects. Because positions of moving objects change continuously, indexes of moving object must perform update operations frequently for keeping the changed position information. Existing spatial index (Grid File, R-Tree, KDB-tree etc.) proposed as index structure to search static data effectively. There are not suitable for index technique of moving object database that position data is changed continuously. In this paper, I propose a dynamic hashing index that insertion/delete costs are low. The dynamic hashing structure is that apply dynamic hashing techniques to combine a hash and a tree to a spatial index. The results of my extensive experiments show the dynamic hashing index outperforms the $R^$ $R^*$-tree and the fixed grid.

Performance Analysis of FA Allocation Schemes of CDMA Radio Networks (CDMA 무선망의 FA 할당 방식 성능 분석)

  • 김장욱;유병철;오창헌;조성준
    • Proceedings of the Korea Electromagnetic Engineering Society Conference
    • /
    • 2000.11a
    • /
    • pp.59-62
    • /
    • 2000
  • 현재 CDMA 시스템에서 통화채널의 주파수 할당 방식에 Hashing 알고리즘이 사용되며, Hashing알고리즘에 의해 FA(frequency assignment)간 부하 균형을 유지하고 있다. Hashing 알고리즘은 모든 가입자 번호를 각 FA에 균일하게 할당하지만, 개별 기지국의 가입자 분포는 각 FA별로 반드시 균일하게 분포되지는 않는다. 이런 FA간 가입자 분포의 불균형은 해당 기지국의 과부하로 인식되어 조기 FA 증설이 요구되며, 해당 과부하 FA의 통화품질 열화를 일으킨다. 따라서 부하 균형에 따라 통화채널의 FA할당을 조절하기 위한 Hashing 후 강제 할당 방식인 OFD(optional forced distributio)을 제안하고 이를 CDMA 시스템에 적용하여 부하 균형의 개선 효과를 얻었다.

  • PDF

Deep Hashing for Semi-supervised Content Based Image Retrieval

  • Bashir, Muhammad Khawar;Saleem, Yasir
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.8
    • /
    • pp.3790-3803
    • /
    • 2018
  • Content-based image retrieval is an approach used to query images based on their semantics. Semantic based retrieval has its application in all fields including medicine, space, computing etc. Semantically generated binary hash codes can improve content-based image retrieval. These semantic labels / binary hash codes can be generated from unlabeled data using convolutional autoencoders. Proposed approach uses semi-supervised deep hashing with semantic learning and binary code generation by minimizing the objective function. Convolutional autoencoders are basis to extract semantic features due to its property of image generation from low level semantic representations. These representations of images are more effective than simple feature extraction and can preserve better semantic information. Proposed activation and loss functions helped to minimize classification error and produce better hash codes. Most widely used datasets have been used for verification of this approach that outperforms the existing methods.

Reversible Multipurpose Watermarking Algorithm Using ResNet and Perceptual Hashing

  • Mingfang Jiang;Hengfu Yang
    • Journal of Information Processing Systems
    • /
    • v.19 no.6
    • /
    • pp.756-766
    • /
    • 2023
  • To effectively track the illegal use of digital images and maintain the security of digital image communication on the Internet, this paper proposes a reversible multipurpose image watermarking algorithm based on a deep residual network (ResNet) and perceptual hashing (also called MWR). The algorithm first combines perceptual image hashing to generate a digital fingerprint that depends on the user's identity information and image characteristics. Then it embeds the removable visible watermark and digital fingerprint in two different regions of the orthogonal separation of the image. The embedding strength of the digital fingerprint is computed using ResNet. Because of the embedding of the removable visible watermark, the conflict between the copyright notice and the user's browsing is balanced. Moreover, image authentication and traitor tracking are realized through digital fingerprint insertion. The experiments show that the scheme has good visual transparency and watermark visibility. The use of chaotic mapping in the visible watermark insertion process enhances the security of the multipurpose watermark scheme, and unauthorized users without correct keys cannot effectively remove the visible watermark.

An Optimization of Hashing Mechanism for the DHP Association Rules Mining Algorithm (DHP 연관 규칙 탐사 알고리즘을 위한 해싱 메커니즘 최적화)

  • Lee, Hyung-Bong;Kwon, Ki-Hyeon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.8
    • /
    • pp.13-21
    • /
    • 2010
  • One of the most distinguished features of the DHP association rules mining algorithm is that it counts the support of hash key combinations composed of k items at phase k-1, and uses the counted support for pruning candidate large itemsets to improve performance. At this time, it is desirable for each hash key combination to have a separate count variable, where it is impossible to allocate the variables owing to memory shortage. So, the algorithm uses a direct hashing mechanism in which several hash key combinations conflict and are counted in a same hash bucket. But the direct hashing mechanism is not efficient because the distribution of hash key combinations is unvalanced by the characteristics sourced from the mining process. This paper proposes a mapped perfect hashing function which maps the region of hash key combinations into a continuous integer space for phase 3 and maximizes the efficiency of direct hashing mechanism. The results of a performance test experimented on 42 test data sets shows that the average performance improvement of the proposed hashing mechanism is 7.3% compared to the existing method, and the highest performance improvement is 16.9%. Also, it shows that the proposed method is more efficient in case the length of transactions or large itemsets are long or the number of total items is large.

A Hashing Scheme using Round Robin in a Wireless Internet Proxy Server Cluster System (무선 인터넷 프록시 서버 클러스터 시스템에서 라운드 로빈을 이용한 해싱 기법)

  • Kwak, Huk-Eun;Chung, Kyu-Sik
    • The KIPS Transactions:PartA
    • /
    • v.13A no.7 s.104
    • /
    • pp.615-622
    • /
    • 2006
  • Caching in a Wireless Internet Proxy Server Cluster Environment has an effect that minimizes the time on the request and response of Internet traffic and Web user As a way to increase the hit ratio of cache, we can use a hash function to make the same request URLs to be assigned to the same cache server. The disadvantage of the hashing scheme is that client requests cannot be well-distributed to all cache servers so that the performance of the whole system can depend on only a few busy servers. In this paper, we propose an improved load balancing scheme using hashing and Round Robin scheme that distributes client requests evenly to cache servers. In the existing hashing scheme, if a hashing value for a request URL is calculated, the server number is statically fixed at compile time while in the proposed scheme it is dynamically fixed at run time using round robin method. We implemented the proposed scheme in a Wireless Internet Proxy Server Cluster Environment and performed experiments using 16 PCs. Experimental results show the even distribution of client requests and the 52% to 112% performance improvement compared to the existing hashing method.

Sorting Cuckoo: Enhancing Lookup Performance of Cuckoo Hashing Using Insertion Sort (Sorting Cuckoo: 삽입 정렬을 이용한 Cuckoo Hashing의 입력 연산의 성능 향상)

  • Min, Dae-hong;Jang, Rhong-ho;Nyang, Dae-hun;Lee, Kyung-hee
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.42 no.3
    • /
    • pp.566-576
    • /
    • 2017
  • Key-value stores proved its superiority by being applied to various NoSQL databases such as Redis, Memcached. Lookup performance is important because key-value store applications performs more lookup than insert operations in most environments. However, in traditional applications, lookup may be slow because hash tables are constructed out of linked-list. Therefore, cuckoo hashing has been getting attention from the academia for constant lookup time, and bucketized cuckoo hashing (BCH) has been proposed since it can achieve high load factor. In this paper, we introduce Sorting Cuckoo which inserts data using insertion sort in BCH structure. Sorting Cuckoo determines the existence of a key with a relatively small memory access because data are sorted in each buckets. In particular, the higher memory load factor, the better lookup performance than BCH's. Experimental results show that Sorting Cuckoo has smaller memory access than BCH's as many as about 19 million (25%) in 10 million negative lookup operations (key is not in the table), about 4 million times (10%) in 10 million positive lookup operations (where it is) with load factor 95%.