• Title/Summary/Keyword: deletion algorithm

Search Result 74, Processing Time 0.035 seconds

New Non-iterative Non-incremental Nonlinear Analysis (새로운 개념의 비반복적 비점증적 비선형해석)

  • Kim Chee-Kyeong;Hwang Young-Chul
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2006.04a
    • /
    • pp.514-519
    • /
    • 2006
  • This paper presents a new nonlinear analysis algorithm which uses the equivalent nodal load for the element stiffness. The equivalent nodal load represents the influence of the stiffness change such as the addition of elements, the deletion of elements, and/or the partial change of element stiffness. The nonlinear analysis of structures using the equivalent load improves the efficiency very much because the inverse of the structural stiffness matrix, which needs a large amount of computation to calculate, is reused in each loading step. In this paper, the concept of nonlinear analysis using the equivalent load for the element stiffness is described and some numerical examples are provided to verify it.

  • PDF

A Dynamic Data Replica Deletion Strategy on HDFS using HMM (HMM을 이용한 HDFS 기반 동적 데이터 복제본 삭제 전략)

  • Seo, Young-Ho;Youn, Hee-Yong
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2014.07a
    • /
    • pp.241-244
    • /
    • 2014
  • 본 논문에서는 HDFS(Hadoop Distributed File System)에서 문제되고 있는 복제정책의 개선을 위해 HMM(Hidden Markov Model)을 이용한 동적 데이터 복제본 삭제 전략을 제안한다. HDFS는 대용량 데이터를 효과적으로 처리할 수 있는 분산 파일 시스템으로 높은 Fault-Tolerance를 제공하며, 데이터의 접근에 높은 처리량을 제공하여 대용량 데이터 집합을 갖는 응용 프로그램에 최적화 되어있는 장점을 가지고 있다. 하지만 HDFS 에서의 복제 메커니즘은 시스템의 안정성과 성능을 향상시키지만, 추가 블록 복제본이 많은 디스크 공간을 차지하여 유지보수 비용 또한 증가하게 된다. 본 논문에서는 HMM과 최상의 상태 순서를 찾는 알고리즘인 Viterbi Algorithm을 이용하여 불필요한 데이터 복제본을 탐색하고, 탐색된 복제본의 삭제를 통하여 HDFS의 디스크 공간과 유지보수 비용을 절약 할 수 있는 전략을 제안한다.

  • PDF

Improvement of Newton-Raphson Iteration Using ELS (강성등가하중을 이용한 Newton-Raphson Iteration 개선)

  • Kim, Chee-Kyeong;Hwang, Young-Chul
    • Proceeding of KASS Symposium
    • /
    • 2006.05a
    • /
    • pp.170-174
    • /
    • 2006
  • This paper presents a new nonlinear analysis algorithm which uses the equivalent nodal load for the element stiffness. The equivalent nodal load represents the influence of the stiffness change such as the addition of elements, the deletion of elements, and/or the partial change of element stiffness. The nonlinear analysis of structures using the equivalent load improves the efficiency very much because the inverse of the structural stiffness matrix, which needs a large amount of computation to calculate, is reused in each loading step. In this paper, the concept of nonlinear analysis using the equivalent load for the element stiffness is described and some numerical examples are provided to verify it.

  • PDF

Feature Extraction of Letter Using Pattern Classifier Neural Network (패턴분류 신경회로망을 이용한 문자의 특징 추출)

  • Ryoo Young-Jae
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.52 no.2
    • /
    • pp.102-106
    • /
    • 2003
  • This paper describes a new pattern classifier neural network to extract the feature from a letter. The proposed pattern classifier is based on relative distance, which is measure between an input datum and the center of cluster group. So, the proposed classifier neural network is called relative neural network(RNN). According to definitions of the distance and the learning rule, the structure of RNN is designed and the pseudo code of the algorithm is described. In feature extraction of letter, RNN, in spite of deletion of learning rate, resulted in the identical performance with those of winner-take-all(WTA), and self-organizing-map(SOM) neural network. Thus, it is shown that RNN is suitable to extract the feature of a letter.

Automated 3D scoring of fluorescence in situ hybridization (FISH) using a confocal whole slide imaging scanner

  • Ziv Frankenstein;Naohiro Uraoka;Umut Aypar;Ruth Aryeequaye;Mamta Rao;Meera Hameed;Yanming Zhang;Yukako Yagi
    • Applied Microscopy
    • /
    • v.51
    • /
    • pp.4.1-4.12
    • /
    • 2021
  • Fluorescence in situ hybridization (FISH) is a technique to visualize specific DNA/RNA sequences within the cell nuclei and provide the presence, location and structural integrity of genes on chromosomes. A confocal Whole Slide Imaging (WSI) scanner technology has superior depth resolution compared to wide-field fluorescence imaging. Confocal WSI has the ability to perform serial optical sections with specimen imaging, which is critical for 3D tissue reconstruction for volumetric spatial analysis. The standard clinical manual scoring for FISH is labor-intensive, time-consuming and subjective. Application of multi-gene FISH analysis alongside 3D imaging, significantly increase the level of complexity required for an accurate 3D analysis. Therefore, the purpose of this study is to establish automated 3D FISH scoring for z-stack images from confocal WSI scanner. The algorithm and the application we developed, SHIMARIS PAFQ, successfully employs 3D calculations for clear individual cell nuclei segmentation, gene signals detection and distribution of break-apart probes signal patterns, including standard break-apart, and variant patterns due to truncation, and deletion, etc. The analysis was accurate and precise when compared with ground truth clinical manual counting and scoring reported in ten lymphoma and solid tumors cases. The algorithm and the application we developed, SHIMARIS PAFQ, is objective and more efficient than the conventional procedure. It enables the automated counting of more nuclei, precisely detecting additional abnormal signal variations in nuclei patterns and analyzes gigabyte multi-layer stacking imaging data of tissue samples from patients. Currently, we are developing a deep learning algorithm for automated tumor area detection to be integrated with SHIMARIS PAFQ.

An Incremental Web Document Clustering Based on the Transitive Closure Tree (이행적 폐쇄트리를 기반으로 한 점증적 웹 문서 클러스터링)

  • Youn Sung-Dae;Ko Suc-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.1
    • /
    • pp.1-10
    • /
    • 2006
  • In document clustering methods, the k-means algorithm and the Hierarchical Alglomerative Clustering(HAC) are often used. The k-means algorithm has the advantage of a processing time and HAC has also the advantage of a precision of classification. But both methods have mutual drawbacks, a slow processing time and a low quality of classification for the k-means algorithm and the HAC, respectively. Also both methods have the serious problem which is to compute a document similarity whenever new document is inserted into a cluster. A main property of web resource is to accumulate an information by adding new documents frequently. Therefore, we propose a new method of transitive closure tree based on the HAC method which can improve a processing time for a document clustering, and also propose a superior incremental clustering method for an insertion of a new document and a deletion of a document contained in a cluster. The proposed method is compared with those existing algorithms on the basis of a pre챠sion, a recall, a F-Measure, and a processing time and we present the experimental results.

  • PDF

(Real Time Classification System for Lead Pin Images) (실시간 Lead Pin 영상 분류 시스템)

  • 장용훈
    • Journal of the Korea Computer Industry Society
    • /
    • v.3 no.9
    • /
    • pp.1177-1188
    • /
    • 2002
  • To classify real time Lead pin images in this paper, The image acquisition system was composed to C.C.D, image frame grabber(DT3153), P.C(PentiumIII). I proposed image processing algorithms. This algorithms were composed to real time monitoring, Lead Pin image acquisition, image noise deletion, object area detection, point detection and pattern classification algorithm. The raw images were acquired from Lead pin images using the system. The result images were obtained from raw images by image processing algorithms. In implemental result, The right recognition was 97 of 100 acceptable products, 95 of 100 defective products. The recognition rate was 96% for total 200 Lead Pins.

  • PDF

Performance Analysis of a Statistical CFB Encryption Algorithm for Cryptographic Synchronization Method in the Wireless Communication Networks (무선 통신망 암호동기에 적합한 Statistical CFB 방식의 암호 알고리즘 성능 분석)

  • Park Dae-seon;Kim Dong-soo;Kim Young-soo;Yoon Jang-hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.7
    • /
    • pp.1419-1424
    • /
    • 2005
  • This paper suggests a new cipher mode of operation which can recover cryptographic synchronization. First, we study the typical cipher modes of operation, especially focused on cryptographic synchronization problems. Then, we suggest a statistical cipher-feedback mode of operation. We define the error sources mathmatically and simulate propagation errors caused by a bit insertion or bit deletion. In the simulation, we compare the effects of changing the synchronization pattern length and feedback key length. After that, we analyze the simulation results with the calculated propagation errors. finally. we evaluate the performance of the statistical cipher-feedback mode of operation and recommand the implementation considerations.

Extended Pairing Heap Algorithms Considering Cache Effect (캐쉬 효과를 고려한 확장된 Pairing Heap 알고리즘)

  • 정균락;김경훈
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.30 no.5_6
    • /
    • pp.250-257
    • /
    • 2003
  • As the memory access time becomes slower relative to the fast processor speed, most systems use cache memory to reduce the gap. The cache performance has an increasingly large impact on the performance of algorithms. Blocking is the well known method to utilize cache and has shown good results in multiplying matrices and search trees like d-heap. But if we use blocking in the data structures which require rotation during insertion or deletion, the execution time increases as the data movements between blocks are necessary. In this paper, we have proposed the extended pairing heap algorithms using block node and shown by experiments that our structure is superior Also in case of using block node, we use less memory space as the number of pointers decreases.

Block Unit Mapping Technique of NAND Flash Memory Using Variable Offset

  • Lee, Seung-Woo;Ryu, Kwan-Woo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.8
    • /
    • pp.9-17
    • /
    • 2019
  • In this paper, we propose a block mapping technique applicable to NAND flash memory. In order to use the NAND flash memory with the operating system and the file system developed on the basis of the hard disk which is mainly used in the general PC field, it is necessary to use the system software known as the FTL (Flash Translation Layer). FTL overcomes the disadvantage of not being able to overwrite data by using the address mapping table and solves the additional features caused by the physical structure of NAND flash memory. In this paper, we propose a new mapping method based on the block mapping method for efficient use of the NAND flash memory. In the case of the proposed technique, the data modification operation is processed by using a blank page in the existing block without using an additional block for the data modification operation, thereby minimizing the block unit deletion operation in the merging operation. Also, the frequency of occurrence of the sequential write request and random write request Accordingly, by optimally adjusting the ratio of pages for recording data in a block and pages for recording data requested for modification, it is possible to optimize sequential writing and random writing by maximizing the utilization of pages in a block.