• Title/Summary/Keyword: deletion algorithm

Search Result 74, Processing Time 0.026 seconds

Object Contour Tracking Using Optimization of the Number of Snake Points in Stereoscopic Images (스테레오 동영상에서 스네이크 포인트 수의 최적화를 이용한 객체 윤곽 추적 알고리즘)

  • Kim Shin-Hyoung;Jang Jong-Whan
    • The KIPS Transactions:PartB
    • /
    • v.13B no.3 s.106
    • /
    • pp.239-244
    • /
    • 2006
  • In this paper, we present a snake-based scheme for contour tracking of objects in stereo image sequences. We address the problem by managing the insertion of new points and deletion of unnecessary points to better describe and track the object's boundary. In particular, our method uses more points in highly curved parts of the contour, and fewer points in less curved parts. The proposed algorithm can successfully define the contour of the object, and can track the contour in complex images. Furthermore, we tested our algorithm in the presence of partial object occlusion. Performance of the proposed algorithm has been verified by simulation.

Applicability of K-path Algorithm for the Transit Transfer of the Mobility Handicapped (교통약자의 대중교통환승을 위한 K경로 알고리즘 적용성 연구)

  • Kim, Eung-Cheol;Kim, Tea-Ho;Choi, Eun-Jin
    • International Journal of Highway Engineering
    • /
    • v.13 no.1
    • /
    • pp.197-206
    • /
    • 2011
  • The Korean government concentrates on supplying public transit facilities for the mobility handicapped. In other hands, increasing needs of transfer information when the mobility handicapped use transit facilities are substantial but not satisfactory as a whole. This study focuses on evaluating the applicability of developed K-path algorithm to provide user-customized route information that could make an active using of public transit while considering the mobility handicapped preferences. Developed algorithm reflects on requirements considering transfer attributes of the mobility handicapped. Trip attributes of the handicapped are addressed distinguished from handicapped types such as transfer walking time, transfer ratio, facility preferences and etc. This study examines the verification and application of the proposed algorithm that searches the least time K-paths by testing on actual subway networks in Seoul metropolitan areas. It is shown that the K-path algorithm is good enough to provide paths that meet the needs of the mobility handicapped and to be adoptable for the future expansion.

A Selection-Deletion of Prime Implicants Algorithm Based on Frequency for Circuit Minimization (빈도수 기반 주 내포 항 선택과 삭제 알고리즘을 적용한 회로 최소화)

  • Lee, Sang-Un
    • Journal of the Korea Society of Computer and Information
    • /
    • v.20 no.4
    • /
    • pp.95-102
    • /
    • 2015
  • This paper proposes a simple algorithm for circuit minimization. There are currently two effective heuristics for circuit minimization, namely manual Karnaugh maps and computable Quine-McCluskey algorithm. The latter, however, has a major defect: the runtime and memory required grow $3^n/n$ times for every increase in the number of variables n. The proposed algorithm, however, extracts the prime implicants (PI) that cover minterms of a given Boolean function by deriving an implicants table based on frequency. From a set of the extracted prime implicants, the algorithm then eliminates redundant PIs again based on frequency. The proposed algorithm is therefore capable of minimizing circuits polynomial time when faced with an increase in n. When applied to various 3-variable and 4-variable cases, it has proved to swiftly and accurately obtain the optimal solutions.

Multi-Level Optimization of Framed Structures Using Automatic Differentiation (자동미분을 이용한 뼈대구조의 다단계 최적설계)

  • Cho, Hyo-Nam;Chung, Jee-Sung;Min, Dae-Hong;Lee, Kwang-Min
    • Journal of Korean Society of Steel Construction
    • /
    • v.12 no.5 s.48
    • /
    • pp.569-579
    • /
    • 2000
  • An improved multi-level (IML) optimization algorithm using automatic differentiation (AD) of framed structures is proposed in this paper. For the efficiency of the proposed algorithm, multi-level optimization techniques using a decomposition method that separates both system-level and element-level optimizations, that utilizes and an artificial constraint deletion technique, are incorporated in the algorithm. And also to save the numerical efforts, an efficient reanalysis technique through approximated structural responses such as moments and frequencies with respect to intermediate variables is proposed in the paper. Sensitivity analysis of dynamic structural response is executed by AD that is a powerful technique for computing complex or implicit derivatives accurately and efficiently with minimal human effort. The efficiency and robustness of the IML algorithm, compared with a plain multi-level (PML) algorithm, is successfully demonstrated in the numerical examples.

  • PDF

Weak D Testing is not Required for D- Patients With C-E- Phenotype

  • Choi, Sooin;Chun, Sejong;Lee, Hwan Tae;Yu, HongBi;Seo, Ji Young;Cho, Duck
    • Annals of Laboratory Medicine
    • /
    • v.38 no.6
    • /
    • pp.585-590
    • /
    • 2018
  • Background: Although testing to detect weak D antigens using the antihuman globulin reagent is not required for D- patients in many countries, it is routinely performed in Korea. However, weak D testing can be omitted in D- patients with a C-E- phenotype as this indicates complete deletion of the RHD gene, except in rare cases. We designed a new algorithm for weak D testing, which consisted of RhCE phenotyping followed by weak D testing in C+ or E+ samples, and compared it with the current algorithm with respect to time and cost-effectiveness. Methods: In this retrospective study, 74,889 test results from January to July 2017 in a tertiary hospital in Korea were analyzed. Agreement between the current and proposed algorithms was evaluated, and total number of tests, time required for testing, and test costs were compared. With both algorithms, RHD genotyping was conducted for samples that were C+ or E+ and negative for weak D testing. Results: The algorithms showed perfect agreement (agreement=100%; ${\kappa}=1.00$). By applying the proposed algorithm, 29.56% (115/389 tests/yr) of tests could be omitted, time required for testing could be reduced by 36% (8,672/24,084 min/yr), and the test cost could be reduced by 16.53% (536.11/3,241.08 USD/yr). Conclusions: Our algorithm omitting weak D testing in D- patients with C-E- phenotype may be a cost-effective testing strategy in Korea.

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.

A Study on the Algorithm Development for Speech Recognition of Korean and Japanese (한국어와 일본어의 음성 인식을 위한 알고리즘 개발에 관한 연구)

  • Lee, Sung-Hwa;Kim, Hyung-Lae
    • Journal of IKEEE
    • /
    • v.2 no.1 s.2
    • /
    • pp.61-67
    • /
    • 1998
  • In this thesis, experiment have performed with the speaker recognition using multilayer feedforward neural network(MFNN) model using Korean and Japanese digits . The 5 adult males and 5 adult females pronounciate form 0 to 9 digits of Korean, Japanese 7 times. And then, they are extracted characteristics coefficient through Pitch deletion algorithm, LPC analysis, and LPC Cepstral analysis to generate input pattern of MFNN. 5 times among them are used to train a neural network, and 2 times is used to measure the performance of neural network. Both Korean and Japanese, Pitch coefficients is about 4%t more enhanced than LPC or LPC Cepstral coefficients.

  • PDF

Malware Containment Using Weight based on Incremental PageRank in Dynamic Social Networks

  • Kong, Jong-Hwan;Han, Myung-Mook
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.421-433
    • /
    • 2015
  • Recently, there have been fast-growing social network services based on the Internet environment and web technology development, the prevalence of smartphones, etc. Social networks also allow the users to convey the information and news so that they have a great influence on the public opinion formed by social interaction among users as well as the spread of information. On the other hand, these social networks also serve as perfect environments for rampant malware. Malware is rapidly being spread because relationships are formed on trust among the users. In this paper, an effective patch strategy is proposed to deal with malicious worms based on social networks. A graph is formed to analyze the structure of a social network, and subgroups are formed in the graph for the distributed patch strategy. The weighted directions and activities between the nodes are taken into account to select reliable key nodes from the generated subgroups, and the Incremental PageRanking algorithm reflecting dynamic social network features (addition/deletion of users and links) is used for deriving the high influential key nodes. With the patch based on the derived key nodes, the proposed method can prevent worms from spreading over social networks.

Channel Allocation Strategies for Interference-Free Multicast in Multi-Channel Multi-Radio Wireless Mesh Networks

  • Yang, Wen-Lin;Hong, Wan-Ting
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.2
    • /
    • pp.629-648
    • /
    • 2012
  • Given a video stream delivering system deployed on a multicast tree, which is embedded in a multi-channel multi-radio wireless mesh network, our problem is concerned about how to allocate interference-free channels to tree links and maximize the number of serviced mesh clients at the same time. In this paper, we propose a channel allocation heuristic algorithm based on best-first search and backtracking techniques. The experimental results show that our BFB based CA algorithm outperforms previous methods such as DFS and BFS based CA methods. This superiority is due to the backtracking technique used in BFB approach. It allows previous channel-allocated links to have feasibility to select the other eligible channels when no conflict-free channel can be found for the current link during the CA process. In addition to that, we also propose a tree refinement method to enhance the quality of channel-allocated trees by adding uncovered destinations at the cost of deletion of some covered destinations. Our aim of this refinement is to increase the number of serviced mesh clients. According to our simulation results, it is proved to be an effective method for improving multicast trees produced by BFB, BFS and DFS CA algorithms.

Flexible Video Authentication based on Aggregate Signature

  • Shin, Weon;Hong, Young-Jin;Lee, Won-Young;Rhee, Kyung-Hyune
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.6
    • /
    • pp.833-841
    • /
    • 2009
  • In this paper we propose a flexible video authentication scheme based on aggregate signature, which provides authenticity of a digital video by means of cryptographic signature to guarantee right of users. In contrast to previous works, the proposed scheme provides flexible usages on content distribution system, and it allows addition of new contents to the signed contents and deletion of some parts of the signed contents. A modification can be done by content owner or others. Although contents are modified by one or more users, our scheme can guarantee each user's right by aggregation of the each user's signatures. Moreover, proposed scheme has half size of Digital Signature Algorithm (DSA) with comparable security.

  • PDF