• Title/Summary/Keyword: processing methods

Search Result 7,280, Processing Time 0.037 seconds

Analysis of Implementing Mobile Heterogeneous Computing for Image Sequence Processing

  • BAEK, Aram;LEE, Kangwoon;KIM, Jae-Gon;CHOI, Haechul
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.10
    • /
    • pp.4948-4967
    • /
    • 2017
  • On mobile devices, image sequences are widely used for multimedia applications such as computer vision, video enhancement, and augmented reality. However, the real-time processing of mobile devices is still a challenge because of constraints and demands for higher resolution images. Recently, heterogeneous computing methods that utilize both a central processing unit (CPU) and a graphics processing unit (GPU) have been researched to accelerate the image sequence processing. This paper deals with various optimizing techniques such as parallel processing by the CPU and GPU, distributed processing on the CPU, frame buffer object, and double buffering for parallel and/or distributed tasks. Using the optimizing techniques both individually and combined, several heterogeneous computing structures were implemented and their effectiveness were analyzed. The experimental results show that the heterogeneous computing facilitates executions up to 3.5 times faster than CPU-only processing.

Fast Motion Artifact Correction Using l$_1$-norm (l$_1$-norm을 이용한 움직임 인공물의 고속 보정)

  • Zho, Sang-Young;Kim, Eung-Yeop;Kim, Dong-Hyun
    • Investigative Magnetic Resonance Imaging
    • /
    • v.13 no.1
    • /
    • pp.22-30
    • /
    • 2009
  • Purpose : Patient motion during magnetic resonance (MR) imaging is one of the major problems due to its long scan time. Entropy based post-processing motion correction techniques have been shown to correct motion artifact effectively. One of main limitations of these techniques however is its long processing time. In this study, we propose several methods to reduce this long processing time effectively. Materials and Methods : To reduce the long processing time, we used the separability property of two dimensional Fourier transform (2-D FT). Also, a computationally light metric (sum of all image pixel intensity) was used instead of the entropy criterion. Finally, partial Fourier reconstruction, in particular the projection onto convex set (POCS) method, was combined thereby reducing the size of the data which should be processed and corrected. Results : Time savings of each proposed method are presented with different data size of brain images. In vivo data were processed using the proposed method and showed similar image quality. The total processing time was reduced to 15% in two dimensional images and 30% in the three dimensional images. Conclusion : The proposed methods can be useful in reducing image motion artifacts when only post-processing motion correction algorithms are available. The proposed methods can also be combined with parallel imaging technique to further reduce the processing times.

  • PDF

Holographic femtosecond laser processing

  • Hayasaki, Yoshio
    • Proceedings of the Optical Society of Korea Conference
    • /
    • 2008.07a
    • /
    • pp.61-63
    • /
    • 2008
  • Parallel femtosecond laser processing using a computer-generated hologram (CGH) displayed on a liquid crystal spatial light modulator (LCSLM) is demonstrated. The use of the LCSLM enables to perform an arbitrary and variable patterning. This holographic femtosecond laser processing has advantages of high throughput and high light-use efficiency. A critical issue is to precisely control the intensities of the diffraction peaks of the CGH. We demonstrate some methods for the control of the diffraction peaks. We also demonstrate the laser processing with two-dimensional and three-dimensional parallelism.

  • PDF

FUZZY PROCESSING BASED ON ALPHA-CUT MAPPING

  • Stoica, Adrian
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1993.06a
    • /
    • pp.1266-1269
    • /
    • 1993
  • The paper introduces a new method for fuzzy processing. The method allows handing a piece of information lost in the classic fuzzification process, and thus neglected by other methods. Processing the result after fuzzification is sustained by the interpretation that the input-output set mapping, specified by the IF-THEN rules, can be regarded as a direct mapping of their corresponding alpha-cuts. Processing involves just singletons as intermediary results, the final result being a combination of singletons obtained from fired rules.

  • PDF

Optimal Control of Large-Scale Dynamic Systems using Parallel Processing (병렬처리를 이용한 대규모 동적 시스템의 최적제어)

  • Park, Ki-Hong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.5 no.4
    • /
    • pp.403-410
    • /
    • 1999
  • In this study, a parallel algorithm has been developed that can quickly solve the optiaml control problem of large-scale dynamic systems. The algorithm adopts the sequential quadratic programming methods and achieves domain decomposition-type parallelism in computing sensitivities for search direction computation. A silicon wafer thermal process problem has been solved using the algorithm, and a parallel efficiency of 45% has been achieved with 16 processors. Practical methods have also been investigated in this study as a way to further speed up the computation time.

  • PDF

Flange Panel Forming using Roll Seaming Method (롤시밍을 이용한 플랜지 패널제작에 대한 연구)

  • 박환서;유송민;이동규;이위로;노대호
    • Proceedings of the Korean Society of Machine Tool Engineers Conference
    • /
    • 2004.10a
    • /
    • pp.10-15
    • /
    • 2004
  • A machining processes for flange panel has been introduced. Contrary to the conventional methods like forming and welding, roll-seaming method has been utilized for better quality and less cost. Several measurement methods including digital image processing have been used to confirm the product quality assurance.

  • PDF

Electron collision cross sections of molecules relevant to plasma processing

  • Jo, Hyeok
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2010.08a
    • /
    • pp.34-34
    • /
    • 2010
  • Absolute electron-impact cross sections for molecular targets including their radicals are important in developing plasma reactors and testing various plasma processing gases. However, low-energy electron collision data for these gases are sparse and only the limited cross section data are available. In this presentation, the methods and the status of measurements of, mainly, absolute elastic cross sections for electron-polyatomic molecule collisions will be discussed with recent results from Chungnam National University. Elastic cross sections are essential for the absolute scale conversion of inelastic cross sections, as well as for testing computational methods.

  • PDF

A Study on Recommendation Methods in Web Services: Existing Solutions and Their Limitations

  • Nasridinov, Aziz;Byun, Jeong-Yong
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2014.04a
    • /
    • pp.606-607
    • /
    • 2014
  • Due to Web Services' platform and language independent nature, many business corporations have used them for the integration of various applications. However, the growing amount of available Web Services on Web forms a new problem - how to select and recommend an appropriate Web Service that matches the user requirements. In this paper, we investigate recommendation methods in Web Services, and discuss their strength and limitations.

Comparison between Word Embedding Techniques in Traditional Korean Medicine for Data Analysis: Implementation of a Natural Language Processing Method (한의학 고문헌 데이터 분석을 위한 단어 임베딩 기법 비교: 자연어처리 방법을 적용하여)

  • Oh, Junho
    • Journal of Korean Medical classics
    • /
    • v.32 no.1
    • /
    • pp.61-74
    • /
    • 2019
  • Objectives : The purpose of this study is to help select an appropriate word embedding method when analyzing East Asian traditional medicine texts as data. Methods : Based on prescription data that imply traditional methods in traditional East Asian medicine, we have examined 4 count-based word embedding and 2 prediction-based word embedding methods. In order to intuitively compare these word embedding methods, we proposed a "prescription generating game" and compared its results with those from the application of the 6 methods. Results : When the adjacent vectors are extracted, the count-based word embedding method derives the main herbs that are frequently used in conjunction with each other. On the other hand, in the prediction-based word embedding method, the synonyms of the herbs were derived. Conclusions : Counting based word embedding methods seems to be more effective than prediction-based word embedding methods in analyzing the use of domesticated herbs. Among count-based word embedding methods, the TF-vector method tends to exaggerate the frequency effect, and hence the TF-IDF vector or co-word vector may be a more reasonable choice. Also, the t-score vector may be recommended in search for unusual information that could not be found in frequency. On the other hand, prediction-based embedding seems to be effective when deriving the bases of similar meanings in context.

Image Processing-based Validation of Unrecognizable Numbers in Severely Distorted License Plate Images

  • Jang, Sangsik;Yoon, Inhye;Kim, Dongmin;Paik, Joonki
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.1 no.1
    • /
    • pp.17-26
    • /
    • 2012
  • This paper presents an image processing-based validation method for unrecognizable numbers in severely distorted license plate images which have been degraded by various factors including low-resolution, low light-level, geometric distortion, and periodic noise. Existing vehicle license plate recognition (LPR) methods assume that most of the image degradation factors have been removed before performing the recognition of printed numbers and letters. If this is not the case, conventional LPR becomes impossible. The proposed method adopts a novel approach where a set of reference number images are intentionally degraded using the same factors estimated from the input image. After a series of image processing steps, including geometric transformation, super-resolution, and filtering, a comparison using cross-correlation between the intentionally degraded reference and the input images can provide a successful identification of the visually unrecognizable numbers. The proposed method makes it possible to validate numbers in a license plate image taken under low light-level conditions. In the experiment, using an extended set of test images that are unrecognizable to human vision, the proposed method provides a successful recognition rate of over 95%, whereas most existing LPR methods fail due to the severe distortion.

  • PDF