• Title/Summary/Keyword: Processing Method

Search Result 18,100, Processing Time 0.037 seconds

Comparison of Two Methods for Size-interpolation on CRT Display : Analog Stimulus-Digital Response Vs. Digital Stimulus-Analog Response (CRT 표시장치에서 두 형태의 크기-내삽 추정 방법의 비교 연구 : 상사자극-계수 반응과 계수 자극-상사반응)

  • Ro, Jae-ho
    • Journal of Industrial Technology
    • /
    • v.14
    • /
    • pp.127-140
    • /
    • 1994
  • This study is concerned with the accuracy and the patterns when different methods was used in interpolation task. Although 3 methods employed the same modality for input (visual) and for output (manual responding), they differed in central processing, which method 1 is relatively more tendency of verbal processing, method 2 is realtively more tendency of spatial processing and method 3 needed a number of switching code (verbal/spatial) performing task. Split-plot design was adopted, which whole plot consisted of methods (3), orientations (horizon, vertical), base-line sizes (300, 500, 700 pixels) and split plot consisted of target locations (1-99). The results showed the anchor effect and the range effect. Method 2, method 3 and method 1 that order was better accuracy. ANOVA showed that the accuracy was significantly influenced by the method, the location of target, and its interactions ($method{\times}location$, $size{\times}location$). Analysis of error data, response time and frequency of under, just, over estimate indicated that a systematic error pattern was made in task and methods changed not only the performance but also the pattern. The results provided support for the importance of the multiple resources theory in accounting for S-C-R compatibility and task performance. They are discussed in terms of multiple resources theory and guidelines for system design is suggested by the S-C-R compatibility.

  • PDF

Direct fault-tree modeling of human failure event dependency in probabilistic safety assessment

  • Ji Suk Kim;Sang Hoon Han;Man Cheol Kim
    • Nuclear Engineering and Technology
    • /
    • v.55 no.1
    • /
    • pp.119-130
    • /
    • 2023
  • Among the various elements of probabilistic safety assessment (PSA), human failure events (HFEs) and their dependencies are major contributors to the quantification of risk of a nuclear power plant. Currently, the dependency among HFEs is reflected using a post-processing method in PSA, wherein several drawbacks, such as limited propagation of minimal cutsets through the fault tree and improper truncation of minimal cutsets exist. In this paper, we propose a method to model the HFE dependency directly in a fault tree using the if-then-else logic. The proposed method proved to be equivalent to the conventional post-processing method while addressing the drawbacks of the latter. We also developed a software tool to facilitate the implementation of the proposed method considering the need for modeling the dependency between multiple HFEs. We applied the proposed method to a specific case to demonstrate the drawbacks of the conventional post-processing method and the advantages of the proposed method. When applied appropriately under specific conditions, the direct fault-tree modeling of HFE dependency enhances the accuracy of the risk quantification and facilitates the analysis of minimal cutsets.

Study on The Drug Processing of of the Roots of Aconitum carmichaeli (바꽃(烏頭)의 포제(抱製)에 관한 연구)

  • Seong, Man-Jun;Lee, Kye-Suk;Cho, Sun-Hee;Lee, Go-Hoon;Kang, OK-Hwa;Kwon, Dong-Yeul
    • Herbal Formula Science
    • /
    • v.13 no.2
    • /
    • pp.141-151
    • /
    • 2005
  • From the tuberous root of Aconitum carmichaeli Debx.(Ranunculaceae), the main root is called as common monkhood mother root and the later root is called as the prepared aconite root. From the prepared aconite root. Looking at the processing method of the prepared aconite root, it is divided into Yeombuja (prepared aconite root processed in salt) and heuksoonpyeon (baekbupyeon) following the processing method after removing the soil and this is a way of processing the prepared aconite root without damage it. The recently produced raw prepared aconite root is easily damaged, thus it shall be preserved in salt to have the crystal shape on the surface of the prepared aconite root and store and transport in firmly solidified yeombuja condition. Therefore, yeombuja shall remove the salt before use and requires processing for use but heuksoonpyeon or baekbupyeon may use immediately. For the succession of the unique processing techniques of our ancestors, there has to be studies on the techniques. Prepared aconite root is generally used as holy medicines to cure the yang depletion syndrome, kidney-yang deficiency syndrome, and obstruction of qi in the chest syndrome. However, they are the substances with toxicity. It is contemplated that the contents of processing are broadly understood through the document on the processing method, and based on such foundation, the systematic set and proof on the documents are made along with the addition of the contemporary scientific theory and technology to develop the traditional processing technology to maximize the treatment effect and safety of prepared aconite root. In this study, the historic data and records on the processing method of latteral root of aconitum carmichaeli Debx will be rearranged to contribute to the standardization of medicinal herbs, maximization of efficacy and minimization of the side effects.

  • PDF

Ontology data processing method in distributed semantic web environment (분산 시맨틱웹 환경에서의 온톨로지 데이터 처리 기법 연구)

  • Kim, Byung-Gon;Oh, Sung-Kyun
    • Journal of Digital Contents Society
    • /
    • v.9 no.2
    • /
    • pp.277-284
    • /
    • 2008
  • As the increasing of users' request about internet web service, the importance of ontology to construct semantic web is increasing now. Early Internet data processing was studied in the form of data integration through centralized ontology construction. However, because of distributed environment of internet, when integrating data of distributed site, it is required to integrate data of each site in terms of peer-to-peer data processing for corresponding to fast change of internet. In this paper, in distributed environment, we propose data processing method which construct ontology in each site with ontology language OWL. Furthermore, through relational representation of OWL, we propose the system containing distributed query processing for data constructed in different site with different method.

  • PDF

Design and Implementation of Algorithms for the Motion Detection of Vehicles using Hierarchical Motion Estimation and Parallel Processing (계층화 모션 추정법과 병렬처리를 이용한 차량 움직임 측정 알고리즘 개발 및 구현)

  • 강경훈;정성태;이상설;남궁문
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.7
    • /
    • pp.1189-1199
    • /
    • 2003
  • This paper presents a new method for the motion detection of vehicles using hierarchical motion estimation and parallel processing. It captures the road image by using a CMOS sensor. It divides the captured image into small blocks and detects the motion of each block by using a block-matching method which is based on a hierarchical motion estimation and parallel processing for the real-time processing. The parallelism is achieved by using tile pipeline and the data flow technique. The proposed method has been implemented by using an embedded system. The proposed block matching algorithm has been implemented on PLDs(Programmable Logic Device) and clustering algorithm has been implemented by ARM processor. Experimental results show that the proposed system detects the motion of vehicles in real-time.

  • PDF

Monochromatic Image Analysis of Elastohydrodynamic Lubrication Film Thickness by Fringe Intensity Computation

  • Jang, Siyoul
    • Journal of Mechanical Science and Technology
    • /
    • v.17 no.11
    • /
    • pp.1704-1713
    • /
    • 2003
  • Point contact film thickness in elastohydrodynamic lubrication (EHL) is analyzed by image processing method for the images from an optical interferometer with monochromatic incident light. Interference between the reflected lights both on half mirror Cr coating of glass disk and on super finished ball makes circular fringes depending on the contact conditions such as sliding velocity, applied load, viscosity-pressure characteristics and viscosity of lubricant under ambient pressure. In this situation the film thickness is regarded as the difference of optical paths between those reflected lights, which make dark and bright fringes with monochromatic incident light. The film thickness is computed by numbering the dark and bright fringe orders and the intensity (gray scale image) in each fringe regime is mapped to the corresponding film thickness. In this work, we developed a measuring technique for EHL film thickness by dividing the image patterns into two typical types under the condition of monochromatic incident light. During the image processing, the captured image is converted into digitally formatted data over the contact area without any loss of the image information of interferogram and it is also interpreted with consistency regardless of the observer's experimental experience. It is expected that the developed image processing method will provide a valuable basis to develop the image processing technique for color fringes, which is generally used for the measurement of relatively thin films in higher resolution.

The Performance Evaluation of Method to Process Nearest neighbor Queries Using an Optimal Search Distance (최적탐색거리를 이용한 최소근접질의 처리 방법의 성능 평가)

  • Seon, Hwi-Jun;Kim, Hong-Gi
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.1
    • /
    • pp.32-41
    • /
    • 1999
  • In spatial database system, the nearest neighbor query occurs frequently and requires the processing cost higher than other spatial queries do. The number of nodes to be searched in the index can be minimized for optimizing the cost of processing the nearest neighbor query. The optimal search distance is pr9posed for the measurement of a search distance to accurately select the nodes which will be searched in the nearest neighbor query. In this paper, we prove properties of the optimal search distance in N-dimensional. We show through experiments that the performance of query processing of our method is superior to other method using maximum search distance.

  • PDF

The Processing Method of Nearest Neighbor Queries Considering a Circular Location Property of Object (객체의 순환적 위치속성을 고려한 최대근접질의의 처리방법)

  • Seon, Hwi-Joon
    • Journal of Korea Spatial Information System Society
    • /
    • v.11 no.4
    • /
    • pp.85-88
    • /
    • 2009
  • In multimedia database systems, the nearest neighbor Query occurs frequently and requires the processing cost higher than other spatial Queries do. It needs the measurement of search distance that the number of searched nodes and the computation time in an index can be minimized for optimizing the cost of processing the nearest neighbor query. The circular location property of objects is considered to accurately select the nodes which will be searched in the nearest neighbor query. In this paper, we propose the processing method of nearest neighbor queries be considered a circular location property of object where the search space consists of a circular domain and show its characteristics. The proposed method uses the circular minimum distance and the circular optimal distance, the search measurement for optimizing the processing cost of nearest neighbor queries.

  • PDF

On Post-Processing of Coded Images by Using the Narrow Quantization Constraint (협 양자화 제약 조건을 이용한 부호화된 영상의 후처리)

  • 박섭형;김동식;이상훈
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.22 no.4
    • /
    • pp.648-661
    • /
    • 1997
  • This paper presents a new method for post-processing of coded images based upon the low-pass filtering followed by the projection onto the NQCS (narrow quantization constraint set). We also investigate how the proposed method works on JPEG-coded real images. The starting point of the QCS-based post-processing techniques is the centroid of the QCS, where the original image belongs. The low-pass filtering followed by the projection onto the QCS makes the images lie on the boundary of the QCS. It is likely that, however, the original image is inside the QCS. Hence projection onto the NQCS gives a lower MSE (mean square error) than does the projection onto the QCS. Simulation results show that setting the narrowing coefficients of the NQCS to be 0.2 yields the best performance in most cases. Even though the JPEG-coded image is low-pass filtered and projected onto the NQCS repeatedly, there is no guarantee that the resultant image has a lower MSE and goes closer to the original image. Thus only one iteration is sufficient for the post-processing of the coded images. This is interesting because the main drawback of the iterative post-processing techniques is the heavy computational burden. The single iteration method reduces the computational burden and gives us an easy way to implement the real time VLSI post-processor.

  • PDF

A Study on the Development of Automatic Ship Berthing System (선박 자동접안시스템 구축을 위한 기초연구)

  • Kim, Y.B.;Choi, Y.W.;Chae, G.H.
    • Journal of Power System Engineering
    • /
    • v.10 no.4
    • /
    • pp.139-146
    • /
    • 2006
  • In this paper vector code correlation(VCC) method and an algorithm to promote the image processing performance in building an effective measurement system using cameras are described for automatically berthing and controlling the ship equipped with side thrusters. In order to realize automatic ship berthing, it is indispensable that the berthing assistant system on the ship should continuously trace a target in the berth to measure the distance to the target and the ship attitude, such that we can make the ship move to the specified location. The considered system is made up of 4 apparatuses compounded from a CCD camera, a camera direction controller, a popular PC with a built in image processing board and a signal conversion unit connected to parallel port of the PC. The object of this paper is to reduce the image processing time so that the berthing system is able to ensure the safety schedule against risks during approaching to the berth. It could be achieved by composing the vector code image to utilize the gradient of an approximated plane found with the brightness of pixels forming a certain region in an image and verifying the effectiveness on a commonly used PC. From experimental results, it is clear that the proposed method can be applied to the measurement system for automatic ship berthing and has the image processing time of fourfold as compared with the typical template matching method.

  • PDF