• Title/Summary/Keyword: 거리행렬

Search Result 207, Processing Time 0.027 seconds

On Slimming down the Functions Room of Light Rail Transit Stations by Utilizing an Enhanced DSM Method (개선된 DSM 기법을 통한 경전철 정거장 기능실의 슬림화에 관한 연구)

  • Kim, Joo-Uk;Park, Kee-Jun;Kim, Young-Min;Lee, Jae-Chon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.2
    • /
    • pp.927-939
    • /
    • 2015
  • It appears that the rapid advance in technology has allowed to broaden the variety of rail systems technology, thereby fostering new business opportunity in rail industry. The direction of rail systems operations is mainly two fold. In one direction, long distance operations between mega cities are pursued with help of high speed trains under development. In the other case, relatively short distance operations for covering intra-city or suburban area are becoming popular. A good example of the latter case is light rail transit (LRT) systems. Due to the short distance operation, it is thus expected that both the development and operation cost for LRT systems be reduced to some extent. The cost reduction desired in there can be gained by scaling down the sizes of both the trains and stations as compared to those of normal rail systems. However, it is not well known how the LRT stations can be scaled down. The objective of this paper is to study on how to slim down the stations (particularly, the functions room) of LRT systems. To achieve the objective, an approach is studied based on a modified method of design structure matrix (DSM). Specifically, using the enhanced DSM method, an integrated architecture is developed for the functions room, in which equipments are housed to perform the functions of electricity, signaling, and communication for LRT stations. The use of the result indicates that the desired reduction can be obtained with the approach taken in the paper.

A Study on the Improvement of UAV based 3D Point Cloud Spatial Object Location Accuracy using Road Information (도로정보를 활용한 UAV 기반 3D 포인트 클라우드 공간객체의 위치정확도 향상 방안)

  • Lee, Jaehee;Kang, Jihun;Lee, Sewon
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.5_1
    • /
    • pp.705-714
    • /
    • 2019
  • Precision positioning is necessary for various use of high-resolution UAV images. Basically, GCP is used for this purpose, but in case of emergency situations or difficulty in selecting GCPs, the data shall be obtained without GCPs. This study proposed a method of improving positional accuracy for x, y coordinate of UAV based 3 dimensional point cloud data generated without GCPs. Road vector file by the public data (Open Data Portal) was used as reference data for improving location accuracy. The geometric correction of the 2 dimensional ortho-mosaic image was first performed and the transform matrix produced in this process was adopted to apply to the 3 dimensional point cloud data. The straight distance difference of 34.54 m before the correction was reduced to 1.21 m after the correction. By confirming that it is possible to improve the location accuracy of UAV images acquired without GCPs, it is expected to expand the scope of use of 3 dimensional spatial objects generated from point cloud by enabling connection and compatibility with other spatial information data.

Development of robot calibration method based on 3D laser scanning system for Off-Line Programming (오프라인 프로그래밍을 위한 3차원 레이저 스캐닝 시스템 기반의 로봇 캘리브레이션 방법 개발)

  • Kim, Hyun-Soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.3
    • /
    • pp.16-22
    • /
    • 2019
  • Off-line programming and robot calibration through simulation are essential when setting up a robot in a robot automation production line. In this study, we developed a new robot calibration method to match the CAD data of the production line with the measurement data on the site using 3D scanner. The proposed method calibrates the robot using 3D point cloud data through Iterative Closest Point algorithm. Registration is performed in three steps. First, vertices connected by three planes are extracted from CAD data as feature points for registration. Three planes are reconstructed from the scan point data located around the extracted feature points to generate corresponding feature points. Finally, the transformation matrix is calculated by minimizing the distance between the feature points extracted through the ICP algorithm. As a result of applying the software to the automobile welding robot installation, the proposed method can calibrate the required accuracy to within 1.5mm and effectively shorten the set-up time, which took 5 hours per robot unit, to within 40 minutes. By using the developed system, it is possible to shorten the OLP working time of the car body assembly line, shorten the precision teaching time of the robot, improve the quality of the produced product and minimize the defect rate.

Finite Element Method Modeling for Individual Malocclusions: Development and Application of the Basic Algorithm (유한요소법을 이용한 환자별 교정시스템 구축의 기초 알고리즘 개발과 적용)

  • Shin, Jung-Woog;Nahm, Dong-Seok;Kim, Tae-Woo;Lee, Sung Jae
    • The korean journal of orthodontics
    • /
    • v.27 no.5 s.64
    • /
    • pp.815-824
    • /
    • 1997
  • The purpose of this study is to develop the basic algorithm for the finite element method modeling of individual malocclusions. Usually, a great deal of time is spent in preprocessing. To reduce the time required, we developed a standardized procedure for measuring the position of each tooth and a program to automatically preprocess. The following procedures were carried to complete this study. 1. Twenty-eight teeth morphologies were constructed three-dimensionally for the finite element analysis and saved as separate files. 2. Standard brackets were attached so that the FA points coincide with the center of the brackets. 3. The study model of a patient was made. 4. Using the study model, the crown inclination, angulation, and the vertical distance from the tip of a tooth was measured by using specially designed tools. 5. The arch form was determined from a picture of the model with an image processing technique. 6. The measured data were input as a rotational matrix. 7. The program provides an output file containing the necessary information about the three-dimensional position of teeth, which is applicable to several finite element programs commonly used. The program for a basic algorithm was made with Turbo-C and the subsequent outfile was applied to ANSYS. This standardized model measuring procedure and the program reduce the time required, especially for preprocessing and can be applied to other malocclusions easily.

  • PDF

Development of Dose Planning System for Brachytherapy with High Dose Rate Using Ir-192 Source (고선량률 강내조사선원을 이용한 근접조사선량계획전산화 개발)

  • Choi Tae Jin;Yei Ji Won;Kim Jin Hee;Kim OK;Lee Ho Joon;Han Hyun Soo
    • Radiation Oncology Journal
    • /
    • v.20 no.3
    • /
    • pp.283-293
    • /
    • 2002
  • Purpose : A PC based brachytherapy planning system was developed to display dose distributions on simulation images by 2D isodose curve including the dose profiles, dose-volume histogram and 30 dose distributions. Materials and Methods : Brachytherapy dose planning software was developed especially for the Ir-192 source, which had been developed by KAERI as a substitute for the Co-60 source. The dose computation was achieved by searching for a pre-computed dose matrix which was tabulated as a function of radial and axial distance from a source. In the computation process, the effects of the tissue scattering correction factor and anisotropic dose distributions were included. The computed dose distributions were displayed in 2D film image including the profile dose, 3D isodose curves with wire frame forms and dosevolume histogram. Results : The brachytherapy dose plan was initiated by obtaining source positions on the principal plane of the source axis. The dose distributions in tissue were computed on a $200\times200\;(mm^2)$ plane on which the source axis was located at the center of the plane. The point doses along the longitudinal axis of the source were $4.5\~9.0\%$ smaller than those on the radial axis of the plane, due to the anisotropy created by the cylindrical shape of the source. When compared to manual calculation, the point doses showed $1\~5\%$ discrepancies from the benchmarking plan. The 2D dose distributions of different planes were matched to the same administered isodose level in order to analyze the shape of the optimized dose level. The accumulated dose-volume histogram, displayed as a function of the percentage volume of administered minimum dose level, was used to guide the volume analysis. Conclusion : This study evaluated the developed computerized dose planning system of brachytherapy. The dose distribution was displayed on the coronal, sagittal and axial planes with the dose histogram. The accumulated DVH and 3D dose distributions provided by the developed system may be useful tools for dose analysis in comparison with orthogonal dose planning.

A Passport Recognition and face Verification Using Enhanced fuzzy ART Based RBF Network and PCA Algorithm (개선된 퍼지 ART 기반 RBF 네트워크와 PCA 알고리즘을 이용한 여권 인식 및 얼굴 인증)

  • Kim Kwang-Baek
    • Journal of Intelligence and Information Systems
    • /
    • v.12 no.1
    • /
    • pp.17-31
    • /
    • 2006
  • In this paper, passport recognition and face verification methods which can automatically recognize passport codes and discriminate forgery passports to improve efficiency and systematic control of immigration management are proposed. Adjusting the slant is very important for recognition of characters and face verification since slanted passport images can bring various unwanted effects to the recognition of individual codes and faces. Therefore, after smearing the passport image, the longest extracted string of characters is selected. The angle adjustment can be conducted by using the slant of the straight and horizontal line that connects the center of thickness between left and right parts of the string. Extracting passport codes is done by Sobel operator, horizontal smearing, and 8-neighborhood contour tracking algorithm. The string of codes can be transformed into binary format by applying repeating binary method to the area of the extracted passport code strings. The string codes are restored by applying CDM mask to the binary string area and individual codes are extracted by 8-neighborhood contour tracking algerian. The proposed RBF network is applied to the middle layer of RBF network by using the fuzzy logic connection operator and proposing the enhanced fuzzy ART algorithm that dynamically controls the vigilance parameter. The face is authenticated by measuring the similarity between the feature vector of the facial image from the passport and feature vector of the facial image from the database that is constructed with PCA algorithm. After several tests using a forged passport and the passport with slanted images, the proposed method was proven to be effective in recognizing passport codes and verifying facial images.

  • PDF

An Intelligence Support System Research on KTX Rolling Stock Failure Using Case-based Reasoning and Text Mining (사례기반추론과 텍스트마이닝 기법을 활용한 KTX 차량고장 지능형 조치지원시스템 연구)

  • Lee, Hyung Il;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.47-73
    • /
    • 2020
  • KTX rolling stocks are a system consisting of several machines, electrical devices, and components. The maintenance of the rolling stocks requires considerable expertise and experience of maintenance workers. In the event of a rolling stock failure, the knowledge and experience of the maintainer will result in a difference in the quality of the time and work to solve the problem. So, the resulting availability of the vehicle will vary. Although problem solving is generally based on fault manuals, experienced and skilled professionals can quickly diagnose and take actions by applying personal know-how. Since this knowledge exists in a tacit form, it is difficult to pass it on completely to a successor, and there have been studies that have developed a case-based rolling stock expert system to turn it into a data-driven one. Nonetheless, research on the most commonly used KTX rolling stock on the main-line or the development of a system that extracts text meanings and searches for similar cases is still lacking. Therefore, this study proposes an intelligence supporting system that provides an action guide for emerging failures by using the know-how of these rolling stocks maintenance experts as an example of problem solving. For this purpose, the case base was constructed by collecting the rolling stocks failure data generated from 2015 to 2017, and the integrated dictionary was constructed separately through the case base to include the essential terminology and failure codes in consideration of the specialty of the railway rolling stock sector. Based on a deployed case base, a new failure was retrieved from past cases and the top three most similar failure cases were extracted to propose the actual actions of these cases as a diagnostic guide. In this study, various dimensionality reduction measures were applied to calculate similarity by taking into account the meaningful relationship of failure details in order to compensate for the limitations of the method of searching cases by keyword matching in rolling stock failure expert system studies using case-based reasoning in the precedent case-based expert system studies, and their usefulness was verified through experiments. Among the various dimensionality reduction techniques, similar cases were retrieved by applying three algorithms: Non-negative Matrix Factorization(NMF), Latent Semantic Analysis(LSA), and Doc2Vec to extract the characteristics of the failure and measure the cosine distance between the vectors. The precision, recall, and F-measure methods were used to assess the performance of the proposed actions. To compare the performance of dimensionality reduction techniques, the analysis of variance confirmed that the performance differences of the five algorithms were statistically significant, with a comparison between the algorithm that randomly extracts failure cases with identical failure codes and the algorithm that applies cosine similarity directly based on words. In addition, optimal techniques were derived for practical application by verifying differences in performance depending on the number of dimensions for dimensionality reduction. The analysis showed that the performance of the cosine similarity was higher than that of the dimension using Non-negative Matrix Factorization(NMF) and Latent Semantic Analysis(LSA) and the performance of algorithm using Doc2Vec was the highest. Furthermore, in terms of dimensionality reduction techniques, the larger the number of dimensions at the appropriate level, the better the performance was found. Through this study, we confirmed the usefulness of effective methods of extracting characteristics of data and converting unstructured data when applying case-based reasoning based on which most of the attributes are texted in the special field of KTX rolling stock. Text mining is a trend where studies are being conducted for use in many areas, but studies using such text data are still lacking in an environment where there are a number of specialized terms and limited access to data, such as the one we want to use in this study. In this regard, it is significant that the study first presented an intelligent diagnostic system that suggested action by searching for a case by applying text mining techniques to extract the characteristics of the failure to complement keyword-based case searches. It is expected that this will provide implications as basic study for developing diagnostic systems that can be used immediately on the site.