• Title/Summary/Keyword: analysis of algorithms

Search Result 3,548, Processing Time 0.035 seconds

Dimensional Quality Assessment for Assembly Part of Prefabricated Steel Structures Using a Stereo Vision Sensor (스테레오 비전 센서 기반 프리팹 강구조물 조립부 형상 품질 평가)

  • Jonghyeok Kim;Haemin Jeon
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.37 no.3
    • /
    • pp.173-178
    • /
    • 2024
  • This study presents a technique for assessing the dimensional quality of assembly parts in Prefabricated Steel Structures (PSS) using a stereo vision sensor. The stereo vision system captures images and point cloud data of the assembly area, followed by applying image processing algorithms such as fuzzy-based edge detection and Hough transform-based circular bolt hole detection to identify bolt hole locations. The 3D center positions of each bolt hole are determined by correlating 3D real-world position information from depth images with the extracted bolt hole positions. Principal Component Analysis (PCA) is then employed to calculate coordinate axes for precise measurement of distances between bolt holes, even when the sensor and structure orientations differ. Bolt holes are sorted based on their 2D positions, and the distances between sorted bolt holes are calculated to assess the assembly part's dimensional quality. Comparison with actual drawing data confirms measurement accuracy with an absolute error of 1mm and a relative error within 4% based on median criteria.

Accuracy Evaluation of Supervised Classification by Using Morphological Attribute Profiles and Additional Band of Hyperspectral Imagery (초분광 영상의 Morphological Attribute Profiles와 추가 밴드를 이용한 감독분류의 정확도 평가)

  • Park, Hong Lyun;Choi, Jae Wan
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.25 no.1
    • /
    • pp.9-17
    • /
    • 2017
  • Hyperspectral imagery is used in the land cover classification with the principle component analysis and minimum noise fraction to reduce the data dimensionality and noise. Recently, studies on the supervised classification using various features having spectral information and spatial characteristic have been carried out. In this study, principle component bands and normalized difference vegetation index(NDVI) was utilized in the supervised classification for the land cover classification. To utilize additional information not included in the principle component bands by the hyperspectral imagery, we tried to increase the classification accuracy by using the NDVI. In addition, the extended attribute profiles(EAP) generated using the morphological filter was used as the input data. The random forest algorithm, which is one of the representative supervised classification, was used. The classification accuracy according to the application of various features based on EAP was compared. Two areas was selected in the experiments, and the quantitative evaluation was performed by using reference data. The classification accuracy of the proposed algorithm showed the highest classification accuracy of 85.72% and 91.14% compared with existing algorithms. Further research will need to develop a supervised classification algorithm and additional input datasets to improve the accuracy of land cover classification using hyperspectral imagery.

Analysis of Network Dynamics from Annals of the Chosun Dynasty (조선왕조실록 네트워크의 동적 변화 분석)

  • Kim, Hak Yong;Kim, Hak Bong
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.9
    • /
    • pp.529-537
    • /
    • 2014
  • To establish a foundation to objectively interpret Chosun history, we construct people network of the Chosun dynasty. The network shows scale free network properties as if most social networks do. The people network is composed of 1,379 nodes and 3,874 links and its diameter is 14. To analysis of the network dynamics, whole network that is composed of 27 king networks were constructed by adding the first king, Taejo network to the second king, Jeongjong network and then continuously adding the next king networks. Interestingly, betweenness and closeness centralities were gradually decreased but stress centrality was drastically increased. These results indicate that information flow is gradually slowing and hub node position is more centrally oriented as growing the network. To elucidate key persons from the network, k-core and MCODE algorithms that can extract core or module structures from whole network were employed. It is a possible to obtain new insight and hidden information by analyzing network dynamics. Due to lack of the dynamic interacting data, there is a limit for network dynamic research. In spite of using concise data, this research provides us a possibility that annals of the Chosun dynasty are very useful historical data for analyzing network dynamics.

Analysis of Block FEC Symbol Size's Effect On Transmission Efficiency and Energy Consumption over Wireless Sensor Networks (무선 센서 네트워크에서 전송 효율과 에너지 소비에 대한 블록 FEC 심볼 크기 영향 분석)

  • Ahn, Jong-Suk;Yoon, Jong-Hyuk;Lee, Young-Su
    • The KIPS Transactions:PartC
    • /
    • v.13C no.7 s.110
    • /
    • pp.803-812
    • /
    • 2006
  • This paper analytically evaluates the FEC(Forward Error Correction) symbol size's effect on the performance and energy consumption of 802.11 protocol with the block FEC algorithm over WSN(Wireless Sensor Network). Since the basic recovery unit of block FEC algorithms is symbols not bits, the FEC symbol size affects the packet correction rate even with the same amount of FEC check bits over a given WSN channel. Precisely, when the same amount of FEC check bits are allocated, the small-size symbols are effective over channels with frequent short bursts of propagation errors while the large ones are good at remedying the long rare bursts. To estimate the effect of the FEC symbol site, the paper at first models the WSN channel with Gilbert model based on real packet traces collected over TIP50CM sensor nodes and measures the energy consumed for encoding and decoding the RS (Reed-Solomon) code with various symbol sizes. Based on the WSN channel model and each RS code's energy expenditure, it analytically calculates the transmission efficiency and power consumption of 802.11 equipped with RS code. The computational analysis combined with real experimental data shows that the RS symbol size makes a difference of up to 4.2% in the transmission efficiency and 35% in energy consumption even with the same amount of FEC check bits.

Time-series Change Analysis of Quarry using UAV and Aerial LiDAR (UAV와 LiDAR를 활용한 토석채취지의 시계열 변화 분석)

  • Dong-Hwan Park;Woo-Dam Sim
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.27 no.2
    • /
    • pp.34-44
    • /
    • 2024
  • Recently, due to abnormal climate caused by climate change, natural disasters such as floods, landslides, and soil outflows are rapidly increasing. In Korea, more than 63% of the land is vulnerable to slope disasters due to the geographical characteristics of mountainous areas, and in particular, Quarry mines soil and rocks, so there is a high risk of landslides not only inside the workplace but also outside.Accordingly, this study built a DEM using UAV and aviation LiDAR for monitoring the quarry, conducted a time series change analysis, and proposed an optimal DEM construction method for monitoring the soil collection site. For DEM construction, UAV and LiDAR-based Point Cloud were built, and the ground was extracted using three algorithms: Aggressive Classification (AC), Conservative Classification (CC), and Standard Classification (SC). UAV and LiDAR-based DEM constructed according to the algorithm evaluated accuracy through comparison with digital map-based DEM.

Post-filtering in Low Bit Rate Moving Picture Coding, and Subjective and Objective Evaluation of Post-filtering (저 전송률 동화상 압축에서 후처리 방법 및 후처리 방법의 주관적 객관적 평가)

  • 이영렬;김윤수;박현욱
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.24 no.8B
    • /
    • pp.1518-1531
    • /
    • 1999
  • The reconstructed images from highly compressed MPEG or H.263 data have noticeable image degradations, such as blocking artifacts near the block boundaries, corner outliers at cross points of blocks, and ringing noise near image edges, because the MPEG or H.263 quantizes the transformed coefficients of 8$\times$8 pixel blocks. A post-processing algorithm has been proposed by authors to reduce quantization effects, such as blocking artifacts, corner outliers, and ringing noise, in MPEG-decompressed images. Our signal-adaptive post-processing algorithm reduces the quantization effects adaptively by using both spatial frequency and temporal information extracted from the compressed data. The blocking artifacts are reduced by one-dimensional (1-D) horizontal and vertical low pass filtering (LPF), and the ringing noise is reduced by two-dimensional (2-D) signal-adaptive filtering (SAF). A comparison study of the subjective quality evaluation using modified single stimulus method (MSSM), the objective quality evaluation (PSNR) and the computation complexity analysis between the signal-adaptive post-processing algorithm and the MPEG-4 VM (Verification Model) post-processing algorithm is performed by computer simulation with several MPEG-4 image sequences. According to the comparison study, the subjective image qualities of both algorithms are similar, whereas the PSNR and the comparison complexity analysis of the signal-adaptive post-processing algorithm shows better performance than the VM post-processing algorithm.

  • PDF

Formation Estimation of Shaly Sandstone Reservoir using Joint Inversion from Well Logging Data (복합역산을 이용한 물리검층자료로부터의 셰일성 사암 저류층의 지층 평가)

  • Choi, Yeonjin;Chung, Woo-Keen;Ha, Jiho;Shin, Sung-ryul
    • Geophysics and Geophysical Exploration
    • /
    • v.22 no.1
    • /
    • pp.1-11
    • /
    • 2019
  • Well logging technologies are used to measure the physical properties of reservoirs through boreholes. These technologies have been utilized to understand reservoir characteristics, such as porosity, fluid saturation, etc., using equations based on rock physics models. The analysis of well logs is performed by selecting a reliable rock physics model adequate for reservoir conditions or characteristics, comparing the results using the Archie's equation or simandoux method, and determining the most feasible reservoir properties. In this study, we developed a joint inversion algorithm to estimate physical properties in shaly sandstone reservoirs based on the pre-existing algorithm for sandstone reservoirs. For this purpose, we proposed a rock physics model with respect to shale volume, constructed the Jacobian matrix, and performed the sensitivity analysis for understanding the relationship between well-logging data and rock properties. The joint inversion algorithm was implemented by adopting the least-squares method using probabilistic approach. The developed algorithm was applied to the well-logging data obtained from the Colony gas sandstone reservoir. The results were compared with the simandox method and the joint inversion algorithms of sand stone reservoirs.

Impedance-based Long-term Structural Health Monitoring for Jacket-type Tidal Current Power Plant Structure in Temperature and Load Changes (온도 및 하중 영향을 고려한 임피던스 기반 조류발전용 재킷 구조물의 장기 건전성 모니터링)

  • Min, Jiyoung;Kim, Yucheong;Yun, Chung-Bang;Yi, Jin-Hak
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.31 no.5A
    • /
    • pp.351-360
    • /
    • 2011
  • Jacket-type offshore structures are always exposed to severe environmental conditions such as salt, high speed of current, wave, and wind compared with other onshore structures. In spite of the importance of maintaining the structural integrity for offshore structure, there are few cases to apply structural health monitoring (SHM) system in practice. The impedance-based SHM is a kind of local SHM techniques and to date, numerous techniques and algorithms have been proposed for local SHM of real-scale structures. However, it still requires a significant challenge for practical applications to compensate unknown environmental effects and to extract only damage features from impedance signals. In this study, the impedance-based SHM was carried out on a 1/20-scaled model of an Uldolmok current power plant structure under changes in temperature and transverse loadings. Principal component analysis (PCA) was applied using conventional damage index to eliminate principal components sensitive to environmental change. It was found that the proposed PCA-base approach is an effective tool for long-term SHM under significant environmental changes.

Parameter Analysis to Predict Cervical Spine Injury on Motor Vehicle Accidents (탑승자 교통사고에서 경추손상 판단을 위한 중증도 요인 분석)

  • Lee, Hee Young;Youk, Hyun;Kong, Joon Seok;Kang, Chan Young;Sung, Sil;Lee, Jung Hun;Kim, Ho Jung;Kim, Sang Chul;Choo, Yeon Il;Jeon, Hyeok Jin;Park, Jong Chan;Choi, Ji Hun;Lee, Kang Hyun
    • Journal of Auto-vehicle Safety Association
    • /
    • v.10 no.3
    • /
    • pp.20-26
    • /
    • 2018
  • It was a pilot study for developing an algorithm to determine the presence or absence of cervical spine injury by analyzing the severity factor of the patients in motor vehicle occupant accidents. From August 2012 to October 2016, we used the KIDAS database, called as Korean In-Depth Accident Study database, collected from three regional emergency centers. We analyzed the general characteristics with several factors. Moreover, cervical spine injury patients were divided into two groups: Group 1 for from Quebec Task Force (hereinafter 'QTF') grade 0 to 1, and group 2 for from QTF grade 2 to 4. The score was assigned according to the distribution ratio of cervical spine injured patients compared to the total injured patients, and the cut-off value was derived from the total score by summation of the assigned score of each factors. 987 patients (53.0%) had no cervical spine injuries and 874 patients (47.0%) had cervical spine injuries. QTF grade 2 was found in 171 patients (9.2%) with musculoskeletal pain, QTF grade 3 was found in 38 patients (2.0%) with spinal cord injuries, and QTF grade 4 was found in 119 patients (6.4%) with dislocation or fracture, respectively. We selected the statistically significant factors, which could be affected the cervical spine injury, like the collision direction, the seating position, the deformation extent, the vehicle type and the frontal airbag deployment. Total score, summation of the assigned each factors, 10 was presented as a cut-off value to determine the cervical spine injury. In this study, it was meaningful as a pilot study to develop algorithms by selecting limited influence factors and proposing cut-off value to determine cervical spine injury. However, since the number of data samples was too small, additional data collection and influencing factor analysis should be performed to develop a more delicate algorithm.

A New Similarity Measure for Categorical Attribute-Based Clustering (범주형 속성 기반 군집화를 위한 새로운 유사 측도)

  • Kim, Min;Jeon, Joo-Hyuk;Woo, Kyung-Gu;Kim, Myoung-Ho
    • Journal of KIISE:Databases
    • /
    • v.37 no.2
    • /
    • pp.71-81
    • /
    • 2010
  • The problem of finding clusters is widely used in numerous applications, such as pattern recognition, image analysis, market analysis. The important factors that decide cluster quality are the similarity measure and the number of attributes. Similarity measures should be defined with respect to the data types. Existing similarity measures are well applicable to numerical attribute values. However, those measures do not work well when the data is described by categorical attributes, that is, when no inherent similarity measure between values. In high dimensional spaces, conventional clustering algorithms tend to break down because of sparsity of data points. To overcome this difficulty, a subspace clustering approach has been proposed. It is based on the observation that different clusters may exist in different subspaces. In this paper, we propose a new similarity measure for clustering of high dimensional categorical data. The measure is defined based on the fact that a good clustering is one where each cluster should have certain information that can distinguish it with other clusters. We also try to capture on the attribute dependencies. This study is meaningful because there has been no method to use both of them. Experimental results on real datasets show clusters obtained by our proposed similarity measure are good enough with respect to clustering accuracy.