• Title/Summary/Keyword: Processing Map

Search Result 1,473, Processing Time 0.024 seconds

A Preliminary Study on the Lamination Characteristics of Inconel 718 Superalloy on S45C Structural Steel using LENS Process (LENS 공정을 이용한 Inconel 718 초합금의 S45C 구조용강 위 적층 특성 고찰에 관한 기초 연구)

  • Kim, Hyun-Sik;Lee, Hyub;Ahn, Dong-Gyu
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.20 no.1
    • /
    • pp.16-24
    • /
    • 2021
  • A laser-engineered net shaping (LENS) process is a representative directed energy deposition process. Deposition characteristics of the LENS process are greatly dependent on the process parameters. The present paper preliminarily investigates deposition characteristics of Inconel 718 superalloy on S45C structural steel using a LENS process. The influence of process parameters, including the laser power and powder feed rate, on the characteristics of the bead formation and the dilution in the vicinity of the deposited region is examined through repeated experiments. A processing map and feasible deposition conditions are estimated from viewpoints of the aspect ratio, defect formation, and the dilution rate of the deposited bead. Finally, an appropriate deposition condition considering side angle, deposition ratio, and buy-to-fly (BTF) is predicted.

Accurate Detection of a Defective Area by Adopting a Divide and Conquer Strategy in Infrared Thermal Imaging Measurement

  • Jiangfei, Wang;Lihua, Yuan;Zhengguang, Zhu;Mingyuan, Yuan
    • Journal of the Korean Physical Society
    • /
    • v.73 no.11
    • /
    • pp.1644-1649
    • /
    • 2018
  • Aiming at infrared thermal images with different buried depth defects, we study a variety of image segmentation algorithms based on the threshold to develop global search ability and the ability to find the defect area accurately. Firstly, the iterative thresholding method, the maximum entropy method, the minimum error method, the Ostu method and the minimum skewness method are applied to image segmentation of the same infrared thermal image. The study shows that the maximum entropy method and the minimum error method have strong global search capability and can simultaneously extract defects at different depths. However none of these five methods can accurately calculate the defect area at different depths. In order to solve this problem, we put forward a strategy of "divide and conquer". The infrared thermal image is divided into several local thermal maps, with each map containing only one defect, and the defect area is calculated after local image processing of the different buried defects one by one. The results show that, under the "divide and conquer" strategy, the iterative threshold method and the Ostu method have the advantage of high precision and can accurately extract the area of different defects at different depths, with an error of less than 5%.

DP-LinkNet: A convolutional network for historical document image binarization

  • Xiong, Wei;Jia, Xiuhong;Yang, Dichun;Ai, Meihui;Li, Lirong;Wang, Song
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.5
    • /
    • pp.1778-1797
    • /
    • 2021
  • Document image binarization is an important pre-processing step in document analysis and archiving. The state-of-the-art models for document image binarization are variants of encoder-decoder architectures, such as FCN (fully convolutional network) and U-Net. Despite their success, they still suffer from three limitations: (1) reduced feature map resolution due to consecutive strided pooling or convolutions, (2) multiple scales of target objects, and (3) reduced localization accuracy due to the built-in invariance of deep convolutional neural networks (DCNNs). To overcome these three challenges, we propose an improved semantic segmentation model, referred to as DP-LinkNet, which adopts the D-LinkNet architecture as its backbone, with the proposed hybrid dilated convolution (HDC) and spatial pyramid pooling (SPP) modules between the encoder and the decoder. Extensive experiments are conducted on recent document image binarization competition (DIBCO) and handwritten document image binarization competition (H-DIBCO) benchmark datasets. Results show that our proposed DP-LinkNet outperforms other state-of-the-art techniques by a large margin. Our implementation and the pre-trained models are available at https://github.com/beargolden/DP-LinkNet.

Considerations for Developing a SLAM System for Real-time Remote Scanning of Building Facilities (건축물 실시간 원격 스캔을 위한 SLAM 시스템 개발 시 고려사항)

  • Kang, Tae-Wook
    • Journal of KIBIM
    • /
    • v.10 no.1
    • /
    • pp.1-8
    • /
    • 2020
  • In managing building facilities, spatial information is the basic data for decision making. However, the method of acquiring spatial information is not easy. In many cases, the site and drawings are often different due to changes in facilities and time after construction. In this case, the site data should be scanned to obtain spatial information. The scan data actually contains spatial information, which is a great help in making space related decisions. However, to obtain scan data, an expensive LiDAR (Light Detection and Ranging) device must be purchased, and special software for processing data obtained from the device must be available.Recently, SLAM (Simultaneous localization and mapping), an advanced map generation technology, has been spreading in the field of robotics. Using SLAM, 3D spatial information can be obtained quickly in real time without a separate matching process. This study develops and tests whether SLAM technology can be used to obtain spatial information for facility management. This draws considerations for developing a SLAM device for real-time remote scanning for facility management. However, this study focuses on the system development method that acquires spatial information necessary for facility management through SLAM technology. To this end, we develop a prototype, analyze the pros and cons, and then suggest considerations for developing a SLAM system.

Accuracy Analysis of Road Surveying and Construction Inspection of Underpass Section using Mobile Mapping System

  • Park, Joon Kyu;Um, Dae Yong
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.39 no.2
    • /
    • pp.103-111
    • /
    • 2021
  • MMS (Mobile Mapping System) is being used for HD (High Definition) map construction because it enables fast and accurate data construction, and it is receiving a lot of attention. However, research on the use of MMS in the construction field is insufficient. In this study, road surveying and inspection of construction structures were performed using MMS. Through data acquisition and processing using MMS, point cloud data for the study site was created, and the accuracy was evaluated by comparing with traditional surveying methods. The accuracy analysis results showed a maximum of 0.096m, 0.091m, and 0.093m in the X, Y, and H directions, respectively. Each RMSE was 0.012m, 0.015m, and 0.006m. These result satisfy the accuracy of topographic surveying in the general survey work regulation, indicating that construction surveying using MMS is possible. In addition, a 3D model was created using the design data for the underpass road, and the inspection was performed by comparing it with the MMS data. Through inspection results, deviations in construction can be visually confirmed for the entire underground roadway. The traditional method takes 6 hours for the 4.5km section of the target area, but MMS can significantly shorten the data acquisition time to 0.5 hours. Accurate 3D data is essential data as basic data for future smart construction. With MMS, you can increase the efficiency of construction sites with fast data collection and accuracy.

A Study on the Optimization of Convolution Operation Speed through FFT Algorithm (FFT 적용을 통한 Convolution 연산속도 향상에 관한 연구)

  • Lim, Su-Chang;Kim, Jong-Chan
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.11
    • /
    • pp.1552-1559
    • /
    • 2021
  • Convolution neural networks (CNNs) show notable performance in image processing and are used as representative core models. CNNs extract and learn features from large amounts of train dataset. In general, it has a structure in which a convolution layer and a fully connected layer are stacked. The core of CNN is the convolution layer. The size of the kernel used for feature extraction and the number that affect the depth of the feature map determine the amount of weight parameters of the CNN that can be learned. These parameters are the main causes of increasing the computational complexity and memory usage of the entire neural network. The most computationally expensive components in CNNs are fully connected and spatial convolution computations. In this paper, we propose a Fourier Convolution Neural Network that performs the operation of the convolution layer in the Fourier domain. We work on modifying and improving the amount of computation by applying the fast fourier transform method. Using the MNIST dataset, the performance was similar to that of the general CNN in terms of accuracy. In terms of operation speed, 7.2% faster operation speed was achieved. An average of 19% faster speed was achieved in experiments using 1024x1024 images and various sizes of kernels.

Research Trend Analysis Using Bibliographic Information and Citations of Cloud Computing Articles: Application of Social Network Analysis (클라우드 컴퓨팅 관련 논문의 서지정보 및 인용정보를 활용한 연구 동향 분석: 사회 네트워크 분석의 활용)

  • Kim, Dongsung;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.195-211
    • /
    • 2014
  • Cloud computing services provide IT resources as services on demand. This is considered a key concept, which will lead a shift from an ownership-based paradigm to a new pay-for-use paradigm, which can reduce the fixed cost for IT resources, and improve flexibility and scalability. As IT services, cloud services have evolved from early similar computing concepts such as network computing, utility computing, server-based computing, and grid computing. So research into cloud computing is highly related to and combined with various relevant computing research areas. To seek promising research issues and topics in cloud computing, it is necessary to understand the research trends in cloud computing more comprehensively. In this study, we collect bibliographic information and citation information for cloud computing related research papers published in major international journals from 1994 to 2012, and analyzes macroscopic trends and network changes to citation relationships among papers and the co-occurrence relationships of key words by utilizing social network analysis measures. Through the analysis, we can identify the relationships and connections among research topics in cloud computing related areas, and highlight new potential research topics. In addition, we visualize dynamic changes of research topics relating to cloud computing using a proposed cloud computing "research trend map." A research trend map visualizes positions of research topics in two-dimensional space. Frequencies of key words (X-axis) and the rates of increase in the degree centrality of key words (Y-axis) are used as the two dimensions of the research trend map. Based on the values of the two dimensions, the two dimensional space of a research map is divided into four areas: maturation, growth, promising, and decline. An area with high keyword frequency, but low rates of increase of degree centrality is defined as a mature technology area; the area where both keyword frequency and the increase rate of degree centrality are high is defined as a growth technology area; the area where the keyword frequency is low, but the rate of increase in the degree centrality is high is defined as a promising technology area; and the area where both keyword frequency and the rate of degree centrality are low is defined as a declining technology area. Based on this method, cloud computing research trend maps make it possible to easily grasp the main research trends in cloud computing, and to explain the evolution of research topics. According to the results of an analysis of citation relationships, research papers on security, distributed processing, and optical networking for cloud computing are on the top based on the page-rank measure. From the analysis of key words in research papers, cloud computing and grid computing showed high centrality in 2009, and key words dealing with main elemental technologies such as data outsourcing, error detection methods, and infrastructure construction showed high centrality in 2010~2011. In 2012, security, virtualization, and resource management showed high centrality. Moreover, it was found that the interest in the technical issues of cloud computing increases gradually. From annual cloud computing research trend maps, it was verified that security is located in the promising area, virtualization has moved from the promising area to the growth area, and grid computing and distributed system has moved to the declining area. The study results indicate that distributed systems and grid computing received a lot of attention as similar computing paradigms in the early stage of cloud computing research. The early stage of cloud computing was a period focused on understanding and investigating cloud computing as an emergent technology, linking to relevant established computing concepts. After the early stage, security and virtualization technologies became main issues in cloud computing, which is reflected in the movement of security and virtualization technologies from the promising area to the growth area in the cloud computing research trend maps. Moreover, this study revealed that current research in cloud computing has rapidly transferred from a focus on technical issues to for a focus on application issues, such as SLAs (Service Level Agreements).

Preliminary Study on the MR Temperature Mapping using Center Array-Sequencing Phase Unwrapping Algorithm (Center Array-Sequencing 위상펼침 기법의 MR 온도영상 적용에 관한 기초연구)

  • Tan, Kee Chin;Kim, Tae-Hyung;Chun, Song-I;Han, Yong-Hee;Choi, Ki-Seung;Lee, Kwang-Sig;Jun, Jae-Ryang;Eun, Choong-Ki;Mun, Chi-Woong
    • Investigative Magnetic Resonance Imaging
    • /
    • v.12 no.2
    • /
    • pp.131-141
    • /
    • 2008
  • Purpose : To investigate the feasibility and accuracy of Proton Resonance Frequency (PRF) shift based magnetic resonance (MR) temperature mapping utilizing the self-developed center array-sequencing phase unwrapping (PU) method for non-invasive temperature monitoring. Materials and Methods : The computer simulation was done on the PU algorithm for performance evaluation before further application to MR thermometry. The MR experiments were conducted in two approaches namely PU experiment, and temperature mapping experiment based on the PU technique with all the image postprocessing implemented in MATLAB. A 1.5T MR scanner employing a knee coil with $T2^*$ GRE (Gradient Recalled Echo) pulse sequence were used throughout the experiments. Various subjects such as water phantom, orange, and agarose gel phantom were used for the assessment of the self-developed PU algorithm. The MR temperature mapping experiment was initially attempted on the agarose gel phantom only with the application of a custom-made thermoregulating water pump as the heating source. Heat was generated to the phantom via hot water circulation whilst temperature variation was observed with T-type thermocouple. The PU program was implemented on the reconstructed wrapped phase images prior to map the temperature distribution of subjects. As the temperature change is directly proportional to the phase difference map, the absolute temperature could be estimated from the summation of the computed temperature difference with the measured ambient temperature of subjects. Results : The PU technique successfully recovered and removed the phase wrapping artifacts on MR phase images with various subjects by producing a smooth and continuous phase map thus producing a more reliable temperature map. Conclusion : This work presented a rapid, and robust self-developed center array-sequencing PU algorithm feasible for the application of MR temperature mapping according to the PRF phase shift property.

  • PDF

Role of p-38 MAP Kinase in apoptosis of hypoxia-induced osteoblasts (저산소 상태로 인한 조골세포 고사사기전에서 p-38 MAP kinase의 역할에 관한 연구)

  • Yoon, Jeong-Hyeon;Jeong, Ae-Jin;Kang, Kyung-Hwa;Kim, Sang-Cheol
    • The korean journal of orthodontics
    • /
    • v.33 no.3 s.98
    • /
    • pp.169-183
    • /
    • 2003
  • Tooth movement by orthodontic force effects great tissue changes within the periodontium, especially by shifting the blood flow in the pressure side and resulting in a hypoxic state of low oxygen tension. The aim of this study is to elucidate the possible mechanism of apoptosis in response to hypoxia in MC3T3El osteoblasts, the main cells in bone remodeling during orthodontic tooth movement. MC3T3El osteoblasts under hypoxic conditions ($2\%$ orygen) resulted in apoptosis in a time-dependent manner as estimated by DNA fragmentation assay and nuclear morphology stained with fluorescent dye, Hoechst 33258. Pretreatment with Z-VAD-FMK, a pancaspase inhibitor, or Z-DEVD-CHO, a specific caspase-3 inhibitor, completely suppressed the DNA ladder in response to hypoxia. An increase in caspase-3-like protease (DEVDase) activity was observed during apoptosis, but no caspase-1 activity (YVADase) was detected. To confirm what caspases are involved in apoptosis, Western blot analysis was performed using anti-caspase-3 or -6 antibodies. The 10-kDa protein, corresponding to the active products of caspase-3, and the 10-kDa protein of the active protein of caspase-6 were generated in hypoxia-challenged cells in which the processing of the full length form of caspase-3 and -6 was evident. While a time course similar to this caspase-3 and -6 activation was evident, hypoxic stress caused the cleavage of lamin A, which was typical of caspase-6 activity. In addition, the stress elicited the release of cytochrome c into the cytosol during apoptosis. Furthermore, we observed that pre-treatment with SB203580, a selective p38 mitogen activated protein kinase inhibitor, attenuated the hypoxia-induced apoptosis. The addition of SB203S80 suppressed caspase-3 and -6-like protease activity by hypoxia up to $50\%$. In contrast, PD98059 had no effect on the hypoxia-induced apoptosis. To confirm the involvement of MAP kinase, JNK/SAPK, ERK, or p38 kinase assay was performed. Although p38 MAPK was activated in response to hypoxic treatment, the other MAPK -JNK/SAPK or ERK- was either only modestly activated or not at all. These results suggest that p38 MAPK is involved in hypoxia-induced apoptosis in MC3T3El osteoblasts.

Development of a gridded crop growth simulation system for the DSSAT model using script languages (스크립트 언어를 사용한 DSSAT 모델 기반 격자형 작물 생육 모의 시스템 개발)

  • Yoo, Byoung Hyun;Kim, Kwang Soo;Ban, Ho-Young
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.20 no.3
    • /
    • pp.243-251
    • /
    • 2018
  • The gridded simulation of crop growth, which would be useful for shareholders and policy makers, often requires specialized computation tasks for preparation of weather input data and operation of a given crop model. Here we developed an automated system to allow for crop growth simulation over a region using the DSSAT (Decision Support System for Agrotechnology Transfer) model. The system consists of modules implemented using R and shell script languages. One of the modules has a functionality to create weather input files in a plain text format for each cell. Another module written in R script was developed for GIS data processing and parallel computing. The other module that launches the crop model automatically was implemented using the shell script language. As a case study, the automated system was used to determine the maximum soybean yield for a given set of management options in Illinois state in the US. The AgMERRA dataset, which is reanalysis data for agricultural models, was used to prepare weather input files during 1981 - 2005. It took 7.38 hours to create 1,859 weather input files for one year of soybean growth simulation in Illinois using a single CPU core. In contrast, the processing time decreased considerably, e.g., 35 minutes, when 16 CPU cores were used. The automated system created a map of the maturity group and the planting date that resulted in the maximum yield in a raster data format. Our results indicated that the automated system for the DSSAT model would help spatial assessments of crop yield at a regional scale.