• Title/Summary/Keyword: 블록 기반

Search Result 2,638, Processing Time 0.028 seconds

Graph Convolutional - Network Architecture Search : Network architecture search Using Graph Convolution Neural Networks (그래프 합성곱-신경망 구조 탐색 : 그래프 합성곱 신경망을 이용한 신경망 구조 탐색)

  • Su-Youn Choi;Jong-Youel Park
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.1
    • /
    • pp.649-654
    • /
    • 2023
  • This paper proposes the design of a neural network structure search model using graph convolutional neural networks. Deep learning has a problem of not being able to verify whether the designed model has a structure with optimized performance due to the nature of learning as a black box. The neural network structure search model is composed of a recurrent neural network that creates a model and a convolutional neural network that is the generated network. Conventional neural network structure search models use recurrent neural networks, but in this paper, we propose GC-NAS, which uses graph convolutional neural networks instead of recurrent neural networks to create convolutional neural network models. The proposed GC-NAS uses the Layer Extraction Block to explore depth, and the Hyper Parameter Prediction Block to explore spatial and temporal information (hyper parameters) based on depth information in parallel. Therefore, since the depth information is reflected, the search area is wider, and the purpose of the search area of the model is clear by conducting a parallel search with depth information, so it is judged to be superior in theoretical structure compared to GC-NAS. GC-NAS is expected to solve the problem of the high-dimensional time axis and the range of spatial search of recurrent neural networks in the existing neural network structure search model through the graph convolutional neural network block and graph generation algorithm. In addition, we hope that the GC-NAS proposed in this paper will serve as an opportunity for active research on the application of graph convolutional neural networks to neural network structure search.

Implementation of AI-based Object Recognition Model for Improving Driving Safety of Electric Mobility Aids (전동 이동 보조기기 주행 안전성 향상을 위한 AI기반 객체 인식 모델의 구현)

  • Je-Seung Woo;Sun-Gi Hong;Jun-Mo Park
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.23 no.3
    • /
    • pp.166-172
    • /
    • 2022
  • In this study, we photograph driving obstacle objects such as crosswalks, side spheres, manholes, braille blocks, partial ramps, temporary safety barriers, stairs, and inclined curb that hinder or cause inconvenience to the movement of the vulnerable using electric mobility aids. We develop an optimal AI model that classifies photographed objects and automatically recognizes them, and implement an algorithm that can efficiently determine obstacles in front of electric mobility aids. In order to enable object detection to be AI learning with high probability, the labeling form is labeled as a polygon form when building a dataset. It was developed using a Mask R-CNN model in Detectron2 framework that can detect objects labeled in the form of polygons. Image acquisition was conducted by dividing it into two groups: the general public and the transportation weak, and image information obtained in two areas of the test bed was secured. As for the parameter setting of the Mask R-CNN learning result, it was confirmed that the model learned with IMAGES_PER_BATCH: 2, BASE_LEARNING_RATE 0.001, MAX_ITERATION: 10,000 showed the highest performance at 68.532, so that the user can quickly and accurately recognize driving risks and obstacles.

A Study on Jointed Rock Mass Properties and Analysis Model of Numerical Simulation on Collapsed Slope (붕괴절토사면의 수치해석시 암반물성치 및 해석모델에 대한 고찰)

  • Koo, Ho-Bon;Kim, Seung-Hee;Kim, Seung-Hyun;Lee, Jung-Yeup
    • Journal of the Korean Geotechnical Society
    • /
    • v.24 no.5
    • /
    • pp.65-78
    • /
    • 2008
  • In case of cut-slopes or shallow-depth tunnels, sliding along with discontinuities or rotation could play a critical role in judging stability. Although numerical analysis is widely used to check the stability of these cut-slopes and shallow-depth tunnels in early design process, common analysis programs are based on continuum model. Performing continuum model analysis regarding discontinuities is possible by reducing overall strength of jointed rock mass. It is also possible by applying ubiquitous joint model to Mohr-Coulomb failure criteria. In numerical analysis of cut-slope, main geotechnical properties such as cohesion, friction angle and elastic modulus can be evaluated by empirical equations. This study tried to compare two main systems, RMR and GSI system by applying them to in-situ hazardous cut-slopes. In addition, this study applied ubiquitous joint model to simulation model with inputs derived by RMR and GSI system to compare with displacements obtained by in-situ monitoring. To sum up, numerical analysis mixed with GSI inputs and ubiquitous joint model proved to provide most reliable results which were similar to actual displacements and their patterns.

Real-Time Terrain Visualization with Hierarchical Structure (실시간 시각화를 위한 계층 구조 구축 기법 개발)

  • Park, Chan Su;Suh, Yong Cheol
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.29 no.2D
    • /
    • pp.311-318
    • /
    • 2009
  • Interactive terrain visualization is an important research area with applications in GIS, games, virtual reality, scientific visualization and flight simulators, besides having military use. This is a complex and challenging problem considering that some applications require precise visualizations of huge data sets at real-time rates. In general, the size of data sets makes rendering at real-time difficult since the terrain data cannot fit entirely in memory. In this paper, we suggest the effective Real-time LOD(level-of-detail) algorithm for displaying the huge terrain data and processing mass geometry. We used a hierarchy structure with $4{\times}4$ and $2{\times}2$ tiles for real-time rendering of mass volume DEM which acquired from Digital map, LiDAR, DTM and DSM. Moreover, texture mapping is performed to visualize realistically while displaying height data of normalized Giga Byte level with user oriented terrain information and creating hill shade map using height data to hierarchy tile structure of file type. Large volume of terrain data was transformed to LOD data for real time visualization. This paper show the new LOD algorithm for seamless visualization, high quality, minimize the data loss and maximize the frame speed.

Assessment of Applicability of CNN Algorithm for Interpretation of Thermal Images Acquired in Superficial Defect Inspection Zones (포장층 이상구간에서 획득한 열화상 이미지 해석을 위한 CNN 알고리즘의 적용성 평가)

  • Jang, Byeong-Su;Kim, YoungSeok;Kim, Sewon ;Choi, Hyun-Jun;Yoon, Hyung-Koo
    • Journal of the Korean Geotechnical Society
    • /
    • v.39 no.10
    • /
    • pp.41-48
    • /
    • 2023
  • The presence of abnormalities in the subgrade of roads poses safety risks to users and results in significant maintenance costs. In this study, we aimed to experimentally evaluate the temperature distributions in abnormal areas of subgrade materials using infrared cameras and analyze the data with machine learning techniques. The experimental site was configured as a cubic shape measuring 50 cm in width, length, and depth, with abnormal areas designated for water and air. Concrete blocks covered the upper part of the site to simulate the pavement layer. Temperature distribution was monitored over 23 h, from 4 PM to 3 PM the following day, resulting in image data and numerical temperature values extracted from the middle of the abnormal area. The temperature difference between the maximum and minimum values measured 34.8℃ for water, 34.2℃ for air, and 28.6℃ for the original subgrade. To classify conditions in the measured images, we employed the image analysis method of a convolutional neural network (CNN), utilizing ResNet-101 and SqueezeNet networks. The classification accuracies of ResNet-101 for water, air, and the original subgrade were 70%, 50%, and 80%, respectively. SqueezeNet achieved classification accuracies of 60% for water, 30% for air, and 70% for the original subgrade. This study highlights the effectiveness of CNN algorithms in analyzing subgrade properties and predicting subsurface conditions.

A Strategy of a Gap Block Design in the CFRP Double Roller to Minimize Defects during the Product Conveyance (제품 이송 시 결함 최소화를 위한 CFRP 이중 롤러의 Gap block 설계 전략)

  • Seung-Ji Yang;Young-june Park;Sung-Eun Kim;Jun-Geol Ahn;Hyun-Ik Yang
    • Composites Research
    • /
    • v.37 no.1
    • /
    • pp.7-14
    • /
    • 2024
  • Due to the structural characteristic of a double roller, the double roller can have various deformation behaviors depending on a gap block design, even if dimensions and loading conditions for the double roller are the same. Based on this feature, we propose a strategy for designing the gap block of the carbon-fiber reinforced plastic (CFRP) double roller to minimize defects (e.g., sagging and wrinkling), which can be raised during the product conveying process, with the pursue of the lightweight design. In the suggested strategy, analysis cases are first selected by considering main design parameters and engineering tolerances of the gap block, and then deformation behaviors of these selected cases are extracted using the finite element method (FEM). Here, to obtain the optimal gap block parameters that satisfy the purpose of this study, deformation deviations in the contact area are calculated and compared using the extracted deformation behaviors. Note that the contact area in this work is located between the product and the roller. As a result, through the design method of the gap block proposed in this work, it is possible to construct the CFRP double roller that can significantly decrease the defects without changing the overall sizes of the roller. A detailed method is suggested herein, and the results are evaluated in a numerical way.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Tectonic Structures and Hydrocarbon Potential in the Central Bransfield Basin, Antarctica (남극 브랜스필드 해협 중앙분지의 지체구조 및 석유부존 가능성)

  • Huh Sik;Kim Yeadong;Cheong Dae-Kyo;Jin Young Keun;Nam Sang Heon
    • The Korean Journal of Petroleum Geology
    • /
    • v.5 no.1_2 s.6
    • /
    • pp.9-15
    • /
    • 1997
  • The study area is located in the Central Bransfield Basin, Antarctica. To analyze the morphology of seafloor, structure of basement, and seismic stratigraphy of the sedimentary layers, we have acquired, processed, and interpreted the multi-channel seismic data. The northwest-southeastern back-arc extension dramatically changes seafloor morphology, volcanic and fault distribution, and basin structure along the spreading ridges. The northern continental shelf shows a narrow, steep topography. In contrast, the continental shelf or slope in the south, which is connected to the Antarctic Peninsula, has a gentle gradient. Volcanic activities resulted in the formation of large volcanos and basement highs near the spreading center, and small-scale volcanic diapirs on the shelf. A very long, continuous normal fault characterizes the northern shelf, whereas several basinward synthetic faults probably detach into the master fault in the south. Four transfer faults, the northwest-southeastern deep-parallel structures, controlled the complex distributions of the volcanos, normal faults, depocenters, and possibly hydrocarbon provinces in the study area. They have also deformed the basement structure and depositional pattern. Even though the Bransfield Basin was believed to be formed in the Late Cenozoic (about 4 Ma), the hydrocarbon potential may be very high due to thick sediment accumulation, high organic contents, high heat flow resulted from the active tectonics, and adequate traps.

  • PDF

A Complexity Reduction Method of MPEG-4 Audio Lossless Coding Encoder by Using the Joint Coding Based on Cross Correlation of Residual (여기신호의 상관관계 기반 joint coding을 이용한 MPEG-4 audio lossless coding 인코더 복잡도 감소 방법)

  • Cho, Choong-Sang;Kim, Je-Woo;Choi, Byeong-Ho
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.3
    • /
    • pp.87-95
    • /
    • 2010
  • Portable multi-media products which can service the highest audio-quality by using lossless audio codec has been released and the international lossless codecs, MPEG-4 audio lossless coding(ALS) and MPEG-4 scalable lossless coding(SLS), were standardized by MPEG in 2006. The simple profile of MPEG-4 ALS, it supports up to stereo, was defined by MPEG in 2009. The lossless audio codec should have low-complexity in stereo to be widely used in portable multi-media products. But the previous researches of MPEG-4 ALS have focused on an improvement of compression ratio, a complexity reduction in multi-channels coding, and a selection of linear prediction coefficients(LPCs) order. In this paper, the complexity and compression ratio of MPEG-4 ALS encoder is analyzed in simple profile of MPEG-4 ALS, the method to reduce a complexity of MPEG-4 ALS encoder is proposed. Based on an analysis of complexity of MPEG-4 ALS encoder, the complexity of short-term prediction filter of MPEG-4 ALS encoder is reduced by using the low-complexity filter that is proposed in previous research to reduce the complexity of MPEG-4 ALS decoder. Also, we propose a joint coding decision method, it reduces the complexity and keeps the compression ratio of MPEG-4 ALS encoder. In proposed method, the operation of joint coding is decided based on the relation between cross-correlation of residual and compression ratio of joint coding. The performance of MPEG-4 ALS encoder that has the method and low-complexity filter is evaluated by using the MPEG-4 ALS conformance test file and normal music files. The complexity of MPEG-4 ALS encoder is reduced by about 24% by comparing with MPEG-4 ALS reference encoder, while the compression ratio by the proposed method is comparable to MPEG-4 ALS reference encoder.

Crime Incident Prediction Model based on Bayesian Probability (베이지안 확률 기반 범죄위험지역 예측 모델 개발)

  • HEO, Sun-Young;KIM, Ju-Young;MOON, Tae-Heon
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.20 no.4
    • /
    • pp.89-101
    • /
    • 2017
  • Crime occurs differently based on not only place locations and building uses but also the characteristics of the people who use the place and the spatial structures of the buildings and locations. Therefore, if spatial big data, which contain spatial and regional properties, can be utilized, proper crime prevention measures can be enacted. Recently, with the advent of big data and the revolutionary intelligent information era, predictive policing has emerged as a new paradigm for police activities. Based on 7420 actual crime incidents occurring over three years in a typical provincial city, "J city," this study identified the areas in which crimes occurred and predicted risky areas. Spatial regression analysis was performed using spatial big data about only physical and environmental variables. Based on the results, using the street width, average number of building floors, building coverage ratio, the type of use of the first floor (Type II neighborhood living facility, commercial facility, pleasure use, or residential use), this study established a Crime Incident Prediction Model (CIPM) based on Bayesian probability theory. As a result, it was found that the model was suitable for crime prediction because the overlap analysis with the actual crime areas and the receiver operating characteristic curve (Roc curve), which evaluated the accuracy of the model, showed an area under the curve (AUC) value of 0.8. It was also found that a block where the commercial and entertainment facilities were concentrated, a block where the number of building floors is high, and a block where the commercial, entertainment, residential facilities are mixed are high-risk areas. This study provides a meaningful step forward to the development of a crime prediction model, unlike previous studies that explored the spatial distribution of crime and the factors influencing crime occurrence.