• Title/Summary/Keyword: 특징 맵 기반

Search Result 143, Processing Time 0.026 seconds

Analysis of Applicability of RPC Correction Using Deep Learning-Based Edge Information Algorithm (딥러닝 기반 윤곽정보 추출자를 활용한 RPC 보정 기술 적용성 분석)

  • Jaewon Hur;Changhui Lee;Doochun Seo;Jaehong Oh;Changno Lee;Youkyung Han
    • Korean Journal of Remote Sensing
    • /
    • v.40 no.4
    • /
    • pp.387-396
    • /
    • 2024
  • Most very high-resolution (VHR) satellite images provide rational polynomial coefficients (RPC) data to facilitate the transformation between ground coordinates and image coordinates. However, initial RPC often contains geometric errors, necessitating correction through matching with ground control points (GCPs). A GCP chip is a small image patch extracted from an orthorectified image together with height information of the center point, which can be directly used for geometric correction. Many studies have focused on area-based matching methods to accurately align GCP chips with VHR satellite images. In cases with seasonal differences or changed areas, edge-based algorithms are often used for matching due to the difficulty of relying solely on pixel values. However, traditional edge extraction algorithms,such as canny edge detectors, require appropriate threshold settings tailored to the spectral characteristics of satellite images. Therefore, this study utilizes deep learning-based edge information that is insensitive to the regional characteristics of satellite images for matching. Specifically,we use a pretrained pixel difference network (PiDiNet) to generate the edge maps for both satellite images and GCP chips. These edge maps are then used as input for normalized cross-correlation (NCC) and relative edge cross-correlation (RECC) to identify the peak points with the highest correlation between the two edge maps. To remove mismatched pairs and thus obtain the bias-compensated RPC, we iteratively apply the data snooping. Finally, we compare the results qualitatively and quantitatively with those obtained from traditional NCC and RECC methods. The PiDiNet network approach achieved high matching accuracy with root mean square error (RMSE) values ranging from 0.3 to 0.9 pixels. However, the PiDiNet-generated edges were thicker compared to those from the canny method, leading to slightly lower registration accuracy in some images. Nevertheless, PiDiNet consistently produced characteristic edge information, allowing for successful matching even in challenging regions. This study demonstrates that improving the robustness of edge-based registration methods can facilitate effective registration across diverse regions.

A Study on Constructing Bottom-up Model for Electric Sector (전력부문 온실가스 감축정책 평가를 위한 상향식 모형화 방안)

  • Kim, Hugon;Paik, Chunhyun;Chung, Yongjoo;Ahn, Younghwan
    • Journal of Energy Engineering
    • /
    • v.25 no.3
    • /
    • pp.114-129
    • /
    • 2016
  • Since the release of mid-term domestic GHG goals until 2020, in 2009, some various GHG reduction policies have been proposed to reduce the emission rate about 30% compared to BAU scenario. There are two types of modeling approaches for identifying options required to meet greenhouse gas (GHG) abatement targets and assessing their economic impacts: top-down and bottom-up models. Examples of the bottom-up optimization models include MARKAL, MESSAGE, LEAP, and AIM, all of which are developed based on linear programming (LP) with a few differences in user interface and database utilization. The bottom-up model for electric sector requires demand management, regeneration energy mix, fuel conversation, etc., thus it has a very complex aspect to estimate some various policies. In this paper, we suggest a bottom-up BAU model for electric sector and how we can build it through step-by-step procedures such that includes load region, hydro-dam and pumping storage.

Vector Data Hashing Using Line Curve Curvature (라인 곡선 곡률 기반의 벡터 데이터 해싱)

  • Lee, Suk-Hwan;Kwon, Ki-Ryong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.36 no.2C
    • /
    • pp.65-77
    • /
    • 2011
  • With the rapid expansion of application fields of vector data model such as CAD design drawing and GIS digital map, the security technique for vector data model has been issued. This paper presents the vector data hashing for the authentication and copy protection of vector data model. The proposed hashing groups polylines in main layers of a vector data model and generates the group coefficients by the line curve curvatures of the first and second type of all poly lines. Then we calculate the feature coefficients by projecting the group coefficients onto the random pattern and generate finally the binary hash from the binarization of the feature coefficients. From experimental results using a number of CAD drawings and GIS digital maps, we verified that the proposed hashing has the robustness against various attacks and the uniqueness and security by the random key.

Light Field Angular Super-Resolution Algorithm Using Dilated Convolutional Neural Network with Residual Network (잔차 신경망과 팽창 합성곱 신경망을 이용한 라이트 필드 각 초해상도 기법)

  • Kim, Dong-Myung;Suh, Jae-Won
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.24 no.12
    • /
    • pp.1604-1611
    • /
    • 2020
  • Light field image captured by a microlens array-based camera has many limitations in practical use due to its low spatial resolution and angular resolution. High spatial resolution images can be easily acquired with a single image super-resolution technique that has been studied a lot recently. But there is a problem in that high angular resolution images are distorted in the process of using disparity information inherent among images, and thus it is difficult to obtain a high-quality angular resolution image. In this paper, we propose light field angular super-resolution that extracts an initial feature map using an dilated convolutional neural network in order to effectively extract the view difference information inherent among images and generates target image using a residual neural network. The proposed network showed superior performance in PSNR and subjective image quality compared to existing angular super-resolution networks.

Improvement of Disparity Map using Loopy Belief Propagation based on Color and Edge (Disparity 보정을 위한 컬러와 윤곽선 기반 루피 신뢰도 전파 기법)

  • Kim, Eun Kyeong;Cho, Hyunhak;Lee, Hansoo;Wibowo, Suryo Adhi;Kim, Sungshin
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.25 no.5
    • /
    • pp.502-508
    • /
    • 2015
  • Stereo images have an advantage of calculating depth(distance) values which can not analyze from 2-D images. However, depth information obtained by stereo images has due to following reasons: it can be obtained by computation process; mismatching occurs when stereo matching is processing in occlusion which has an effect on accuracy of calculating depth information. Also, if global method is used for stereo matching, it needs a lot of computation. Therefore, this paper proposes the method obtaining disparity map which can reduce computation time and has higher accuracy than established method. Edge extraction which is image segmentation based on feature is used for improving accuracy and reducing computation time. Color K-Means method which is image segmentation based on color estimates correlation of objects in an image. And it extracts region of interest for applying Loopy Belief Propagation(LBP). For this, disparity map can be compensated by considering correlation of objects in the image. And it can reduce computation time because of calculating region of interest not all pixels. As a result, disparity map has more accurate and the proposed method reduces computation time.

Update Protocols for Web-Based GIS Applications (웹 기반 GIS 응용을 위한 변경 프로토콜)

  • An, Seong-U;Seo, Yeong-Deok;Kim, Jin-Deok;Hong, Bong-Hui
    • Journal of KIISE:Databases
    • /
    • v.29 no.4
    • /
    • pp.321-333
    • /
    • 2002
  • As web-based services are becoming more and more popular, concurrent updates of spatial data should be possible in the web-based environments in order to use the various services. Web-based GIS applications are characterized by large quantity of data providing and these data should be continuously updated according to various user's requirements. Faced with such an enormous data providing system, it is inefficient for a server to do all of the works of updating spatial data requested by clients. Besides, the HTTP protocol used in the web environment is established under the assumption of 'Connectionless'and 'Stateless'. Lots of problems may occur if the scheme of transaction processing based on the LAN environment is directly applied to the web environment. Especially for long transactions of updating spatial data, it is very difficult to control the concurrency among clients and to keep the consistency of the server data. This paper proposes a solution of keeping consistency during updating directly spatial data in the client-side by resolving the Dormancy Region Lock problem caused by the 'Connectionless'and 'Stateless'feature of the HTTP protocol. The RX(Region-eXclusive) lock and the periodically sending of ALIVE_CLIENTi messages can solve this problem. The protocol designed here is verified as effective enough through implementing in the main memory spatial database system, called CyberMap.

Performance Evaluation of a Novel Chaos Transceiver for the High Level Modulation (고레벨 변조를 위한 새로운 카오스 송수신기의 성능 평가)

  • Lee, Jun-Hyun;Ryu, Heung-Gyoon
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.1
    • /
    • pp.31-36
    • /
    • 2014
  • Security of chaos communication system that has characteristic of sensitive initial conditions is superior to digital communication systems, but BER(Bit Error Rate) performance is evaluatied badly. So, studies in order to improve the BER performance is important. existing studies, BER performance of proposed chaos transceiver is possible to improve than the CDSK(Correlation Delay Shift Keying) system because it has characteristic that has very few addition elements like noise signal except for the desired signal. Chaos communication system has many symbols because it spreads according to characteristic of chaos map. Therefore, study that can have the good data rate in chaos communication system is required. Information bits of existing chaos modulation system are modulated as -1 and 1 on the basis of BPSK system. However, instead of BPSK system, if chaos communication system is applied high level modulation systems such as QPSK system and 16QAM system, it is possible to have good data rate because more data are transmitted at a time. In the paper, when QPSK system and 16QAM system are applied to proposed chaos transceiver in existing study, we evaluate the SER(Symbol Error Rate) performance and compare the each performance. Also, when QPSK system and 16QAM system are applied to proposed chaos transceiver, we evaluate the anti-jamming performance of proposed system.

Face Detection Using Adaboost and Template Matching of Depth Map based Block Rank Patterns (Adaboost와 깊이 맵 기반의 블록 순위 패턴의 템플릿 매칭을 이용한 얼굴검출)

  • Kim, Young-Gon;Park, Rae-Hong;Mun, Seong-Su
    • Journal of Broadcast Engineering
    • /
    • v.17 no.3
    • /
    • pp.437-446
    • /
    • 2012
  • A face detection algorithms using two-dimensional (2-D) intensity or color images have been studied for decades. Recently, with the development of low-cost range sensor, three-dimensional (3-D) information (i.e., depth image that represents the distance between a camera and objects) can be easily used to reliably extract facial features. Most people have a similar pattern of 3-D facial structure. This paper proposes a face detection method using intensity and depth images. At first, adaboost algorithm using intensity image classifies face and nonface candidate regions. Each candidate region is divided into $5{\times}5$ blocks and depth values are averaged in each block. Then, $5{\times}5$ block rank pattern is constructed by sorting block averages of depth values. Finally, candidate regions are classified as face and nonface regions by matching the constructed depth map based block rank patterns and a template pattern that is generated from training data set. For template matching, the $5{\times}5$ template block rank pattern is prior constructed by averaging block ranks using training data set. The proposed algorithm is tested on real images obtained by Kinect range sensor. Experimental results show that the proposed algorithm effectively eliminates most false positives with true positives well preserved.

Historical Observation and the Characteristics of the Records and Archives Management in Korea (한국 기록관리의 사적 고찰과 그 특징)

  • Lee, Young-Hak
    • The Korean Journal of Archival Studies
    • /
    • no.34
    • /
    • pp.221-250
    • /
    • 2012
  • This paper introduces the characteristics of the records and archives management of Korea from Joseon dynasty to now. This paper also explains historical background of making the records and archives management in Joseon dynasty. This paper introduces the process of establishment of modern records management system by adopting records management system and public administration of USA after liberation in 1945. The Joseon bureaucrats established systematic methodologies for managing and arranging the records. Jeseon dynasty managed its records systematically since it was a bureaucratic regime. It is also noticeable that the famous Joseonwangjosilrok(Annals of Joseon dynasty) came out of the power struggles for the control of the national affairs between the king and the nobility during the time of establishment of the dynasty. Another noticeable feature of the records tradition in Joseon dynasty was that the nobility recorded their experience and allowed future generations use and refer their experiences and examples when they performed similar business. The records of Joseon period are the historical records which recorded contemporary incidents and the compilers expected the future historians evaluate the incidents they recorded. In 1894, the reformation policy of Gaboh governments changed society into modernity. The policy of Gaboh governments prescribed archive management process through 'Regulation(命令頒布式)'. They revised the form of official documents entirely. They changed a name of an era from Chinese to unique style of Korean, and changed original Chinese into Korean or Korean-Chinese together. Also, instead of a blank sheet of paper they used printed paper to print the name of each office. Korea was liberated from Japanese Imperialism in 1945 and the government of Republic of Korea was established in 1948. In 1950s Republic of Korea used the records management system of the Government-General of Joseon without any alteration. In the late of 1950's Republic of Korea constructed the new records management system by adopting records management system and public administration of USA. However, understanding of records management was scarce, so records and archives management was not accomplished. Consequently, many important records like presidential archives were deserted or destroyed. A period that made the biggest difference on National Records Management System was from 1999 when was enacted. Especially, it was the period of President Roh's five-year tenure called Participation Government (2003-2008). The first distinctive characteristic of Participation Government's records management is that it implemented governance actively. Another remarkable feature is a nomination of records management specialists at public institutions. The Participation Government also legislated (completely revised) . It led to a beginning of developing records management in Republic of Korea.

Design of an Integrated University Information Service Model Based on Block Chain (블록체인 기반의 대학 통합 정보서비스 실증 모델 설계)

  • Moon, Sang Guk;Kim, Min Sun;Kim, Hyun Joo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.2
    • /
    • pp.43-50
    • /
    • 2019
  • Block-chain enjoys technical advantages such as "robust security," owing to the structural characteristic that forgery is impossible, decentralization through sharing the ledger between participants, and the hyper-connectivity connecting Internet of Things, robots, and Artificial Intelligence. As a result, public organizations have highly positive attitudes toward the adoption of technology using block-chain, and the design of university information services is no exception. Universities are also considering the application of block-chain technology to foundations that implement various information services within a university. Through case studies of block-chain applications across various industries, this study designs an empirical model of an integrated information service platform that integrates information systems in a university. A basic road map of university information services is constructed based on block-chain technology, from planning to the actual service design stage. Furthermore, an actual empirical model of an integrated information service in a university is designed based on block-chain by applying this framework.