• Title/Summary/Keyword: 무참조점

Search Result 14, Processing Time 0.03 seconds

Development of Surface Velocity Measurement Technique without Reference Points Using UAV Image (드론 정사영상을 이용한 무참조점 표면유속 산정 기법 개발)

  • Lee, Jun Hyeong;Yoon, Byung Man;Kim, Seo Jun
    • Ecology and Resilient Infrastructure
    • /
    • v.8 no.1
    • /
    • pp.22-31
    • /
    • 2021
  • Surface image velocimetry (SIV) is a noncontact velocimetry technique based on images. Recently, studies have been conducted on surface velocity measurements using drones to measure a wide range of velocities and discharges. However, when measuring the surface velocity using a drone, reference points must be included in the image for image correction and the calculation of the ground sample distance, which limits the flight altitude and shooting area of the drone. A technique for calculating the surface velocity that does not require reference points must be developed to maximize spatial freedom, which is the advantage of velocity measurements using drone images. In this study, a technique for calculating the surface velocity that uses only the drone position and the specifications of the drone-mounted camera, without reference points, was developed. To verify the developed surface velocity calculation technique, surface velocities were calculated at the Andong River Experiment Center and then measured with a FlowTracker. The surface velocities measured by conventional SIV using reference points and those calculated by the developed SIV method without reference points were compared. The results confirmed an average difference of approximately 4.70% from the velocity obtained by the conventional SIV and approximately 4.60% from the velocity measured by FlowTracker. The proposed technique can accurately measure the surface velocity using a drone regardless of the flight altitude, shooting area, and analysis area.

An Algorithm for Translation from RDB Schema Model to XML Schema Model Considering Implicit Referential Integrity (묵시적 참조 무결성을 고려한 관계형 스키마 모델의 XML 스키마 모델 변환 알고리즘)

  • Kim, Jin-Hyung;Jeong, Dong-Won;Baik, Doo-Kwon
    • Journal of KIISE:Databases
    • /
    • v.33 no.5
    • /
    • pp.526-537
    • /
    • 2006
  • The most representative approach for efficient storing of XML data is to store XML data in relational databases. The merit of this approach is that it can easily accept the realistic status that most data are still stored in relational databases. This approach needs to convert XML data into relational data or relational data into XML data. The most important issue in the translation is to reflect structural and semantic relations of RDB to XML schema model exactly. Many studies have been done to resolve the issue, but those methods have several problems: Not cover structural semantics or just support explicit referential integrity relations. In this paper, we propose an algorithm for extracting implicit referential integrities automatically. We also design and implement the suggested algorithm, and execute comparative evaluations using translated XML documents. The proposed algorithm provides several good points such as improving semantic information extraction and conversion, securing sufficient referential integrity of the target databases, and so on. By using the suggested algorithm, we can guarantee not only explicit referential integrities but also implicit referential integrities of the initial relational schema model completely. That is, we can create more exact XML schema model through the suggested algorithm.

The Design and Implementation of HTML Document Integrity Management System (HTML 문서의 무결성 유지 시스템의 설계 및 구현)

  • 조이기;이영운;황인문;양수영;김원중
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2002.05a
    • /
    • pp.380-383
    • /
    • 2002
  • It Is difficult to manage broken link with dangling reference, inaccurate reference m the manual site that is consist of HTML documents of much quantity as KLDP(Korean Linux Documentation Project, http:/ /kldp.org) Web site. In this paper. we define relationship and constrain renditions that exist between Web site's HTML documents. And we design and implement HIMS(HTML Document Integrity Management System), which notify user that integrity violation happens or launch trigger operation to keep integrity between HTML documents in case of insert, delete, update.

  • PDF

An Efficient Transformation Technique from Relational Schema to Redundancy Free XML Schema (관계형 스키마로부터 중복성이 없는 XML 스키마로의 효율적인 변환 기법)

  • Cho, Jung-Gil
    • Journal of Internet Computing and Services
    • /
    • v.11 no.6
    • /
    • pp.123-133
    • /
    • 2010
  • XML has been become the new standard for publishing and exchanging data on the Web. However, most business data is still stored and maintained in relational database management systems. As such, there is an increasing need to efficiently publish relational data as XML data for Internet-based applications. The most important issue in the transformation is to reflect structural and semantic relations of RDB to XML schema exactly. Most transformation approaches have been done to resolve the issue, but those methods have several problems. In this paper, we discuss algorithm in transforming a relational database schema into corresponding XML schema in XML Schema. We aim to achieve not only explicit/implicit referential integrity relation information but also high level of nested structure while introducing no data redundancy for the transformed XML schema. To achieve these goals, we propose a transformation model which is redundancy free and then we improve the XML Schema structure by exploring more nested structure.

Relative RPCs Bias-compensation for Satellite Stereo Images Processing (고해상도 입체 위성영상 처리를 위한 무기준점 기반 상호표정)

  • Oh, Jae Hong;Lee, Chang No
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.4
    • /
    • pp.287-293
    • /
    • 2018
  • It is prerequisite to generate epipolar resampled images by reducing the y-parallax for accurate and efficient processing of satellite stereo images. Minimizing y-parallax requires the accurate sensor modeling that is carried out with ground control points. However, the approach is not feasible over inaccessible areas where control points cannot be easily acquired. For the case, a relative orientation can be utilized only with conjugate points, but its accuracy for satellite sensor should be studied because the sensor has different geometry compared to well-known frame type cameras. Therefore, we carried out the bias-compensation of RPCs (Rational Polynomial Coefficients) without any ground control points to study its precision and effects on the y-parallax in epipolar resampled images. The conjugate points were generated with stereo image matching with outlier removals. RPCs compensation was performed based on the affine and polynomial models. We analyzed the reprojection error of the compensated RPCs and the y-parallax in the resampled images. Experimental result showed one-pixel level of y-parallax for Kompsat-3 stereo data.

Security Verification of Korean Open Crypto Source Codes with Differential Fuzzing Analysis Method (차분 퍼징을 이용한 국내 공개 암호소스코드 안전성 검증)

  • Yoon, Hyung Joon;Seo, Seog Chung
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.6
    • /
    • pp.1225-1236
    • /
    • 2020
  • Fuzzing is an automated software testing methodology that dynamically tests the security of software by inputting randomly generated input values outside of the expected range. KISA is releasing open source for standard cryptographic algorithms, and many crypto module developers are developing crypto modules using this source code. If there is a vulnerability in the open source code, the cryptographic library referring to it has a potential vulnerability, which may lead to a security accident that causes enormous losses in the future. Therefore, in this study, an appropriate security policy was established to verify the safety of block cipher source codes such as SEED, HIGHT, and ARIA, and the safety was verified using differential fuzzing. Finally, a total of 45 vulnerabilities were found in the memory bug items and error handling items, and a vulnerability improvement plan to solve them is proposed.

Organizational Analysis of Computer Textbooks for Elementary School in Korea (초등학교 1.2학년 컴퓨터 교과서 내용 선정 및 조직 분석)

  • Lee, Jae-Mu
    • Journal of The Korean Association of Information Education
    • /
    • v.9 no.2
    • /
    • pp.299-308
    • /
    • 2005
  • This study attempts to analyze fifteen computer textbooks for the first and second grades in Korea. The textbooks were analyzed by content selection and organization. Evaluation criteria from previous works were selected for this computer textbook study. By analyses of current computer textbooks, some textbooks have a large volume for first and second grade students, and they do not provide enough integration with other subjects. Also, the textbooks provide various activities, practices, and teaching methods, but they do not provide examples of cooperative learning. They do not provide opportunities for individual learning, advanced learning, or remedial learning.

  • PDF

A Study on the Toxic Gases Released from Fire Retardant Finishing Materials (가스검지관법에 의한 방염시료의 연소가스 독성평가)

  • Lee, Hae Pyeong;Park, Young Ju;Lim, Suk Hwan;Kim, Jung In
    • 한국방재학회:학술대회논문집
    • /
    • 2011.02a
    • /
    • pp.195-195
    • /
    • 2011
  • 본 연구는 방염성능시험에 의한 필름 및 도료처리 시료를 대상으로 화재 시 발생하는 연소가스 중의 독성에 대하여 NES 713 방법에 의해 분석하고, 독성지수를 확인함으로써 연소 시 인체에 미치는 영향을 간접적으로 확인해 보고자 하였다. 기존 방염제품 중 필름과 도료처리 그리고 도료처리 후 기간이 경과한 시료의 3가지로 실험군을 구분하였으며, NES 713 방법으로 연소가스를 분석하고 독성지수를 평가하였으며, 추가로 기존의 국내 노출기준과의 비교를 수행하였으며, 국내 NES 713 방법을 활용한 선행연구들을 참조하였다. 발암가능 추정물질인 아크릴로니트릴은 총 18개의 시료에서 검출되었으며, 포름알데히드는 9개의 시료에서 검출되었다. 염화수소와 브롬화수소는 각각 5개의 시료들에서 검출된 반면에 도료처리된 시료에서는 각각 1개만이 검출되었다. 그 외 일산화탄소와 이산화탄소 그리고 질소산화물은 모든 시료에서 공통적으로 검출되었다. 독성지수는 최소 3.5에서 최고 9.4의 값을 갖는 것으로 나타났다. 본 연구에서 수행한 NES 713 분석결과와 독성지수 산출 및 기존 국내 노출기준과의 비교를 수행해 본 결과는 다음과 같았다. 첫째, 현행 국내 기준 상 방염제품의 독성시험 등에 관한 기준이 없고, 둘째, 방염제품에 대한 연소가스 독성 등을 평가한 선행연구가 제한적이라는 점에서 의미가 있을 것으로 사료된다. 하지만 독성지수의 적용이 국내에서 이루어지고 있지 않은 현실에서 산출도나 독성지수의 위해정도를 평가하기 어렵고, 시료의 수량이 총 18개로서 많지 않았다는 점에서 전체 방염제품의 연소가스 독성의 잠재적인 위험성을 추정할 수는 있지만 대표성을 갖기에는 문제점이 있다는 한계성을 벗어날 수 없었다. 따라서 향후 후속 연구에서는 보다 많은 방염제품들을 대상으로 무작위 선정을 통한 연구 설계 및 실험 연구가 수행된다면 보다 객관적이고 대표성을 지닌 연구결과를 도출할 수 있을 것으로 기대한다.

  • PDF

A Comparative Analysis of Long-Term Preservation Policies in Foreign Electronic Records: NARA, LAC, TNA, NAA, and SFA (국외 전자기록물의 장기보존 정책 비교 분석 - 미국, 캐나다, 영국, 호주, 스위스를 중심으로 -)

  • So, Jeong-Eui;Han, Hui-Jeong;Yang, Dongmin
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.18 no.4
    • /
    • pp.125-148
    • /
    • 2018
  • This study was intended to investigate the long-term preservation policy of electronic records published abroad to derive and compare the policy elements necessary for policy establishment. The U.S., Canada, United Kingdom, Australia, and Switzerland archives were selected, which officially announced the e-Record Preservation Policy. Research and analysis of the long-term preservation policies of the five selected countries resulted in six main policy elements. These six policy elements are preservation scope, long-term preservation strategy, risk management, integrity assurance methods, preservation infrastructure, and reference models. We compared and analyzed five long-term preservation policies through policy elements and found six implications as a result. The implications were focused on establishing a long-term preservation strategy in line with the actual state of the institution and using long-term preservation policies outside of the country as advanced cases for various types of electronic records.

Analysis of Objects, Contents and Formative Evaluation for Computer Textbooks for Elementary School Grades in Korea (초등학교 저학년 컴퓨터 교과서 목표.내용 및 평가의 분석)

  • Lee, Jae-Mu
    • Journal of The Korean Association of Information Education
    • /
    • v.10 no.1
    • /
    • pp.67-74
    • /
    • 2006
  • This study attempts to analyze fifteen computer textbooks for the first and second grades. The textbooks were analyzed by the following characteristics: goals, contents, and formative evaluation. Evaluation criteria from previous works were selected for advantages and identify the problems. In addition, advice for teachers using these books is proposed. The results of the analyses of current computer textbooks are the following: Firstly, in the area of goals and contents, most textbooks clearly define learning goals for each unit. Secondly, in the area of content, some textbooks have a large volume for first and second grade students, and they do not provide enough integration with other subjects. Finally, they do not provide formative evaluation to achieve for goals.

  • PDF