• Title/Summary/Keyword: Algorithmic

Search Result 375, Processing Time 0.024 seconds

Development of a CAD Based Tool for the Analysis of Landscape Visibility and Sensitivity (수치지형 해석에 의한 가시성 및 시인성의 경관정보화 연구 - CAD 기반의 분석 도구 개발을 중심으로 -)

  • 조동범
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.26 no.3
    • /
    • pp.78-78
    • /
    • 1998
  • The purpose of this research is to develop a CAD-based program for data analysis of digital elevation model(DEM) on the aspect of landscape assessment. When handling DEM data as a visual simulation of topographic landscape, it is basic interest to analyze visible area and visualize visual sensitivity distributions. In reference with landscape assessment, more intuitive and interactive visualizing tools are needed, specially in area of visual approach. For adaptability to landscape assessment, algorithmic approaches to visibility analysis and concepts for visual sensitivity calculation in this study were based on processing techniques of entity data control functions used in AutoCAD drawing database. Also, for the purpose of quantitative analysis, grid-type 3DFACE entities were adopted as mesh unit of DEM structure. Developed programs are composed of main part named VSI written in AutoLISP and two of interface modules written in dialog control language(DCL0 for user-oriented interactive usage. Definitions of camera points(view points) and target points(or observed area) are available alternatively in combined methods of representing scenic landscape, scenery, and sequential landscape. In the case of scene landscape(single camera to fixed target point), only visibility analysis in available. And total visibility, frequency of cumulative visibility, and visual sensitivity analysis are available in other cases. Visual sensitivity was thought as view angle(3 dimensional observed visual area) and the strengths were classified in user defined level referring to statistical characteristics of distribution. Visibility analysis routine of the VSI was proved to be more effective in the accuracy and time comparing with similar modules of existing AutoCAD third utility.

Multi-Dimensional Traveling Salesman Problem Scheme Using Top-n Skyline Query (Top-n 스카이라인 질의를 이용한 다차원 외판원 순회문제 기법)

  • Jin, ChangGyun;Oh, Dukshin;Kim, Jongwan
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.9 no.1
    • /
    • pp.17-24
    • /
    • 2020
  • The traveling salesman problem is an algorithmic problem tasked with finding the shortest route that a salesman visits, visiting each city and returning to the started city. Due to the exponential time complexity of TSP, it's hard to implement on cases like amusement park or delivery. Also, TSP is hard to meet user's demand that is associated with multi-dimensional attributes like travel time, interests, waiting time because it uses only one attribute - distance between nodes. This paper proposed Top-n Skyline-Multi Dimension TSP to resolve formerly adverted problems. The proposed algorithm finds the shortest route faster than the existing method by decreasing the number of operations, selecting multi-dimensional nodes according to the dominance of skyline. In the simulation, we compared computation time of dynamic programming algorithm to the proposed a TS-MDT algorithm, and it showed that TS-MDT was faster than dynamic programming algorithm.

The Effect Analysis of Postural Stability on the Inter-Segmental Spine Motion according to Types of Trunk Models in Drop Landing (드롭착지 동작 시 체간모델에 따른 척추분절운동이 자세안정성 해석에 미치는 영향)

  • Yoo, Kyoung-Seok
    • Korean Journal of Applied Biomechanics
    • /
    • v.24 no.4
    • /
    • pp.375-383
    • /
    • 2014
  • The purpose of this study was to assess the inter-segmental trunk motion during which multi-segmental movements of the spinal column was designed to interpret the effect of segmentation on the total measured spine motion. Also it analyzed the relative motion at three types of the spine models in drop landing. A secondary goal was to determine the intrinsic algorithmic errors of spine motion and the usefulness of such an approach as a tool to assess spinal motions. College students in the soccer team were selected the ten males with no history of spine symptoms or injuries. Each subject was given a fifteen minute adaptation period of drop landing on the 30cm height box. Inter-segmental spine motion were collected Vicon Motion Capture System (250 Hz) and synchronized with GRF data (1000 Hz). The result shows that Model III has a more increased range of motion (ROM) than Model I and Model II. And the Lagrange energy has significant difference of at E3 and E4 (p<.05). This study can be concluded that there are differences in the three models of algorithm during the phase of load absorption. Especially, Model III shows proper spine motion for the inter-segmental joint motion with the interaction effects using the seven segments. Model III shows more proper observed values about dynamic equilibrium than Model I & Model II. The findings have shown that the dynamic stability strategy of Model III toward multi-directional spinal motion supports for better function of the inter-segmental motor-control than the Model I and Model II.

Performance Factor Analysis of Sensing-Data Estimation Algorithm for Walking Robots (보행 로봇을 위한 센서 추정 알고리즘의 성능인자 분석)

  • Shon, Woong-Hee;Yu, Seung-Nam;Lee, Sang-Ho;Han, Chang-Soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.11 no.11
    • /
    • pp.4087-4094
    • /
    • 2010
  • The sensor data which is measured by Quadruped robot is utilized to recognize the physical environment or other information and to control the posture and walking of robot system. In order to control the robot precisely, high accuracy of sensor data is required, most of these sensors however, belongs to expensive and low-durable products. Moreover, these are exposed excessive load operation in a field condition if it is applied to field robot system. This issue becomes more serious one when the robot system is manufactured as a mass product. As in this context, this study suggests a virtual sensor technology to alternate or assist the main sensor system. This scheme is realized by using back-propagation algorithm of neural network theory, and the quality of estimated sensor data could be improved through the algorithmic and hardware based treatments. This study performs the various trial to identify the effective parameters which effect to the quality and reliability of estimated sensor data and tries to show the possibility of proposed methodology.

Fast Self-Similar Network Traffic Generation Based on FGN and Daubechies Wavelets (FGN과 Daubechies Wavelets을 이용한 빠른 Self-Similar 네트워크 Traffic의 생성)

  • Jeong, Hae-Duck;Lee, Jong-Suk
    • The KIPS Transactions:PartC
    • /
    • v.11C no.5
    • /
    • pp.621-632
    • /
    • 2004
  • Recent measurement studies of real teletraffic data in modern telecommunication networks have shown that self-similar (or fractal) processes may provide better models of teletraffic in modern telecommunication networks than Poisson processes. If this is not taken into account, it can lead to inaccurate conclusions about performance of telecommunication networks. Thus, an important requirement for conducting simulation studies of telecommunication networks is the ability to generate long synthetic stochastic self-similar sequences. A new generator of pseu-do-random self-similar sequences, based on the fractional Gaussian nois and a wavelet transform, is proposed and analysed in this paper. Specifically, this generator uses Daubechies wavelets. The motivation behind this selection of wavelets is that Daubechies wavelets lead to more accurate results by better matching the self-similar structure of long range dependent processes, than other types of wavelets. The statistical accuracy and time required to produce sequences of a given (long) length are experimentally studied. This generator shows a high level of accuracy of the output data (in the sense of the Hurst parameter) and is fast. Its theoretical algorithmic complexity is 0(n).

The Effect of Software Education on Middle School Students' Computational Thinking (소프트웨어 교육이 중학생의 컴퓨팅 사고력에 미치는 효과)

  • Lee, Jeongmin;Ko, Eunji
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.12
    • /
    • pp.238-250
    • /
    • 2018
  • The 2015 revised curriculum includes 'informatics' course including the process of building software aiming at cultivating creative and convergent ability. This study analyzes the competencies pursued in the revised curriculum and defines computational thinking as the main competency. The subjects of the study were the first grade of a middle school in the first semester of the 2018 school year. Of the 95 collected data, 83 data were used for analysis and the significance was confirmed by the paired t-test. Also, computational concept, computational practice and computational perspectives were confirmed through artifact-based interviews. As a result of statistical analysis, critical thinking, creativity, algorithmic thinking, and problem-solving significantly increased among sub-variables of computational thinking. Statistical results and interview results were analyzed to provide implications for design and implementation of software education in 'informatics' course.

Coupling non-matching finite element discretizations in small-deformation inelasticity: Numerical integration of interface variables

  • Amaireh, Layla K.;Haikal, Ghadir
    • Coupled systems mechanics
    • /
    • v.8 no.1
    • /
    • pp.71-93
    • /
    • 2019
  • Finite element simulations of solid mechanics problems often involve the use of Non-Confirming Meshes (NCM) to increase accuracy in capturing nonlinear behavior, including damage and plasticity, in part of a solid domain without an undue increase in computational costs. In the presence of material nonlinearity and plasticity, higher-order variables are often needed to capture nonlinear behavior and material history on non-conforming interfaces. The most popular formulations for coupling non-conforming meshes are dual methods that involve the interpolation of a traction field on the interface. These methods are subject to the Ladyzhenskaya-Babuska-Brezzi (LBB) stability condition, and are therefore limited in their implementation with the higher-order elements needed to capture nonlinear material behavior. Alternatively, the enriched discontinuous Galerkin approach (EDGA) (Haikal and Hjelmstad 2010) is a primal method that provides higher order kinematic fields on the interface, and in which interface tractions are computed from local finite element estimates, therefore facilitating its implementation with nonlinear material models. The inclusion of higher-order interface variables, however, presents the issue of preserving material history at integration points when a increase in integration order is needed. In this study, the enriched discontinuous Galerkin approach (EDGA) is extended to the case of small-deformation plasticity. An interface-driven Gauss-Kronrod integration rule is proposed to enable adaptive enrichment on the interface while preserving history-dependent material data at existing integration points. The method is implemented using classical J2 plasticity theory as well as the pressure-dependent Drucker-Prager material model. We show that an efficient treatment of interface variables can improve algorithmic performance and provide a consistent approach for coupling non-conforming meshes in inelasticity.

A Study on the Effect of EPL on Programing, Computing Thinking and Problem Solving Ability of Programing Education (EPL이 프로그래밍 교육의 프로그래밍, 컴퓨팅사고력 및 문제해결력에 미치는 영향에 관한 연구)

  • Yoon, Sunhee
    • The Journal of the Convergence on Culture Technology
    • /
    • v.4 no.4
    • /
    • pp.287-294
    • /
    • 2018
  • In this paper, it is practically difficult to obtain programming language education without having an algorithmic thinking ability, computing thinking ability, and problem solving ability of students with relatively low basic education. The results showed that students who took program language education in parallel with the EPL(Educational Programing Language), Scratch, compared to without using Scratch found to be helpful in improving their programming amd computing thinking ability, and problem-solving abilities as well as their satisfaction. This not only gave students confidence in their hard-thinking programming practices, but also helped prevent them from falling out of the middle.

A Cortex-M0 based Security System-on-Chip Embedded with Block Ciphers and Hash Function IP (블록암호와 해시 함수 IP가 내장된 Cortex-M0 기반의 보안 시스템 온 칩)

  • Choe, Jun-Yeong;Choi, Jun-Baek;Shin, Kyung-Wook
    • Journal of IKEEE
    • /
    • v.23 no.2
    • /
    • pp.388-394
    • /
    • 2019
  • This paper describes a design of security system-on-chip (SoC) that integrates a Cortex-M0 CPU with an AAW (ARIA-AES- Whirlpool) crypto-core which implements two block cipher algorithms of ARIA and AES and a hash function Whirlpool into an unified hardware architecture. The AAW crypto-core was implemented in a small area through hardware sharing based on algorithmic characteristics of ARIA, AES and Whirlpool, and it supports key sizes of 128-bit and 256-bit. The designed security SoC was implemented on FPGA device and verified by hardware-software co-operation. The AAW crypto-core occupied 5,911 slices, and the AHB_Slave including the AAW crypto-core was implemented with 6,366 slices. The maximum clock frequency of the AHB_Slave was estimated at 36 MHz, the estimated throughputs of the ARIA-128 and the AES-128 was 83 Mbps and 78 Mbps respectively, and the throughput of the Whirlpool hash function of 512-bit block was 156 Mbps.

Secure De-identification and Data Sovereignty Management of Decentralized SSI using Restructured ZKP (재구성된 영지식 증명을 활용한 탈중앙형 자기 주권 신원의 안전한 비식별화 및 데이터 주권 관리)

  • Cho, Kang-Woo;Jeon, Mi-Hyeon;Shin, Sang Uk
    • Journal of Digital Convergence
    • /
    • v.19 no.8
    • /
    • pp.205-217
    • /
    • 2021
  • Decentralized SSI(Self Sovereign Identity) has become an alternative to a new digital identity solution, but an efficient de-identification technique has not been proposed due to the unique algorithmic characteristics of data transactions. In this study, to ensure the decentralized operation of SSI, we propose a de-identification technique that does not remove identifiers by restructuring the verification results of ZKP (Zero Knowledge Proof) into a form that can be provided to the outside by the verifier. In addition, it is possible to provide restructured de-identification data without the consent of data subject by proposing the concept of differential sovereignty management for each entity participating in verification. As a result, the proposed model satisfies the domestic personal information protection law in a decnetralized SSI, in addition provides secure and efficient de-identification processing and sovereignty management.