• Title/Summary/Keyword: 일괄처리

Search Result 309, Processing Time 0.028 seconds

Continuous Monitoring of k-Exclusive Closest Pairs in Road Network (도로네트워크 기반 이동 객체들 간의 배타적 최근접 쌍 모니터링 방법)

  • Li, Ki-Joune;Kwon, O-Je;Baek, Yun-Sun
    • Spatial Information Research
    • /
    • v.17 no.2
    • /
    • pp.213-222
    • /
    • 2009
  • Finding exclusive closest pairs in road network is very useful to real applications such as, for example, finding a closest pair between a passenger and a nearby taxi in a road network. Few studies, however, have been interested in this problem. To match two close moving objects exclusively, one object must belong to only one result pair. Because moving objects in a road network change their position continuously, it is necessary to monitor closest pair results. In this paper, we propose a methodology to monitor k exclusive closest pairs via a road network. Proposed method only updates the results which are influenced by objects' movement. We evaluated the performance of the proposed method with various real road network data. The results show that our method produces better accuracy than normal batch processing methods.

  • PDF

Hand-Held Mobile Phone Design for SAR Reduction (SAR 저감을 위한 휴대폰 설계)

  • 홍수원;오학태;박천석
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.12 no.3
    • /
    • pp.352-359
    • /
    • 2001
  • We propose the row method that is able to consider the SAR compliance test from the very beginning step of developing the mobile phone. The reason this new method is plausible is that we adopt the certified FDTD for the reliability of calculation, utilizing 1 mm high resolution model that is to model the phantom and the mobile phone almost identically to the reality. In this paper we introduce the process that will apply the proposed method in order to reduce the SAR of the mobile phone that has been problematic in satisfying the SAR compliance test. It results in dropping in the SAR that we keep the mobile phone or its antenna while we use it. Therefore here we make a claim as fellows. When we develop the new mobile phone, we should use the computer simulation combining the CAD design and radiation pattern rather than make a prototype and then use the trial and error method. Moreover the former way leads us to boost up the developing efficiency and reduce the cost.

  • PDF

Fast Scalar Multiplication Algorithm on Elliptic Curve over Optimal Extension Fields (최적확장체 위에서 정의되는 타원곡선에서의 고속 상수배 알고리즘)

  • Chung Byungchun;Lee Soojin;Hong Seong-Min;Yoon Hyunsoo
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.15 no.3
    • /
    • pp.65-76
    • /
    • 2005
  • Speeding up scalar multiplication of an elliptic curve point has been a prime approach to efficient implementation of elliptic curve schemes such as EC-DSA and EC-ElGamal. Koblitz introduced a $base-{\phi}$ expansion method using the Frobenius map. Kobayashi et al. extended the $base-{\phi}$ scalar multiplication method to suit Optimal Extension Fields(OEF) by introducing the table reference method. In this paper we propose an efficient scalar multiplication algorithm on elliptic curve over OEF. The proposed $base-{\phi}$ scalar multiplication method uses an optimized batch technique after rearranging the computation sequence of $base-{\phi}$ expansion usually called Horner's rule. The simulation results show that the new method accelerates the scalar multiplication about $20\%{\sim}40\%$ over the Kobayashi et al. method and is about three times as fast as some conventional scalar multiplication methods.

A comparison of on masse retraction of six anterior teeth with separate canine retraction (6전치 일괄(on masse) 견인과 견치 견인 후 4전치 견인 시 공간폐쇄 양상에 관한 연구)

  • Heo, Wook;Nahm, Dong-Seok
    • The korean journal of orthodontics
    • /
    • v.32 no.3 s.92
    • /
    • pp.165-174
    • /
    • 2002
  • The purpose of this study was to compare on masse retraction of six anterior teeth with separate canine retraction in the amount of the anchorage loss and the retraction of the anterior teeth. The subjects consisted of 30 adult female patients with Angle Class 1 malocclusions who were treated by .022' straight wire appliance with 4 first permolar extraction. They were composed of two groups. Group 1 consisted of 15 subjects, whose six anterior teeth were retracted by on masse retraction. Group 2 consisted of 15 subjects, whose canines were retracted separately. Pre-treatment and post-treatment lateral cephalometric radiographs were analyzed. All data were processed statistically with independent samples t-test, and the conclusions were as follows. 1. There was no significant difference in the amount of the anchorage loss between two groups(p>0.05). 2. There was no significant difference in the amount of the retraction of the anterior teeth between two groups(p>0.05). 3. There was a significant difference in the amount of the inclinational change of the upper incisors between two groups. It was greater in Group 2. 4. There was a significant difference in the vertical positional change of the upper incisal edges between two groups. The upper incisal edges in Group 2 were more extruded than Group 1 by about 1mm. 5. There was no significant difference in the vertical positional change of the root apex of the upper incisors between two groups(p>0.05). And there was no significant difference in the vertical positional change of the upper molar(p>0.05).

Directions for Developing Database Schema of Records in Archives Management Systems (영구기록물관리를 위한 기록물 데이터베이스 스키마 개발 방향)

  • Yim, Jin-Hee;Lee, Dae-Wook;Kim, Eun-Sil;Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.34
    • /
    • pp.57-105
    • /
    • 2012
  • The CAMS(Central Archives Management System) of NAK(National Archives of Korea) is an important system which receives and manages large amount of electronic records annually from 2015. From the point of view in database design, this paper analyzes the database schema of CAMS and discusses the direction of overall improvement of the CAMS. Firstly this research analyzes the tables for records and folders in the CAMS database which are core tables for the electronic records management. As a result, researchers notice that it is difficult to trust the quality of the records in the CAMS, because two core tables are entirely not normalized and have many columns whose roles are unknown. Secondly, this study suggests directions of normalization for the tables for records and folders in the CAMS database like followings: First, redistributing the columns into proper tables to reduce the duplication. Second, separating the columns about the classification scheme into separate tables. Third, separating the columns about the records types and sorts into separate tables. Lastly, separating metadata information related to the acquisition, takeover and preservation into separate tables. Thirdly, this paper suggests considerations to design and manage the database schema in each phase of archival management. In the ingest phase, the system should be able to process large amount of records as batch jobs in time annually. In the preservation phase, the system should be able to keep the management histories in the CAMS as audit trails including the reclassification, revaluation, and preservation activities related to the records. In the access phase, the descriptive metadata sets for the access should be selected and confirmed in various ways. Lastly, this research also shows the prototype of conceptual database schema for the CAMS which fulfills the metadata standards for records.

An Improved Skyline Query Scheme for Recommending Real-Time User Preference Data Based on Big Data Preprocessing (빅데이터 전처리 기반의 실시간 사용자 선호 데이터 추천을 위한 개선된 스카이라인 질의 기법)

  • Kim, JiHyun;Kim, Jongwan
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.5
    • /
    • pp.189-196
    • /
    • 2022
  • Skyline query is a scheme for exploring objects that are suitable for user preferences based on multiple attributes of objects. Existing skyline queries return search results as batch processing, but the need for real-time search results has increased with the advent of interactive apps or mobile environments. Online algorithm for Skyline improves the return speed of objects to explore preferred objects in real time. However, the object navigation process requires unnecessary navigation time due to repeated comparative operations. This paper proposes a Pre-processing Online Algorithm for Skyline Query (POA) to eliminate unnecessary search time in Online Algorithm exploration techniques and provide the results of skyline queries in real time. Proposed techniques use the concept of range-limiting to existing Online Algorithm to perform pretreatment and then eliminate repetitive rediscovering regions first. POAs showed improvement in standard distributions, bias distributions, positive correlations, and negative correlations of discrete data sets compared to Online Algorithm. The POAs used in this paper improve navigation performance by minimizing comparison targets for Online Algorithm, which will be a new criterion for rapid service to users in the face of increasing use of mobile devices.

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

On Processing Raw Data from Micrometeorological Field Experiments (미기상학 야외실험에서 얻어지는 자료 처리에 관하여)

  • Hong, Jin-kyu;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.4 no.2
    • /
    • pp.119-126
    • /
    • 2002
  • Recently, the flux community in Korea established a new regional flux network, so-called KoFlux, which will provide an infrastructure for collecting, synthesizing, and analysing long-term measurements of energy and mass exchange between the atmosphere and the various vegetated surfaces. KoFlux requires the collection of long time series of raw data, and a large amount of data are expected to accumulate due to continuous flux observations at each KoFlux sites. Therefore, we need a systematic and efficient tool to manage these raw data. As a part of this effort, a computer program far processing raw data measured from micrometeorological field experiments was developed for the flux community in Korea. In this paper, we introduce this program for processing raw data to estimate fluxes and other turbulent statistics and explain the micrometeolological processes coded in this data-processing program. Also, we show some examples on how to run the program and handle the outputs for the unique purpose of research interest.

Visual Programming Environment for Effective Teaching and Research in Image Processing (영상처리에서 효율적인 교육과 연구를 위한 비주얼 프로그래밍 환경 개발)

  • Lee Jeong Heon;Heo Hoon;Chae Oksam
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.1
    • /
    • pp.50-61
    • /
    • 2005
  • With the wide spread use of multimedia device, the demand for the image processing engineers are increasing in various fields. However there are few engineers who can develop practical applications in the image processing area. To teach practical image processing techniques, we need a visual programming environment which can efficiently present the image processing theories and, at the same time, provide interactive experiments for the theory presented. In this paper, we propose a visual programming environment of the integrated environment for image processing. It consists of the theory presentation systems and experiment systems based on the visual programming environment. The theory presentation systems support multimedia data, web documents and powerpoint files. The proposed system provides an integrated environment for application development as well as education. The proposed system accumulates the teaching materials and exercise data and it manages, an ideal image processing education and research environment to students and instructors.

Dynamic Load Management Method for Spatial Data Stream Processing on MapReduce Online Frameworks (맵리듀스 온라인 프레임워크에서 공간 데이터 스트림 처리를 위한 동적 부하 관리 기법)

  • Jeong, Weonil
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.8
    • /
    • pp.535-544
    • /
    • 2018
  • As the spread of mobile devices equipped with various sensors and high-quality wireless network communications functionsexpands, the amount of spatio-temporal data generated from mobile devices in various service fields is rapidly increasing. In conventional research into processing a large amount of real-time spatio-temporal streams, it is very difficult to apply a Hadoop-based spatial big data system, designed to be a batch processing platform, to a real-time service for spatio-temporal data streams. This paper extends the MapReduce online framework to support real-time query processing for continuous-input, spatio-temporal data streams, and proposes a load management method to distribute overloads for efficient query processing. The proposed scheme shows a dynamic load balancing method for the nodes based on the inflow rate and the load factor of the input data based on the space partition. Experiments show that it is possible to support efficient query processing by distributing the spatial data stream in the corresponding area to the shared resources when load management in a specific area is required.