• Title/Summary/Keyword: Discrete Support

Search Result 165, Processing Time 0.031 seconds

Updating Algorithms using a Galois-Lattice Structure for Building and Maintaining Object-Oriented Analysis Models (Galois-격자 구조를 이용한 객체지향 분석 모델 구축과 유지에 관한 갱신 알고 리즘)

  • Ahn, Hi-Suck;Jun, Moon-Seog;Rhew, Sung-Yul
    • The Transactions of the Korea Information Processing Society
    • /
    • v.2 no.4
    • /
    • pp.477-486
    • /
    • 1995
  • This paper describes and constructs object-oriented analysis models using Galois-lattices that we are always studying in discrete mathematics, shows fundamental approaches to maintain the models, analyzes the construction of object-oriented analysis models through good examples. Also, we define several properties of Galois-lattices that have binary relations between class objects, propose the incremental updating algorithms that can update the Galois-lattice whenever new classes are added. This proposal shows that in case of adding new class nodes the results from simulations can implement in constant time and have linearly the incremental structures in worst cases, and in that the growth rate of lattices is proportioned to class nodes in time complexity. This results can achieve the high understandability of object-oriented analysis models and the high traceability of maintenance models. Furthermore it is possible to make more efficient performances of class reusability in advantages of object-oriented systems and support truly the class hierarchical maintenances.

  • PDF

A probabilistic knowledge model for analyzing heart rate variability (심박수변이도 분석을 위한 확률적 지식기반 모형)

  • Son, Chang-Sik;Kang, Won-Seok;Choi, Rock-Hyun;Park, Hyoung-Seob;Han, Seongwook;Kim, Yoon-Nyun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.20 no.3
    • /
    • pp.61-69
    • /
    • 2015
  • This study presents a probabilistic knowledge discovery method to interpret heart rate variability (HRV) based on time and frequency domain indexes, extracted using discrete wavelet transform. The knowledge induction algorithm was composed of two phases: rule generation and rule estimation. Firstly, a rule generation converts numerical attributes to intervals using ROC curve analysis and constructs a reduced ruleset by comparing consistency degree between attribute-value pairs with different decision values. Then, we estimated three measures such as rule support, confidence, and coverage to a probabilistic interpretation for each rule. To show the effectiveness of proposed model, we evaluated the statistical discriminant power of five rules (3 for atrial fibrillation, 1 for normal sinus rhythm, and 1 for both atrial fibrillation and normal sinus rhythm) generated using a data (n=58) collected from 1 channel wireless holter electrocardiogram (ECG), i.e., HeartCall$^{(R)}$, U-Heart Inc. The experimental result showed the performance of approximately 0.93 (93%) in terms of accuracy, sensitivity, specificity, and AUC measures, respectively.

ARtalet for Digilog Book Authoring Tool - Authoring 3D Objects Properties (디지로그 북 저작도구 ARtalet - 3 차원 객체 속성 저작)

  • Ha, Tae-Jin;Lee, Youg-Ho;Woo, Woon-Tack
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.314-318
    • /
    • 2008
  • This paper is about an authoring interface for augmented/mixed reality based book, specifically authoring 3D objects properties of Digilog book. We pursue even normal users with non-professional knowledge for programming can make the Digilog book easily. An authoring interface 3D object properties includes a manipulator as an input device and 3D contents authoring parts. As an interface design metaphor, existing GUI interface, already familiar to computer users, are referenced. The manipulator generates continuous/discrete input signal are necessary for authoring interface. Contents authoring part performs selection, positioning, scaling, coloring, copy of virtual objects using the input signal of the manipulator. Also users can exploit already existing GUI interface metaphor including pointing, click, drag and drop, and copy techniques with the manipulator. Therefore we think our AR authoring system can support rapid and intuitive modification of properties of virtual objects.

  • PDF

Application of Model-Based Systems Engineering to Large-Scale Multi-Disciplinary Systems Development (모델기반 시스템공학을 응용한 대형복합기술 시스템 개발)

  • Park, Joong-Yong;Park, Young-Won
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.8
    • /
    • pp.689-696
    • /
    • 2001
  • Large-scale Multi-disciplinary Systems(LMS) such as transportation, aerospace, defense etc. are complex systems in which there are many subsystems, interfaces, functions and demanding performance requirements. Because many contractors participate in the development, it is necessary to apply methods of sharing common objectives and communicating design status effectively among all of the stakeholders. The processes and methods of systems engineering which includes system requirement analysis; functional analysis; architecting; system analysis; interface control; and system specification development provide a success-oriented disciplined approach to the project. This paper shows not only the methodology and the results of model-based systems engineering to Automated Guided Transit(AGT) system as one of LMS systems, but also propose the extension of the model-based tool to help manage a project by linking WBS (Work Breakdown Structure), work organization, and PBS (Product Breakdown Structure). In performing the model-based functional analysis, the focus was on the operation concept of an example rail system at the top-level and the propulsion/braking function, a key function of the modern automated rail system. The model-based behavior analysis approach that applies a discrete-event simulation method facilitates the system functional definition and the test and verification activities. The first application of computer-aided tool, RDD-100, in the railway industry demonstrates the capability to model product design knowledge and decisions concerning key issues such as the rationale for architecting the top-level system. The model-based product design knowledge will be essential in integrating the follow-on life-cycle phase activities. production through operation and support, over the life of the AGT system. Additionally, when a new generation train system is required, the reuse of the model-based database can increase the system design productivity and effectiveness significantly.

  • PDF

An Improved Particle Swarm Optimization Algorithm for Care Worker Scheduling

  • Akjiratikarl, Chananes;Yenradee, Pisal;Drake, Paul R.
    • Industrial Engineering and Management Systems
    • /
    • v.7 no.2
    • /
    • pp.171-181
    • /
    • 2008
  • Home care, known also as domiciliary care, is part of the community care service that is a responsibility of the local government authorities in the UK as well as many other countries around the world. The aim is to provide the care and support needed to assist people, particularly older people, people with physical or learning disabilities and people who need assistance due to illness to live as independently as possible in their own homes. It is performed primarily by care workers visiting clients' homes where they provide help with daily activities. This paper is concerned with the dispatching of care workers to clients in an efficient manner. The optimized routine for each care worker determines a schedule to achieve the minimum total cost (in terms of distance traveled) without violating the capacity and time window constraints. A collaborative population-based meta-heuristic called Particle Swarm Optimization (PSO) is applied to solve the problem. A particle is defined as a multi-dimensional point in space which represents the corresponding schedule for care workers and their clients. Each dimension of a particle represents a care activity and the corresponding, allocated care worker. The continuous position value of each dimension determines the care worker to be assigned and also the assignment priority. A heuristic assignment scheme is specially designed to transform the continuous position value to the discrete job schedule. This job schedule represents the potential feasible solution to the problem. The Earliest Start Time Priority with Minimum Distance Assignment (ESTPMDA) technique is developed for generating an initial solution which guides the search direction of the particle. Local improvement procedures (LIP), insertion and swap, are embedded in the PSO algorithm in order to further improve the quality of the solution. The proposed methodology is implemented, tested, and compared with existing solutions for some 'real' problem instances.

Simulation Analysis for Verifying an Implementation Method of Higher-performed Packet Routing

  • Park, Jaewoo;Lim, Seong-Yong;Lee, Kyou-Ho
    • Proceedings of the Korea Society for Simulation Conference
    • /
    • 2001.10a
    • /
    • pp.440-443
    • /
    • 2001
  • As inter-network traffics grows rapidly, the router systems as a network component becomes to be capable of not only wire-speed packet processing but also plentiful programmability for quality services. A network processor technology is widely used to achieve such capabilities in the high-end router. Although providing two such capabilities, the network processor can't support a deep packet processing at nominal wire-speed. Considering QoS may result in performance degradation of processing packet. In order to achieve foster processing, one chipset of network processor is occasionally not enough. Using more than one urges to consider a problem that is, for instance, an out-of-order delivery of packets. This problem can be serious in some applications such as voice over IP and video services, which assume that packets arrive in order. It is required to develop an effective packet processing mechanism leer using more than one network processors in parallel in one linecard unit of the router system. Simulation analysis is also needed for verifying the mechanism. We propose the packet processing mechanism consisting of more than two NPs in parallel. In this mechanism, we use a load-balancing algorithm that distributes the packet traffic load evenly and keeps the sequence, and then verify the algorithm with simulation analysis. As a simulation tool, we use DEVSim++, which is a DEVS formalism-based hierarchical discrete-event simulation environment developed by KAIST. In this paper, we are going to show not only applicability of the DEVS formalism to hardware modeling and simulation but also predictability of performance of the load balancer when implemented with FPGA.

  • PDF

Study of the experimentation methodology for the counter fire operations by using discrete event simulation (이산사건 시뮬레이션을 활용한 대화력전 전투실험 방법론 연구)

  • Kim, Hyungkwon;Kim, Hyokyung;Kim, Youngho
    • Journal of the Korea Society for Simulation
    • /
    • v.25 no.2
    • /
    • pp.41-49
    • /
    • 2016
  • Counter Fire Operations can be characterized as having a system of systems that key features include situational awareness, command and control systems and highly responsive strike achieved by precision weapons. Current modeling methodology cannot provide an appropriate methodology for a system of systems and utilizes modeling and simulation tools to implement analytic options which can be time consuming and expensive. We explain developing methodology and tools for the effectiveness analysis of the counter fire operations under Network Centric Warfare Environment and suggest how to support a efficient decision making with the methodology and tools. Theater Counter Fire Operations tools consist of Enemy block, ISR block, C2 block and Shooter block. For the convenience of using by domain expert or non simulation expert, it is composed of the environments that each parameter and algorithm easily can be altered by user.

Symmetric Shape Deformation Considering Facial Features and Attractiveness Improvement (얼굴 특징을 고려한 대칭적인 형상 변형과 호감도 향상)

  • Kim, Jeong-Sik;Shin, Il-Kyu;Choi, Soo-Mi
    • Journal of the Korea Computer Graphics Society
    • /
    • v.16 no.2
    • /
    • pp.29-37
    • /
    • 2010
  • In this paper, we present a novel deformation method for alleviating the asymmetry of a scanned 3D face considering facial features. To handle detailed areas of the face, we developed a new local 3D shape descriptor based on facial features and surface curvatures. Our shape descriptor can improve the accuracy when deforming a 3D face toward a symmetric configuration, because it provides accurate point pairing with respect to the plane of symmetry. In addition, we use point-based representation over all stages of symmetrization, which makes it much easier to support discrete processes. Finally, we performed a statistical analysis to assess subjects' preference for the symmetrized faces by our approach.

A Novel Transmission Scheme for Compressed Health Data Using ISO/IEEE11073-20601

  • Kim, Sang-Kon;Kim, Tae-Kon;Lee, Hyungkeun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.12
    • /
    • pp.5855-5877
    • /
    • 2017
  • In view of personal health and disease management based on cost effective healthcare services, there is a growing need for real-time monitoring services. The electrocardiogram (ECG) signal is one of the most important of health information and real-time monitoring of the ECG can provide an efficient way to cope with emergency situations, as well as assist in everyday health care. In this system, it is essential to continuously collect and transmit large amount of ECG data within a given time and provide maximum user convenience at the same time. When considering limited wireless capacity and unstable channel conditions, appropriate signal processing and transmission techniques such as compression are required. However, ISO/IEEE 11073 standards for interoperability between personal health devices cannot properly support compressed data transmission. Therefore, in the present study, the problems for handling compressed data are specified and new extended agent and manager are proposed to address the problems while maintaining compatibility with existing devices. Extended devices have two PM-stores enabling compression and a novel transmission scheme. A variety of compression techniques can be applied; in this paper, discrete cosine transformation (DCT) is used. And the priority of information after DCT compression enables new transmission techniques for performance improvement. The performance of the compressed signal and the original uncompressed signal transmitted over the noisy channel are compared in terms of percent root mean square difference (PRD) using our simulation results. Our transmission scheme shows a better performance and complies with 11073 standards.

Performance Comparison of DCT Algorithm Implementations Based on Hardware Architecture (프로세서 구조에 따른 DCT 알고리즘의 구현 성능 비교)

  • Lee Jae-Seong;Pack Young-Cheol;Youn Dae-Hee
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.6C
    • /
    • pp.637-644
    • /
    • 2006
  • This paper presents performance and implementation comparisons of standard and fast DCT algorithms that are commonly used for subband filter bank in MPEG audio coders. The comparison is made according to the architectural difference of the implementation hardware. Fast DCT algorithms are known to have much less computational complexity than the standard method that involves computing a vector dot product of cosine coefficient. But, due to structural irregularity, fast DCT algorithms require extra cycles to generate the addresses for operands and to realign interim data. When algorithms are implemented using DSP processors that provide special operations such as single-cycle MAC (multiply-accumulate), zero-overhead nested loop, the standard algorithm is more advantageous than the fast algorithms. Also, in case of the finite-precision processing, the error performance of the standard method is far superior to that of the fast algorithms. In this paper, truncation errors and algorithmic suitability are analyzed and implementation results are provided to support the analysis.