• Title/Summary/Keyword: Real Time Framework

Search Result 692, Processing Time 0.028 seconds

TCP-ROME: A Transport-Layer Parallel Streaming Protocol for Real-Time Online Multimedia Environments

  • Park, Ju-Won;Karrer, Roger P.;Kim, Jong-Won
    • Journal of Communications and Networks
    • /
    • v.13 no.3
    • /
    • pp.277-285
    • /
    • 2011
  • Real-time multimedia streaming over the Internet is rapidly increasing with the popularity of user-created contents, Web 2.0 trends, and P2P (peer-to-peer) delivery support. While many homes today are broadband-enabled, the quality of experience (QoE) of a user is still limited due to frequent interruption of media playout. The vulnerability of TCP (transmission control protocol), the popular transport-layer protocol for streaming in practice, to the packet losses, retransmissions, and timeouts makes it hard to deliver a timely and persistent flow of packets for online multimedia contents. This paper presents TCP-real-time online multimedia environment (ROME), a novel transport-layer framework that allows the establishment and coordination of multiple many-to-one TCP connections. Between one client with multiple home addresses and multiple co-located or distributed servers, TCP-ROME increases the total throughput by aggregating the resources of multiple TCP connections. It also overcomes the bandwidth fluctuations of network bottlenecks by dynamically coordinating the streams of contents from multiple servers and by adapting the streaming rate of all connections to match the bandwidth requirement of the target video.

Faster-than-real-time Hybrid Automotive Underwater Glider Simulation for Ocean Mapping

  • Choi, Woen-Sug;Bingham, Brian;Camilli, Richard
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.28 no.3
    • /
    • pp.441-450
    • /
    • 2022
  • The introduction of autonomous underwater gliders (AUGs) specifically addresses the reduction of operational costs that were previously prohibited with conventional autonomous underwater vehicles (AUVs) using a "scaling-down" design philosophy by utilizing the characteristics of autonomous drifters to far extend operation duration and coverage. Long-duration, wide-area missions raise the cost and complexity of in-water testing for novel approaches to autonomous mission planning. As a result, a simulator that supports the rapid design, development, and testing of autonomy solutions across a wide range using software-in-the-loop simulation at faster-than-real-time speeds becomes critical. This paper describes a faster-than-real-time AUG simulator that can support high-resolution bathymetry for a wide variety of ocean environments, including ocean currents, various sensors, and vehicle dynamics. On top of the de facto standard ROS-Gazebo framework and open-sourced underwater vehicle simulation packages, features specific to AUGs for ocean mapping are developed. For vehicle dynamics, the next-generation hybrid autonomous underwater gliders (Hybrid-AUGs) operate with both the buoyancy engine and the thrusters to improve navigation for bathymetry mappings, e.g., line trajectory, are is implemented since because it can also describe conventional AUGs without the thrusters. The simulation results are validated with experiments while operating at 120 times faster than the real-time.

Deep Learning Framework with Convolutional Sequential Semantic Embedding for Mining High-Utility Itemsets and Top-N Recommendations

  • Siva S;Shilpa Chaudhari
    • Journal of information and communication convergence engineering
    • /
    • v.22 no.1
    • /
    • pp.44-55
    • /
    • 2024
  • High-utility itemset mining (HUIM) is a dominant technology that enables enterprises to make real-time decisions, including supply chain management, customer segmentation, and business analytics. However, classical support value-driven Apriori solutions are confined and unable to meet real-time enterprise demands, especially for large amounts of input data. This study introduces a groundbreaking model for top-N high utility itemset mining in real-time enterprise applications. Unlike traditional Apriori-based solutions, the proposed convolutional sequential embedding metrics-driven cosine-similarity-based multilayer perception learning model leverages global and contextual features, including semantic attributes, for enhanced top-N recommendations over sequential transactions. The MATLAB-based simulations of the model on diverse datasets, demonstrated an impressive precision (0.5632), mean absolute error (MAE) (0.7610), hit rate (HR)@K (0.5720), and normalized discounted cumulative gain (NDCG)@K (0.4268). The average MAE across different datasets and latent dimensions was 0.608. Additionally, the model achieved remarkable cumulative accuracy and precision of 97.94% and 97.04% in performance, respectively, surpassing existing state-of-the-art models. This affirms the robustness and effectiveness of the proposed model in real-time enterprise scenarios.

APPLICATIONS OF THE REPRODUCING KERNEL THEORY TO INVERSE PROBLEMS

  • Saitoh, Saburou
    • Communications of the Korean Mathematical Society
    • /
    • v.16 no.3
    • /
    • pp.371-383
    • /
    • 2001
  • In this survey article, we shall introduce the applications of the theory of reproducing kernels to inverse problems. At the same time, we shall present some operator versions of our fundamental general theory for linear transforms in the framework of Hilbert spaces.

  • PDF

Verification of Real-time Hybrid Test System using RC Pier Model (RC교각을 이용한 실시간 하이브리드 실험 시스템의 적용성 연구)

  • Lee, Jinhaeng;Park, Minseok;Chae, Yunbyeong;Kim, Chul-Young
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.22 no.4
    • /
    • pp.253-259
    • /
    • 2018
  • Structure behaviors resulting from an earthquake are experimentally simulated mainly through a shaking table test. As for large-scale structures, however, size effects over a miniature may make it difficult to assess actual behaviors properly. To address this problem, research on the hybrid simulation is being conducted actively. This method is to implement numerical analysis on framework members that affect the general behavior of the structure dominantly through an actual scale experiment and on the rest parts by applying the substructuring technique. However, existing studies on hybrid simulation focus mainly on Slow experimental methods, which are disadvantageous in that it is unable to assess behaviors close to the actual level if material properties change depending on the speed or the influence of inertial force is significant. The present study aims to establish a Real-time hybrid simulation system capable of excitation based on the actual time history and to verify its performance and applicability. The hybrid simulation system built up in this study utilizes the ATS Compensator system, CR integrator, etc. in order to make the target displacement the same with the measured displacement on the basis of MATLAB/Simulink. The target structure was a 2-span bridge and an RC pier to support it was produced as an experimental model in order for the shaking table test and Slow and Real-time hybrid simulations. Behaviors that result from the earthquake of El Centro were examined, and the results were analyzed comparatively. In comparison with the results of the shaking table test, the Real-time hybrid simulation produced more similar maximum displacement and vibration behaviors than the Slow hybrid simulation. Hence, it is thought that the Real-time hybrid simulation proposed in this study can be utilized usefully in seismic capacity assessment of structural systems such as RC pier that are highly non-linear and time-dependent.

Developing an Intrusion Detection Framework for High-Speed Big Data Networks: A Comprehensive Approach

  • Siddique, Kamran;Akhtar, Zahid;Khan, Muhammad Ashfaq;Jung, Yong-Hwan;Kim, Yangwoo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.8
    • /
    • pp.4021-4037
    • /
    • 2018
  • In network intrusion detection research, two characteristics are generally considered vital to building efficient intrusion detection systems (IDSs): an optimal feature selection technique and robust classification schemes. However, the emergence of sophisticated network attacks and the advent of big data concepts in intrusion detection domains require two more significant aspects to be addressed: employing an appropriate big data computing framework and utilizing a contemporary dataset to deal with ongoing advancements. As such, we present a comprehensive approach to building an efficient IDS with the aim of strengthening academic anomaly detection research in real-world operational environments. The proposed system has the following four characteristics: (i) it performs optimal feature selection using information gain and branch-and-bound algorithms; (ii) it employs machine learning techniques for classification, namely, Logistic Regression, Naïve Bayes, and Random Forest; (iii) it introduces bulk synchronous parallel processing to handle the computational requirements of large-scale networks; and (iv) it utilizes a real-time contemporary dataset generated by the Information Security Centre of Excellence at the University of Brunswick (ISCX-UNB) to validate its efficacy. Experimental analysis shows the effectiveness of the proposed framework, which is able to achieve high accuracy, low computational cost, and reduced false alarms.

Design and Implementation of a Lightweight On-Device AI-Based Real-time Fault Diagnosis System using Continual Learning (연속학습을 활용한 경량 온-디바이스 AI 기반 실시간 기계 결함 진단 시스템 설계 및 구현)

  • Youngjun Kim;Taewan Kim;Suhyun Kim;Seongjae Lee;Taehyoun Kim
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.19 no.3
    • /
    • pp.151-158
    • /
    • 2024
  • Although on-device artificial intelligence (AI) has gained attention to diagnosing machine faults in real time, most previous studies did not consider the model retraining and redeployment processes that must be performed in real-world industrial environments. Our study addresses this challenge by proposing an on-device AI-based real-time machine fault diagnosis system that utilizes continual learning. Our proposed system includes a lightweight convolutional neural network (CNN) model, a continual learning algorithm, and a real-time monitoring service. First, we developed a lightweight 1D CNN model to reduce the cost of model deployment and enable real-time inference on the target edge device with limited computing resources. We then compared the performance of five continual learning algorithms with three public bearing fault datasets and selected the most effective algorithm for our system. Finally, we implemented a real-time monitoring service using an open-source data visualization framework. In the performance comparison results between continual learning algorithms, we found that the replay-based algorithms outperformed the regularization-based algorithms, and the experience replay (ER) algorithm had the best diagnostic accuracy. We further tuned the number and length of data samples used for a memory buffer of the ER algorithm to maximize its performance. We confirmed that the performance of the ER algorithm becomes higher when a longer data length is used. Consequently, the proposed system showed an accuracy of 98.7%, while only 16.5% of the previous data was stored in memory buffer. Our lightweight CNN model was also able to diagnose a fault type of one data sample within 3.76 ms on the Raspberry Pi 4B device.

Dynamic Route Guidance via Road Network Matching and Public Transportation Data

  • Nguyen, Hoa-Hung;Jeong, Han-You
    • Journal of IKEEE
    • /
    • v.25 no.4
    • /
    • pp.756-761
    • /
    • 2021
  • Dynamic route guidance (DRG) finds the fastest path from a source to a destination location considering the real-time congestion information. In Korea, the traffic state information is available by the public transportation data (PTD) which is indexed on top of the node-link map (NLM). While the NLM is the authoritative low-detailed road network for major roads only, the OpenStreetMap road network (ORN) supports not only a high-detailed road network but also a few open-source routing engines, such as OSRM and Valhalla. In this paper, we propose a DRG framework based on road network matching between the NLM and ORN. This framework regularly retrieves the NLM-indexed PTD to construct a historical speed profile which is then mapped to ORN. Next, we extend the Valhalla routing engine to support dynamic routing based on the historical speed profile. The numerical results at the Yeoui-do island with collected 11-month PTD show that our DRG framework reduces the travel time up to 15.24 % and improves the estimation accuracy of travel time more than 5 times.

A Flow Analysis Framework for Traffic Video

  • Bai, Lu-Shuang;Xia, Ying;Lee, Sang-Chul
    • Journal of Korea Spatial Information System Society
    • /
    • v.11 no.2
    • /
    • pp.45-53
    • /
    • 2009
  • The fast progress on multimedia data acquisition technologies has enabled collecting vast amount of videos in real time. Although the amount of information gathered from these videos could be high in terms of quantity and quality, the use of the collected data is very limited typically by human-centric monitoring systems. In this paper, we propose a framework for analyzing long traffic video using series of content-based analyses tools. Our framework suggests a method to integrate theses analyses tools to extract highly informative features specific to a traffic video analysis. Our analytical framework provides (1) re-sampling tools for efficient and precise analysis, (2) foreground extraction methods for unbiased traffic flow analysis, (3) frame property analyses tools using variety of frame characteristics including brightness, entropy, Harris corners, and variance of traffic flow, and (4) a visualization tool that summarizes the entire video sequence and automatically highlight a collection of frames based on some metrics defined by semi-automated or fully automated techniques. Based on the proposed framework, we developed an automated traffic flow analysis system, and in our experiments, we show results from two example traffic videos taken from different monitoring angles.

  • PDF

A Dual Modeling Method for a Real-Time Palpation Simulator

  • Kim, Sang-Youn;Park, Se-Kil;Park, Jin-Ah
    • Journal of Information Processing Systems
    • /
    • v.8 no.1
    • /
    • pp.55-66
    • /
    • 2012
  • This paper presents a dual modeling method that simulates the graphic and haptic behavior of a volumetric deformable object and conveys the behavior to a human operator. Although conventional modeling methods (a mass-spring model and a finite element method) are suitable for the real-time computation of an object's deformation, it is not easy to compute the haptic behavior of a volumetric deformable object with the conventional modeling method in real-time (within a 1kHz) due to a computational burden. Previously, we proposed a fast volume haptic rendering method based on the S-chain model that can compute the deformation of a volumetric non-rigid object and its haptic feedback in real-time. When the S-chain model represents the object, the haptic feeling is realistic, whereas the graphical results of the deformed shape look linear. In order to improve the graphic and haptic behavior at the same time, we propose a dual modeling framework in which a volumetric haptic model and a surface graphical model coexist. In order to inspect the graphic and haptic behavior of objects represented by the proposed dual model, experiments are conducted with volumetric objects consisting of about 20,000 nodes at a haptic update rate of 1000Hz and a graphic update rate of 30Hz. We also conduct human factor studies to show that the haptic and graphic behavior from our model is realistic. Our experiments verify that our model provides a realistic haptic and graphic feeling to users in real-time.