• Title/Summary/Keyword: Cloud computing systems

Search Result 593, Processing Time 0.03 seconds

Development of Soil Erosion Analysis Systems Based on Cloud and HyGIS (클라우드 및 HyGIS기반 토양유실분석 시스템 개발)

  • Kim, Joo-Hun;Kim, Kyung-Tak;Lee, Jin-Won
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.14 no.4
    • /
    • pp.63-76
    • /
    • 2011
  • This study purposes to develop a model to analyze soil loss in estimating prior disaster influence. The model of analyzing soil loss develops the soil loss analysis system on the basis of Internet by introducing cloud computing system, and also develops a standalone type in connection with HyGIS. The soil loss analysis system is developed to draw a distribution chart without requiring a S/W license as well as without preparing basic data such as DEM, soil map and land cover map. Besides, it can help users to draw a soil loss distribution chart by applying various factors like direct rain factors. The tools of Soil Loss Anaysis Model in connection with HyGiS are developed as add-on type of GMMap2009 in GEOMania, and also are developed to draw Soil Loss Hazard Map suggested by OECD. As a result of using both models, they are developed very conveniently to analyze soil loss. Hereafter, these models will be able to be improved continuously through researches to analyze sediment a watershed outlet and to calculate R value using data of many rain stations.

A Study on the Metadata Schema for the Collection of Sensor Data in Weapon Systems (무기체계 CBM+ 적용 및 확대를 위한 무기체계 센서데이터 수집용 메타데이터 스키마 연구)

  • Jinyoung Kim;Hyoung-seop Shim;Jiseong Son;Yun-Young Hwang
    • Journal of Internet Computing and Services
    • /
    • v.24 no.6
    • /
    • pp.161-169
    • /
    • 2023
  • Due to the Fourth Industrial Revolution, innovation in various technologies such as artificial intelligence (AI), big data (Big Data), and cloud (Cloud) is accelerating, and data is considered an important asset. With the innovation of these technologies, various efforts are being made to lead technological innovation in the field of defense science and technology. In Korea, the government also announced the "Defense Innovation 4.0 Plan," which consists of five key points and 16 tasks to foster advanced science and technology forces in March 2023. The plan also includes the establishment of a Condition-Based Maintenance system (CBM+) to improve the operability and availability of weapons systems and reduce defense costs. Condition Based Maintenance (CBM) aims to secure the reliability and availability of the weapon system and analyze changes in equipment's state information to identify them as signs of failure and defects, and CBM+ is a concept that adds Remaining Useful Life prediction technology to the existing CBM concept [1]. In order to establish a CBM+ system for the weapon system, sensors are installed and sensor data are required to obtain condition information of the weapon system. In this paper, we propose a sensor data metadata schema to efficiently and effectively manage sensor data collected from sensors installed in various weapons systems.

A Prototype Implementation of Component Modules for Web-based SAR Data Processing System (웹 기반 SAR 자료처리 시스템 구성모듈 시험구현)

  • Kang, Sang-Goo;Lee, Ki-Won
    • Korean Journal of Remote Sensing
    • /
    • v.28 no.1
    • /
    • pp.29-38
    • /
    • 2012
  • Nowadays, most remote sensing image processing systems are on client-based ones. But in the view of information technology, a web-based system is predominant, being closely related to cloud computing and services. The web-based system in remote sensing is somewhat limited in the area of data sharing or dissemination, but it is necessary to extend. This study is to implement a web-based system and its component modules for SAR data processing. First, the previous cases dealt with both web computing and SAR information are investigated. InSAR information processing and concerned modules for a web-based system among SAR research domains are the main points in this work. It is expected that this approach contributes to the first attempt to link web computing technology such as HTML5 and satellite image processing.

Malware Behavior Analysis based on Mobile Virtualization (모바일 가상화기반의 악성코드 행위분석)

  • Kim, Jang-Il;Lee, Hee-Seok;Jung, Yong-Gyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.1-7
    • /
    • 2015
  • As recent smartphone is used around the world, all of the subscribers of the mobile communication is up to 47.7% about 24 million people. Smartphone has a vulnerability to security, and security-related incidents are increased in damage with the smartphone. However, precautions have been made, rather than analysis of the infection of most of the damage occurs after the damaged except for the case of the expert by way of conventional post-countermeasure. In this paper, we implement a mobile-based malware analysis systems apply a virtualization technology. It is designed to analyze the behavior through it. Virtualization is a technique that provides a logical resources to the guest by abstracting the physical characteristics of computing resources. The virtualization technology can improve the efficiency of resources by integrating with cloud computing services to servers, networks, storage, and computing resources to provide a flexible. In addition, we propose a system that can be prepared in advance to buy a security from a user perspective.

Energy-Aware Data-Preprocessing Scheme for Efficient Audio Deep Learning in Solar-Powered IoT Edge Computing Environments (태양 에너지 수집형 IoT 엣지 컴퓨팅 환경에서 효율적인 오디오 딥러닝을 위한 에너지 적응형 데이터 전처리 기법)

  • Yeontae Yoo;Dong Kun Noh
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.18 no.4
    • /
    • pp.159-164
    • /
    • 2023
  • Solar energy harvesting IoT devices prioritize maximizing the utilization of collected energy due to the periodic recharging nature of solar energy, rather than minimizing energy consumption. Meanwhile, research on edge AI, which performs machine learning near the data source instead of the cloud, is actively conducted for reasons such as data confidentiality and privacy, response time, and cost. One such research area involves performing various audio AI applications using audio data collected from multiple IoT devices in an IoT edge computing environment. However, in most studies, IoT devices only perform sensing data transmission to the edge server, and all processes, including data preprocessing, are performed on the edge server. In this case, it not only leads to overload issues on the edge server but also causes network congestion by transmitting unnecessary data for learning. On the other way, if data preprocessing is delegated to each IoT device to address this issue, it leads to another problem of increased blackout time due to energy shortages in the devices. In this paper, we aim to alleviate the problem of increased blackout time in devices while mitigating issues in server-centric edge AI environments by determining where the data preprocessed based on the energy state of each IoT device. In the proposed method, IoT devices only perform the preprocessing process, which includes sound discrimination and noise removal, and transmit to the server if there is more energy available than the energy threshold required for the basic operation of the device.

Interactive Visual Analytic Approach for Anomaly Detection in BGP Network Data (BGP 네트워크 데이터 내의 이상징후 감지를 위한 인터랙티브 시각화 분석 기법)

  • Choi, So-mi;Kim, Son-yong;Lee, Jae-yeon;Kauh, Jang-hyuk;Kwon, Koo-hyung;Choo, Jae-gul
    • Journal of Internet Computing and Services
    • /
    • v.23 no.5
    • /
    • pp.135-143
    • /
    • 2022
  • As the world has implemented social distancing and telecommuting due to the spread of COVID-19, real-time streaming sessions based on routing protocols have increased dependence on the Internet due to the activation of video and voice-related content services and cloud computing. BGP is the most widely used routing protocol, and although many studies continue to improve security, there is a lack of visual analysis to determine the real-time nature of analysis and the mis-detection of algorithms. In this paper, we analyze BGP data, which are powdered as normal and abnormal, on a real-world basis, using an anomaly detection algorithm that combines statistical and post-processing statistical techniques with Rule-based techniques. In addition, we present an interactive spatio-temporal analysis plan as an intuitive visualization plan and analysis result of the algorithm with a map and Sankey Chart-based visualization technique.

Batch Resizing Policies and Techniques for Fine-Grain Grid Tasks: The Nuts and Bolts

  • Muthuvelu, Nithiapidary;Chai, Ian;Chikkannan, Eswaran;Buyya, Rajkumar
    • Journal of Information Processing Systems
    • /
    • v.7 no.2
    • /
    • pp.299-320
    • /
    • 2011
  • The overhead of processing fine-grain tasks on a grid induces the need for batch processing or task group deployment in order to minimise overall application turnaround time. When deciding the granularity of a batch, the processing requirements of each task should be considered as well as the utilisation constraints of the interconnecting network and the designated resources. However, the dynamic nature of a grid requires the batch size to be adaptable to the latest grid status. In this paper, we describe the policies and the specific techniques involved in the batch resizing process. We explain the nuts and bolts of these techniques in order to maximise the resulting benefits of batch processing. We conduct experiments to determine the nature of the policies and techniques in response to a real grid environment. The techniques are further investigated to highlight the important parameters for obtaining the appropriate task granularity for a grid resource.

A Study on the Environment Characteristics and Continuous Usage Intention for Improvement of Fintech (핀테크 활성화를 위한 사용환경특성과 지속사용의도)

  • Jung, Dae-Hyun;Chang, Hwal-Sik;Park, Kwang-O
    • The Journal of Information Systems
    • /
    • v.26 no.2
    • /
    • pp.123-142
    • /
    • 2017
  • Purpose The development of the Fintech industry can be on the basis of the development in IT technologies such as Big data, IoT, cloud computing, it can be considered that the financial industry is repeating the evolution into Fintech. But the awareness of the consumers is still very low. Therefore the current dissertation, tries to deduce the suggestions for invigoration measures for Fintech by conducting an empirical study on the factors that influence the intention of reuse of Fintech on the consumer's point of view. Design/methodology/approach This study made a design of the research model by integrating the factors deducted from the Expectation Confirmation Theory. This paper empirically analyzes the impact of Continuous Usage Intention for Improvement of Fintech. The 302 survey responses were used to verify research hypotheses through covariate structural equation model. Findings According to the empirical analysis result, this study confirmed that the ultimate purpose of the Fintech service is to eliminate the social cost's waste element occurring from issue of money by not using or reducing the usage of cash. Since many Fintech users have pointed out security as the priority task, a direction for the related institutions has been proposed. Second, the content of the current dissertation will be the opportunity of broadening the perception of the current consumers that perceive Fintech as only a NFC simple payment service.

UniPy: A Unified Programming Language for MGC-based IoT Systems

  • Kim, Gayoung;Choi, Kwanghoon;Chang, Byeong-Mo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.3
    • /
    • pp.77-86
    • /
    • 2019
  • The advent of Internet of Things (IoT) makes common nowadays computing environments involving programming not a single computer but several heterogeneous distributed computers together. Developing programs separately, one for each computer, increases programmer burden and testing all the programs become more complex. To address the challenge, this paper proposes an RPC-based unified programming language, UniPy, for development of MGC (eMbedded, Gateway, and Cloud) applications in IoT systems configured with popular computers such as Arduino, Raspberry Pi, and Web-based DB server. UniPy offers programmers a view of classes as locations and a very simple form of remote procedure call mechanism. Our UniPy compiler automatically splits a UniPy program into small pieces of the program at different locations supporting the necessary RPC mechanism. An advantage of UniPy programs is to permit programmers to write local codes the same as for a single computer requiring no extra knowledge due to having unified programming models, which is very different from the existing research works such as Fabryq and Ravel. Also, the structure of UniPy programs allows programmers to test them by directly executing them before splitting, which is a feature that has never been emphasized yet.

A Study on the Role and Security Enhancement of the Expert Data Processing Agency: Focusing on a Comparison of Data Brokers in Vermont (데이터처리전문기관의 역할 및 보안 강화방안 연구: 버몬트주 데이터브로커 비교를 중심으로)

  • Soo Han Kim;Hun Yeong Kwon
    • Journal of Information Technology Services
    • /
    • v.22 no.3
    • /
    • pp.29-47
    • /
    • 2023
  • With the recent advancement of information and communication technologies such as artificial intelligence, big data, cloud computing, and 5G, data is being produced and digitized in unprecedented amounts. As a result, data has emerged as a critical resource for the future economy, and overseas countries have been revising laws for data protection and utilization. In Korea, the 'Data 3 Act' was revised in 2020 to introduce institutional measures that classify personal information, pseudonymized information, and anonymous information for research, statistics, and preservation of public records. Among them, it is expected to increase the added value of data by combining pseudonymized personal information, and to this end, "the Expert Data Combination Agency" and "the Expert Data Agency" (hereinafter referred to as the Expert Data Processing Agency) system were introduced. In comparison to these domestic systems, we would like to analyze similar overseas systems, and it was recently confirmed that the Vermont government in the United States enacted the first "Data Broker Act" in the United States as a measure to protect personal information held by data brokers. In this study, we aim to compare and analyze the roles and functions of the "Expert Data Processing Agency" and "Data Broker," and to identify differences in designated standards, security measures, etc., in order to present ways to contribute to the activation of the data economy and enhance information protection.