• Title/Summary/Keyword: cloud-storage

Search Result 434, Processing Time 0.032 seconds

Development of a System for Field-data Collection Transmission and Monitoring based on Low Power Wide Area Network (저전력 광역통신망 기반 현장데이터 수집 전송 및 모니터링 시스템 개발)

  • Yeong-Tae, Ju;Jong-Sil, Kim;Eung-Kon, Kim
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.6
    • /
    • pp.1105-1112
    • /
    • 2022
  • Field data monitoring systems such as renewable energy generation and smart farm integrated control are developing from PC and server to mobile first, and various wireless communication and application services have emerged with the development of IoT technology. Low-power wide-area networks are services optimized for low-power, low-capacity, and low-speed data transmission, and data collected in the field is transmitted to designated storage servers or cloud-based data platforms, enabling data monitoring. In this paper, we implement an IoT repeater that collects field data with a single device and transmits it to a wireless carrier cloud data flat using a low-power wide-area network, and a monitoring app using it. Using this, the system configuration is simpler, the cost of deployment and operation is lower, and effective data accumulation is possible.

Improving Efficiency of Encrypted Data Deduplication with SGX (SGX를 활용한 암호화된 데이터 중복제거의 효율성 개선)

  • Koo, Dongyoung
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.8
    • /
    • pp.259-268
    • /
    • 2022
  • With prosperous usage of cloud services to improve management efficiency due to the explosive increase in data volume, various cryptographic techniques are being applied in order to preserve data privacy. In spite of the vast computing resources of cloud systems, decrease in storage efficiency caused by redundancy of data outsourced from multiple users acts as a factor that significantly reduces service efficiency. Among several approaches on privacy-preserving data deduplication over encrypted data, in this paper, the research results for improving efficiency of encrypted data deduplication using trusted execution environment (TEE) published in the recent USENIX ATC are analysed in terms of security and efficiency of the participating entities. We present a way to improve the stability of a key-managing server by integrating it with individual clients, resulting in secure deduplication without independent key servers. The experimental results show that the communication efficiency of the proposed approach can be improved by about 30% with the effect of a distributed key server while providing robust security guarantees as the same level of the previous research.

An Efficient Dual Queue Strategy for Improving Storage System Response Times (저장시스템의 응답 시간 개선을 위한 효율적인 이중 큐 전략)

  • Hyun-Seob Lee
    • Journal of Internet of Things and Convergence
    • /
    • v.10 no.3
    • /
    • pp.19-24
    • /
    • 2024
  • Recent advances in large-scale data processing technologies such as big data, cloud computing, and artificial intelligence have increased the demand for high-performance storage devices in data centers and enterprise environments. In particular, the fast data response speed of storage devices is a key factor that determines the overall system performance. Solid state drives (SSDs) based on the Non-Volatile Memory Express (NVMe) interface are gaining traction, but new bottlenecks are emerging in the process of handling large data input and output requests from multiple hosts simultaneously. SSDs typically process host requests by sequentially stacking them in an internal queue. When long transfer length requests are processed first, shorter requests wait longer, increasing the average response time. To solve this problem, data transfer timeout and data partitioning methods have been proposed, but they do not provide a fundamental solution. In this paper, we propose a dual queue based scheduling scheme (DQBS), which manages the data transfer order based on the request order in one queue and the transfer length in the other queue. Then, the request time and transmission length are comprehensively considered to determine the efficient data transmission order. This enables the balanced processing of long and short requests, thus reducing the overall average response time. The simulation results show that the proposed method outperforms the existing sequential processing method. This study presents a scheduling technique that maximizes data transfer efficiency in a high-performance SSD environment, which is expected to contribute to the development of next-generation high-performance storage systems

Development of Information Technology Infrastructures through Construction of Big Data Platform for Road Driving Environment Analysis (도로 주행환경 분석을 위한 빅데이터 플랫폼 구축 정보기술 인프라 개발)

  • Jung, In-taek;Chong, Kyu-soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.3
    • /
    • pp.669-678
    • /
    • 2018
  • This study developed information technology infrastructures for building a driving environment analysis platform using various big data, such as vehicle sensing data, public data, etc. First, a small platform server with a parallel structure for big data distribution processing was developed with H/W technology. Next, programs for big data collection/storage, processing/analysis, and information visualization were developed with S/W technology. The collection S/W was developed as a collection interface using Kafka, Flume, and Sqoop. The storage S/W was developed to be divided into a Hadoop distributed file system and Cassandra DB according to the utilization of data. Processing S/W was developed for spatial unit matching and time interval interpolation/aggregation of the collected data by applying the grid index method. An analysis S/W was developed as an analytical tool based on the Zeppelin notebook for the application and evaluation of a development algorithm. Finally, Information Visualization S/W was developed as a Web GIS engine program for providing various driving environment information and visualization. As a result of the performance evaluation, the number of executors, the optimal memory capacity, and number of cores for the development server were derived, and the computation performance was superior to that of the other cloud computing.

A study on damage prediction analysis for styrene monomer fire explosion accidents (스티렌 모노머 화재폭발사고 피해예측 분석에 관한 연구)

  • Hyung-Su Choi;Min-Je Choi;Guy-Sun Cho
    • Industry Promotion Research
    • /
    • v.9 no.2
    • /
    • pp.37-44
    • /
    • 2024
  • This study selected the worst-case scenario for fireball and vapor cloud explosion (VCE) of a styrene monomer storage tank installed in a petrochemical production plant and performed damage prediction and accident impact analysis. The range of influence of radiant heat and overpressure due to fireball and vapor VCE during the abnormal polymerization reaction of styrene monomer, the main component of the mixed residue oil storage tank, was quantitatively analyzed by applying the e-CA accident damage prediction program. The damage impact areas of radiant heat and explosion overpressure are analyzed to have a maximum radius of 1,150m and 626m, respectively. People within 1,150m of radiant heat of 4kW/m2 may have their skin swell when exposed to it for 20 seconds. In buildings within 626m, where an explosion overpressure of 21kPa is applied, steel structures may be damaged and separated from the foundation, and people may suffer physical injuries. In the event of a fire, explosion or leak, determine the risk standards such as the degree of risk and acceptability to workers in the work place, nearby residents, or surrounding facilities due to radiant heat or overpressure, identify the hazards and risks of the materials handled, and establish an emergency response system. It is expected that it will be helpful in establishing measures to minimize damage to workplaces through improvement and investment activities.

Bioinformatics services for analyzing massive genomic datasets

  • Ko, Gunhwan;Kim, Pan-Gyu;Cho, Youngbum;Jeong, Seongmun;Kim, Jae-Yoon;Kim, Kyoung Hyoun;Lee, Ho-Yeon;Han, Jiyeon;Yu, Namhee;Ham, Seokjin;Jang, Insoon;Kang, Byunghee;Shin, Sunguk;Kim, Lian;Lee, Seung-Won;Nam, Dougu;Kim, Jihyun F.;Kim, Namshin;Kim, Seon-Young;Lee, Sanghyuk;Roh, Tae-Young;Lee, Byungwook
    • Genomics & Informatics
    • /
    • v.18 no.1
    • /
    • pp.8.1-8.10
    • /
    • 2020
  • The explosive growth of next-generation sequencing data has resulted in ultra-large-scale datasets and ensuing computational problems. In Korea, the amount of genomic data has been increasing rapidly in the recent years. Leveraging these big data requires researchers to use large-scale computational resources and analysis pipelines. A promising solution for addressing this computational challenge is cloud computing, where CPUs, memory, storage, and programs are accessible in the form of virtual machines. Here, we present a cloud computing-based system, Bio-Express, that provides user-friendly, cost-effective analysis of massive genomic datasets. Bio-Express is loaded with predefined multi-omics data analysis pipelines, which are divided into genome, transcriptome, epigenome, and metagenome pipelines. Users can employ predefined pipelines or create a new pipeline for analyzing their own omics data. We also developed several web-based services for facilitating downstream analysis of genome data. Bio-Express web service is freely available at https://www. bioexpress.re.kr/.

Probabilistic Assesment of the Effects of Vapor Cloud Explosion on a Human Body (증기운 폭발이 인체에 미치는 영향에 대한 확률론적 평가)

  • Yoon, Yong-Kyun;Ju, Eun-Hye
    • Tunnel and Underground Space
    • /
    • v.31 no.1
    • /
    • pp.52-65
    • /
    • 2021
  • In this study, authors analyzed the vapor cloud explosion induced by propane leak at the PEMIX Terminal, which is the propane storage facility outside of Mexico City. TNT equivalence mass for the leaked 4750 kg propane was estimated to be 9398 kg. Blast parameters such as peak overpressure, positive phase duration, and impact at 40-400 (m) away from the center of the explosion were calculated by applying TNT Equivalency Method and Multi-Energy Method. The probability of damage due to lung damage, eardrum rupture, head impact, and whole-body displacement impact by applying the probit function obtained using blast parameters was evaluated. The peak overpressure obtained using Multi-Energy Method was found to be greater than the peak overpressure obtained by applying the TNT Equivalency Method at all distances considered, but it was evaluated that there was no significant difference from the points above 200 m. The peak overpressure obtained by Multi-Energy Method was computed to assess the extent of damage to the structure, and it was shown that structures within 100 m of the explosion center would collapse completely, and that the glasses of the structures 400 m away would be almost broken. The probability of death due to lung damage was shown to vary depending on a human body's position located in the propagating direction of shock wave, and if there is a reflecting surface in the immediate surroundings of a human body, the probability of death was estimated to be the greatest. The impact of shock wave on lung damage, eardrum rupture, head impact, and whole-body displacement impact was evaluated and found to affect whole-body impact < lung damage < eardrum rupture

Explanable Artificial Intelligence Study based on Blockchain Using Point Cloud (포인트 클라우드를 이용한 블록체인 기반 설명 가능한 인공지능 연구)

  • Hong, Sunghyuck
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.8
    • /
    • pp.36-41
    • /
    • 2021
  • Although the technology for prediction or analysis using artificial intelligence is constantly developing, a black-box problem does not interpret the decision-making process. Therefore, the decision process of the AI model can not be interpreted from the user's point of view, which leads to unreliable results. We investigated the problems of artificial intelligence and explainable artificial intelligence using Blockchain to solve them. Data from the decision-making process of artificial intelligence models, which can be explained with Blockchain, are stored in Blockchain with time stamps, among other things. Blockchain provides anti-counterfeiting of the stored data, and due to the nature of Blockchain, it allows free access to data such as decision processes stored in blocks. The difficulty of creating explainable artificial intelligence models is a large part of the complexity of existing models. Therefore, using the point cloud to increase the efficiency of 3D data processing and the processing procedures will shorten the decision-making process to facilitate an explainable artificial intelligence model. To solve the oracle problem, which may lead to data falsification or corruption when storing data in the Blockchain, a blockchain artificial intelligence problem was solved by proposing a blockchain-based explainable artificial intelligence model that passes through an intermediary in the storage process.

3D Explosion Analyses of Hydrogen Refueling Station Structure Using Portable LiDAR Scanner and AUTODYN (휴대형 라이다 스캐너와 AUTODYN를 이용한 수소 충전소 구조물의 3차원 폭발해석)

  • Baluch, Khaqan;Shin, Chanhwi;Cho, Yongdon;Cho, Sangho
    • Explosives and Blasting
    • /
    • v.40 no.3
    • /
    • pp.19-32
    • /
    • 2022
  • Hydrogen is a fuel having the highest energy compared with other common fuels. This means hydrogen is a clean energy source for the future. However, using hydrogen as a fuel has implication regarding carrier and storage issues, as hydrogen is highly inflammable and unstable gas susceptible to explosion. Explosions resulting from hydrogen-air mixtures have already been encountered and well documented in research experiments. However, there are still large gaps in this research field as the use of numerical tools and field experiments are required to fully understand the safety measures necessary to prevent hydrogen explosions. The purpose of this present study is to develop and simulate 3D numerical modelling of an existing hydrogen gas station in Jeonju by using handheld LiDAR and Ansys AUTODYN, as well as the processing of point cloud scans and use of cloud dataset to develop FEM 3D meshed model for the numerical simulation to predict peak-over pressures. The results show that the Lidar scanning technique combined with the ANSYS AUTODYN can help to determine the safety distance and as well as construct, simulate and predict the peak over-pressures for hydrogen refueling station explosions.

A Study on the Improvement of Collection, Management and Sharing of Maritime Traffic Information (해상교통정보의 수집, 관리 및 공유 개선방안에 관한 연구)

  • Shin, Gil-Ho;Song, Chae-Uk
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.28 no.4
    • /
    • pp.515-524
    • /
    • 2022
  • To effectively collect, manage, and share the maritime traffic information, it is necessary to identify the technology trends concerning this particular information and analyze its current status and problems. Therefore, this study observes the domestic and foreign technology trends involving maritime traffic information while analyzing and summarizing the current status and problems in collecting, managing, and sharing it. According to the data analysis, the problems in the collecting stage are difficulties in collecting visual information from long-distance radars, CCTVs, and cameras in areas outside the LTE network coverage. Notably, this explains the challenges in detecting smuggling ships entering the territorial waters through the exclusive economic zone (EEZ) in the early stage. The problems in the management stage include difficult reductions and expansions of maritime traffic information caused by the lack of flexibility in storage spaces mostly constructed by the maritime transportation system. Additionally, it is challenging to deal with system failure with system redundancy and backup as a countermeasure. Furthermore, the problems in the sharing stage show that it is difficult to share information with external operating organizations since the internal network is mainly used to share maritime transportation information. If at all through the government cloud via platforms such as LRIT and SASS, it often fails to effectively provide various S/W applications that help use maritime big data. Therefore, it is suggested that collecting equipment such as unmanned aerial vehicles and satellites should be constructed to expand collecting areas in the collecting stage. In the management and sharing stages, the introduction and construction of private clouds are suggested, considering the operational administration and information disclosure of each maritime transportation system. Through these efforts, an enhancement of the expertise and security of clouds is expected.