• Title/Summary/Keyword: Large Scale Data

Search Result 2,796, Processing Time 0.036 seconds

Application of access control policy in ScienceDMZ-based network configuration (ScienceDMZ 기반의 네트워크 구성에서 접근제어정책 적용)

  • Kwon, Woo Chang;Lee, Jae Kwang;Kim, Ki Hyeon
    • Convergence Security Journal
    • /
    • v.21 no.2
    • /
    • pp.3-10
    • /
    • 2021
  • Nowadays, data-based scientific research is a trend, and the transmission of large amounts of data has a great influence on research productivity. To solve this problem, a separate network structure for transmitting large-scale scientific big data is required. ScienceDMZ is a network structure designed to transmit such scientific big data. In such a network configuration, it is essential to establish an access control list(ACL) for users and resources. In this paper, we describe the R&E Together project and the network structure implemented in the actual ScienceDMZ network structure, and define users and services to which access control policies are applied for safe data transmission and service provision. In addition, it presents a method for the network administrator to apply the access control policy to all network resources and users collectively, and through this, it was possible to achieve automation of the application of the access control policy.

A Service Model Development Plan for Countering Denial of Service Attacks based on Artificial Intelligence Technology (인공지능 기술기반의 서비스거부공격 대응 위한 서비스 모델 개발 방안)

  • Kim, Dong-Maeong;Jo, In-June
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.2
    • /
    • pp.587-593
    • /
    • 2021
  • In this thesis, we will break away from the classic DDoS response system for large-scale denial-of-service attacks that develop day by day, and effectively endure intelligent denial-of-service attacks by utilizing artificial intelligence-based technology, one of the core technologies of the 4th revolution. A possible service model development plan was proposed. That is, a method to detect denial of service attacks and minimize damage through machine learning artificial intelligence learning targeting a large amount of data collected from multiple security devices and web servers was proposed. In particular, the development of a model for using artificial intelligence technology is to detect a Western service attack by focusing on the fact that when a service denial attack occurs while repeating a certain traffic change and transmitting data in a stable flow, a different pattern of data flow is shown. Artificial intelligence technology was used. When a denial of service attack occurs, a deviation between the probability-based actual traffic and the predicted value occurs, so it is possible to respond by judging as aggressiveness data. In this paper, a service denial attack detection model was explained by analyzing data based on logs generated from security equipment or servers.

3D Modeling based on Digital Topographic Map for Risk Analysis of Crowd Concentration and Selection of High-risk Walking Routes (군중 밀집 위험도 분석과 고위험 보행로 선정을 위한 수치지형도 기반 3D 모델링)

  • Jae Min Lee;Imgyu Kim;Sang Yong Park;Hyuncheol Kim
    • Journal of the Korean Society of Safety
    • /
    • v.38 no.2
    • /
    • pp.87-95
    • /
    • 2023
  • On October 29, 2022, a very large number of people gathered in Itaewondong, Yongsan-gu, Seoul, Korea for a Halloween festival, and as crowds pushed through narrow alleys, 159 deaths and 195 injuries occurred, making it the largest crushing incident in Korea. There have been a number of stampede deaths where crowds gathered at large-scale festivals, event venues, and stadiums, both at home and abroad. When the density increases, the physical contact between bodies becomes very strong, and crowd turbulence occurs when the force of the crowd is suddenly added from one body to another; thus, the force is amplified and causes the crowd to behave like a mass of fluid. When crowd turbulence occurs, people cannot control themselves and are pushed into he crowd. To prevent a stampede accident, investigation and management of areas expected to be crowded and congested must be systematically conducted, and related ministries and local governments are planning to establish a crowd management system to prepare safety management measures to prevent accidents involving multiple crowds. In this study, based on national data, a continuous digital topographic map is modeled in 3D to analyze the risk of crowding and present a plan for selecting high-risk walking routes. Areas with a high risk of crowding are selected in advance based on various data (numerical data, floating population, and regional data) in a realistic and feasible way, and the analysis is based on the visible results from 3D modeling of the risk area. The study demonstrates that it is possible to prepare measures to prevent cluster accidents that can reflect the characteristics of the region.

A Method of Frequent Structure Detection Based on Active Sliding Window (능동적 슬라이딩 윈도우 기반 빈발구조 탐색 기법)

  • Hwang, Jeong-Hee
    • Journal of Digital Contents Society
    • /
    • v.13 no.1
    • /
    • pp.21-29
    • /
    • 2012
  • In ubiquitous computing environment, rising large scale data exchange through sensor network with sharply growing the internet, the processing of the continuous stream data is required. Therefore there are some mining researches related to the extracting of frequent structures and the efficient query processing of XML stream data. In this paper, we propose a mining method to extract frequent structures of XML stream data in recent window based on the active window sliding using trigger rule. The proposed method is a basic research to control the stream data flow for data mining and continuous query by trigger rules.

VRSMS: VR-based Sensor Management System (VRSMS: 가상현실 기반 센서 관리 시스템)

  • Kim, Han-Soo;Kim, Hyung-Seok
    • Journal of the HCI Society of Korea
    • /
    • v.3 no.2
    • /
    • pp.1-8
    • /
    • 2008
  • We introduce VRSMS(VR-based sensor management system) which is the visualization system of micro-scale air quality monitoring system Airscope[3]. By adopting VR-based visualization method, casual users can get insight of air quality data intuitively. Users can also manipulate sensors in VR space to get specific data they needed. For adaptive visualization, we separated visualization and interaction methods from air quality data. By separation, we can get consistent way for data access so new visualization and interaction methods are easily attached. As one of the adaptive visualization method, we constructed large display system which consists of several small displays. This system can provide accessibility for air quality data to people one public space.

  • PDF

Secure Multicast using Proxy Re-Encryption in an IoT Environment

  • Kim, SuHyun;Hwang, YongWoon;Seo, JungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.2
    • /
    • pp.946-959
    • /
    • 2018
  • Recently interest in Internet of Things(IoT) has attracted significant attention at national level. IoT can create new services as a technology to exchange data through connections among a huge number of objects around the user. Data communication between objects provides not only information collected in the surrounding environment but also various personalized information. IoT services which provide these various types of data are exposed to numerous security vulnerabilities. If data is maliciously collected and used by an attacker in an IoT environment that deals with various data, security threats are greater than those in existing network environments. Therefore, security of all data exchanged in the IoT environment is essential. However, lightweight terminal devices used in the IoT environment are not suitable for applying the existing encryption algorithm. In addition, IoT networks consisting of many sensors require group communication. Therefore, this paper proposes a secure multicast scheme using the proxy re-encryption method based on Vehicular ad-hoc networks(VANET) environment. The proposed method is suitable for a large-scale dynamic IoT network environment using unreliable servers.

The Evaluation of the Annual Time Series Data for the Mean Sea Level of the West Coast by Regression Model (회귀모형에 의한 서해안 평균해면의 연시계열자료의 평가)

  • 조기태;박영기;이장춘
    • Journal of Environmental Science International
    • /
    • v.9 no.1
    • /
    • pp.19-25
    • /
    • 2000
  • As the tideland reclamation is done on a large scale these days, construction work is active in the coastal areas. Facilities in the coastal areas must be built with the tide characteristics taken into consideration. Thus the tide characteristics affect the overall reclamation plan. The analysis of the tide data boils down to a harmonic analysis of the hourly changes of long-term tide data and extraction of unharmonic coefficients from the results. Since considerable amount of tide data of the West Coast are available, the existing data can be collected and can be used to obtain the temporal changes of the tide by being fitted into the tide prediction model. The goal of this thesis lies in assessing whether the mean sea level used in the field agrees with the analysis results from the long-term observation data obtained with their homogeneity guaranteed. To achieve this goal, the research was conducted as follows. First the present conditions of the observation stations, the land level standard, and the sea level standard were analyzed to set up a time series model formula for representing them. To secure the homogeneity of the time series, each component was separated. Lastly the mean sea level used in the field was assessed based on the results obtained form the analysis of the time series.

  • PDF

A Study of the extraction algorithm of the disaster sign data from web (재난 전조 정보 추출 알고리즘 연구)

  • Lee, Changyeol;Kim, Taehwan;Cha, Sangyeul
    • Journal of the Society of Disaster Information
    • /
    • v.7 no.2
    • /
    • pp.140-150
    • /
    • 2011
  • Life Environment is rapidly changing and large scale disasters are increasing from the global warming. Although the disaster repair resources are deployed to the disaster fields, the prevention of the disasters is the most effective countermeasures. the disaster sign data is based on the rule of Heinrich. Automatic extraction of the disaster sign data from the web is the focused issues in this paper. We defined the automatic extraction processes and applied information, such as accident nouns, disaster filtering nouns, disaster sign nouns and rules. Using the processes, we implemented the disaster sign data management system. In the future, the applied information must be continuously updated, because the information is only the extracted and analytic result from the some disaster data.

Implementation of Search Method based on Sequence and Adjacency Relationship of User Query (사용자 검색 질의 단어의 순서 및 단어간의 인접 관계에 기반한 검색 기법의 구현)

  • So, Byung-Chul;Jung, Jin-Woo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.6
    • /
    • pp.724-729
    • /
    • 2011
  • Information retrieval is a method to search the needed data by users. Generally, when a user searches some data in the large scale data set like the internet, ranking-based search is widely used because it is not easy to find the exactly needed data at once. In this paper, we propose a novel ranking-based search method based on sequence and adjacency relationship of user query by the help of TF-IDF and n-gram. As a result, it was possible to find the needed data more accurately with 73% accuracy in more than 19,000 data set.

Bayesian analysis of longitudinal traits in the Korea Association Resource (KARE) cohort

  • Chung, Wonil;Hwang, Hyunji;Park, Taesung
    • Genomics & Informatics
    • /
    • v.20 no.2
    • /
    • pp.16.1-16.12
    • /
    • 2022
  • Various methodologies for the genetic analysis of longitudinal data have been proposed and applied to data from large-scale genome-wide association studies (GWAS) to identify single nucleotide polymorphisms (SNPs) associated with traits of interest and to detect SNP-time interactions. We recently proposed a grid-based Bayesian mixed model for longitudinal genetic data and showed that our Bayesian method increased the statistical power compared to the corresponding univariate method and well detected SNP-time interactions. In this paper, we further analyze longitudinal obesity-related traits such as body mass index, hip circumference, waist circumference, and waist-hip ratio from Korea Association Resource data to evaluate the proposed Bayesian method. We first conducted GWAS analyses of cross-sectional traits and combined the results of GWAS analyses through a meta-analysis based on a trajectory model and a random-effects model. We then applied our Bayesian method to a subset of SNPs selected by meta-analysis to further discover SNPs associated with traits of interest and SNP-time interactions. The proposed Bayesian method identified several novel SNPs associated with longitudinal obesity-related traits, and almost 25% of the identified SNPs had significant p-values for SNP-time interactions.