• Title/Summary/Keyword: Log File Analysis

Search Result 61, Processing Time 0.023 seconds

High Rate Denial-of-Service Attack Detection System for Cloud Environment Using Flume and Spark

  • Gutierrez, Janitza Punto;Lee, Kilhung
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.675-689
    • /
    • 2021
  • Nowadays, cloud computing is being adopted for more organizations. However, since cloud computing has a virtualized, volatile, scalable and multi-tenancy distributed nature, it is challenging task to perform attack detection in the cloud following conventional processes. This work proposes a solution which aims to collect web server logs by using Flume and filter them through Spark Streaming in order to only consider suspicious data or data related to denial-of-service attacks and reduce the data that will be stored in Hadoop Distributed File System for posterior analysis with the frequent pattern (FP)-Growth algorithm. With the proposed system, we can address some of the difficulties in security for cloud environment, facilitating the data collection, reducing detection time and consequently enabling an almost real-time attack detection.

Article Data Prefetching Policy using User Access Patterns in News-On-demand System (주문형 전자신문 시스템에서 사용자 접근패턴을 이용한 기사 프리패칭 기법)

  • Kim, Yeong-Ju;Choe, Tae-Uk
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.5
    • /
    • pp.1189-1202
    • /
    • 1999
  • As compared with VOD data, NOD article data has the following characteristics: it is created at any time, has a short life cycle, is selected as not one article but several articles by a user, and has high access locality in time. Because of these intrinsic features, user access patterns of NOD article data are different from those of VOD. Thus, building NOD system using the existing techniques of VOD system leads to poor performance. In this paper, we analysis the log file of a currently running electronic newspaper, show that the popularity distribution of NOD articles is different from Zipf distribution of VOD data, and suggest a new popularity model of NOD article data MS-Zipf(Multi-Selection Zipf) distribution and its approximate solution. Also we present a life cycle model of NOD article data, which shows changes of popularity over time. Using this life cycle model, we develop LLBF (Largest Life-cycle Based Frequency) prefetching algorithm and analysis he performance by simulation. The developed LLBF algorithm supports the similar level in hit-ratio to the other prefetching algorithms such as LRU(Least Recently Used) etc, while decreasing the number of data replacement in article prefetching and reducing the overhead of the prefetching in system performance. Using the accurate user access patterns of NOD article data, we could analysis correctly the performance of NOD server system and develop the efficient policies in the implementation of NOD server system.

  • PDF

Application Performance Evaluation in Main Memory Database System (메인메모리 데이터베이스시스템에서의 어플리케이션 성능 평가)

  • Kim, Hee-Wan;Ahn, Yeon S.
    • Journal of Digital Contents Society
    • /
    • v.15 no.5
    • /
    • pp.631-642
    • /
    • 2014
  • The main memory DBMS is operated which the contents of the table that resides on a disk at the same time as the drive is in the memory. However, because the main memory DBMS stores the data and transaction log file using the disk file system, there are a limit to the speed at which the CPU accesses the memory. In this paper, I evaluated the performance through analysis of the application side difference the technology that has been implemented in Altibase system of main memory DBMS and Sybase of disk-based DBMS. When the application performance of main memory DBMS is in comparison with the disk-based DBMS, the performance of main memory DBMS was outperformed 1.24~3.36 times in the single soccer game, and was outperformed 1.29~7.9 times in the soccer game / special soccer. The result of sale transaction response time showed a fast response time of 1.78 ~ 6.09 times.

A Study of Acquisition and Analysis on the Bios Firmware Image File in the Digital Forensics (디지털 포렌식 관점에서 BIOS 펌웨어 이미지 파일 수집 및 분석에 관한 연구)

  • Jeong, Seung Hoon;Lee, Yun Ho;Lee, Sang Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.5 no.12
    • /
    • pp.491-498
    • /
    • 2016
  • Recently leakages of confidential information and internal date have been steadily increasing by using booting technique on portable OS such as Windows PE stored in portable storage devices (USB or CD/DVD etc). This method allows to bypass security software such as USB security or media control solution installed in the target PC, to extract data or insert malicious code by mounting the PC's storage devices after booting up the portable OS. Also this booting method doesn't record a log file such as traces of removable storage devices. Thus it is difficult to identify whether the data are leaked and use trace-back technique. In this paper is to propose method to help facilitate the process of digital forensic investigation or audit of a company by collecting and analyzing BIOS firmware images that record data relating to BIOS settings in flash memory and finding traces of portable storage devices that can be regarded as abnormal events.

Exploring Navigation Pattern and Site Evaluation Variation in a Community Website by Mixture Model at Segment Level (커뮤니티 사이트 특성과 navigation pattern 연관성의 세분시장별 이질성분석 - 믹스처모델의 구조방정식 적용을 중심으로 -)

  • Kim, So-Young;Kwak, Young-Sik;Nam, Yong-Sik
    • Journal of Global Scholars of Marketing Science
    • /
    • v.13
    • /
    • pp.209-229
    • /
    • 2004
  • Although the site evaluation factors that affect the navigation pattern are well documented, the attempt to explore the difference in the relationship between navigation pattern and site evaluation factors by post hoc segmentation approach has been relatively rare. For this purpose, this study constructs the structure equation model using web-evaluation data and log file of a community site with 300,000 members. And then it applies the structure equation model to each segment. Each segment is identified by mixture model. Mixture model is to unmix the sample, to identify the segments, and to estimate the parameters of the density function underlying the observed data within each segment. The study examines the opportunity to increase GFI, using mixture model which supposes heterogeneous groups in the users, not through specification search by modification index from structure equation model. This study finds out that AGFI increases from 0.819 at total sample to 0.927, 0.930, 0.928, 0.929 for each 4 segments in the case of the community site. The results confirm that segment level approach is more effective than model modification when model is robust in terms of theoretical background. Furthermore, we can identify a heterogeneous navigation pattern and site evaluation variation in the community website at segment level.

  • PDF

Non-Surgical Resolution of Inflow Cannula Obstruction of a Left Ventricular Assist Device: A Case Report

  • Lee, Yoonseo;Sung, Kiick;Kim, Wook Sung;Jeong, Dong Seop;Shinn, Sung Ho;Cho, Yang Hyun
    • Journal of Chest Surgery
    • /
    • v.54 no.6
    • /
    • pp.543-546
    • /
    • 2021
  • A 55-year-old woman who had received an implantable left ventricular assist device 3 months earlier presented with dyspnea and a low-flow alarm of the device. Computed tomography and log-file analysis of the device system suggested inflow cannula obstruction. Since the patient had cardiogenic shock due to pump failure, venoarterial extracorporeal membrane oxygenation (ECMO) was initiated. With ECMO, surgical exchange of the pump was considered. However, the obstruction spontaneously resolved without surgical intervention. It turned out that an obstructive thrombus was washed out by rebooting the pump. Moreover, the thrombus was embolized in the patient's left subclavian artery. The patient underwent heart transplantation 4 months after the pump obstruction accident and continued to do well.

Application of Social Network Analysis on Learner Interaction in a GBS Learning Environment (GBS 학습 환경 하에서 상호작용 연구를 위한 사회 연결망 분석 기법의 적용)

  • Jo, Il-Hyun
    • The Journal of Korean Association of Computer Education
    • /
    • v.6 no.2
    • /
    • pp.81-93
    • /
    • 2003
  • The purpose of the study was to explore the potential of the Social Network Analysis as an analytical tool for scientific investigation of learner-learner, or learner-tutor interaction within an e-Learning environment. Theoretical and methodological implication of the Social Network Analysis had been discussed. Following theoretical analysis, an exploratory empirical study was conducted to test statistical correlation between traditional performance measures such as achievement and team contribution index, and the centrality measure, one of the many quantitative measures the Social Network Analysis provides. Results indicate the centrality measure was correlated with the higher order learning performance and peer-evaluated contribution indices. An interpretation of the results and their implication to instructional design theory and practices were provided along with some suggestions for future research.

  • PDF

Metaverse Artifact Analysis through the Roblox Platform Forensics (메타버스 플랫폼 Roblox 포렌식을 통한 아티팩트 분석)

  • Yiseul Choi;Jeongeun Cho;Eunbeen Lee;Hakkyong Kim;Seongmin Kim
    • Convergence Security Journal
    • /
    • v.23 no.3
    • /
    • pp.37-47
    • /
    • 2023
  • The growth of the metaverse has been accelerated by the increased demand for non-face-to-face interactions due to COVID-19 and advancements in technologies such as blockchain and NFTs. However, with the emergence of various metaverse platforms and the corresponding rise in users, criminal cases such as ransomware attacks, copyright infringements, and sexual offenses have occurred within the metaverse. Consequently, the need for artifacts that can be utilized as digital evidence within metaverse systems has increased. However, there is a lack of information about artifacts that can be used as digital evidence. Furthermore, metaverse security evaluation and forensic analysis are also insufficient, and the absence of attack scenarios and related guidelines makes forensics challenging. To address these issues, this paper presents artifacts that can be used for user behavior analysis and timeline analysis through dynamic analysis of Roblox, a representative metaverse gaming solution. Based on analyzing interrelationship between identified artifacts through memory forensics and log file analysis, this paper suggests the potential usability of artifacts in metaverse crime scenarios. Moreover, it proposes improvements by analyzing the current legal and regulatory aspects to address institutional deficiencies.

A Study on the Data Collection Methods based Hadoop Distributed Environment (하둡 분산 환경 기반의 데이터 수집 기법 연구)

  • Jin, Go-Whan
    • Journal of the Korea Convergence Society
    • /
    • v.7 no.5
    • /
    • pp.1-6
    • /
    • 2016
  • Many studies have been carried out for the development of big data utilization and analysis technology recently. There is a tendency that government agencies and companies to introduce a Hadoop of a processing platform for analyzing big data is increasing gradually. Increased interest with respect to the processing and analysis of these big data collection technology of data has become a major issue in parallel to it. However, study of the collection technology as compared to the study of data analysis techniques, it is insignificant situation. Therefore, in this paper, to build on the Hadoop cluster is a big data analysis platform, through the Apache sqoop, stylized from relational databases, to collect the data. In addition, to provide a sensor through the Apache flume, a system to collect on the basis of the data file of the Web application, the non-structured data such as log files to stream. The collection of data through these convergence would be able to utilize as a basic material of big data analysis.

The Study on the Effect of Target Volume in DQA based on MLC log file (MLC 로그 파일 기반 DQA에서 타깃 용적에 따른 영향 연구)

  • Shin, Dong Jin;Jung, Dong Min;Cho, Kang Chul;Kim, Ji Hoon;Yoon, Jong Won;Cho, Jeong Hee
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.32
    • /
    • pp.53-59
    • /
    • 2020
  • Purpose: The purpose of this study is to compare and analyze the difference between the MLC log file-based software (Mobius) and the conventional phantom-ionization chamber (ArcCheck) dose verification method according to the change of target volume. Material and method: Radius 0.25cm, 0.5cm, 1cm, 2cm, 3cm, 4cm, 5cm, 6cm, 7cm, 8cm, 9cm, 10cm with a Sphere-shaped target Twelve plans were created and dose verification using Mobius and ArcCheck was conducted three times each. The irradiated data were compared and analyzed using the point dose error value and the gamma passing rate (3%/3mm) as evaluation indicators. Result: Mobius point dose error values were -9.87% at a radius of 0.25cm and -4.39% at 0.5cm, and the error value was within 3% at the remaining target volume. The gamma passing rate was 95% at a radius of 9cm and 93.9% at 10cm, and a passing rate of more than 95% was shown in the remaining target volume. In ArcCheck, the average error value of the point dose was about 2% in all target volumes. The gamma passing rate also showed a pass rate of 98% or more in all target volumes. Conclusion: For small targets with a radius of 0.5cm or less or a large target with a radius of 9cm or more, considering the uncertainty of DQA based on MLC log files, phantom-ionized DQA is used in complementary ways to include point dose, gamma index, DVH, and target coverage. It is believed that it is desirable to verify the dose delivery through a comprehensive analysis.