• Title/Summary/Keyword: Log analysis

Search Result 2,159, Processing Time 0.035 seconds

Bayesian Analysis in Generalized Log-Gamma Censored Regression Model

  • Younshik chung;Yoomi Kang
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.3
    • /
    • pp.733-742
    • /
    • 1998
  • For industrial and medical lifetime data, the generalized log-gamma regression model is considered. Then the Bayesian analysis for the generalized log-gamma regression with censored data are explained and following the data augmentation (Tanner and Wang; 1987), the censored data is replaced by simulated data. To overcome the complicated Bayesian computation, Makov Chain Monte Carlo (MCMC) method is employed. Then some modified algorithms are proposed to implement MCMC. Finally, one example is presented.

  • PDF

Design of Intrusion Responsible System For Enterprise Security Management (통합보안 관리를 위한 침입대응 시스템 설계)

  • Lee, Chang-Woo;Sohn, Woo-Yong;Song, Jung-Gil
    • Convergence Security Journal
    • /
    • v.5 no.2
    • /
    • pp.51-56
    • /
    • 2005
  • Service operating management to keep stable and effective environment according as user increase and network environment of the Internet become complex gradually and requirements of offered service and user become various is felt constraint gradually. To solve this problem, invasion confrontation system through proposed this log analysis can be consisted as search of log file that is XML's advantage storing log file by XML form is easy and fast, and can have advantage log files of system analyze unification and manages according to structure anger of data. Also, created log file by Internet Protocol Address sort by do log and by Port number sort do log, invasion type sort log file and comparative analysis created in other invasion feeler system because change sort to various form such as do log by do logarithm, feeler time possible.

  • PDF

Hazard Analysis of Staphylococcus aureus in Ready-to-Eat Sandwiches (즉석섭취 샌드위치류의 황색포도상구균에 대한 위해분석)

  • Park, Hae-Jung;Bae, Hyun-Joo
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.36 no.7
    • /
    • pp.938-943
    • /
    • 2007
  • This study investigated the hazard analysis of ready-to-eat sandwiches sold in various establishments. Sandwich samples were collected from convenience stores, discount stores, sandwich chain stores, bakery shops, fast-food chain stores, and food service operations located in Daegu and Gyeongbuk. Out of 174 samples, 18 (10.3%) contained coagulase positive staphylococci with counts ranging from 0.30 to 4.08 log CFU/g. There was significant seasonal difference in Staphylococcus aureus isolation; the average count in summer (3.24 log CFU/g) was 3 times higher than that of winter (1.10 log CFU/g) (P<0.001). According to the microbiological guidelines of PHLS for ready-to-eat foods, 95.4% of the samples were acceptable. As a result of enterotoxin producing experimental data ($35^{\circ}C$, pH 5.8, NaCl 0.5%), enterotoxin was not produced in a sandwich until Staphylococcus aureus increased to a level greater than 4.95 log CFU/g. This microbiological hazard analysis data could be applied to future studies on quantitative risk assessment of ready-to-eat foods.

AN EFFICIENT PRAM ALGORITHM FOR MAXIMUM-WEIGHT INDEPENDENT SET ON PERMUTATION GRAPHS

  • SAHA ANITA;PAL MADHUMANGAL;PAL TAPAN K.
    • Journal of applied mathematics & informatics
    • /
    • v.19 no.1_2
    • /
    • pp.77-92
    • /
    • 2005
  • An efficient parallel algorithm is presented to find a maximum weight independent set of a permutation graph which takes O(log n) time using O($n^2$/ log n) processors on an EREW PRAM, provided the graph has at most O(n) maximal independent sets. The best known parallel algorithm takes O($log^2n$) time and O($n^3/log\;n$) processors on a CREW PRAM.

A Recovery Technique Using Client-based Logging in Client/Server Environment

  • Park, Yong-Mun;Lee, Chan-Seob;Kim, Dong-Hyuk;Park, Eui-In
    • Proceedings of the IEEK Conference
    • /
    • 2002.07a
    • /
    • pp.429-432
    • /
    • 2002
  • The existing recovery technique using the logging technique in the client/sewer database system only administers the log as a whole in a server. This contains the logging record transmission cost on the transaction that is executed in each client potentially and increases network traffic. In this paper, the logging technique for redo-only log is suggested, which removes the redundant before-image and supports the client-based logging to eliminate the transmission cost of the logging record. Also, in case of a client crash, redo recovery through a backward client analysis log is performed in a self-recovering way. In case of a server crash, the after-image of the pages which needs recovery through simultaneous backward analysis log is only transmitted and redo recovery is done through the received after-image and backward analysis log. Also, we select the comparing model to estimate the performance about the proposed recovery technique. And we analyzed the redo and recovery time about the change of the number of client and the rate of updating operation.

  • PDF

Analysis of Web Log Using Clementine Data Mining Solution (클레멘타인 데이터마이닝 솔루션을 이용한 웹 로그 분석)

  • Kim, Jae-Kyeong;Lee, Kun-Chang;Chung, Nam-Ho;Kwon, Soon-Jae;Cho, Yoon-Ho
    • Information Systems Review
    • /
    • v.4 no.1
    • /
    • pp.47-67
    • /
    • 2002
  • Since mid 90's, most of firms utilizing web as a communication vehicle with customers are keenly interested in web log file which contains a lot of trails customers left on the web, such as IP address, reference address, cookie file, duration time, etc. Therefore, an appropriate analysis of the web log file leads to understanding customer's behaviors on the web. Its analysis results can be used as an effective marketing information for locating potential target customers. In this study, we introduced a web mining technique using Clementine of SPSS, and analyzed a set of real web log data file on a certain Internet hub site. We also suggested a process of various strategies build-up based on the web mining results.

Methodology of Log Analysis for Intrusion Prevention based on LINUX (리눅스 기반 침입 방지를 위한 로그 분석 방법 연구)

  • Lim, Sung-Hwa;Lee, Do Hyeon;Kim, Jeom Goo
    • Convergence Security Journal
    • /
    • v.15 no.2
    • /
    • pp.33-41
    • /
    • 2015
  • A safe Linux system for security enhancement should have an audit ability that prohibits an illegal access and alternation of data as well as trace ability of illegal activities. In addition, construction of the log management and monitoring system is a necessity to clearly categorize the responsibility of the system manager or administrator and the users' activities. In this paper, the Linux system's Security Log is analyzed to utilize it on prohibition and detection of an illegal protrusion converting the analyzed security log into a database. The proposed analysis allows a safe management of the security log. This system will contribute to the enhancement of the system reliability by allowing quick response to the system malfunctions.

A Study on Process Management Method of Offshore Plant Piping Material using Process Mining Technique (프로세스 마이닝 기법을 이용한 해양플랜트 배관재 제작 공정 관리 방법에 관한 연구)

  • Park, JungGoo;Kim, MinGyu;Woo, JongHun
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.56 no.2
    • /
    • pp.143-151
    • /
    • 2019
  • This study describes a method for analyzing log data generated in a process using process mining techniques. A system for collecting and analyzing a large amount of log data generated in the process of manufacturing an offshore plant piping material was constructed. The analyzed data was visualized through various methods. Through the analysis of the process model, it was evaluated whether the process performance was correctly input. Through the pattern analysis of the log data, it is possible to check beforehand whether the problem process occurred. In addition, we analyzed the process performance data of partner companies and identified the load of their processes. These data can be used as reference data for pipe production allocation. Real-time decision-making is required to cope with the various variances that arise in offshore plant production. To do this, we have built a system that can analyze the log data of real - time system and make decisions.

An Efficient Design and Implementation of an MdbULPS in a Cloud-Computing Environment

  • Kim, Myoungjin;Cui, Yun;Lee, Hanku
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.8
    • /
    • pp.3182-3202
    • /
    • 2015
  • Flexibly expanding the storage capacity required to process a large amount of rapidly increasing unstructured log data is difficult in a conventional computing environment. In addition, implementing a log processing system providing features that categorize and analyze unstructured log data is extremely difficult. To overcome such limitations, we propose and design a MongoDB-based unstructured log processing system (MdbULPS) for collecting, categorizing, and analyzing log data generated from banks. The proposed system includes a Hadoop-based analysis module for reliable parallel-distributed processing of massive log data. Furthermore, because the Hadoop distributed file system (HDFS) stores data by generating replicas of collected log data in block units, the proposed system offers automatic system recovery against system failures and data loss. Finally, by establishing a distributed database using the NoSQL-based MongoDB, the proposed system provides methods of effectively processing unstructured log data. To evaluate the proposed system, we conducted three different performance tests on a local test bed including twelve nodes: comparing our system with a MySQL-based approach, comparing it with an Hbase-based approach, and changing the chunk size option. From the experiments, we found that our system showed better performance in processing unstructured log data.

Messaging System Analysis for Effective Embedded Tester Log Processing (효과적인 Embedded Tester Log 처리를 위한 Messaging System 분석)

  • Nam, Ki-ahn;Kwon, Oh-young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.05a
    • /
    • pp.645-648
    • /
    • 2017
  • The existing embedded tester used TCP and shared file system for log processing. In addition, the existing processing method was treated as 1-N structure. This method wastes resources of the tester for exception handling. We implemented a log processing message layer that can be distributed by messaging system. And we compare the transmission method using the message layer and the transmission method using TCP and the shared file system. As a result of comparison, transmission using the message layer showed higher transmission bandwidth than TCP. In the CPU usage, the message layer showed lower efficiency than TCP, but showed no significant difference. It can be seen that the log processing using the message layer shows higher efficiency.

  • PDF