• Title/Summary/Keyword: network forensics

Search Result 69, Processing Time 0.022 seconds

A Deep Learning Approach for Identifying User Interest from Targeted Advertising

  • Kim, Wonkyung;Lee, Kukheon;Lee, Sangjin;Jeong, Doowon
    • Journal of Information Processing Systems
    • /
    • v.18 no.2
    • /
    • pp.245-257
    • /
    • 2022
  • In the Internet of Things (IoT) era, the types of devices used by one user are becoming more diverse and the number of devices is also increasing. However, a forensic investigator is restricted to exploit or collect all the user's devices; there are legal issues (e.g., privacy, jurisdiction) and technical issues (e.g., computing resources, the increase in storage capacity). Therefore, in the digital forensics field, it has been a challenge to acquire information that remains on the devices that could not be collected, by analyzing the seized devices. In this study, we focus on the fact that multiple devices share data through account synchronization of the online platform. We propose a novel way of identifying the user's interest through analyzing the remnants of targeted advertising which is provided based on the visited websites or search terms of logged-in users. We introduce a detailed methodology to pick out the targeted advertising from cache data and infer the user's interest using deep learning. In this process, an improved learning model considering the unique characteristics of advertisement is implemented. The experimental result demonstrates that the proposed method can effectively identify the user interest even though only one device is examined.

An Optimized Model for the Local Compression Deformation of Soft Tissue

  • Zhang, Xiaorui;Yu, Xuefeng;Sun, Wei;Song, Aiguo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.2
    • /
    • pp.671-686
    • /
    • 2020
  • Due to the long training time and high training cost of traditional surgical training methods, the emerging virtual surgical training method has gradually replaced it as the mainstream. However, the virtual surgical system suffers from poor authenticity and high computational cost problems. For overcoming the deficiency of these problems, we propose an optimized model for the local compression deformation of soft tissue. This model uses a simulated annealing algorithm to optimize the parameters of the soft tissue model to improve the authenticity of the simulation. Meanwhile, although the soft tissue deformation is divided into local deformation region and non-deformation region, our proposed model only needs to calculate and update the deformation region, which can improve the simulation real-time performance. Besides, we define a compensation strategy for the "superelastic" effect which often occurs with the mass-spring model. To verify the validity of the model, we carry out a compression simulation experiment of abdomen and human foot and compare it with other models. The experimental results indicate the proposed model is realistic and effective in soft tissue compression simulation, and it outperforms other models in accuracy and real-time performance.

CAB: Classifying Arrhythmias based on Imbalanced Sensor Data

  • Wang, Yilin;Sun, Le;Subramani, Sudha
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.7
    • /
    • pp.2304-2320
    • /
    • 2021
  • Intelligently detecting anomalies in health sensor data streams (e.g., Electrocardiogram, ECG) can improve the development of E-health industry. The physiological signals of patients are collected through sensors. Timely diagnosis and treatment save medical resources, promote physical health, and reduce complications. However, it is difficult to automatically classify the ECG data, as the features of ECGs are difficult to extract. And the volume of labeled ECG data is limited, which affects the classification performance. In this paper, we propose a Generative Adversarial Network (GAN)-based deep learning framework (called CAB) for heart arrhythmia classification. CAB focuses on improving the detection accuracy based on a small number of labeled samples. It is trained based on the class-imbalance ECG data. Augmenting ECG data by a GAN model eliminates the impact of data scarcity. After data augmentation, CAB classifies the ECG data by using a Bidirectional Long Short Term Memory Recurrent Neural Network (Bi-LSTM). Experiment results show a better performance of CAB compared with state-of-the-art methods. The overall classification accuracy of CAB is 99.71%. The F1-scores of classifying Normal beats (N), Supraventricular ectopic beats (S), Ventricular ectopic beats (V), Fusion beats (F) and Unclassifiable beats (Q) heartbeats are 99.86%, 97.66%, 99.05%, 98.57% and 99.88%, respectively. Unclassifiable beats (Q) heartbeats are 99.86%, 97.66%, 99.05%, 98.57% and 99.88%, respectively.

A Study on the remote acuisition of HejHome Air Cloud artifacts (스마트 홈 헤이 홈 Air의 클라우드 아티팩트 원격 수집 방안 연구)

  • Kim, Ju-eun;Seo, Seung-hee;Cha, Hae-seong;Kim, Yeok;Lee, Chang-hoon
    • Journal of Internet Computing and Services
    • /
    • v.23 no.5
    • /
    • pp.69-78
    • /
    • 2022
  • As the use of Internet of Things (IoT) devices has expanded, digital forensics coverage of the National Police Agency has expanded to smart home areas. Accordingly, most of the existing studies conducted to acquire smart home platform data were mainly conducted to analyze local data of mobile devices and analyze network perspectives. However, meaningful data for evidence analysis is mainly stored on cloud storage on smart home platforms. Therefore, in this paper, we study how to acquire stored in the cloud in a Hey Home Air environment by extracting accessToken of user accounts through a cookie database of browsers such as Microsoft Edge, Google Chrome, Mozilia Firefox, and Opera, which are recorded on a PC when users use the Hey Home app-based "Hey Home Square" service. In this paper, the it was configured with smart temperature and humidity sensors, smart door sensors, and smart motion sensors, and artifacts such as temperature and humidity data by date and place, device list used, and motion detection records were collected. Information such as temperature and humidity at the time of the incident can be seen from the results of the artifact analysis and can be used in the forensic investigation process. In addition, the cloud data acquisition method using OpenAPI proposed in this paper excludes the possibility of modulation during the data collection process and uses the API method, so it follows the principle of integrity and reproducibility, which are the principles of digital forensics.

Development of the SysLog-based Integrated Log Management system for Firewalls in Distributed Network Environments (분산 환경에서 SysLog기반의 방화벽 통합로그관리시스템 개발)

  • Lee, Dong Young;Seo, Hee Suk;Lee, Eul Suk
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.7 no.4
    • /
    • pp.39-45
    • /
    • 2011
  • Application log files contain error messages; operational data and usage information that can help manage applications and servers. Log analysis system is software that read and parse log files, extract and aggregate information in order to generate reports on the application. In currently, the importance of log files of firewalls is growing bigger and bigger for the forensics of cyber crimes and the establishment of security policy. In this paper, we designed and implemented the SILAS(SysLog-based Integrated Log mAanagement System) in distribute network environments. It help to generate reports on the the log fires of firewalls - IP and users, and statistics of application usage.

A Study regarding IP Traceback designs and security audit data generation. (IP 역추적 설계 및 보안감사 자료생성에 관한 연구)

  • Lee, In-Hee;Park, Dea-Woo
    • KSCI Review
    • /
    • v.15 no.1
    • /
    • pp.53-64
    • /
    • 2007
  • Avoid at damage systems in order to avoid own IP address exposure, and an invader does not attack directly a system in recent hacking accidents at these papers, and use Stepping stone and carry out a roundabout attack. Use network audit Policy and use a CIS, AIAA technique and algorithm, the Sleep Watermark Tracking technique that used Thumbprints Algorithm, Timing based Algorithm, TCP Sequence number at network bases, and Presented a traceback system at TCP bases at log bases, and be at these papers Use the existing algorithm that is not one module in a system one harm for responding to invasion technology develop day by day in order to supplement the disadvantage where is physical logical complexity of configuration of present Internet network is large, and to have a fast technology development speed, and presentation will do an effective traceback system.

  • PDF

Evidence acquisition of P2P network for digital forensics (디지털 포렌식 관점의 P2P 네트워크 정보 수집 망안 연구)

  • Lee, Jin-Won;Baek, Eun-Ju;Byun, Keun-Duck;Lee, Sang-Jin;Lee, Jong-In
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2007.02a
    • /
    • pp.173-176
    • /
    • 2007
  • 컴퓨터 포렌식 수사에서 인터넷 사용에 대한 분석은 증거 획득에서 중요한 부분을 차지한다. 인터넷 사용 분석에는 웹브라우저 분석, 메신저 분석 그리고 Peer to Peer (P2P) 분석 등이 그 대상이 된다 그 중 본 논문에서는 최근 중요성이 대두되고 있는 P2P를 분석함으로써 P2P에서 컴퓨터 포렌식 수사에 도움이 되는 정보에 대해 알아보고 분석 방법을 제시한다.

  • PDF

A Stable Evidence Collection Procedure of a Volatile Data in Research (휘발성 증거자료의 무결한 증거확보 절차에 관한 연구)

  • Kim, Yong-Ho;Lee, Dong-Hwi;J. Kim, Kui-Nam
    • Convergence Security Journal
    • /
    • v.6 no.3
    • /
    • pp.13-19
    • /
    • 2006
  • I would like to explain a method how to get important data from a volatile data securely, when we are not available to use network in computer system by incident. The main idea is that the first investigator who collects a volatile data by applying scripts built in USB media should be in crime scene at the time. In according to volatile data, he generates hash value, and gets witness signature. After that, he analyses the volatile data with authentication in forensics system.

  • PDF

A Study on Network Forensics Information in Automated Computer Emergency Response System (자동화된 침해사고대응시스템에서의 네트웍 포렌식 정보에 대한 연구)

  • 박종성;문종섭;최운호
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.04a
    • /
    • pp.253-255
    • /
    • 2004
  • 포렌식에 관한 연구는 현재까지 시스템에 남은 흔적을 수집하고 가공, 보관하는 시스템 포렌식에 치우쳐 있었다. 최근 들어 단순히 시스템에 남은 흔적만을 분석하는 것이 아닌 시스템이 속한 전체 네트웍에서 침입 관련 정보를 얻고 분석하려는 네트웍 포렌식에 대한 연구가 활발하다. 특히나 자동화된 침해사고대응시스템에서는 전체 네트웍에 대한 침임 흔적을 다루어야 하기 때문에 네트웍 포렌식의 중요성이 크다고 할 수 있다. 본 논문에서는 자동화된 침해사고대응시스템에서 네트웍 포렌식을 위해 수집되어야 할 정보들을 정의한다. 자동화된 침해사고대응시스템의 여러 장비들과 정보들 중 컴퓨터 범죄 발생시 증거(Evidence)가 되는 포렌식로 수집되어야 할 항목들을 제시하고 필요성에 대해 언급할 것이다.

  • PDF

Real-time Lossless Compression of Traffic for Network Forensics (네트워크 포렌직을 위한 트래픽의 실시간 비손실 압축에 관한 연구)

  • 유상현;김기창
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.10a
    • /
    • pp.382-384
    • /
    • 2004
  • 최근 가파른 증가세를 보이며 그 기법 또한 다양해지고 있는 공격에 대한 사후처리와 동일 침입 방지를 위해, 네트워크 포렌직에 대한 관심이 커지고 있다. 포렌직을 위해서는 정보의 손실 없는 네트워크 트래픽을 수집하여 보존하는 것이 필요하지만, 그 양이 막대하며, 이는 더 큰 용량의 디스크를 필요로 한다. 이를 해결하기 위해서는 압축을 수행하는 것이 필요하며, 또한 IP와 같이 필요로 하는 몇몇 정보를 압축해제 없이 접근할 수 있게 한다면, 간단한 작업을 위해서도 압축을 풀기 위해 컴퓨팅 파워와 시간을 줄일 수 있을 것이다. 따라서, 이 논문에서는 수집한 패킷마다 압축을 수행할 부분과 수행하지 않을 부분으로 나누고, 압축을 수행한 뒤, 해당 정보를 4바이트의 헤더로 만들어 덧붙임으로써, 기존 트래픽을 압축함과 동시에 패킷들에 대한 간단한 정보들을 압축해제 없이 접근할 수 있는 모델을 제안하였다.

  • PDF