• Title/Summary/Keyword: 증거정보

Search Result 739, Processing Time 0.032 seconds

A Forensic Methodology for Detecting Image Manipulations (이미지 조작 탐지를 위한 포렌식 방법론)

  • Jiwon Lee;Seungjae Jeon;Yunji Park;Jaehyun Chung;Doowon Jeong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.4
    • /
    • pp.671-685
    • /
    • 2023
  • By applying artificial intelligence to image editing technology, it has become possible to generate high-quality images with minimal traces of manipulation. However, since these technologies can be misused for criminal activities such as dissemination of false information, destruction of evidence, and denial of facts, it is crucial to implement strong countermeasures. In this study, image file and mobile forensic artifacts analysis were conducted for detecting image manipulation. Image file analysis involves parsing the metadata of manipulated images and comparing them with a Reference DB to detect manipulation. The Reference DB is a database that collects manipulation-related traces left in image metadata, which serves as a criterion for detecting image manipulation. In the mobile forensic artifacts analysis, packages related to image editing tools were extracted and analyzed to aid the detection of image manipulation. The proposed methodology overcomes the limitations of existing graphic feature-based analysis and combines with image processing techniques, providing the advantage of reducing false positives. The research results demonstrate the significant role of such methodology in digital forensic investigation and analysis. Additionally, We provide the code for parsing image metadata and the Reference DB along with the dataset of manipulated images, aiming to contribute to related research.

A Study on the Decryption Method for Volume Encryption and Backup Applications (볼륨 암호화 및 백업 응용프로그램에 대한 복호화 방안 연구)

  • Gwui-eun Park;Min-jeong Lee;Soo-jin Kang;Gi-yoon Kim;Jong-sung Kim
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.3
    • /
    • pp.511-525
    • /
    • 2023
  • As awareness of personal data protection increases, various Full Disk Encryption (FDE)-based applications are being developed that real-time encryption or use virtual drive volumes to protect data on user's PC. FDE-based applications encrypt and protect the volume containing user's data. However, as disk encryption technology advances, some users are abusing FDE-based applications to encrypt evidence associated with criminal activities, which makes difficulties in digital forensic investigations. Thus, it is necessary to analyze the encryption process used in FDE-based applications and decrypt the encrypted data. In this paper, we analyze Cryptomator and Norton Ghost, which provide volume encryption and backup functions. We analyze the encrypted data structure and encryption process to classify the main data of each application and identify the encryption algorithm used for data decryption. The encryption algorithms of these applications are recently emergin gor customized encryption algorithms which are analyzed to decrypt data. User password is essential to generate a data encryption key used for decryption, and a password acquisition method is suggested using the function of each application. This supplemented the limitations of password investigation, and identifies user data by decrypting encrypted data based on the acquired password.

An contention-aware ordered sequential collaborative spectrum sensing scheme for CRAHN (무선인지 애드 혹 네트워크를 위한 순차적 협력 스펙트럼 센싱 기법)

  • Nguyen-Thanh, Nhan;Koo, In-Soo
    • Journal of Internet Computing and Services
    • /
    • v.12 no.4
    • /
    • pp.35-43
    • /
    • 2011
  • Cognitive Radio (CR) ad hoc network is highly considered as one of promising future ad hoc networks, which enables opportunistic access to under-utilized licensed spectrum. Similarly to other CR networks, the spectrum sensing is a prerequisite in CR ad hoc network. Collaborative spectrum sensing can help increasing sensing performance. For such an infrastructureless network, however the coordination for the sensing collaboration is really complicated due to the lack of a central controller. In this paper, we propose a novel collaborative spectrum sensing scheme in which the final decision is made by the node with the highest data reliability based on a sequential Dempster Shafer theory. The collaboration of sensing data is also executed by the proposed contention-aware reporting mechanism which utilizes the sensing data reliability order for broadcasting spectrum sensing result. The proposed method reduces the collecting time and the overhead of the control channel due to the efficiency of the ordered sequential combination while keeping the same sensing performance in comparison with the conventional cooperative centralized spectrum sensing scheme.

A Study on the Citation Behavior by Academic Background of Researchers (전문연구자의 학문배경에 따른 인용행태에 관한 연구)

  • Oh, Yu-Jin;Oh, Hyo-Jung;Kim, Chong-Hyuck;Kim, Yong
    • Journal of the Korean Society for information Management
    • /
    • v.33 no.1
    • /
    • pp.247-268
    • /
    • 2016
  • Although it has been a long subject of study why researchers prefer some cited documents to others, the existing relative researches have had a variety of perspectives on the nature and complexity of the citation behavior and not provided a complete answer to this question. In particular, Korea researchers mainly used statistical analysis of bibliographic information, which has limitations in revealing dynamic and complex cognitive aspects of the citation process. In this study, I investigate the citer perception of citing motives and bibliographic factors through survey and compared the responses according to the researchers' characteristics. After extracting the 22 motivations and 21 factors through the literature analysis and configuring a 5-point Likert scale questions, I conducted a survey in the wat of an e-mail attachment. From the SPSS 22.0, the frequency analysis, t-test, and one-way ANOVA were performed on the 354 valid samples. As a result, it is found that supporting is considered the most important citing motive and social connection, self-citation have little influence. In the case of bibliographic factors, the journal's reputation was recognized the most influential factor and the number of pages and authors was the least. Significant differences in fields of study and research careers were showed in some parts. These results can substantiate earlier studies, determine whether the factors assumed influential in selecting references were intended, and suggest the search point to the specialty library or academic database.

An Empirical Study of Profiling Model for the SMEs with High Demand for Standards Using Data Mining (데이터마이닝을 이용한 표준정책 수요 중소기업의 프로파일링 연구: R&D 동기와 사업화 지원 정책을 중심으로)

  • Jun, Seung-pyo;Jung, JaeOong;Choi, San
    • Journal of Korea Technology Innovation Society
    • /
    • v.19 no.3
    • /
    • pp.511-544
    • /
    • 2016
  • Standards boost technological innovation by promoting information sharing, compatibility, stability and quality. Identifying groups of companies that particularly benefit from these functions of standards in their technological innovation and commercialization helps to customize planning and implementation of standards-related policies for demand groups. For this purpose, this study engages in profiling of SMEs whose R&D objective is to respond to standards as well as those who need to implement standards system for technological commercialization. Then it suggests a prediction model that can distinguish such companies from others. To this end, decision tree analysis is conducted for profiling of characteristics of subject SMEs through data mining. Subject SMEs include (1) those that engage in R&D to respond to standards (Group1) or (2) those in need of product standard or technological certification policies for commercialization purposes (Group 2). Then the study proposes a prediction model that can distinguish Groups 1 and 2 from others based on several variables by adopting discriminant analysis. The practicality of discriminant formula is statistically verified. The study suggests that Group 1 companies are distinguished in variables such as time spent on R&D planning, KoreanStandardIndustryClassification (KSIC) category, number of employees and novelty of technologies. Profiling result of Group 2 companies suggests that they are differentiated in variables such as KSIC category, major clients of the companies, time spent on R&D and ability to test and verify their technologies. The prediction model proposed herein is designed based on the outcomes of profiling and discriminant analysis. Its purpose is to serve in the planning or implementation processes of standards-related policies through providing objective information on companies in need of relevant support and thereby to enhance overall success rate of standards-related projects.

The Recovery Method for MySQL InnoDB Using Feature of IBD Structure (IBD 구조적특징을이용한 MySQL InnoDB의레코드복구기법)

  • Jang, Jeewon;Jeoung, Doowon;Lee, Sang Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.2
    • /
    • pp.59-66
    • /
    • 2017
  • MySQL database is the second place in the market share of the current database. Especially InnoDB storage engine has been used in the default storage engine from the version of MySQL5.5. And many companies are using the MySQL database with InnoDB storage engine. Study on the structural features and the log of the InnoDB storage engine in the field of digital forensics has been steadily underway, but for how to restore on a record-by-record basis for the deleted data, has not been studied. In the process of digital forensic investigation, database administrators damaged evidence for the purpose of destruction of evidence. For this reason, it is important in the process of forensic investigation to recover deleted record in database. In this paper, We proposed the method of recovering deleted data on a record-by-record in database by analyzing the structure of MySQL InnoDB storage engine. And we prove this method by tools. This method can be prevented by database anti forensic, and used to recover deleted data when incident which is related with MySQL InnoDB database is occurred.

Managing Mobility - Enterprise Secure Wireless Control (이동성 관리 - 기업의 안전한 무선 네트워크 제어)

  • Lee Daniel H.
    • 한국정보통신설비학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.282-290
    • /
    • 2004
  • 80년대 초반에 등장한 퍼스널 컴퓨터에서부터 90년대에 급격히 확산된 클라이언트/서버 환경에 이르기까지 분산 컴퓨팅은 관리가 어렵다고 증명되었다. IBM의 Tivoli나 HP의 OpenView 등을 포함한 거대한 엔터프라이즈 관리 시스템 산업이 이러한 표면상의 극복하기 힘든 법칙처럼 여겨지는 것이 그 증거라고 하겠다. 이 후 무선의 개념이 등장했다. NOP World Technology가 Cisco사를 위해 2001년에 시행한 조사에 의하면 최종 사용자는 무선랜을 사용함으로써 생산성이 최고 22% 향상되었고 조사대상의 63%가 일상적인 직무에서 정확도가 향상되었다. 이 모든 것은 투자대비수익(ROI) 계산상 사용자 당 $550 해당한다. 현재 이동성과 IT 관리 기능의 딜레마를 동시에 고려하며 저렴한 몇몇 솔루션들이 소개되고 있다. 본 논문에서는 분산 컴퓨팅의 다음 진화 단계인 무선 네트워킹과 관련된 문제를 해결할 수 있는 혁신적이고 전체적인 접근법을 소개한다. 본 논문에서는 무선 컴퓨팅과 보안의 본질 및 무선랜이라는 새로운 컴퓨팅 패러다임으로 인하여 파생되는 운영과 관리의 어려움을 소개한다. 이러한 환경이 정의되면 본 논문은 이해하기 쉬운 5x5 레이어 매트릭스를 바탕으로 각 레이어의 독특한 본질을 고려한 혁신적인 무선랜 관리 방법에 대해 설명한다. 마지막으로 무선 네트워킹, 컨버젼스, 궁극적으로 분산 컴퓨팅만이 가지는 문제점을 해결할 수 있는 Red-M의 백 오피스 애플리케이션에 기반한 솔루션을 소개한다. 본 논문의 목표는 Red-M의 성공에 관한 두 가지 중요한 과정을 설명하고자 함이다. 이는 안전한 무선 네트워크 제어에서 비롯되는 무선 환경이 약속하는 장점들을 고루 제공하는 것과 나쁜 의도의 사용자를 차단할 뿐 아니라 올바른 사용자와 또한 나머지 일반 사용자를 총체적으로 관리할 수 있는, 안정적이고 확장 가능하며 직관적인 시스템을 제공하는 것이다.가 생성된다. $M_{C}$에 CaC $l_2$를 첨가한 경우 $M_{C}$는 완전히 $M_{Cl}$ 로 전이를 하였다. $M_{Cl}$ 에 CaC $l_2$를 첨가하였을 경우에는 아무런 수화물의 변화는 발생하지 않았다. 따라서 CaS $O_4$.2$H_2O$를 CaC $O_3$및 CaC $l_2$와 반응시켰을 때의 AFm상의 안정성 순서는 $M_{S}$ < $M_{C}$< $M_{Cl}$ 로 된다.phy. Finally, Regional Development and Regional Environmental Problems were highly correlated with accommodators.젼 공정을 거쳐 제조된다는 점을 고려할 때 이용가능한 에너지 함량계산에 직접 활용될 수는 없을 것이다.총단백질 및 AST에서 시간경과에 따른 삼투압 조절 능력에 문제가 있는 것으로 보여진다.c}C$에서 5시간 가열조리 후 잔존율은 각각 84.7% 및 73.3%였고, 질소가스 통기하에서는 잔존율이 88.9% 및 81.8%로 더욱 안정하였다.8% 및 12.44%, 201일 이상의 경우 13.17% 및 11.30%로 201일 이상의 유기의 경우에만 대조구와 삭제 구간에 유의적인(p<0.05) 차이를 나타내었다.는 담수(淡水)에서 10%o의 해수(海水)로 이주된지 14일(日) 이후에 신장(腎臟)에서 수축된 것으로 나타났다. 30%o의 해수(海水)에 적응(適應)된 틸라피아의 평균 신사구체(腎絲球體)의 면적은 담수(淡水)에 적응된 개체의 면적보다 유의성있게 나타났다. 해수(海水)에 적응(適應)된 틸라피아의 신단위(腎

  • PDF

A Classification Method of Delirium Patients Using Local Covering-Based Rule Acquisition Approach with Rough Lower Approximation (러프 하한 근사를 갖는 로컬 커버링 기반 규칙 획득 기법을 이용한 섬망 환자의 분류 방법)

  • Son, Chang Sik;Kang, Won Seok;Lee, Jong Ha;Moon, Kyoung Ja
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.9 no.4
    • /
    • pp.137-144
    • /
    • 2020
  • Delirium is among the most common mental disorders encountered in patients with a temporary cognitive impairment such as consciousness disorder, attention disorder, and poor speech, particularly among those who are older. Delirium is distressing for patients and families, can interfere with the management of symptoms such as pain, and is associated with increased elderly mortality. The purpose of this paper is to generate useful clinical knowledge that can be used to distinguish the outcomes of patients with delirium in long-term care facilities. For this purpose, we extracted the clinical classification knowledge associated with delirium using a local covering rule acquisition approach with the rough lower approximation region. The clinical applicability of the proposed method was verified using data collected from a prospective cohort study. From the results of this study, we found six useful clinical pieces of evidence that the duration of delirium could more than 12 days. Also, we confirmed eight factors such as BMI, Charlson Comorbidity Index, hospitalization path, nutrition deficiency, infection, sleep disturbance, bed scores, and diaper use are important in distinguishing the outcomes of delirium patients. The classification performance of the proposed method was verified by comparison with three benchmarking models, ANN, SVM with RBF kernel, and Random Forest, using a statistical five-fold cross-validation method. The proposed method showed an improved average performance of 0.6% and 2.7% in both accuracy and AUC criteria when compared with the SVM model with the highest classification performance of the three models respectively.

Fuzzy Expert System for Detecting Anti-Forensic Activities (안티 포렌식 행위 탐지를 위한 퍼지 전문가 시스템)

  • Kim, Se-Ryoung;Kim, Huy-Kang
    • Journal of Internet Computing and Services
    • /
    • v.12 no.5
    • /
    • pp.47-61
    • /
    • 2011
  • Recently, the importance of digital forensic has been magnified because of the dramatic increase of cyber crimes and the increasing complexity of the investigation of target systems such as PCs, servers, and database systems. Moreover, some systems have to be investigated with live forensic techniques. However, even though live forensic techniques have been improved, they are still vulnerable to anti-forensic activities when the target systems are remotely accessible by criminals or their accomplices. To solve this problem, we first suggest a layer-based model and the anti-forensic scenarios which can actually be applicable to each layer. Our suggested model, the Anti-Forensic Activites layer-based model, has 5 layers - the physical layer, network layer, OS layer, database application layer and data layer. Each layer has possible anti-forensic scenarios with detailed commands. Second, we propose a fuzzy expert system for effectively detecting anti-forensic activities. Some anti-forensic activities are hardly distinguished from normal activities. So, we use fuzzy logic for handling ambiguous data. We make rule sets with extracted commands and their arguments from pre-defined scenarios and the fuzzy expert system learns the rule sets. With this system, we can detect anti-forensic activities in real time when performing live forensic.

Comparison among Methods of Modeling Epistemic Uncertainty in Reliability Estimation (신뢰성 해석을 위한 인식론적 불확실성 모델링 방법 비교)

  • Yoo, Min Young;Kim, Nam Ho;Choi, Joo Ho
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.27 no.6
    • /
    • pp.605-613
    • /
    • 2014
  • Epistemic uncertainty, the lack of knowledge, is often more important than aleatory uncertainty, variability, in estimating reliability of a system. While the probability theory is widely used for modeling aleatory uncertainty, there is no dominant approach to model epistemic uncertainty. Different approaches have been developed to handle epistemic uncertainties using various theories, such as probability theory, fuzzy sets, evidence theory and possibility theory. However, since these methods are developed from different statistics theories, it is difficult to interpret the result from one method to the other. The goal of this paper is to compare different methods in handling epistemic uncertainty in the view point of calculating the probability of failure. In particular, four different methods are compared; the probability method, the combined distribution method, interval analysis method, and the evidence theory. Characteristics of individual methods are compared in the view point of reliability analysis.