• Title/Summary/Keyword: Binary Code Analysis

Search Result 126, Processing Time 0.028 seconds

Program Slicing for Binary code Deobfuscation (역난독화를 위한 바이너리 프로그램 슬라이싱)

  • Mok, Seong-Kyun;Jeon, Hyeon-gu;Cho, Eun-Sun
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.27 no.1
    • /
    • pp.59-66
    • /
    • 2017
  • Hackers have obfuscated their malware to avoid being analyzed. Recently, obfuscation tools translate original codes into bytecodes to use virtualized-obfuscation, so that bytecodes are executed by virtual machines. In such cases, malware analysts fail to know about the malware before execution of the codes. We found that program slicing is one of promising program analysis techniques to solve this problem. The main concepts of program slice include slicing criteria given by analysts and sliced statements according to the slicing criteria. This paper proposes a deobfuscation method based on program slicing technique.

Computing Method of Cross-Correlation of Non-Linear Sequences Using Subfield (부분체를 이용한 비선형 수열의 상호상관관계의 효율적인 계산방법)

  • Choi, Un-Sook;Cho, Sung-Jin;Kim, Seok-Tae
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.16 no.8
    • /
    • pp.1686-1692
    • /
    • 2012
  • Spreading sequence play an important role in wireless communications, such as in a CDMA(code division multiple access) communication system and multi-carrier spectrum communication system. Spreading sequences with low cross-correlation, in a direct-sequence spread spectrum communication system, help to minimize multiple access interference and to increase security degree of system. Analysis of cross-correlations between the sequences is a necessary process to design sequences. However it require lots of computing time for analysis of cross-correlations between sequences. In this paper we propose a method which is possible to compute effectively cross-correlation using subfield in the process of practical computation of cross-correlation between nonlinear binary sequences.

A Real-time Vehicle Localization Algorithm for Autonomous Parking System (자율 주차 시스템을 위한 실시간 차량 추출 알고리즘)

  • Hahn, Jong-Woo;Choi, Young-Kyu
    • Journal of the Semiconductor & Display Technology
    • /
    • v.10 no.2
    • /
    • pp.31-38
    • /
    • 2011
  • This paper introduces a video based traffic monitoring system for detecting vehicles and obstacles on the road. To segment moving objects from image sequence, we adopt the background subtraction algorithm based on the local binary patterns (LBP). Recently, LBP based texture analysis techniques are becoming popular tools for various machine vision applications such as face recognition, object classification and so on. In this paper, we adopt an extension of LBP, called the Diagonal LBP (DLBP), to handle the background subtraction problem arise in vision-based autonomous parking systems. It reduces the code length of LBP by half and improves the computation complexity drastically. An edge based shadow removal and blob merging procedure are also applied to the foreground blobs, and a pose estimation technique is utilized for calculating the position and heading angle of the moving object precisely. Experimental results revealed that our system works well for real-time vehicle localization and tracking applications.

WACFI: Code Instrumentation Technique for Protection of Indirect Call in WebAssembly (WACFI: 웹 어셈블리에서의 간접호출 명령어 보호를 위한 코드 계측 기술)

  • Chang, Yoonsoo;Kim, Youngju;Kwon, Donghyun
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.31 no.4
    • /
    • pp.753-762
    • /
    • 2021
  • WebAssembly(WASM) is a low-level instruction format that can be run in a web environment. Since WASM has a excellent performance, various web applications use webassembly. However, according to our security analysis WASM has a security pitfall related to control flow integrity (CFI) for indirect calls. To address the problem in this paper we propose a new code instrumentation scheme to protect indirect calls, named WACFI. Specifically WACFI enhances a CFI technique for indirect call in WASM based on source code anlysis and binary instrumentation. To test the feasibility of WACFI, we applied WACFI to a sound-encoding application. According to our experimental results WACFI only adds 2.75% overhead on the execution time while protecting indirect calls safely.

PHOTOMETRIC SOLUTIONS OF W UMA TYPE STARS: GSC2576-0319 AND GSC2584-1731 (W UMa형 식쌍성 GSC2576-0319와 GSC2584-1731의 측광해)

  • Lee, Chung-Uk;Lee, Jae-Woo;Jin, Ho;Kim, Chun-Hwey
    • Journal of Astronomy and Space Sciences
    • /
    • v.23 no.4
    • /
    • pp.311-318
    • /
    • 2006
  • High-precision photometric observations were performed in BVI bandpasses using Am robotic telescope at Mt. Lemmon Observatory for two binary stars, which are reclassified as W UMa-type systems from ROTSE(Robotic Optical Transient Search Experiment) follow-up observations and show peculiar light variations. In order to analyze W UMa-type eclipsing binaries systematically, the light curve analysis script using 2005 version of Wilson-Devinney binary code is constructed. The orbital inclinations of GSC2S84-1731 and GSC2576-0319 are $43.^{\circ}5\;and\;57.^{\circ}6$ from light-curve analysis, respectively. Spot model is applied to explain the asymmetric light curve for GSC2S84-1731 and the spot parameters are derived.

Authorship Attribution Framework Using Survival Network Concept : Semantic Features and Tolerances (서바이벌 네트워크 개념을 이용한 저자 식별 프레임워크: 의미론적 특징과 특징 허용 범위)

  • Hwang, Cheol-Hun;Shin, Gun-Yoon;Kim, Dong-Wook;Han, Myung-Mook
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.6
    • /
    • pp.1013-1021
    • /
    • 2020
  • Malware Authorship Attribution is a research field for identifying malware by comparing the author characteristics of unknown malware with the characteristics of known malware authors. The authorship attribution method using binaries has the advantage that it is easy to collect and analyze targeted malicious codes, but the scope of using features is limited compared to the method using source code. This limitation has the disadvantage that accuracy decreases for a large number of authors. This study proposes a method of 'Defining semantic features from binaries' and 'Defining allowable ranges for redundant features using the concept of survival network' to complement the limitations in the identification of binary authors. The proposed method defines Opcode-based graph features from binary information, and defines the allowable range for selecting unique features for each author using the concept of a survival network. Through this, it was possible to define the feature definition and feature selection method for each author as a single technology, and through the experiment, it was confirmed that it was possible to derive the same level of accuracy as the source code-based analysis with an improvement of 5.0% accuracy compared to the previous study.

Analysis of Turbo Coding and Decoding Algorithm for DVB-RCS Next Generation (DVB-RCS Next Generation을 위한 터보 부복호화 방식 분석)

  • Kim, Min-Hyuk;Park, Tae-Doo;Lim, Byeong-Su;Lee, In-Ki;Oh, Deock-Gil;Jung, Ji-Won
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.36 no.9C
    • /
    • pp.537-545
    • /
    • 2011
  • This paper analyzed performance of three dimensional turbo code and turbo ${\Phi}$ codes proposed in the next generation DVB-RCS systems. In the view of turbo ${\Phi}$ codes, we proposed the optimal permutation and puncturing patterns for triple binary input data. We also proposed optimal post-encoder types and interleaving algorithm for three dimensional turbo codes. Based on optimal parameters, we simulated both turbo codes, and we confirmed that the performance of turbo ${\Phi}$ codes are better than that of three dimensional turbo codes. However, the complexity of turbo ${\Phi}$ is more complex than that of three dimensional turbo codes by 18%.

A Study on Development of a GIS based Post-processing System of the EFDC Model for Supporting Water Quality Management (수질관리 지원을 위한 GIS기반의 EFDC 모델 후처리 시스템 개발 연구)

  • Lee, Geon Hwi;Kim, Kye Hyun;Park, Yong Gil;Lee, Sung Joo
    • Spatial Information Research
    • /
    • v.22 no.4
    • /
    • pp.39-47
    • /
    • 2014
  • The Yeongsan river estuary has a serious water quality problem due to the water stagnation and it is imperative to predict the changes of water quality for mitigating water pollution. EFDC(Environmental Fluid Dynamics Code) model was mainly utilized to predict the changes of water quality for the estuary. The EFDC modeling normally accompanies the large volume of modeling output. For checking the spatial distribution of the modeling results, post-processing for converting of the output is prerequisite and mainly post-processing program is EFDC_Explorer. However, EFDC_Explorer only shows the spatial distribution of the time series and this doesn't support overlay function with other thematic maps. This means the impossible to the connection analysis with a various GIS data and high dimensional analysis. Therefore, this study aims to develop a post-processing system of a EFDC output to use them as GIS layers. For achieving this purpose, a editing module for main input files, and a module for converting binary format into an ASCII format, and a module for converting it into a layer format to use in a GIS based environment, and a module for visualizing the reconfigured model result efficiently were developed. Using the developed system, result file is possible to automatically convert the GIS based layer and it is possible to utilize for water quality management.

An automated memory error detection technique using source code analysis in C programs (C언어 기반 프로그램의 소스코드 분석을 이용한 메모리 접근오류 자동검출 기법)

  • Cho, Dae-Wan;Oh, Seung-Uk;Kim, Hyeon-Soo
    • The KIPS Transactions:PartD
    • /
    • v.14D no.6
    • /
    • pp.675-688
    • /
    • 2007
  • Memory access errors are frequently occurred in C programs. A number of tools and research works have been trying to detect the errors automatically. However, they have one or more of the following problems: inability to detect all memory errors, changing the memory allocation mechanism, incompatibility with libraries, and excessive performance overhead. In this paper, we suggest a new method to solve these problems, and then present a result of comparison to the previous research works through the experiments. Our approach consists of two phases. First is to transform source code at compile time through inserting instrumentation into the source code. And second is to detect memory errors at run time with a bitmap that maintains information about memory allocation. Our approach has improved the error detection abilities against the binary code analysis based ones by using the source code analysis technique, and enhanced performance in terms of both space and time, too. In addition, our approach has no problem with respect to compatibility with shared libraries as well as does not need to modify memory allocation mechanism.

Technology Analysis on Automatic Detection and Defense of SW Vulnerabilities (SW 보안 취약점 자동 탐색 및 대응 기술 분석)

  • Oh, Sang-Hwan;Kim, Tae-Eun;Kim, HwanKuk
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.11
    • /
    • pp.94-103
    • /
    • 2017
  • As automatic hacking tools and techniques have been improved, the number of new vulnerabilities has increased. The CVE registered from 2010 to 2015 numbered about 80,000, and it is expected that more vulnerabilities will be reported. In most cases, patching a vulnerability depends on the developers' capability, and most patching techniques are based on manual analysis, which requires nine months, on average. The techniques are composed of finding the vulnerability, conducting the analysis based on the source code, and writing new code for the patch. Zero-day is critical because the time gap between the first discovery and taking action is too long, as mentioned. To solve the problem, techniques for automatically detecting and analyzing software (SW) vulnerabilities have been proposed recently. Cyber Grand Challenge (CGC) held in 2016 was the first competition to create automatic defensive systems capable of reasoning over flaws in binary and formulating patches without experts' direct analysis. Darktrace and Cylance are similar projects for managing SW automatically with artificial intelligence and machine learning. Though many foreign commercial institutions and academies run their projects for automatic binary analysis, the domestic level of technology is much lower. This paper is to study developing automatic detection of SW vulnerabilities and defenses against them. We analyzed and compared relative works and tools as additional elements, and optimal techniques for automatic analysis are suggested.