• Title/Summary/Keyword: Separate file

Search Result 42, Processing Time 0.034 seconds

A Study on the Improvement Method of Deleted Record Recovery in MySQL InnoDB (MySQL InnoDB의 삭제된 레코드 복구 기법 개선방안에 관한 연구)

  • Jung, Sung Kyun;Jang, Jee Won;Jeoung, Doo Won;Lee, Sang Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.12
    • /
    • pp.487-496
    • /
    • 2017
  • In MySQL InnoDB, there are two ways of storing data. One is to create a separate tablespace for each table and store it separately. Another is to store all table and index information in a single system tablespace. You can use this information to recover deleted data from the record. However, in most of the current database forensic studies, the former is actively researched and its structure is analyzed, whereas the latter is not enough to be used for forensics. Both approaches must be analyzed in terms of database forensics because their storage structures are different from each other. In this paper, we propose a method for recovering deleted records in a method of storing records in IBDATA file, which is a single system tablespace. First, we analyze the IBDATA file to reveal its structure. And introduce delete record recovery algorithm which extended to an unallocated page area which was not considered in the past. In addition, we show that the recovery rate is improved up to 68% compared with the existing method through verification using real data by implementing the algorithm as a tool.

Reverse Engineering of 3D Compound Shapes using Delaunay Triangulation (Delaunay 삼각형분할법을 이용한 3차원복합형상의 역공학)

  • 조승현;조명우;김재도
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.17 no.7
    • /
    • pp.181-188
    • /
    • 2000
  • The objective of this research is to develop an efficient reverse engineering method of 3-dimensional compound surfaces for raped prototyping process. As a first step, several image processing techniques were applied to the measured data obtained using laser scanner. And the boundary information of the compound surface were extracted to divide the surface into several separate regions. As a next step, the Delaunay triangulation method were applied to reconstruct the surface based on the measured data and the boundary information. Finally, the STL file were created for the rapid prototyping process. Required simulations and experiments were performed and the results were analyzed to show the effectiveness of the proposed methods.

  • PDF

Android Application for Connecting Cycling Routes on Strava Segments

  • Mulasastra, Intiraporn;Kao-ian, Wichpong
    • Journal of information and communication convergence engineering
    • /
    • v.17 no.2
    • /
    • pp.142-148
    • /
    • 2019
  • Relatively few countries provide separate bicycle lanes for cyclists. Hence, tools for suggesting cycling routes are essential for a safe and pleasant cycling experience. This study aims to develop a mobile application to build cycling routes based on user preferences, specifically location, search radius, ride distance, and number of optimal routes. Our application calls the Strava API to retrieve Strava cycling segments crowdsourced from the cycling community. Then, it creates a graph consisting of the start and end points of these segments. Beginning from a user-specified location, the depth-first search algorithm (DFS) is applied to find routes that conform to the user's preferences. Next, a set of optimal routes is obtained by computing a trade-off ratio for every discovered route. This ratio is calculated from the lengths of all segments and the lengths of all connecting paths. The connected routes can be displayed on a map on an Android device or exported as a GPX file to a bike computer. Future work must be performed to improve the design of the user interface and user experience.

Design an Indexing Structure System Based on Apache Hadoop in Wireless Sensor Network

  • Keo, Kongkea;Chung, Yeongjee
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2013.05a
    • /
    • pp.45-48
    • /
    • 2013
  • In this paper, we proposed an Indexing Structure System (ISS) based on Apache Hadoop in Wireless Sensor Network (WSN). Nowadays sensors data continuously keep growing that need to control. Data constantly update in order to provide the newest information to users. While data keep growing, data retrieving and storing are face some challenges. So by using the ISS, we can maximize processing quality and minimize data retrieving time. In order to design ISS, Indexing Types have to be defined depend on each sensor type. After identifying, each sensor goes through the Indexing Structure Processing (ISP) in order to be indexed. After ISP, indexed data are streaming and storing in Hadoop Distributed File System (HDFS) across a number of separate machines. Indexed data are split and run by MapReduce tasks. Data are sorted and grouped depend on sensor data object categories. Thus, while users send the requests, all the queries will be filter from sensor data object and managing the task by MapReduce processing framework.

Development of Android App for Suppor ting Smooth Multimedia Streaming Service Using Frame Buffer (프레임 버퍼를 이용한 매끄러운 멀티미디어 스트리밍 서비스를 지원하는 안드로이드 앱 개발)

  • Seo, Sang-min;Kwon, Jonnho;Choi, Yoon-Ho
    • Journal of Internet Computing and Services
    • /
    • v.17 no.1
    • /
    • pp.55-64
    • /
    • 2016
  • Existing Android applications for streaming video in real time are dependent on the codec, which composes the encoding function, and the version of Android operating system. Also, for streaming video in real time, most applications should be connected with a separate desktop PC. To overcome these disadvantages, we propose a new application, which records and streams video in real time. Specifically, the proposed application uses the flash video file format, which is the common media file format supported by various versions of Android operating system. Through experiments, we show that it is possible for the proposed application to record the video screens more than 20 frames per second and to stream it in real time while using the existing video encoding methods.

Development and Validation of Exposure Models for Construction Industry: Tier 2 Model (건설업 유해화학물질 노출 모델의 개발 및 검증: Tier-2 노출 모델)

  • Kim, Seung Won;Jang, Jiyoung;Kim, Gab Bae
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.24 no.2
    • /
    • pp.219-228
    • /
    • 2014
  • Objectives: The major objective of this study was to develop a tier 2 exposure model combining tier 1 exposure model estimates and worker monitoring data and suggesting narrower exposure ranges than tier 1 results. Methods: Bayesian statistics were used to develop a tier 2 exposure model as was done for the European Union (EU) tier 2 exposure models, for example Advanced REACH Tools (ART) and Stoffenmanager. Bayesian statistics required a prior and data to calculate the posterior results. In this model, tier 1 estimated serving as a prior and worker exposure monitoring data at the worksite of interest were entered as data. The calculation of Bayesian statistics requires integration over a range, which were performed using a Riemann sum algorithm. From the calculated exposure estimates, 95% range was extracted. These algorithm have been realized on Excel spreadsheet for convenience and easy access. Some fail-proof features such as locking the spreadsheet were added in order to prevent errors or miscalculations derived from careless usage of the file. Results: The tier 2 exposure model was successfully built on a separate Excel spreadsheet in the same file containing tier 1 exposure model. To utilize the model, exposure range needs to be estimated from tier 1 model and worker monitoring data, at least one input are required. Conclusions: The developed tier 2 exposure model can help industrial hygienists obtain a narrow range of worker exposure level to a chemical by reflecting a certain set of job characteristics.

DETECTING VARIABILITY IN ASTRONOMICAL TIME SERIES DATA: APPLICATIONS OF CLUSTERING METHODS IN CLOUD COMPUTING ENVIRONMENTS

  • Shin, Min-Su;Byun, Yong-Ik;Chang, Seo-Won;Kim, Dae-Won;Kim, Myung-Jin;Lee, Dong-Wook;Ham, Jae-Gyoon;Jung, Yong-Hwan;Yoon, Jun-Weon;Kwak, Jae-Hyuck;Kim, Joo-Hyun
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.36 no.2
    • /
    • pp.131.1-131.1
    • /
    • 2011
  • We present applications of clustering methods to detect variability in massive astronomical time series data. Focusing on variability of bright stars, we use clustering methods to separate possible variable sources from other time series data, which include intrinsically non-variable sources and data with common systematic patterns. We already finished the analysis of the Northern Sky Variability Survey data, which include about 16 million light curves, and present candidate variable sources with their association to other data at different wavelengths. We also apply our clustering method to the light curves of bright objects in the SuperWASP Data Release 1. For the analysis of the SuperWASP data, we exploit a elastically configurable Cloud computing environments that the KISTI Supercomputing Center is deploying. Two quite different configurations are incorporated in our Cloud computing test bed. One system uses the Hadoop distributed processing with its distributed file system, using distributed processing with data locality condition. Another one adopts the Condor and the Lustre network file system. We present test results, considering performance of processing a large number of light curves, and finding clusters of variable and non-variable objects.

  • PDF

Implementation and Design of WISD(Web Interface System based DICOM) for Efficient Sharing of Medical Information between Clinics (의료기관간 효과적인 의료정보 공유를 위한 WISD의 설계 및 구현)

  • Cho, Ik-Sung;Kwon, Hyeog-Soong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.3
    • /
    • pp.500-508
    • /
    • 2008
  • For efficient compatible system between medical clinics, the medical information has to be built on a standardized protocol such as a HL7 for text data and a DICOM for image data. But it is difficult to exchange information between medical clinics because the systems and softwares are different and also a structure of data and a type of code. Therefore we analyze a structure of DICOM file and design an integrated database for effective information sharing and exchange. The WISD system suggested in this paper separate the DICOM file transmitted by medical clinics to text data and image data and store it in the integrated DB(database) by standardized protocol respectively. It is very efficient that each medical clinic can search and exchange information by web browser using the suggested system. The WISD system can not only search and control of image data and patient information through integrated database and internet, but share medical information without extra charge like construction of new system.

Androfilter: Android Malware Filter using Valid Market Data (Androfilter: 유효마켓데이터를 이용한 안드로이드 악성코드 필터)

  • Yang, Wonwoo;Kim, Jihye
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.25 no.6
    • /
    • pp.1341-1351
    • /
    • 2015
  • As the popularization of smartphone increases the number of various applications, the number of malicious applications also grows rapidly through the third party App Market or black market. This paper suggests an investigation filter, Androfilter, that detects the fabrication of APK file effectively. Whereas the most of antivirus software uses a separate server to collect, analyze, and update malicious applications, Androfilter assumes Google Play as the trusted party and verifies integrity of an application through a simple query to Google Play. Experiment results show that Androfilter blocks brand new malicious applications that have not been reported yet as well as known malicious applications.

The Relocation Effect of Observation Station on the Homogeneity of Seasonal Mean of Diurnal Temperature Range (기상관측소의 이전이 계절평균 일교차의 균질성에 미치는 영향)

  • Kim, Ji-Hyun;Suh, Myoung-Seok;Hong, Soon-Hee
    • Atmosphere
    • /
    • v.20 no.4
    • /
    • pp.437-449
    • /
    • 2010
  • The relocation effect of observation station (REOS) on the homogeneity of seasonal mean of maximum and minimum temperature, diurnal temperature range (DTR) and relative humidity (RH) was investigated using surface observation data and document file. Twelve stations were selected among the 60 stations which have been operated more than 30 years and relocated over one time. The data from Chunpungryeong station were used as a reference to separate the impacts of station relocation from the effects caused by increased green house gases, urbanization, and others. The REOS was calculated as a difference between REOS of relocated station and REOS of reference station. Although the REOS is clearly dependent on season, meteorological elements, and observing stations, statistically significant impacts are found in many stations, especially the environment of observing station after relocation is greatly changed. As a result, homogeneity of seasonal mean of meteorological elements, especially DTR and RH, is greatly reduced. The results showed that the effect of REOS, along with the effect of urbanization, should be eliminated for the proper estimation of climate change from the analysis of long-term observation data.