• Title/Summary/Keyword: File Database

Search Result 370, Processing Time 0.027 seconds

HBase based Business Process Event Log Schema Design of Hadoop Framework

  • Ham, Seonghun;Ahn, Hyun;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.5
    • /
    • pp.49-55
    • /
    • 2019
  • Organizations design and operate business process models to achieve their goals efficiently and systematically. With the advancement of IT technology, the number of items that computer systems can participate in and the process becomes huge and complicated. This phenomenon created a more complex and subdivide flow of business process.The process instances that contain workcase and events are larger and have more data. This is an essential resource for process mining and is used directly in model discovery, analysis, and improvement of processes. This event log is getting bigger and broader, which leads to problems such as capacity management and I / O load in management of existing row level program or management through a relational database. In this paper, as the event log becomes big data, we have found the problem of management limit based on the existing original file or relational database. Design and apply schemes to archive and analyze large event logs through Hadoop, an open source distributed file system, and HBase, a NoSQL database system.

A Comparative Study on Authority Records for Korean Writers Among Countries (한국인 저자 전거에 관한 국가간 비교 연구)

  • Kim, Song-Ie;Chung, Yeon Kyoung
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.49 no.1
    • /
    • pp.379-403
    • /
    • 2015
  • Name Authority Control is useful not only to manage author information but also to gather other names of authors in order to provide access points in libraries and other institutes. The purpose of this study is to find problems through comparing and analyzing Literature Translation Institute of Korea Author Database, Korean name authority records of national libraries in the U. S., Japan and Korea. The results of the study are as follows. First, Literature Translation Institute of Korea Author Database missed some useful information about Korean writers in translated books in other countries. Second, the name authority file of Library of Congress and National Diet Library did not include the variant names, dates of authors' birth and death date, and filled out incorrect variant names with some birth and death dates. Third, English and Chinese character variants of Korean authors were not found in the National Library of Korea. To solve these problems, revisions of Korean author database, open access to National Library of Korea name authority file, and a strong participation of VIAF were suggested.

A Study on the Improvement Method of Deleted Record Recovery in MySQL InnoDB (MySQL InnoDB의 삭제된 레코드 복구 기법 개선방안에 관한 연구)

  • Jung, Sung Kyun;Jang, Jee Won;Jeoung, Doo Won;Lee, Sang Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.12
    • /
    • pp.487-496
    • /
    • 2017
  • In MySQL InnoDB, there are two ways of storing data. One is to create a separate tablespace for each table and store it separately. Another is to store all table and index information in a single system tablespace. You can use this information to recover deleted data from the record. However, in most of the current database forensic studies, the former is actively researched and its structure is analyzed, whereas the latter is not enough to be used for forensics. Both approaches must be analyzed in terms of database forensics because their storage structures are different from each other. In this paper, we propose a method for recovering deleted records in a method of storing records in IBDATA file, which is a single system tablespace. First, we analyze the IBDATA file to reveal its structure. And introduce delete record recovery algorithm which extended to an unallocated page area which was not considered in the past. In addition, we show that the recovery rate is improved up to 68% compared with the existing method through verification using real data by implementing the algorithm as a tool.

Tailoring Operations based on Relational Algebra for XES-based Workflow Event Logs

  • Yun, Jaeyoung;Ahn, Hyun;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.6
    • /
    • pp.21-28
    • /
    • 2019
  • Process mining is state-of-the-art technology in the workflow field. Recently, process mining becomes more important because of the fact that it shows the status of the actual behavior of the workflow model. However, as the process mining get focused and developed, the material of the process mining - workflow event log - also grows fast. Thus, the process mining algorithms cannot operate with some data because it is too large. To solve this problem, there should be a lightweight process mining algorithm, or the event log must be divided and processed partly. In this paper, we suggest a set of operations that control and edit XES based event logs for process mining. They are designed based on relational algebra, which is used in database management systems. We designed three operations for tailoring XES event logs. Select operation is an operation that gets specific attributes and excludes others. Thus, the output file has the same structure and contents of the original file, but each element has only the attributes user selected. Union operation makes two input XES files into one XES file. Two input files must be from the same process. As a result, the contents of the two files are integrated into one file. The final operation is a slice. It divides anXES file into several files by the number of traces. We will show the design methods and details below.

Design and Implementation of 3D Web Service based on ASE File and Model Database (ASE 파일 파싱과 모델 데이터베이스 연동을 통한 3D 웹 서비스 설계 및 구현)

  • Yeo, Yun-Seok;Park, Jong-Koo
    • The KIPS Transactions:PartD
    • /
    • v.11D no.6
    • /
    • pp.1327-1334
    • /
    • 2004
  • The purpose of this paper is to implement Web 3D environment that is not provider - oriented but client-oriented in order to provide dynamic information and to analyze knowledges by executing programs on Web pages. For these, The 3D Viewer program that parses and renders ASE files - the most general 3D Model Data file and exported text file of 3D Max Studio - is made and then converted into ActiveX 3D Viewer Component that can be used on the Web. With the purpose of managing ASE and texture file efficiently and interacting between clients and server, ActiveX Component link ASP and Database with Web Service. The 3D View Web Service can make dynamic information and cooperative works easier in Networked Virtual Reality.

Database based Global Positioning System Correction (데이터베이스 기반 GPS 위치 보정 시스템)

  • Moon, Jun-Ho;Choi, Hyuk-Doo;Park, Nam-Hun;Kim, Chong-Hui;Park, Yong-Woon;Kim, Eun-Tai
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.3
    • /
    • pp.205-215
    • /
    • 2012
  • A GPS sensor is widely used in many areas such as navigation, or air traffic control. Particularly, the car navigation system is equipped with GPS sensor for locational information. However, when a car goes through a tunnel, forest, or built-up area, GPS receiver cannot get the enough number of satellite signals. In these situations, a GPS receiver does not reliably work. A GPS error can be formulated by sum of bias error and sensor noise. The bias error is generated by the geometric arrangement of satellites and sensor noise error is generated by the corrupted signal noise of receiver. To enhance GPS sensor accuracy, these two kinds of errors have to be removed. In this research, we make the road database which includes Road Database File (RDF). RDF includes road information such as road connection, road condition, coordinates of roads, lanes, and stop lines. Among the information, we use the stop line coordinates as a feature point to correct the GPS bias error. If the relative distance and angle of a stop line from a car are detected and the detected stop line can be associated with one of the stop lines in the database, we can measure the bias error and correct the car's location. To remove the other GPS error, sensor noise, the Kalman filter algorithm is used. Additionally, using the RDF, we can get the information of the road where the car belongs. It can be used to help the GPS correction algorithm or to give useful information to users.

Development of an Object-Relational IFC Server

  • Hoon-sig Kang;Ghang Lee
    • International conference on construction engineering and project management
    • /
    • 2009.05a
    • /
    • pp.1346-1351
    • /
    • 2009
  • In this paper we propose a framework for an Object Relational IFC Server (OR-IFC Server). Enormous amounts of information are generated in each project. Today, many BIM systems are developed by various CAD software vendors. Industry Foundation Classes (IFC) developed by International Alliance for Interoperability (IAI) is an open standard data model for exchanging data between the various BIM tools. The IFC provides a foundation for exchanging and sharing of information directly between software applications and define a shared building project model. The IFC model server is a database management system that can keep track of transactions, modifications, and deletions. It plays a role as an information hub for storing and sharing information between various parties involved in construction projects. Users can communicate with each other via the internet and utilize functions implemented in the model server such as partial data import/export, file merge, version control, etc. IFC model servers using relational database systems have been developed. However, they suffered from slow performance and long transaction time due to a complex mapping process between the IFC structure and a relational-database structure because the IFC model schema is defined in the EXPRESS language which is object-favored language. In order to simplify the mapping process, we developed a set of rules to map the IFC model to an object-relational database (ORDB). Once the database has been configured, only those pieces of information that are required for a specific information-exchange scenario are extracted using the pre-defined information delivery manual (IDM). Therefore, file sizes will be reduced when exchanging data, meaning that files can now be effectively exchanged and shared. In this study, the framework of the IFC server using ORDB and IDM and the method to develop it will be examined.

  • PDF

The method of recovery for deleted record in Oracle Database (Oracle 데이터베이스의 삭제된 레코드 복구 기법)

  • Choi, Jong-Hyun;Jeong, Doo Won;Lee, Sangjin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.23 no.5
    • /
    • pp.947-955
    • /
    • 2013
  • Most of the enterprise information is stored in the database. Therefore, in order to investigate the company's criminal behavior, forensic analysis is important for the database and delete record is a need to develop recovery techniques. This paper is explained structure of the oracle database tablespace file and analyzed system tables that stored table information. Further, we suggests a method of recovery for deleted record in oracle tablespace.

Building of Land Ledger Database Using Land Information System (토지정보 시스템에 있어서 토지대장 데이타베이스 구축)

  • 강인준;장용구;박기배
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.12 no.2
    • /
    • pp.141-146
    • /
    • 1994
  • At the present time the cadastral sections has a document for constructing database of land register and commit to record the assessed cost of land in field. Kumjung-Ku, Pusan is a model in this study. It is possible to investigate the present land record by connecting graphic data with attribute data in author's program. AutoCAD make possible to connect graphic data with attribute data. Because of limitation of constructing database in AutoCAD, authors construct independent database in Clipper's circumstance. Database in AutoCAD and Clipper is connected to the menu-file in AutoCAD's circumstance.

  • PDF

Development of the design methodology for large-scale database based on MongoDB

  • Lee, Jun-Ho;Joo, Kyung-Soo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.22 no.11
    • /
    • pp.57-63
    • /
    • 2017
  • The recent sudden increase of big data has characteristics such as continuous generation of data, large amount, and unstructured format. The existing relational database technologies are inadequate to handle such big data due to the limited processing speed and the significant storage expansion cost. Thus, big data processing technologies, which are normally based on distributed file systems, distributed database management, and parallel processing technologies, have arisen as a core technology to implement big data repositories. In this paper, we propose a design methodology for large-scale database based on MongoDB by extending the information engineering methodology based on E-R data model.