• Title/Summary/Keyword: Database Design for Records

Search Result 53, Processing Time 0.02 seconds

Directions for Developing Database Schema of Records in Archives Management Systems (영구기록물관리를 위한 기록물 데이터베이스 스키마 개발 방향)

  • Yim, Jin-Hee;Lee, Dae-Wook;Kim, Eun-Sil;Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.34
    • /
    • pp.57-105
    • /
    • 2012
  • The CAMS(Central Archives Management System) of NAK(National Archives of Korea) is an important system which receives and manages large amount of electronic records annually from 2015. From the point of view in database design, this paper analyzes the database schema of CAMS and discusses the direction of overall improvement of the CAMS. Firstly this research analyzes the tables for records and folders in the CAMS database which are core tables for the electronic records management. As a result, researchers notice that it is difficult to trust the quality of the records in the CAMS, because two core tables are entirely not normalized and have many columns whose roles are unknown. Secondly, this study suggests directions of normalization for the tables for records and folders in the CAMS database like followings: First, redistributing the columns into proper tables to reduce the duplication. Second, separating the columns about the classification scheme into separate tables. Third, separating the columns about the records types and sorts into separate tables. Lastly, separating metadata information related to the acquisition, takeover and preservation into separate tables. Thirdly, this paper suggests considerations to design and manage the database schema in each phase of archival management. In the ingest phase, the system should be able to process large amount of records as batch jobs in time annually. In the preservation phase, the system should be able to keep the management histories in the CAMS as audit trails including the reclassification, revaluation, and preservation activities related to the records. In the access phase, the descriptive metadata sets for the access should be selected and confirmed in various ways. Lastly, this research also shows the prototype of conceptual database schema for the CAMS which fulfills the metadata standards for records.

Generation of Artificial Time History Earthquake Record Family using the Least Squares Fitting Method (최소오차 최적합화 방법에 의한 인공 시간이력 지진기록군의 생성)

  • Kim, Yong-Seok
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.12 no.5
    • /
    • pp.31-38
    • /
    • 2008
  • Recently the necessity of time history analyses is increasing for the seismic analyses of a structure, and the seismic design provisions of IBC2003, ASCE and KBC2005 require the use of a minimum of seven earthquake records for the time history analyses. Earthquake records for the time history analyses could be selected from the database of the field-measured earthquake records having similar site conditions with the designed site, or from simulated sites satisfying the design spectrum. However, in this study seven earthquake records were generated using 50 earthquake records, classified as records measured at the rock, in the database of the Pacific Earthquake Research Center (PEER). Seven earthquake records were first selected by the least squares fitting method comparing the scaling factored response spectra with the specified design spectrum, and a family of seven artificial time history earthquake records was ultimately generated by multiplying scaling factors, which were calculated by the least squares fitting method and the SRSS averaging method, to the corresponding selected earthquake records.

A Study on Designing the Metadata for Integrated Management of Individually Managed Presidential Records (개별관리 대통령기록물의 연계관리를 위한 통합 메타데이터 설계 방안 연구)

  • Cho, Hyun-Yang;Jang, Bo-Seong
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.47 no.1
    • /
    • pp.105-124
    • /
    • 2013
  • Metadata standardization of resources, having a heterogeneous metadata structure for each presidential archive and presidential library and museum is preferentially required for utilizing and sharing presidential records. An integrated operation model of metadata to manage various types of presidential records is then needed. The purpose of this study is to create a design principle of integrated metadata, and to suggest relationships and attributes of metadata, needed for developing integrated metadata operation system on presidential records. The design principle includes "creation of relationship among presidential records", "design of each entity, applicable multiple entity data model", "design to describe various types of presidential records", "design to reflect lifelong management on records of holding institutes", and "designing hybrid metadata for long term preservation". Metadata element set consists of elements for common attributes with all types of presidential records for a unique attribute for a specific presidential record and for reference information among different records related to the production of presidential records.

Performance of tuned mass dampers against near-field earthquakes

  • Matta, E.
    • Structural Engineering and Mechanics
    • /
    • v.39 no.5
    • /
    • pp.621-642
    • /
    • 2011
  • Passive tuned mass dampers (TMDs) efficiently suppress vibrations induced by quasi-stationary dynamic inputs, such as winds, sea waves or traffic loads, but may prove of little use against pulse-like excitations, such as near-field (NF) ground motions. The extent of such impairment is however controversial, partly due to the different evaluation criteria adopted within the literature, partly to the limited number of seismic records used in most investigations. In this study, three classical techniques and two new variants for designing a TMD on an SDOF structure are tested under 338 NF records from the PEER NGA database, including 156 records with forward-directivity features. Percentile response reduction spectra are introduced to statistically assess TMD performance, and TMD robustness is verified through Monte Carlo simulations. The methodology is extended to a variety of MDOF bending-type and shear-type frames, and simulated on a case study building structure recently constructed in Central Italy.Results offer an interesting insight into the performance of TMDs against NF earthquakes, ultimately showing that, if properly designed and sufficiently massive, TMDs are effective and robust even in the face of pulse-like ground motions. The two newly proposed design techniques are shown to generally outperform the classical ones.

DESIGN OF DATA REDUCTION SYSTEM AND CONSTRUCTION OF PHOTOMETRIC DATABASE FOR KMTNet (KMTNet 자료처리 시스템 설계와 측광데이터베이스 구축)

  • Kim, D.J.;Lee, C.U.;Kim, S.L.;Park, B.G.;Lee, J.W.
    • Publications of The Korean Astronomical Society
    • /
    • v.24 no.1
    • /
    • pp.83-91
    • /
    • 2009
  • We have designed data processing server system to include data archiving, photometric processing and light curve analysis for KMTNet (Korea Microlensing Telescope Network). Outputs of each process are reported to the main photometric database, which manages the whole processing steps and archives the photometric results. The database is developed using ORACLE 11g Release 2 engine. It allows to select objects applying any set of criteria such as RA/DEC coordinate and Star ID, etc. We tested the performance of the database using the OGLE photometric data. The searching time for querying 70,000,000 records was under 1 second. The database is fully accessed using query forms via web page.

Selecting and scaling ground motion time histories according to Eurocode 8 and ASCE 7-05

  • Ergun, Mustafa;Ates, Sevket
    • Earthquakes and Structures
    • /
    • v.5 no.2
    • /
    • pp.129-142
    • /
    • 2013
  • Linear and nonlinear time history analyses have been becoming more common in seismic analysis and design of structures with advances in computer technology and earthquake engineering. One of the most important issues for such analyses is the selection of appropriate acceleration time histories and matching these histories to a code design acceleration spectrum. In literature, there are three sources of acceleration time histories: artificial records, synthetic records obtained from seismological models and accelerograms recorded in real earthquakes. Because of the increase of the number of strong ground motion database, using and scaling real earthquake records for seismic analysis has been becoming one of the most popular research issues in earthquake engineering. In general, two methods are used for scaling actual earthquake records: scaling in time domain and frequency domain. The objective of this study is twofold: the first is to discuss and summarize basic methodologies and criteria for selecting and scaling ground motion time histories. The second is to analyze scaling results of time domain method according to ASCE 7-05 and Eurocode 8 (1998-1:2004) criteria. Differences between time domain method and frequency domain method are mentioned briefly. The time domain scaling procedure is utilized to scale the available real records obtained from near fault motions and far fault motions to match the proposed elastic design acceleration spectrum given in the Eurocode 8. Why the time domain method is preferred in this study is stated. The best fitted ground motion time histories are selected and these histories are analyzed according to Eurocode 8 (1998-1:2004) and ASCE 7-05 criteria. Also, characteristics of both near fault ground motions and far fault ground motions are presented by the help of figures. Hence, we can compare the effects of near fault ground motions on structures with far fault ground motions' effects.

An investigation on the maximum earthquake input energy for elastic SDOF systems

  • Merter, Onur
    • Earthquakes and Structures
    • /
    • v.16 no.4
    • /
    • pp.487-499
    • /
    • 2019
  • Energy-based seismic design of structures has gradually become prominent in today's structural engineering investigations because of being more rational and reliable when it is compared to traditional force-based and displacement-based methods. Energy-based approaches have widely taken place in many previous studies and investigations and undoubtedly, they are going to play more important role in future seismic design codes, too. This paper aims to compute the maximum earthquake energy input to elastic single-degree-of-freedom (SDOF) systems for selected real ground motion records. A data set containing 100 real ground motion records which have the same site soil profiles has been selected from Pacific Earthquake Research (PEER) database. Response time history (RTH) analyses have been conducted for elastic SDOF systems having a constant damping ratio and natural periods of 0.1 s to 3.0 s. Totally 3000 RTH analyses have been performed and the maximum mass normalized earthquake input energy values for all records have been computed. Previous researchers' approaches have been compared to the results of RTH analyses and an approach which considers the pseudo-spectral velocity with Arias Intensity has been proposed. Graphs of the maximum earthquake input energy versus the maximum pseudo-spectral velocity have been obtained. The results show that there is a good agreement between the maximum input energy demands of RTH analysis and the other approaches and the maximum earthquake input energy is a relatively stable response parameter to be used for further seismic design and evaluations.

A study on configuring deployment of digital repositories for the archives management systems (대량기록물 처리를 위한 영구기록물관리시스템의 디지털저장소 배치형상 연구)

  • Yim, Jin-Hee;Lee, Dae-Wook
    • The Korean Journal of Archival Studies
    • /
    • no.32
    • /
    • pp.177-217
    • /
    • 2012
  • The National Archives of Korea(NAK) has a mission to ingest large-scaled digital records and information from a number of different government agencies annually from 2015. There are important issues related to the digital records and information transfer between NAK and agencies, and one of them is how to configure deployment of digital repositories for the archives management systems. The purpose of this paper is to offer the way to design it by examining the checkpoints through the whole life cycle of digital records and information in the archives management systems and calculating the amount of ingested digital records and information to the systems in 2015 and deploying the digital repositories configured according to the amount the records and information. Firstly, this paper suggests that the archives management systems in NAK should be considered and examined into at least three different parts called Ingest tier, Preservation tier and Access tier in aspects to the characteristics of the flow and process of the digital records and information. Secondly, as a results of the calculation the amount of the digital records and information ingested to the archives management systems in 2015 is sum up to around 2.5 Tera bytes. This research draws several requirements related to the large-scaled data and bulk operations which should be satisfied by the database or database management system implemented on to the archives management systems. Thirdly, this paper configures digital repositories deployment according to the characteristics of the three tiers respectively. This research triggers discussion in depth and gives specific clues about how to design the digital repositories in the archives management systems for preparing the year of 2015.

NVST DATA ARCHIVING SYSTEM BASED ON FASTBIT NOSQL DATABASE

  • Liu, Ying-Bo;Wang, Feng;Ji, Kai-Fan;Deng, Hui;Dai, Wei;Liang, Bo
    • Journal of The Korean Astronomical Society
    • /
    • v.47 no.3
    • /
    • pp.115-122
    • /
    • 2014
  • The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our slestudy brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.

A Study in the Data Modeling for Archive System Applying RiC (RiC을 적용한 아카이브 시스템 데이터 모델링 연구)

  • Shin, Mira;Kim, Ikhan
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.19 no.1
    • /
    • pp.23-67
    • /
    • 2019
  • Records in Contexts (RiC) is an international archival description standard developed by integrating and normalizing four archival description standards of the International Council of Archives (ICA). RiC has the advantage of diversifying archival description, exposing the context of records, and ensuring data interoperability between disparate systems. In this study, RiC is set up as a key tool in the design of archive systems, and logical data modeling is performed to implement the database. Because of RiC's conceptual model, RiC-CM can be used as a data reference model, and which makes it possible to develop a data model that meets user requirements. Therefore, this study intends to implement these two data models: relational data model, which is widely used as the database on legacy systems, and graphical data model, which can flexibly extend objects around the relationship between information entities.