• Title/Summary/Keyword: Denormalization

Search Result 9, Processing Time 0.029 seconds

Investigation on the Side Effects of Denormalizing Corporate Databases

  • Lee, Sang-Won;Kim, Nam-Gyu;Moon, Song-Chun
    • Journal of Information Technology Applications and Management
    • /
    • v.16 no.2
    • /
    • pp.135-150
    • /
    • 2009
  • Corporate databases are usually denormalized, due to the data modelers' impetuous belief that denormalization could improve system performance. By providing a logical insight into denormalization, this paper attempts to prevent every database modeler from falling into the denormalization pit. We indicate loopholes in the denormalization advocates' assertions, and then present four criteria to analyze the usefulness and validity of denormalization; 1) the level of concurrency among transactions, 2) the database independence of the application program, 3) the independence between the logical design and the physical one, and 4) the overhead cost to maintain database integrity under various query patterns. This paper also includes experimental results to evaluate performance of denormalized and fully normalized structures under various workloads.

  • PDF

Harmfulness of Denormalization Adopted for Database for Database Performance Enhancement (데이터베이스 성능향상용 역정규화의 무용성)

  • Rhee Hae Kyung
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.42 no.3 s.303
    • /
    • pp.9-16
    • /
    • 2005
  • For designing the database more efficiently, normailzation can be enforced to minimize the degree of unnecessary data redundancy and contribute to enhance data integrity. However, deep normalization tends to provoke multiple way of schema join, which could then induces response time degradation. To mitigate this sort of side effect that the normalization could brought, a number of field studies we observed adopted the idea of denormalization. To measure whether denormalization contributes to response time improvement, we in this paper developed two different data models about customer service system, one with perfect normalization and the other with denormalization, and evaluated their query response time behaviors. Performance results show that normalization case consistently outperforms denormalization case in terms of response time. This study show that the idea of denormalization, quite rarely contributes to that sort of improvement due ironically to the unnecessary data redundancy.

An Optimal Denormalization Method in Relational Database with Response Time Constraint (관계형 데이터베이스에서 응답시간에 제약이 있는 경우 최적 역정규화 방법)

  • 장영관
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.26 no.3
    • /
    • pp.1-9
    • /
    • 2003
  • Databases are central to business information systems and RDBMS is most widely used for the database system. Normalization was designed to control various anomalies (insert, update, delete anomalies). However, normalized database design does not account for the tradeoffs necessary for performance. In this research, I model a database design method considering denormalization of duplicating attributes in order to reduce frequent join processes. This model considers response time for processing each select, insert, update, delete transaction, and for treating anomalies. A branch and bound method is proposed for this model.

A database design using denormalization in relational database (관계형 데이터베이스에서 비정규화를 사용한 데이터베이스 설계)

  • 장영관;강맹규
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1996.04a
    • /
    • pp.172-178
    • /
    • 1996
  • Databases are critical to business information systems, and RDBMS is most widely usded for the database system. Normalization has been designed to control various anomalies(insert, update, and delete anomalies). However, normalized databese design does not account for the tradeoffs necessary for the performance. In this research, we develop a model for database desin by denormalization of duplicating attributes in order to reduce frequent join processes. In this model, we consider insert, update, and delete costs. The anomalies are treated by additional disk I/O which is necessary for each insert and update transaction. We propose a branch and bound method for this model, and show considerable cost reduction.

  • PDF

A Database Design without Storage Constraint Considering Denormalization in Relational Database (관계형 데이터베이스에서 저장용량에 제약이 없는 경우 비 정규화를 고려한 데이터베이스 설계)

  • 장영관;강맹규
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.19 no.37
    • /
    • pp.251-261
    • /
    • 1996
  • Databases are critical to business information systems and RDBMS is most widely used for the database system. Normalization was designed to control various anomalies(insert, update, and delete anomalies). However normalized database design does not account for the tradeoffs necessary for the performance reason. In this research, we model a database design problem without storage constraint. Given a normalized database design, in this model, we do the denormalization of duplicating columns in order in reduce frequent join processes. In this paper, we consider insert, update, delete, and storage cost, and the anomalies are treated by additional disk I/O cost necessary for each insert, update transaction. We propose a branch and bound method, and show considerable cost reduction.

  • PDF

An Optimal Database Design Considering Denormalization in Relational Database (관계형 데이터베이스에서 비정규화를 고려한 최적 데이터베이스 설계)

  • 장영관;강맹규
    • The Journal of Information Technology and Database
    • /
    • v.3 no.1
    • /
    • pp.3-24
    • /
    • 1996
  • Databases are critical to business information systems, and RDBMS is most widely used for the database system. Normalization has been designed to control various anomalies(insert, update, and delete anomalies). However, normalized database design does not account for the tradeoffs necessary for the performance. In this research, we develop a model for database design by denormalization of duplicating attributes in order to reduce frequent join processes. In this mood, we consider insert, update, delete, and query costs. The anomaly and data inconsistency are removed by additional disk I/O which is necessary for each update and insert transaction. We propose a branch and bound method for this model, and show considerable cost reduction.

  • PDF

Negative Side Effects of Denormalization-Oriented Data Modeling in Enterprise-Wide Database Design (기업 전사 자료 설계에서 역정규화 중심 데이터 모델링의 부작용)

  • Rhee, Hae-Kyung
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.43 no.6 s.312
    • /
    • pp.17-25
    • /
    • 2006
  • As information systems to be computerized get significantly scaled up, data modeling issues apparently considered to be crucial once again as the early 1980's under the terms of data governance, data architecture or data quality. Unfortuately, merely resorting to heuristics-based field approaches with more or less no firm theoretical foundation of knowledge with regard to criteria of data design lead quite often to major failures in efficacy of data modeling. In this paper, we have compared normalization-critical data modeling approach, well-known as the Non-Stop Data Modeling methodology in the literature, to the Information Engineering in which in many occasions the notion of do-normalization is supported and even recommended as a mandatory part in its modeling nature. Quantitative analyses have revealed that NS methodology ostensibly outperforms IE methodology in terms of efficiency indices like adequacy of entity judgement, degree of existence of data circulation path that confirms the balancedness of data design and ratio of unnecessary data attribute replication.

Implementation of Exchange Rate Forecasting Neural Network Using Heterogeneous Computing (이기종 컴퓨팅을 활용한 환율 예측 뉴럴 네트워크 구현)

  • Han, Seong Hyeon;Lee, Kwang Yeob
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.11
    • /
    • pp.71-79
    • /
    • 2017
  • In this paper, we implemented the exchange rate forecasting neural network using heterogeneous computing. Exchange rate forecasting requires a large amount of data. We used a neural network that could leverage this data accordingly. Neural networks are largely divided into two processes: learning and verification. Learning took advantage of the CPU. For verification, RTL written in Verilog HDL was run on FPGA. The structure of the neural network has four input neurons, four hidden neurons, and one output neuron. The input neurons used the US $ 1, Japanese 100 Yen, EU 1 Euro, and UK £ 1. The input neurons predicted a Canadian dollar value of $ 1. The order of predicting the exchange rate is input, normalization, fixed-point conversion, neural network forward, floating-point conversion, denormalization, and outputting. As a result of forecasting the exchange rate in November 2016, there was an error amount between 0.9 won and 9.13 won. If we increase the number of neurons by adding data other than the exchange rate, it is expected that more precise exchange rate prediction will be possible.

Urban Change Detection for High-resolution Satellite Images Using U-Net Based on SPADE (SPADE 기반 U-Net을 이용한 고해상도 위성영상에서의 도시 변화탐지)

  • Song, Changwoo;Wahyu, Wiratama;Jung, Jihun;Hong, Seongjae;Kim, Daehee;Kang, Joohyung
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.6_2
    • /
    • pp.1579-1590
    • /
    • 2020
  • In this paper, spatially-adaptive denormalization (SPADE) based U-Net is proposed to detect changes by using high-resolution satellite images. The proposed network is to preserve spatial information using SPADE. Change detection methods using high-resolution satellite images can be used to resolve various urban problems such as city planning and forecasting. For using pixel-based change detection, which is a conventional method such as Iteratively Reweighted-Multivariate Alteration Detection (IR-MAD), unchanged areas will be detected as changing areas because changes in pixels are sensitive to the state of the environment such as seasonal changes between images. Therefore, in this paper, to precisely detect the changes of the objects that consist of the city in time-series satellite images, the semantic spatial objects that consist of the city are defined, extracted through deep learning based image segmentation, and then analyzed the changes between areas to carry out change detection. The semantic objects for analyzing changes were defined as six classes: building, road, farmland, vinyl house, forest area, and waterside area. Each network model learned with KOMPSAT-3A satellite images performs a change detection for the time-series KOMPSAT-3 satellite images. For objective assessments for change detection, we use F1-score, kappa. We found that the proposed method gives a better performance compared to U-Net and UNet++ by achieving an average F1-score of 0.77, kappa of 77.29.