• Title/Summary/Keyword: Generate Data

Search Result 3,066, Processing Time 0.035 seconds

Land Use Classification in the Seoul Metropolitan Region - An Application of Remote Sensing - (인공위성 영상자료를 이용한 수도권 토지이용 실태분석)

  • 김영표;김순희
    • Spatial Information Research
    • /
    • v.2 no.2
    • /
    • pp.135-145
    • /
    • 1994
  • The primary purpose of this study is, using Landsat remote sensing data and a image processing software, ERDAS, to generate real data and image photographs on physical land use of the Seoul metropolitan region. The remote sensing data used in this study are Landsat MSS data (August 28, 1979) and TM data (May 31, 1991) which cover the Seoul metropolitan region of Korea. The spatial resolutions of MSS data and TM data are 57m X 79m and 30m X 30m respectively. In addition, this study aims at contrasting urbanization phases of the Seoul metropolitan region in 1979 with those in 1991, by making image photographs and statistics on physical land use. Summing up the major results, built-up area ratio within the Seoul city had been expanded from 41.9% in 1979 to 64.5% in 1991 and that within the radius of 40km of Seoul city hall had been expanded from 10.5% In 1979 to 19.8% in 1991. The data and technique developed in this study could serve as a useful tool in making various kinds of spatial plannings, that is, urban and regional planning, selection of optimal new town location, evaluation of public facilities location alternatives, etc..

  • PDF

OLAP System and Performance Evaluation for Analyzing Web Log Data (웹 로그 분석을 위한 OLAP 시스템 및 성능 평가)

  • 김지현;용환승
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.5
    • /
    • pp.909-920
    • /
    • 2003
  • Nowadays, IT for CRM has been growing and developed rapidly. Typical techniques are statistical analysis tools, on-line multidimensional analytical processing (OLAP) tools, and data mining algorithms (such neural networks, decision trees, and association rules). Among customer data, web log data is very important and to use these data efficiently, applying OLAP technology to analyze multi-dimensionally. To make OLAP cube, we have to precalculate multidimensional summary results in order to get fast response. But as the number of dimensions and sparse cells increases, data explosion occurs seriously and the performance of OLAP decreases. In this paper, we presented why the web log data sparsity occurs and then what kinds of sparsity patterns generate in the two and t.he three dimensions for OLAP. Based on this research, we set up the multidimensional data models and query models for benchmark with each sparsity patterns. Finally, we evaluated the performance of three OLAP systems (MS SQL 2000 Analysis Service, Oracle Express and C-MOLAP).

  • PDF

Assessment of Insolation Data in Korea for Building Energy Performance Assessment (건물에너지 성능 평가를 위한 효과적 기상자료 선정에 관한 연구)

  • Kim, K.S.;Kim, C.B.;Park, J.U.;Yoon, J.H.;Lee, E.J.;Song, I.C.
    • Solar Energy
    • /
    • v.18 no.3
    • /
    • pp.31-39
    • /
    • 1998
  • Selection of a right weather data set has been considered as one of important factors for a successful building energy audit process. A set of 30 year raw weather data base for six major cities has been developed to provide the weather data file for building energy audit and retrofit analysis in Korea. The program named as KWDP(KIER Weather Data Processor) uses the DB to produce a right data set for a specific building energy performance simulation program like DOE2.1E. A program called WMAKE has been developed to generate the right set of input parameters for DOE2.1E weather utility program. The set of the programs could provide the right weather data for specific building energy audit and retrofit analysis.

  • PDF

Vehicle Location Data Generator based on a User (사용자 지정 시나리오에 기반한 차량 위치 데이터 생성기)

  • Jung Young-Jin;Cho Eun-Sun;Ryu Keun-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.2 s.40
    • /
    • pp.101-110
    • /
    • 2006
  • ADevelopment of various geographic observations, GPS, and Wireless Communication technologies make it easy to control many moving objects and to build an intelligent transport system and transport vehicle management system. However it is difficult to make a suitable system in the real world with a variety of tests to evaluate the performance fairly because real vehicle data are not enough as evaluating and testing the transport plan in the system. Therefore some moving object data generator would be used in most researches. However they can not generate vehicle trajectory according to a user scenario defined to be applied to transport plan, because the existing data generators consider only a gauss distribution, road network. In this paper we design and implement a vehicle data generator for creating vehicle trajectory data based on the user-defined scenario. The designed data generator could make the vehicle location depending on user's transport plan. Besides we store the scenario as patterns and reutilize the used scenario.

  • PDF

A Multimedia Data Compression Scheme for Disaster Prevention in Wireless Multimedia Sensor Networks

  • Park, Jun-Ho;Lim, Jong-Tae;Yoo, Jae-Soo;Oh, Yong-Sun;Oh, Sang-Hoon;Min, Byung-Won;Park, Sun-Gyu;Noh, Hwang-Woo;Hayashida, Yukuo
    • International Journal of Contents
    • /
    • v.11 no.2
    • /
    • pp.31-36
    • /
    • 2015
  • Recent years have seen a significant increase in demand for multimedia data over wireless sensor networks for monitoring applications that utilize sensor nodes to collect multimedia data, including sound and video. However, the multimedia streams generate a very large amount of data. When data transmission schemes for traditional wireless sensor networks are applied in wireless multimedia sensor networks, the network lifetime significantly decreases due to the excessive energy consumption of specific nodes. In this paper, we propose a data compression scheme that implements the Chinese remainder theorem to a wireless multimedia sensor network. The proposed scheme uses the Chinese Remainder Theorem (CRT) to compress and split multimedia data, and it then transmits the bit-pattern packets of the remainder to the base station. As a result, the amount of multimedia data that is transmitted is reduced. The superiority of our proposed scheme is demonstrated by comparing its performance to that of an existing scheme. The results of our experiment indicate that our proposed scheme significantly increased the compression ratio and reduced the compression operation in comparison to those of existing compression schemes.

A Study on the Synthetic ECG Generation for User Recognition (사용자 인식을 위한 가상 심전도 신호 생성 기술에 관한 연구)

  • Kim, Min Gu;Kim, Jin Su;Pan, Sung Bum
    • Smart Media Journal
    • /
    • v.8 no.4
    • /
    • pp.33-37
    • /
    • 2019
  • Because the ECG signals are time-series data acquired as time elapses, it is important to obtain comparative data the same in size as the enrolled data every time. This paper suggests a network model of GAN (Generative Adversarial Networks) based on an auxiliary classifier to generate synthetic ECG signals which may address the different data size issues. The Cosine similarity and Cross-correlation are used to examine the similarity of synthetic ECG signals. The analysis shows that the Average Cosine similarity was 0.991 and the Average Euclidean distance similarity based on cross-correlation was 0.25: such results indicate that data size difference issue can be resolved while the generated synthetic ECG signals, similar to real ECG signals, can create synthetic data even when the registered data are not the same as the comparative data in size.

Qualification Test of ROCSAT -2 Image Processing System

  • Liu, Cynthia;Lin, Po-Ting;Chen, Hong-Yu;Lee, Yong-Yao;Kao, Ricky;Wu, An-Ming
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.1197-1199
    • /
    • 2003
  • ROCSAT-2 mission is to daily image over Taiwan and the surrounding area for disaster monitoring, land use, and ocean surveillance during the 5-year mission lifetime. The satellite will be launched in December 2003 into its mission orbit, which is selected as a 14 rev/day repetitive Sun-synchronous orbit descending over (120 deg E, 24 deg N) and 9:45 a.m. over the equator with the minimum eccentricity. National Space Program Office (NSPO) is developing a ROCSAT-2 Image Processing System (IPS), which aims to provide real-time high quality image data for ROCSAT-2 mission. A simulated ROCSAT-2 image, based on Level 1B QuickBird Data, is generated for IPS verification. The test image is comprised of one panchromatic data and four multispectral data. The qualification process consists of four procedures: (a) QuickBird image processing, (b) generation of simulated ROCSAT-2 image in Generic Raw Level Data (GERALD) format, (c) ROCSAT-2 image processing, and (d) geometric error analysis. QuickBird standard photogrammetric parameters of a camera that models the imaging and optical system is used to calculate the latitude and longitude of each line and sample. The backward (inverse model) approach is applied to find the relationship between geodetic coordinate system (latitude, longitude) and image coordinate system (line, sample). The bilinear resampling method is used to generate the test image. Ground control points are used to evaluate the error for data processing. The data processing contains various coordinate system transformations using attitude quaternion and orbit elements. Through the qualification test process, it is verified that the IPS is capable of handling high-resolution image data with the accuracy of Level 2 processing within 500 m.

  • PDF

Analysis of Success Factors of OTT Original Contents Through BigData, Netflix's 'Squid Game Season 2' Proposal (빅데이터를 통한 OTT 오리지널 콘텐츠의 성공요인 분석, 넷플릭스의 '오징어게임 시즌2' 제언)

  • Ahn, Sunghun;Jung, JaeWoo;Oh, Sejong
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.18 no.1
    • /
    • pp.55-64
    • /
    • 2022
  • This study analyzes the success factors of OTT original content through big data, and intends to suggest scenarios, casting, fun, and moving elements when producing the next work. In addition, I would like to offer suggestions for the success of 'Squid Game Season 2'. The success factor of 'Squid Game' through big data is first, it is a simple psychological experimental game. Second, it is a retro strategy. Third, modern visual beauty and color. Fourth, it is simple aesthetics. Fifth, it is the platform of OTT Netflix. Sixth, Netflix's video recommendation algorithm. Seventh, it induced Binge-Watch. Lastly, it can be said that the consensus was high as it was related to the time to think about 'death' and 'money' in a pandemic situation. The suggestions for 'Squid Game Season 2' are as follows. First, it is a fusion of famous traditional games of each country. Second, it is an AI-based planned MD product production and sales strategy. Third, it is casting based on artificial intelligence big data. Fourth, secondary copyright and copyright sales strategy. The limitations of this study were analyzed only through external data. Data inside the Netflix platform was not utilized. In this study, if AI big data is used not only in the OTT field but also in entertainment and film companies, it will be possible to discover better business models and generate stable profits.

Utilization of Skewness for Statistical Quality Control (통계적 품질관리를 위한 왜도의 활용)

  • Kim, Hoontae;Lim, Sunguk
    • Journal of Korean Society for Quality Management
    • /
    • v.51 no.4
    • /
    • pp.663-675
    • /
    • 2023
  • Purpose: Skewness is an indicator used to measure the asymmetry of data distribution. In the past, product quality was judged only by mean and variance, but in modern management and manufacturing environments, various factors and volatility must be considered. Therefore, skewness helps accurately understand the shape of data distribution and identify outliers or problems, and skewness can be utilized from this new perspective. Therefore, we would like to propose a statistical quality control method using skewness. Methods: In order to generate data with the same mean and variance but different skewness, data was generated using normal distribution and gamma distribution. Using Minitab 18, we created 20 sets of 1,000 random data of normal distribution and gamma distribution. Using this data, it was proven that the process state can be sensitively identified by using skewness. Results: As a result of the analysis of this study, if the skewness is within ± 0.2, there is no difference in judgment from management based on the probability of errors that can be made in the management state as discussed in quality control. However, if the skewness exceeds ±0.2, the control chart considering only the standard deviation determines that it is in control, but it can be seen that the data is out of control. Conclusion: By using skewness in process management, the ability to evaluate data quality is improved and the ability to detect abnormal signals is excellent. By using this, process improvement and process non-sub-stitutability issues can be quickly identified and improved.

Automated Terrain Data Generation for Urban Flood Risk Mapping Using c-GAN and BBDM

  • Jonghyuk Lee;Sangik Lee;Byung-hun Seo;Dongsu Kim;Yejin Seo;Dongwoo Kim;Yerim Cho;Won Choi
    • International conference on construction engineering and project management
    • /
    • 2024.07a
    • /
    • pp.1294-1294
    • /
    • 2024
  • Flood risk maps are used in urban flooding to understand the spatial extent and depth of inundation damage. To construct these maps, hydrodynamic modeling capable of simulating flood waves is necessary. Flood waves are typically fast, and inundation patterns can significantly vary depending on the terrain, making it essential to accurately represent the terrain of the flood source in flood wave analysis. Recently, methods using UAVs for terrain data construction through Structure-from-Motion or LiDAR have been utilized. These methods are crucial for UAV operations, and thus, still require a lot of time and manpower, and are limited when UAV operations are not possible. Therefore, for efficient nationwide monitoring, this study developed a model that can automatically generate terrain data by estimating depth information from a single image using c-GAN (Conditional Generative Adversarial Networks) and BBDM (Brownian Bridge Diffusion Model). The training, utilization, and validation datasets employed images from the ISPRS (2018) and directly aerial photographed image sets from five locations in the territory of the Republic of Korea. Compared to the ground truth of the test data set, it is considered sufficiently usable as terrain data for flood wave analysis, capable of generating highly accurate and precise terrain data with high reproducibility.