• Title/Summary/Keyword: Generate Data

Search Result 3,066, Processing Time 0.029 seconds

Development of Homogeneous Road Section Determination and Outlier Filter Algorithm (국도의 동질구간 선정과 이상치 제거 방법에 관한 연구)

  • Do, Myung-Sik;Kim, Sung-Hyun;Bae, Hyun-Sook;Kim, Jong-Sik
    • Journal of Korean Society of Transportation
    • /
    • v.22 no.7 s.78
    • /
    • pp.7-16
    • /
    • 2004
  • The homogeneous road section is defined as one consisted of similar traffic characteristics focused on demand and supply. The criteria, in the aspect of demand, are the diverging rate and the ratio of green time to cycle time at signalized intersection, and distance between the signalized intersections. The criteria, in that or supply, are the traffic patterns such as traffic volume and its speed. In this study, the effective method to generate valuable data, pointing out the problems of removal method of obscure data, is proposed using data collected from Gonjiam IC to Jangji IC on the national highway No.3. Travel times are collected with licence matching method and traffic volume and speed are collected from detectors. Futhermore, the method of selecting homogeneous road section is proposed considering demand and supply aspect simultaneously. This method using outlier filtering algorithm can be applied to generate the travel time forecasting model and to revise the obscured of missing data transmitting from detectors. The point and link data collected at the same time on the rational highway can be used as a basis predicting the travel time and revising the obscured data in the future.

Development and Application of a Scenario Analysis System for CBRN Hazard Prediction (화생방 오염확산 시나리오 분석 시스템 구축 및 활용)

  • Byungheon Lee;Jiyun Seo;Hyunwoo Nam
    • Journal of the Korea Society for Simulation
    • /
    • v.33 no.3
    • /
    • pp.13-26
    • /
    • 2024
  • The CBRN(Chemical, Biological, Radiological, and Nuclear) hazard prediction model is a system that supports commanders in making better decisions by creating contamination distribution and damage prediction areas based on the weapons used, terrain, and weather information in the events of biochemical and radiological accidents. NBC_RAMS(Nuclear, Biological and Chemical Reporting And Modeling S/W System) developed by ADD (Agency for Defense Development) is used not only supporting for decision making plan for various military operations and exercises but also for post analyzing CBRN related events. With the NBC_RAMS's core engine, we introduced a CBR hazard assessment scenario analysis system that can generate contaminant distribution prediction results reflecting various CBR scenarios, and described how to apply it in specific purposes in terms of input information, meteorological data, land data with land coverage and DEM, and building data with pologon form. As a practical use case, a technology development case is addressed that tracks the origin location of contaminant source with artificial intelligence and a technology that selects the optimal location of a CBR detection sensor with score data by analyzing large amounts of data generated using the CBRN scenario analysis system. Through this system, it is possible to generate AI-specialized CBRN related to training and analysis data and support planning of operation and exercise by predicting battle field.

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.

Verification Control Algorithm of Data Integrity Verification in Remote Data sharing

  • Xu, Guangwei;Li, Shan;Lai, Miaolin;Gan, Yanglan;Feng, Xiangyang;Huang, Qiubo;Li, Li;Li, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.2
    • /
    • pp.565-586
    • /
    • 2022
  • Cloud storage's elastic expansibility not only provides flexible services for data owners to store their data remotely, but also reduces storage operation and management costs of their data sharing. The data outsourced remotely in the storage space of cloud service provider also brings data security concerns about data integrity. Data integrity verification has become an important technology for detecting the integrity of remote shared data. However, users without data access rights to verify the data integrity will cause unnecessary overhead to data owner and cloud service provider. Especially malicious users who constantly launch data integrity verification will greatly waste service resources. Since data owner is a consumer purchasing cloud services, he needs to bear both the cost of data storage and that of data verification. This paper proposes a verification control algorithm in data integrity verification for remotely outsourced data. It designs an attribute-based encryption verification control algorithm for multiple verifiers. Moreover, data owner and cloud service provider construct a common access structure together and generate a verification sentinel to verify the authority of verifiers according to the access structure. Finally, since cloud service provider cannot know the access structure and the sentry generation operation, it can only authenticate verifiers with satisfying access policy to verify the data integrity for the corresponding outsourced data. Theoretical analysis and experimental results show that the proposed algorithm achieves fine-grained access control to multiple verifiers for the data integrity verification.

HIPIMS Arc-Free Reactive Deposition of Non-conductive Films Using the Applied Material ENDURA 200 mm Cluster Tool

  • Chistyakov, Roman
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2012.02a
    • /
    • pp.96-97
    • /
    • 2012
  • In nitride and oxide film deposition, sputtered metals react with nitrogen or oxygen gas in a vacuum chamber to form metal nitride or oxide films on a substrate. The physical properties of sputtered films (metals, oxides, and nitrides) are strongly influenced by magnetron plasma density during the deposition process. Typical target power densities on the magnetron during the deposition process are ~ (5-30) W/cm2, which gives a relatively low plasma density. The main challenge in reactive sputtering is the ability to generate a stable, arc free discharge at high plasma densities. Arcs occur due to formation of an insulating layer on the target surface caused by the re-deposition effect. One current method of generating an arc free discharge is to use the commercially available Pinnacle Plus+ Pulsed DC plasma generator manufactured by Advanced Energy Inc. This plasma generator uses a positive voltage pulse between negative pulses to attract electrons and discharge the target surface, thus preventing arc formation. However, this method can only generate low density plasma and therefore cannot allow full control of film properties. Also, after long runs ~ (1-3) hours, depends on duty cycle the stability of the reactive process is reduced due to increased probability of arc formation. Between 1995 and 1999, a new way of magnetron sputtering called HIPIMS (highly ionized pulse impulse magnetron sputtering) was developed. The main idea of this approach is to apply short ${\sim}(50-100){\mu}s$ high power pulses with a target power densities during the pulse between ~ (1-3) kW/cm2. These high power pulses generate high-density magnetron plasma that can significantly improve and control film properties. From the beginning, HIPIMS method has been applied to reactive sputtering processes for deposition of conductive and nonconductive films. However, commercially available HIPIMS plasma generators have not been able to create a stable, arc-free discharge in most reactive magnetron sputtering processes. HIPIMS plasma generators have been successfully used in reactive sputtering of nitrides for hard coating applications and for Al2O3 films. But until now there has been no HIPIMS data presented on reactive sputtering in cluster tools for semiconductors and MEMs applications. In this presentation, a new method of generating an arc free discharge for reactive HIPIMS using the new Cyprium plasma generator from Zpulser LLC will be introduced. Data (or evidence) will be presented showing that arc formation in reactive HIPIMS can be controlled without applying a positive voltage pulse between high power pulses. Arc-free reactive HIPIMS processes for sputtering AlN, TiO2, TiN and Si3N4 on the Applied Materials ENDURA 200 mm cluster tool will be presented. A direct comparison of the properties of films sputtered with the Advanced Energy Pinnacle Plus + plasma generator and the Zpulser Cyprium plasma generator will be presented.

  • PDF

XED: Model-based XML Editor Generator for Data-Centric XML Documents (XED: 데이타 중심 XML문서를 위한 모델 기반의 XML 편집기 생성 도구)

  • 최종명;유재우
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.10
    • /
    • pp.894-903
    • /
    • 2003
  • Though XML is widely used, it is still hard for end users to write XML documents. A lot of XML documents are data-centric documents which have formal data format. Even novices can easily write the data-centric XML documents if they use form-based GUIs. In this paper, we introduce a new method for generating form-based XML editor for data-centric XML documents automatically and an XML editor generator called XED. The DTD consists of sequence, choice, and repetition, and this structure can be represented with Document Decomposition Graph(DDG). XED allows users to generate an XML editor by applying the presentation rules to DDG. It also permits users to modify generated editor through changing editor`s GUI properties with direct manipulation.

Association rule mining for intertransactions with considering fairly data semantics (데이터의 의미적 정보를 공정하게 반영한 인터트랜잭션들에 대한 연관규칙 탐사)

  • Ceong, Hyi-Thaek
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.9 no.3
    • /
    • pp.359-368
    • /
    • 2014
  • Recently, to reflect the context between transactions, the intertransaction association rule mining has been study. In this study, we present two problems that is within intertransaction association rule mining method and suggest the methods to solve this problems. First, we suggest an algorithm to reflect changes on data between transactions. Second, we propose the method to solve the unfairly considered frequency of data when intertransactions is generate with transactions. We make more meaningful rules than previous researches. We present the experiment result with measured data from the marine environment.

Rapid Manufacturing of 3D Prototype from 3D scan data using VLM-ST (단속형 가변적층쾌속조형공정을 이용한 3차원 스캔데이터로부터 3차원 시작품의 쾌속 제작)

  • 이상호;안동규;김효찬;양동열;박두섭;채희창
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.536-539
    • /
    • 2002
  • The reverse engineering (RE) technology can quickly generate 3D point cloud data of an object by capturing the surface of a model using a 3D scanner. In the rapid prototyping (RP) technology, prototypes are rapidly produced from 3D CAD models in a layer-by-layer additive basis. In this paper, a physical human head shape is duplicated using a new RP process, the Transfer-type Variable Lamination Manufacturing process using expandable polystyrene foam sheet (VLM-ST), after the point cloud data of a human head shape measured from 3D SNX scanner are converted to STL file. From the duplicated human head shape, it has been shown that the VLM-ST process in connection with the 3D scanner is a fast and efficient process in that shapes with free surface, such as the human head shape, can be duplicated with ease. Considering the measurement time and the shape duplication time, the use of 3D SNX scanner and the VLM-ST process is expected to reduce the lead-time fur the development of new products in comparison with the other existing RE-RP connected manufacturing systems.

  • PDF

Simulation of Stage-Storage Curve Function in Irrigation Reservoirs (저수지 내용적 곡선의 모의발생)

  • 김현영;윤인택;최용선;오수훈
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.37 no.5
    • /
    • pp.73-80
    • /
    • 1995
  • The uses of stage-storage curve function are diverse in irrigation reservoirs. The curve functions would be used to determine the optimal size of spillway length and the inundation area above full water level based on the flood routing in reservoirs. In addition, the curve function would he used to transform the stage to the storage for the reservoir water management, in which the storage is the supply water. Besides those, the curve is necessary for the planning of dredging, the estimation of the effective and the dead storage, the drought management by reservoir, etc. The curve function data, however, are almost unavailable for these purposes. According to the statistics, about 74% of the 2, 900 resevoirs which are maintained by Farm Land Improvement Association have no more effective data. Therefore, the simulation of the curve function could be better alternative. The curve functions were simulated derivating the regression equations based on the basin relief ratio and the effective depth. The results of the verification show the enough reliability of the application to generate the curve function in some reservoirs which do not have the surveyed stage-storage data. Also, even though the averaged curve function would be applicated without the basin relief ratio data, the result shows that the simulated curve is closer to the real one than the linear function by only the existing effective storage data.

  • PDF

Huffman Code Design and PSIP Structure of Hangul Data for Digital Broadcasting (디지털 방송용 한글 허프만 부호 설계 및 PSIP 구조)

  • 황재정;진경식;한학수;최준영;이진환
    • Journal of Broadcast Engineering
    • /
    • v.6 no.1
    • /
    • pp.98-107
    • /
    • 2001
  • In this paper we derive an optimal Huffman code set with escape coding that miximizes coding efficiency for the Hangul text data. The Hangul code can be represented in the standard Wansung or Unicode format, and we can generate a set of Huffamn codes for both. The current Korean DT standard has not defined a Hangul compression algorithm which may be confronted with a serious data rate for the digital data broadcasting system Generation of the optimal Huffman code set is to solve the data transmission problem. A relevant PSIP structure for the DTB standard is also proposed As a result characters which have the probability of less than 0.0043 are escape coded, showing the optimum compression efficiency of 46%.

  • PDF