• Title/Summary/Keyword: Generate Data

Search Result 3,065, Processing Time 0.032 seconds

FEASIBILITY OF A RFID-BASED MONITORING SYSTEM FOR THE CONCRETE POUR PROCESS

  • S. W. Moon;S. M. Hong
    • International conference on construction engineering and project management
    • /
    • 2007.03a
    • /
    • pp.433-439
    • /
    • 2007
  • A ubiquitous environment in construction requires integrating hardware and software systems. Currently the Construction System Integration Lab (CSIL) at Pusan National University is currently studying the application of ubiquitous technology for better communication in the construction process. In this paper, a pilot of Ubiquitous Concrete Pour System (u-CPS) has been presented to demonstrate the effectiveness of data exchange in the concrete pour process. The u-CPS environment takes advantage of the RFID technology for collecting construction data. The pilot can automatically generate the data for concrete pour work such as departure time, arrival time, concrete pour time. Construction managers can keep track of the progress of concrete pour work using the information. A case study was done for a building construction using the pilot system, the result of which demonstrated that the RFID-base system can help improve the effectiveness of data communication during the concrete pour process.

  • PDF

Reversible Data Hiding Based on Block Median Preservation and image local characteristic

  • Qu, Xiao-Chao;Kim, Hyoung-Joong
    • Annual Conference of KIPS
    • /
    • 2011.04a
    • /
    • pp.986-989
    • /
    • 2011
  • Reversible data hiding is a technique that can embed information into cover media (image, video, voice signal) and can recover the original cover media after extracting the embedded information. In this papa, we propose a new reversible data hiding methods that based on block median preservation and the image local characteristic. By using the median value of a block, a high payload can be got and by considering the image local characteristic, a lot of distortion can be avoided and a high PSNR can be got. In the experiment, our methods can generate better result than the previous reversible data hiding methods.

Determination of Optimal Adhesion Conditions for FDM Type 3D Printer Using Machine Learning

  • Woo Young Lee;Jong-Hyeok Yu;Kug Weon Kim
    • Journal of Practical Engineering Education
    • /
    • v.15 no.2
    • /
    • pp.419-427
    • /
    • 2023
  • In this study, optimal adhesion conditions to alleviate defects caused by heat shrinkage with FDM type 3D printers with machine learning are researched. Machine learning is one of the "statistical methods of extracting the law from data" and can be classified as supervised learning, unsupervised learning and reinforcement learning. Among them, a function model for adhesion between the bed and the output is presented using supervised learning specialized for optimization, which can be expected to reduce output defects with FDM type 3D printers by deriving conditions for optimum adhesion between the bed and the output. Machine learning codes prepared using Python generate a function model that predicts the effect of operating variables on adhesion using data obtained through adhesion testing. The adhesion prediction data and verification data have been shown to be very consistent, and the potential of this method is explained by conclusions.

A Novel Approach for Accessing Semantic Data by Translating RESTful/JSON Commands into SPARQL Messages

  • Nguyen, Khiem Minh;Nguyen, Hai Thanh;Huynh, Hiep Xuan
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.3
    • /
    • pp.222-229
    • /
    • 2016
  • Linked Data is a powerful technology for storing and publishing the structures of data. It is helpful for web applications because of its usefulness through semantic query data. However, using Linked Data is not easy for ordinary users who lack knowledge about the structure of data or the query syntax of Linked Data. For that problem, we propose a translator component that is used for translating RESTful/JSON request messages into SPARQL commands based on ontology - a metadata that describes the structure of data. Clients do not need to worry about the structure of stored data or SPARQL, a kind of query language used for querying linked data that not many people know, when they insert a new instance or query for all instances of any specific class with those complex structure data. In addition, the translator component has the search function that can find a set of data from multiple classes based on finding the shortest paths between the target classes - the original set that user provide, and target classes- the users want to get. This translator component will be applied for any dynamic ontological structure as well as automatically generate a SPARQL command based on users' request message.

IMAGE DATA CHAIN ANALYSIS FOR SATELLITE CAMERA ELECTRONIC SYSTEM

  • Park, Jong-Euk;Kong, Jong-Pil;Heo, Haeng-Pal;Kim, Young-Sun;Chang, Young-Jun
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.791-793
    • /
    • 2006
  • In the satellite camera, the incoming light source is converted to electronic analog signals by the electronic component for example CCD (Charge Coupled Device) detectors. The analog signals are amplified, biased and converted into digital signals (pixel data stream) in the video processor (A/Ds). The outputs of the A/Ds are digitally multiplexed and driven out using differential line drivers (two pairs of wires) for cross strap requirement. The MSC (Multi-Spectral Camera) in the KOMPSAT-2 which is a LEO spacecraft will be used to generate observation imagery data in two main channels. The MSC is to obtain data for high-resolution images by converting incoming light from the earth into digital stream of pixel data. The video data outputs are then MUXd, converted to 8 bit bytes, serialized and transmitted to the NUC (Non-Uniformity Correction) module by the Hotlink data transmitter. In this paper, the video data streams, the video data format, and the image data processing routine for satellite camera are described in terms of satellite camera control hardware. The advanced satellite with very high resolution requires faster and more complex image data chain than this algorithm. So, the effective change of the used image data chain and the fast video data transmission method are discussed in this paper

  • PDF

A Study on Implementation and Direction of the Metropolitan Cites' GIS (광역시 GIS의 구축현황 및 방향에 관한 연구)

  • O, Jong-U
    • 한국디지털정책학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.215-229
    • /
    • 2005
  • The purpose of this study is to analyses 7 metropolitan cites on the urban facility data rather than application systems including water, sewer, and roads and to present the best way to reconfigure of the geospatial data in Korea. Data were mainly compiled from the 2nd NGIS(National Geographic Information Systems: 95-00) because the GIS data of this study is relatively important to generate favorable results due to their data precisely examinate by the 1st NGIS evaluated results. In results, GIS data and invested budget of the Seoul metropolitan government were not only covered 25 distracted areas of the fields, but also were overwhelming other six metropolitan government invested budget. Relatively the Daejeon metropolitan government had least invested budget resulted in Jess geospatial data than others. In particularly there were serious geospatial data error occurred even though geospatial data of the urban infrastructures like water, sewer, and roads including the national infrastructure like elec., comm., heat, oil pipes, and gas were facilitated with human habitants, gas and elec. had very high error data nearly two times high than the permitted error in 30cm. Therefore, this paper exhibited a LiDAR technology to regenerate high accuracy geospatial data of the metropolitan governments because of the new technology's extremely accuracy in 20cm in horizontal scale in fields.

  • PDF

Toward Generic, Immersive, and Collaborative Solutions to the Data Interoperability Problem which Target End-Users

  • Sanchez-Ruiz, Arturo;Umapathy, Karthikeyan;Hayes, Pat
    • Journal of Computing Science and Engineering
    • /
    • v.3 no.2
    • /
    • pp.127-141
    • /
    • 2009
  • In this paper, we describe our vision of a "Just-in-time" initiative to solve the Data Interoperability Problem (a.k.a. INTEROP.) We provide an architectural overview of our initiative which draws upon existing technologies to develop an immersive and collaborative approach which aims at empowering data stakeholders (e.g., data producers and data consumers) with integrated tools to interact and collaborate with each other while directly manipulating visual representations of their data in an immersive environment (e.g., implemented via Second Life.) The semantics of these visual representations and the operations associated with the data are supported by ontologies defined using the Common Logic Framework (CL). Data operations gestured by the stakeholders, through their avatars, are translated to a variety of generated resources such as multi-language source code, visualizations, web pages, and web services. The generality of the approach is supported by a plug-in architecture which allows expert users to customize tasks such as data admission, data manipulation in the immersive world, and automatic generation of resources. This approach is designed with a mindset aimed at enabling stakeholders from diverse domains to exchange data and generate new knowledge.

EVALUATION OF AN ENHANCED WEATHER GENERATION TOOL FOR SAN ANTONIO CLIMATE STATION IN TEXAS

  • Lee, Ju-Young
    • Water Engineering Research
    • /
    • v.5 no.1
    • /
    • pp.47-54
    • /
    • 2004
  • Several computer programs have been developed to make stochastically generated weather data from observed daily data. But they require fully dataset to run WGEN. Mostly, meterological data frequently have sporadic missing data as well as totally missing data. The modified WGEN has data filling algorithm for incomplete meterological datasets. Any other WGEN models have not the function of data filling. Modified WGEN with data filling algorithm is processing from the equation of Matalas for first order autoregressive process on a multi dimensional state with known cross and auto correlations among state variables. The parameters of the equation of Matalas are derived from existing dataset and derived parameters are adopted to fill data. In case of WGEN (Richardson and Wright, 1984), it is one of most widely used weather generators. But it has to be modified and added. It uses an exponential distribution to generate precipitation amounts. An exponential distribution is easier to describe the distribution of precipitation amounts. But precipitation data with using exponential distribution has not been expressed well. In this paper, generated precipitation data from WGEN and Modified WGEN were compared with corresponding measured data as statistic parameters. The modified WGEN adopted a formula of CLIGEN for WEPP (Water Erosion Prediction Project) in USDA in 1985. In this paper, the result of other parameters except precipitation is not introduced. It will be introduced through study of verification and review soon

  • PDF

A Study of Data Quality Management Maturity Model (데이터품질관리 성숙도모델에 대한 연구)

  • Kim, Chan-Soo;Park, Joo-Seok
    • Journal of the Korean Society for information Management
    • /
    • v.20 no.4 s.50
    • /
    • pp.249-275
    • /
    • 2003
  • In companies competing for today's information society, Data quality deterioration is causing a negative influence to generate company competitiveness fall and new cost. A lot of Preceding study about data qualify have been proceeded in order to solve a problem of these data qualify deterioration. Among the sides of data qualify, it has been studied mainly on qualify of the data valve and quality of data service that are the results quality concept. However. this study studied structural qualify of the data which were cause quality concept in a viewpoint of meta data management and presented data quality management maturity model through this. Also empirically this study verified that data quality improved if the management level matured.

Hadoop Based Wavelet Histogram for Big Data in Cloud

  • Kim, Jeong-Joon
    • Journal of Information Processing Systems
    • /
    • v.13 no.4
    • /
    • pp.668-676
    • /
    • 2017
  • Recently, the importance of big data has been emphasized with the development of smartphone, web/SNS. As a result, MapReduce, which can efficiently process big data, is receiving worldwide attention because of its excellent scalability and stability. Since big data has a large amount, fast creation speed, and various properties, it is more efficient to process big data summary information than big data itself. Wavelet histogram, which is a typical data summary information generation technique, can generate optimal data summary information that does not cause loss of information of original data. Therefore, a system applying a wavelet histogram generation technique based on MapReduce has been actively studied. However, existing research has a disadvantage in that the generation speed is slow because the wavelet histogram is generated through one or more MapReduce Jobs. And there is a high possibility that the error of the data restored by the wavelet histogram becomes large. However, since the wavelet histogram generation system based on the MapReduce developed in this paper generates the wavelet histogram through one MapReduce Job, the generation speed can be greatly increased. In addition, since the wavelet histogram is generated by adjusting the error boundary specified by the user, the error of the restored data can be adjusted from the wavelet histogram. Finally, we verified the efficiency of the wavelet histogram generation system developed in this paper through performance evaluation.