• Title/Summary/Keyword: Sensor Data Process

Search Result 990, Processing Time 0.029 seconds

Multi-stage Image Restoration for High Resolution Panchromatic Imagery (고해상도 범색 영상을 위한 다중 단계 영상 복원)

  • Lee, Sanghoon
    • Korean Journal of Remote Sensing
    • /
    • v.32 no.6
    • /
    • pp.551-566
    • /
    • 2016
  • In the satellite remote sensing, the operational environment of the satellite sensor causes image degradation during the image acquisition. The degradation results in noise and blurring which badly affect identification and extraction of useful information in image data. Especially, the degradation gives bad influence in the analysis of images collected over the scene with complicate surface structure such as urban area. This study proposes a multi-stage image restoration to improve the accuracy of detailed analysis for the images collected over the complicate scene. The proposed method assumes a Gaussian additive noise, Markov random field of spatial continuity, and blurring proportional to the distance between the pixels. Point-Jacobian Iteration Maximum A Posteriori (PJI-MAP) estimation is employed to restore a degraded image. The multi-stage process includes the image segmentation performing region merging after pixel-linking. A dissimilarity coefficient combining homogeneity and contrast is proposed for image segmentation. In this study, the proposed method was quantitatively evaluated using simulation data and was also applied to the two panchromatic images of super-high resolution: Dubaisat-2 data of 1m resolution from LA, USA and KOMPSAT3 data of 0.7 m resolution from Daejeon in the Korean peninsula. The experimental results imply that it can improve analytical accuracy in the application of remote sensing high resolution panchromatic imagery.

Flood Disaster Prediction and Prevention through Hybrid BigData Analysis (하이브리드 빅데이터 분석을 통한 홍수 재해 예측 및 예방)

  • Ki-Yeol Eom;Jai-Hyun Lee
    • The Journal of Bigdata
    • /
    • v.8 no.1
    • /
    • pp.99-109
    • /
    • 2023
  • Recently, not only in Korea but also around the world, we have been experiencing constant disasters such as typhoons, wildfires, and heavy rains. The property damage caused by typhoons and heavy rain in South Korea alone has exceeded 1 trillion won. These disasters have resulted in significant loss of life and property damage, and the recovery process will also take a considerable amount of time. In addition, the government's contingency funds are insufficient for the current situation. To prevent and effectively respond to these issues, it is necessary to collect and analyze accurate data in real-time. However, delays and data loss can occur depending on the environment where the sensors are located, the status of the communication network, and the receiving servers. In this paper, we propose a two-stage hybrid situation analysis and prediction algorithm that can accurately analyze even in such communication network conditions. In the first step, data on river and stream levels are collected, filtered, and refined from diverse sensors of different types and stored in a bigdata. An AI rule-based inference algorithm is applied to analyze the crisis alert levels. If the rainfall exceeds a certain threshold, but it remains below the desired level of interest, the second step of deep learning image analysis is performed to determine the final crisis alert level.

The IEEE 802.15.4e based Distributed Scheduling Mechanism for the Energy Efficiency of Industrial Wireless Sensor Networks (IEEE 802.15.4e DSME 기반 산업용 무선 센서 네트워크에서의 전력소모 절감을 위한 분산 스케줄링 기법 연구)

  • Lee, Yun-Sung;Chung, Sang-Hwa
    • Journal of KIISE
    • /
    • v.44 no.2
    • /
    • pp.213-222
    • /
    • 2017
  • The Internet of Things (IoT) technology is rapidly developing in recent years, and is applicable to various fields. A smart factory is one wherein all the components are organically connected to each other via a WSN, using an intelligent operating system and the IoT. A smart factory technology is used for flexible process automation and custom manufacturing, and hence needs adaptive network management for frequent network fluctuations. Moreover, ensuring the timeliness of the data collected through sensor nodes is crucial. In order to ensure network timeliness, the power consumption for information exchange increases. In this paper, we propose an IEEE 802.15.4e DSME-based distributed scheduling algorithm for mobility support, and we evaluate various performance metrics. The proposed algorithm adaptively assigns communication slots by analyzing the network traffic of each node, and improves the network reliability and timeliness. The experimental results indicate that the throughput of the DSME MAC protocol is better than the IEEE 802.15.4e TSCH and the legacy slotted CSMA/CA in large networks with more than 30 nodes. Also, the proposed algorithm improves the throughput by 15%, higher than other MACs including the original DSME. Experimentally, we confirm that the algorithm reduces power consumption by improving the availability of communication slots. The proposed algorithm improves the power consumption by 40%, higher than other MACs.

Joint Demosaicking and Arbitrary-ratio Down Sampling Algorithm for Color Filter Array Image (컬러 필터 어레이 영상에 대한 공동의 컬러보간과 임의 배율 다운샘플링 알고리즘)

  • Lee, Min Seok;Kang, Moon Gi
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.54 no.4
    • /
    • pp.68-74
    • /
    • 2017
  • This paper presents a joint demosaicking and arbitrary-ratio down sampling algorithm for color filter array (CFA) images. Color demosaiking is a necessary part of image signal processing pipeline for many types of digital image recording system using single sensor. Also, such as smart phone, obtained high resolution image from image sensor has to be down-sampled to be displayed on the screen. The conventional solution is "Demosaicking first and down sampling later". However, this scheme requires a significant amount of memory and computational cost. Also, artifacts can be introduced or details get damaged during demosaicking and down sampling process. In this paper, we propose a method in which demosaicking and down sampling are working simultaneously. We use inverse mapping of Bayer CFA and then joint demosaicking and down sampling with arbitrary-ratio scheme based on signal decomposition of high and low frequency component in input data. Experimental results show that our proposed algorithm has better image quality performance and much less computational cost than those of conventional solution.

A Study on Integrated Control and Safety Management Systems for LNG Membrane Storage Tank (멤브레인식 LNG 저장탱크용 통합제어안전관리시스템에 대한 연구)

  • Kim, Chung-Kyun
    • Journal of the Korean Institute of Gas
    • /
    • v.14 no.2
    • /
    • pp.40-46
    • /
    • 2010
  • In this study, the integrated control and safety management system for a super-large LNG membrane storage tank has been presented based on the investigation and analysis of measuring equipments and safety analysis system for a conventional LNG membrane storage tank. The integrated control and safety management system, which may increase a safety and efficiency of a super-large LNG membrane storage tank, added additional pressure gauges and new displacement/force sensors at the steel anchor between an inner tank and a prestressed concrete structure. The displacement and force sensors may provide clues of a membrane panel failure and a LNG leakage from the inner tank. The conventional leak sensor may not provide proper information on the membrane panel fracture even though LNG is leaked until the leak detector, which is placed at the insulation area behind an inner tank, send a warning signal. Thus, the new integrated control and safety management system is to collect and analyze the temperature, pressure, displacement, force and LNG density, which are related to the tank system safety and leakage control from the inner tank. The digital data are also measured from measurement systems such as displacement and force of a membrane panel safety, LNG level and density, cool-down process, leakage, and pressure controls.

A Study on Smart Factory System Design for Screw Machining Management (나사 가공 관리를 위한 스마트팩토리 시스템 설계에 관한 연구)

  • Lee, Eun-Kyu;Kim, Dong-Wan;Lee, Sang-Wan;Kim, Jae-joong
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.329-331
    • /
    • 2018
  • In this paper, we propose a monitoring system that starts with the supply of raw materials for threading, is processed into a lathe machine, and checks for defects of the product are automatically performed by the robot with Smart Factory technology through assembly and disassembly. Completion check according to the production instruction quantity and production instruction is made by checking the production status according to whether or not the raw material is worn by the displacement sensor, and checking the pitch and the contour of the processed female and male to determine OK and NG. The robotic system acts as a relay for loading and unloading of raw materials, pallet transfer, and overall process, and it acts as an intermediary for organically driving. The location information of the threaded products is collected by using the non-contact wireless tag and the energy saving system Production efficiency and utilization rate were checked. The environmental sensor collects the air-conditioning environment data (temperature, humidity), measures the temperature and humidity accurately, and checks the quality of product processing. It monitors and monitors the driving hazard level environment (overheating, humidity) of the product. Controls for CNC and robot module PLC as a heterogeneous system.

  • PDF

Information Privacy Concern in Context-Aware Personalized Services: Results of a Delphi Study

  • Lee, Yon-Nim;Kwon, Oh-Byung
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.63-86
    • /
    • 2010
  • Personalized services directly and indirectly acquire personal data, in part, to provide customers with higher-value services that are specifically context-relevant (such as place and time). Information technologies continue to mature and develop, providing greatly improved performance. Sensory networks and intelligent software can now obtain context data, and that is the cornerstone for providing personalized, context-specific services. Yet, the danger of overflowing personal information is increasing because the data retrieved by the sensors usually contains privacy information. Various technical characteristics of context-aware applications have more troubling implications for information privacy. In parallel with increasing use of context for service personalization, information privacy concerns have also increased such as an unrestricted availability of context information. Those privacy concerns are consistently regarded as a critical issue facing context-aware personalized service success. The entire field of information privacy is growing as an important area of research, with many new definitions and terminologies, because of a need for a better understanding of information privacy concepts. Especially, it requires that the factors of information privacy should be revised according to the characteristics of new technologies. However, previous information privacy factors of context-aware applications have at least two shortcomings. First, there has been little overview of the technology characteristics of context-aware computing. Existing studies have only focused on a small subset of the technical characteristics of context-aware computing. Therefore, there has not been a mutually exclusive set of factors that uniquely and completely describe information privacy on context-aware applications. Second, user survey has been widely used to identify factors of information privacy in most studies despite the limitation of users' knowledge and experiences about context-aware computing technology. To date, since context-aware services have not been widely deployed on a commercial scale yet, only very few people have prior experiences with context-aware personalized services. It is difficult to build users' knowledge about context-aware technology even by increasing their understanding in various ways: scenarios, pictures, flash animation, etc. Nevertheless, conducting a survey, assuming that the participants have sufficient experience or understanding about the technologies shown in the survey, may not be absolutely valid. Moreover, some surveys are based solely on simplifying and hence unrealistic assumptions (e.g., they only consider location information as a context data). A better understanding of information privacy concern in context-aware personalized services is highly needed. Hence, the purpose of this paper is to identify a generic set of factors for elemental information privacy concern in context-aware personalized services and to develop a rank-order list of information privacy concern factors. We consider overall technology characteristics to establish a mutually exclusive set of factors. A Delphi survey, a rigorous data collection method, was deployed to obtain a reliable opinion from the experts and to produce a rank-order list. It, therefore, lends itself well to obtaining a set of universal factors of information privacy concern and its priority. An international panel of researchers and practitioners who have the expertise in privacy and context-aware system fields were involved in our research. Delphi rounds formatting will faithfully follow the procedure for the Delphi study proposed by Okoli and Pawlowski. This will involve three general rounds: (1) brainstorming for important factors; (2) narrowing down the original list to the most important ones; and (3) ranking the list of important factors. For this round only, experts were treated as individuals, not panels. Adapted from Okoli and Pawlowski, we outlined the process of administrating the study. We performed three rounds. In the first and second rounds of the Delphi questionnaire, we gathered a set of exclusive factors for information privacy concern in context-aware personalized services. The respondents were asked to provide at least five main factors for the most appropriate understanding of the information privacy concern in the first round. To do so, some of the main factors found in the literature were presented to the participants. The second round of the questionnaire discussed the main factor provided in the first round, fleshed out with relevant sub-factors. Respondents were then requested to evaluate each sub factor's suitability against the corresponding main factors to determine the final sub-factors from the candidate factors. The sub-factors were found from the literature survey. Final factors selected by over 50% of experts. In the third round, a list of factors with corresponding questions was provided, and the respondents were requested to assess the importance of each main factor and its corresponding sub factors. Finally, we calculated the mean rank of each item to make a final result. While analyzing the data, we focused on group consensus rather than individual insistence. To do so, a concordance analysis, which measures the consistency of the experts' responses over successive rounds of the Delphi, was adopted during the survey process. As a result, experts reported that context data collection and high identifiable level of identical data are the most important factor in the main factors and sub factors, respectively. Additional important sub-factors included diverse types of context data collected, tracking and recording functionalities, and embedded and disappeared sensor devices. The average score of each factor is very useful for future context-aware personalized service development in the view of the information privacy. The final factors have the following differences comparing to those proposed in other studies. First, the concern factors differ from existing studies, which are based on privacy issues that may occur during the lifecycle of acquired user information. However, our study helped to clarify these sometimes vague issues by determining which privacy concern issues are viable based on specific technical characteristics in context-aware personalized services. Since a context-aware service differs in its technical characteristics compared to other services, we selected specific characteristics that had a higher potential to increase user's privacy concerns. Secondly, this study considered privacy issues in terms of service delivery and display that were almost overlooked in existing studies by introducing IPOS as the factor division. Lastly, in each factor, it correlated the level of importance with professionals' opinions as to what extent users have privacy concerns. The reason that it did not select the traditional method questionnaire at that time is that context-aware personalized service considered the absolute lack in understanding and experience of users with new technology. For understanding users' privacy concerns, professionals in the Delphi questionnaire process selected context data collection, tracking and recording, and sensory network as the most important factors among technological characteristics of context-aware personalized services. In the creation of a context-aware personalized services, this study demonstrates the importance and relevance of determining an optimal methodology, and which technologies and in what sequence are needed, to acquire what types of users' context information. Most studies focus on which services and systems should be provided and developed by utilizing context information on the supposition, along with the development of context-aware technology. However, the results in this study show that, in terms of users' privacy, it is necessary to pay greater attention to the activities that acquire context information. To inspect the results in the evaluation of sub factor, additional studies would be necessary for approaches on reducing users' privacy concerns toward technological characteristics such as highly identifiable level of identical data, diverse types of context data collected, tracking and recording functionality, embedded and disappearing sensor devices. The factor ranked the next highest level of importance after input is a context-aware service delivery that is related to output. The results show that delivery and display showing services to users in a context-aware personalized services toward the anywhere-anytime-any device concept have been regarded as even more important than in previous computing environment. Considering the concern factors to develop context aware personalized services will help to increase service success rate and hopefully user acceptance for those services. Our future work will be to adopt these factors for qualifying context aware service development projects such as u-city development projects in terms of service quality and hence user acceptance.

Application and Analysis of Ocean Remote-Sensing Reflectance Quality Assurance Algorithm for GOCI-II (천리안해양위성 2호(GOCI-II) 원격반사도 품질 검증 시스템 적용 및 결과)

  • Sujung Bae;Eunkyung Lee;Jianwei Wei;Kyeong-sang Lee;Minsang Kim;Jong-kuk Choi;Jae Hyun Ahn
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.6_2
    • /
    • pp.1565-1576
    • /
    • 2023
  • An atmospheric correction algorithm based on the radiative transfer model is required to obtain remote-sensing reflectance (Rrs) from the Geostationary Ocean Color Imager-II (GOCI-II) observed at the top-of-atmosphere. This Rrs derived from the atmospheric correction is utilized to estimate various marine environmental parameters such as chlorophyll-a concentration, total suspended materials concentration, and absorption of dissolved organic matter. Therefore, an atmospheric correction is a fundamental algorithm as it significantly impacts the reliability of all other color products. However, in clear waters, for example, atmospheric path radiance exceeds more than ten times higher than the water-leaving radiance in the blue wavelengths. This implies atmospheric correction is a highly error-sensitive process with a 1% error in estimating atmospheric radiance in the atmospheric correction process can cause more than 10% errors. Therefore, the quality assessment of Rrs after the atmospheric correction is essential for ensuring reliable ocean environment analysis using ocean color satellite data. In this study, a Quality Assurance (QA) algorithm based on in-situ Rrs data, which has been archived into a database using Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-optical Archive and Storage System (SeaBASS), was applied and modified to consider the different spectral characteristics of GOCI-II. This method is officially employed in the National Oceanic and Atmospheric Administration (NOAA)'s ocean color satellite data processing system. It provides quality analysis scores for Rrs ranging from 0 to 1 and classifies the water types into 23 categories. When the QA algorithm is applied to the initial phase of GOCI-II data with less calibration, it shows the highest frequency at a relatively low score of 0.625. However, when the algorithm is applied to the improved GOCI-II atmospheric correction results with updated calibrations, it shows the highest frequency at a higher score of 0.875 compared to the previous results. The water types analysis using the QA algorithm indicated that parts of the East Sea, South Sea, and the Northwest Pacific Ocean are primarily characterized as relatively clear case-I waters, while the coastal areas of the Yellow Sea and the East China Sea are mainly classified as highly turbid case-II waters. We expect that the QA algorithm will support GOCI-II users in terms of not only statistically identifying Rrs resulted with significant errors but also more reliable calibration with quality assured data. The algorithm will be included in the level-2 flag data provided with GOCI-II atmospheric correction.

Stand-alone Real-time Healthcare Monitoring Driven by Integration of Both Triboelectric and Electro-magnetic Effects (실시간 헬스케어 모니터링의 독립 구동을 위한 접촉대전 발전과 전자기 발전 원리의 융합)

  • Cho, Sumin;Joung, Yoonsu;Kim, Hyeonsu;Park, Minseok;Lee, Donghan;Kam, Dongik;Jang, Sunmin;Ra, Yoonsang;Cha, Kyoung Je;Kim, Hyung Woo;Seo, Kyoung Duck;Choi, Dongwhi
    • Korean Chemical Engineering Research
    • /
    • v.60 no.1
    • /
    • pp.86-92
    • /
    • 2022
  • Recently, the bio-healthcare market is enlarging worldwide due to various reasons such as the COVID-19 pandemic. Among them, biometric measurement and analysis technology are expected to bring about future technological innovation and socio-economic ripple effect. Existing systems require a large-capacity battery to drive signal processing, wireless transmission part, and an operating system in the process. However, due to the limitation of the battery capacity, it causes a spatio-temporal limitation on the use of the device. This limitation can act as a cause for the disconnection of data required for the user's health care monitoring, so it is one of the major obstacles of the health care device. In this study, we report the concept of a standalone healthcare monitoring module, which is based on both triboelectric effects and electromagnetic effects, by converting biomechanical energy into suitable electric energy. The proposed system can be operated independently without an external power source. In particular, the wireless foot pressure measurement monitoring system, which is rationally designed triboelectric sensor (TES), can recognize the user's walking habits through foot pressure measurement. By applying the triboelectric effects to the contact-separation behavior that occurs during walking, an effective foot pressure sensor was made, the performance of the sensor was verified through an electrical output signal according to the pressure, and its dynamic behavior is measured through a signal processing circuit using a capacitor. In addition, the biomechanical energy dissipated during walking is harvested as electrical energy by using the electromagnetic induction effect to be used as a power source for wireless transmission and signal processing. Therefore, the proposed system has a great potential to reduce the inconvenience of charging caused by limited battery capacity and to overcome the problem of data disconnection.

The Analysis of Robot Education Unit in the Practical Arts Textbooks According to 2015 Revised Curriculum (2015 개정 실과교과서의 로봇교육 체제 분석)

  • Park, SunJu
    • Journal of The Korean Association of Information Education
    • /
    • v.24 no.1
    • /
    • pp.99-106
    • /
    • 2020
  • In this paper, we analyzed the units related to robot education in the Practical Arts textbooks according to the 2015 revised curriculum. As a result, all textbooks had a common system of introduction, development, and organization, and all of them showed a similar flow. Learning objectives were presented in all textbooks, but no affective goals were presented except cognitive and functional goals. The contents of robot learning suggest the meaning and type of robots, the structure and sensors of robots, and the activities of making robots, but the contents of robot ethics, the production and activities of various robot works, and the use of robots in the problem solving process are not presented. The assembly robot and the infrared sensor are used in common, and it consists of presenting robot production and control training materials in experience activities and arranging units through evaluation, and the A, C, and F textbooks also provide the unit auxiliary data. In the future, it will be necessary to include the contents of robot ethics education centered on the design/manufacturer and user-oriented robot ethics such as the recognition of the limits of robots, the principles of using robots correctly, safety education, personal information and privacy protection.