• Title/Summary/Keyword: 스마트 모니터링 시스템

Search Result 720, Processing Time 0.023 seconds

Crop Monitoring Technique Using Spectral Reflectance Sensor Data and Standard Growth Information (지상 고정형 작물 원격탐사 센서 자료와 표준 생육정보를 융합한 작물 모니터링 기법)

  • Kim, Hyunki;Moon, Hyun-Dong;Ryu, Jae-Hyun;Kwon, Dong-Won;Baek, Jae-Kyeong;Seo, Myung-Chul;Cho, Jaeil
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_1
    • /
    • pp.1199-1206
    • /
    • 2021
  • Accordingly, attention is also being paid to the agricultural use of remote sensing technique that non-destructively and continuously detects the growth and physiological status of crops. However, when remote sensing techniques are used for crop monitoring, it is possible to continuously monitor the abnormality of crops in real time. For this, standard growth information of crops is required and relative growth considering the cultivation environment must be identified. With the relationship between GDD (Growing Degree Days), which is the cumulative temperature related to crop growth obtained from ideal cultivation management, and the vegetation index as standard growth information, compared with the vegetation index observed with the spectralreflectance sensor(SRSNDVI & SRSPRI) in each rice paddy treated with standard cultivation management and non-fertilized, it was quantitatively identified as a time series. In the future, it is necessary to accumulate a database targeting various climatic conditions and varieties in the standard cultivation management area to establish a more reliable standard growth information.

Analysis of Health Care Service Trends for The Older Adults Based on ICT (국내외 ICT기반 노인 건강관리 서비스 동향분석)

  • Lee, Sung-Hyun;Hong, Sung Jung;Kim, Kyung Mi
    • Journal of the Korea Convergence Society
    • /
    • v.12 no.5
    • /
    • pp.373-383
    • /
    • 2021
  • Our society is aging rapidly. In this super-aged society, the increase in healthcare costs are considered a national problem that undermines the sustainability of social security. Various services for healthcare for the elderly have been promoted to address this. However, most of them have focused on healthcare after the outbreak of chronic diseases and lack preventive healthcare. Most of the preventive healthcare projects are only pilots. In this paper, the current status of health care services for senior citizens at home and abroad was analyzed and based on this, the limitations and improvements were analyzed to propose the establishment of IoT-based Total Silver Care Center. IoT-based Total Silver Care Center may be conveniently monitored the health status of the elderly through various sensors, medical devices, and smart bands. And based on this, it can improve the quality of nursing services through time-saving and work efficiency of nursing providers. In addition, health care interventions may be provided in a timely manner if there is a change in the health status of users. And real-time imaging systems can help overcome mental difficulties.

A Study on People Counting in Public Metro Service using Hybrid CNN-LSTM Algorithm (Hybrid CNN-LSTM 알고리즘을 활용한 도시철도 내 피플 카운팅 연구)

  • Choi, Ji-Hye;Kim, Min-Seung;Lee, Chan-Ho;Choi, Jung-Hwan;Lee, Jeong-Hee;Sung, Tae-Eung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.2
    • /
    • pp.131-145
    • /
    • 2020
  • In line with the trend of industrial innovation, IoT technology utilized in a variety of fields is emerging as a key element in creation of new business models and the provision of user-friendly services through the combination of big data. The accumulated data from devices with the Internet-of-Things (IoT) is being used in many ways to build a convenience-based smart system as it can provide customized intelligent systems through user environment and pattern analysis. Recently, it has been applied to innovation in the public domain and has been using it for smart city and smart transportation, such as solving traffic and crime problems using CCTV. In particular, it is necessary to comprehensively consider the easiness of securing real-time service data and the stability of security when planning underground services or establishing movement amount control information system to enhance citizens' or commuters' convenience in circumstances with the congestion of public transportation such as subways, urban railways, etc. However, previous studies that utilize image data have limitations in reducing the performance of object detection under private issue and abnormal conditions. The IoT device-based sensor data used in this study is free from private issue because it does not require identification for individuals, and can be effectively utilized to build intelligent public services for unspecified people. Especially, sensor data stored by the IoT device need not be identified to an individual, and can be effectively utilized for constructing intelligent public services for many and unspecified people as data free form private issue. We utilize the IoT-based infrared sensor devices for an intelligent pedestrian tracking system in metro service which many people use on a daily basis and temperature data measured by sensors are therein transmitted in real time. The experimental environment for collecting data detected in real time from sensors was established for the equally-spaced midpoints of 4×4 upper parts in the ceiling of subway entrances where the actual movement amount of passengers is high, and it measured the temperature change for objects entering and leaving the detection spots. The measured data have gone through a preprocessing in which the reference values for 16 different areas are set and the difference values between the temperatures in 16 distinct areas and their reference values per unit of time are calculated. This corresponds to the methodology that maximizes movement within the detection area. In addition, the size of the data was increased by 10 times in order to more sensitively reflect the difference in temperature by area. For example, if the temperature data collected from the sensor at a given time were 28.5℃, the data analysis was conducted by changing the value to 285. As above, the data collected from sensors have the characteristics of time series data and image data with 4×4 resolution. Reflecting the characteristics of the measured, preprocessed data, we finally propose a hybrid algorithm that combines CNN in superior performance for image classification and LSTM, especially suitable for analyzing time series data, as referred to CNN-LSTM (Convolutional Neural Network-Long Short Term Memory). In the study, the CNN-LSTM algorithm is used to predict the number of passing persons in one of 4×4 detection areas. We verified the validation of the proposed model by taking performance comparison with other artificial intelligence algorithms such as Multi-Layer Perceptron (MLP), Long Short Term Memory (LSTM) and RNN-LSTM (Recurrent Neural Network-Long Short Term Memory). As a result of the experiment, proposed CNN-LSTM hybrid model compared to MLP, LSTM and RNN-LSTM has the best predictive performance. By utilizing the proposed devices and models, it is expected various metro services will be provided with no illegal issue about the personal information such as real-time monitoring of public transport facilities and emergency situation response services on the basis of congestion. However, the data have been collected by selecting one side of the entrances as the subject of analysis, and the data collected for a short period of time have been applied to the prediction. There exists the limitation that the verification of application in other environments needs to be carried out. In the future, it is expected that more reliability will be provided for the proposed model if experimental data is sufficiently collected in various environments or if learning data is further configured by measuring data in other sensors.

Present Status of the Quality Assurance and Control (QA/QC) for Korean Macrozoobenthic Biological Data and Suggestions for its Improvement (해양저서동물의 정량적 자료에 대한 정도관리 현실과 개선안)

  • CHOI, JIN-WOO;KHIM, JONG SEONG;SONG, SUNG JOON;RYU, JONGSEONG;KWON, BONG-OH
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.26 no.3
    • /
    • pp.263-276
    • /
    • 2021
  • Marine benthic organisms have been used as the indicators for the environment assessment and recently considered as a very important component in the biodiversity and ecosystem restoration. In Korean waters, the quantitative data on marine benthos was used as one of major components for the marine pollution assessment for 50 years since 1970s. The species identification which is an important factor for the quantitative biological data was mainly performed by the marine benthic ecologists. This leads to the deterioration of the data quality on marine benthos from the misidentication of major taxonomic groups due to the lack of taxonomic expertise in Korea. This taxonomic problem has not been solved until now and remains in most data from national research projects on the marine ecosystems in Korean waters. Here we introduce the quality assurance and control (QA/QC) system for the marine biological data in UK, that is, NMBAQC (Northeast Atlantic Marine Biological Analytic and Quality Control) Scheme which has been performed by private companies to solve similar species identification problems in UK. This scheme asks for all marine laboratories which want to participate to any national monitoring programs in UK to keep their identification potency at high level by the internal quality assurance systems and provides a series of taxonomic workshops and literature to increase their capability. They also performs the external quality control for the marine laboratories by performing the Ring Test using standard specimens on various faunal groups. In the case of Korea, there are few taxonomic expertise in two existing national institutions and so they can't solve the taxonomic problems in marine benthic fauna data. We would like to provide a few necessary suggestions to solve the taxonomic problems in Korean marine biological data in short-terms and long-terms: (1) the identification of all dominant species in marine biological data should be confirmed by taxonomic expertise, (2) all the national research programs should include taxonomic experts, and (3) establishing a private company, like the Korea marine organism identification association (KMOIA), which can perform the QA/QC system on the marine organisms and support all Korean marine laboratories by providing taxonomic literature and species identification workshops to enhance their potency. The last suggestion needs more efforts and time for the establishment of that taxonomic company by gathering the detailed contents and related opinions from diverse stakeholders in Korea.

A Checklist to Improve the Fairness in AI Financial Service: Focused on the AI-based Credit Scoring Service (인공지능 기반 금융서비스의 공정성 확보를 위한 체크리스트 제안: 인공지능 기반 개인신용평가를 중심으로)

  • Kim, HaYeong;Heo, JeongYun;Kwon, Hochang
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.3
    • /
    • pp.259-278
    • /
    • 2022
  • With the spread of Artificial Intelligence (AI), various AI-based services are expanding in the financial sector such as service recommendation, automated customer response, fraud detection system(FDS), credit scoring services, etc. At the same time, problems related to reliability and unexpected social controversy are also occurring due to the nature of data-based machine learning. The need Based on this background, this study aimed to contribute to improving trust in AI-based financial services by proposing a checklist to secure fairness in AI-based credit scoring services which directly affects consumers' financial life. Among the key elements of trustworthy AI like transparency, safety, accountability, and fairness, fairness was selected as the subject of the study so that everyone could enjoy the benefits of automated algorithms from the perspective of inclusive finance without social discrimination. We divided the entire fairness related operation process into three areas like data, algorithms, and user areas through literature research. For each area, we constructed four detailed considerations for evaluation resulting in 12 checklists. The relative importance and priority of the categories were evaluated through the analytic hierarchy process (AHP). We use three different groups: financial field workers, artificial intelligence field workers, and general users which represent entire financial stakeholders. According to the importance of each stakeholder, three groups were classified and analyzed, and from a practical perspective, specific checks such as feasibility verification for using learning data and non-financial information and monitoring new inflow data were identified. Moreover, financial consumers in general were found to be highly considerate of the accuracy of result analysis and bias checks. We expect this result could contribute to the design and operation of fair AI-based financial services.

Comparative study of flood detection methodologies using Sentinel-1 satellite imagery (Sentinel-1 위성 영상을 활용한 침수 탐지 기법 방법론 비교 연구)

  • Lee, Sungwoo;Kim, Wanyub;Lee, Seulchan;Jeong, Hagyu;Park, Jongsoo;Choi, Minha
    • Journal of Korea Water Resources Association
    • /
    • v.57 no.3
    • /
    • pp.181-193
    • /
    • 2024
  • The increasing atmospheric imbalance caused by climate change leads to an elevation in precipitation, resulting in a heightened frequency of flooding. Consequently, there is a growing need for technology to detect and monitor these occurrences, especially as the frequency of flooding events rises. To minimize flood damage, continuous monitoring is essential, and flood areas can be detected by the Synthetic Aperture Radar (SAR) imagery, which is not affected by climate conditions. The observed data undergoes a preprocessing step, utilizing a median filter to reduce noise. Classification techniques were employed to classify water bodies and non-water bodies, with the aim of evaluating the effectiveness of each method in flood detection. In this study, the Otsu method and Support Vector Machine (SVM) technique were utilized for the classification of water bodies and non-water bodies. The overall performance of the models was assessed using a Confusion Matrix. The suitability of flood detection was evaluated by comparing the Otsu method, an optimal threshold-based classifier, with SVM, a machine learning technique that minimizes misclassifications through training. The Otsu method demonstrated suitability in delineating boundaries between water and non-water bodies but exhibited a higher rate of misclassifications due to the influence of mixed substances. Conversely, the use of SVM resulted in a lower false positive rate and proved less sensitive to mixed substances. Consequently, SVM exhibited higher accuracy under conditions excluding flooding. While the Otsu method showed slightly higher accuracy in flood conditions compared to SVM, the difference in accuracy was less than 5% (Otsu: 0.93, SVM: 0.90). However, in pre-flooding and post-flooding conditions, the accuracy difference was more than 15%, indicating that SVM is more suitable for water body and flood detection (Otsu: 0.77, SVM: 0.92). Based on the findings of this study, it is anticipated that more accurate detection of water bodies and floods could contribute to minimizing flood-related damages and losses.

Three-Dimensional Positional Accuracy Analysis of UAV Imagery Using Ground Control Points Acquired from Multisource Geospatial Data (다종 공간정보로부터 취득한 지상기준점을 활용한 UAV 영상의 3차원 위치 정확도 비교 분석)

  • Park, Soyeon;Choi, Yoonjo;Bae, Junsu;Hong, Seunghwan;Sohn, Hong-Gyoo
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.5_3
    • /
    • pp.1013-1025
    • /
    • 2020
  • Unmanned Aerial Vehicle (UAV) platform is being widely used in disaster monitoring and smart city, having the advantage of being able to quickly acquire images in small areas at a low cost. Ground Control Points (GCPs) for positioning UAV images are essential to acquire cm-level accuracy when producing UAV-based orthoimages and Digital Surface Model (DSM). However, the on-site acquisition of GCPs takes considerable manpower and time. This research aims to provide an efficient and accurate way to replace the on-site GNSS surveying with three different sources of geospatial data. The three geospatial data used in this study is as follows; 1) 25 cm aerial orthoimages, and Digital Elevation Model (DEM) based on 1:1000 digital topographic map, 2) point cloud data acquired by Mobile Mapping System (MMS), and 3) hybrid point cloud data created by merging MMS data with UAV data. For each dataset a three-dimensional positional accuracy analysis of UAV-based orthoimage and DSM was performed by comparing differences in three-dimensional coordinates of independent check point obtained with those of the RTK-GNSS survey. The result shows the third case, in which MMS data and UAV data combined, to be the most accurate, showing an RMSE accuracy of 8.9 cm in horizontal and 24.5 cm in vertical, respectively. In addition, it has been shown that the distribution of geospatial GCPs has more sensitive on the vertical accuracy than on horizontal accuracy.

The Results of the Application of a Real-time Chemical Exposure Monitoring System in a Workplace (스마트 센서 세트를 활용한 화학물질 상시모니터링 시스템의 작업현장 적용 결과)

  • Wook Kim;Jangjin Ryoo;Jongdeok Jung;Gwihyun Park;Giyeong Kim;Jinju Kang;Kihyo Jung;Seunghon Ham
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.33 no.2
    • /
    • pp.215-229
    • /
    • 2023
  • Objectives: To validate the effectiveness of a real-time chemical exposure monitoring system developed by KOSHA (Korea Occupational Safety and Health Agency), we applied the system to a workplace in the electronics industry for 153 days. Methods: The monitoring system consisted of a PID chemical sensor, a LTE communication equipment, and a web-based platform. To monitor chemical exposure, four sets of sensors were placed in two manufacturing tasks - inspection and jig cleaning - which used TCE as a degreasing agent. We reviewed previous reports of work environment measurements and conducted a new work environment measurement on one day during the period. The PID sensor systems detected the chemical exposure levels in the workplace every second and transmitted it to the platform. Daily average and maximum chemical exposure levels were also recorded. Results: We compared the results from the real-time monitoring system and the work environment measurement by traditional methods. Generally, the data from the real-time monitoring system showed a higher level because the sensors were closer to the chemical source. We found that 28% of jig cleaning task data exceeded the STEL. Peak exposure levels of sensor data were useful for understanding the characteristics of the task's chemical use. Limitations and implications were reviewed for the adoption of the system for preventing poisoning caused by chemical substances. Conclusions: We found that the real-time chemical exposure monitoring system was an efficient tool for preventing occupational diseases caused by chemical exposure, such as acute poisoning. Further research is needed to improve the reliability and applicability of the system. We also believe that forming a social consensus around the system is essential.

An Exploratory Study on Smart Wearable and Game Service Design for U-Silver Generation: U-Hospital Solution for the Induction of Interest to Carry Out Personalized Exercise Prescription (U-실버세대를 위한 스마트 웨어러블 및 연동 게임의 서비스 디자인 방안 탐색: 개인 맞춤형 운동처방 실행을 위한 흥미 유도 목적의 U-Hospital 솔루션)

  • Park, Su Youn;Lee, Joo Hyeon
    • Science of Emotion and Sensibility
    • /
    • v.22 no.1
    • /
    • pp.23-34
    • /
    • 2019
  • The U-Healthcare era has evolved with the development of the Internet of things (IoT) in the early stages of being connected as a society. Already, many changes such as increased well-being and the extension of human life are becoming evident across cultures. Korea entered the growing group of aging societies in 2017, and its silver industry is expected to grow rapidly by adopting the IoT of a super-connected society. In particular, the senior shift phenomenon has resulted in increased interest in the promotion of the health and well-being of the emergent silver generation which, unlike the existing silver generation, is highly active and wields great economic power. This study conducted in-depth interviews to investigate the characteristics of the new silver generation, and to develop the design for a wearable serious game that intends to boost the interest of the elderly in exercise and fitness activities according to their personalized physical training regimes as prescribed by the U-Hospital service. The usage scenario of this wearable serious game for the 'U-silver generation' is derived from social necessity. Medical professionals can utilize this technology to conduct health examinations and to monitor the rehabilitation of senior patients. The elderly can also use this tool to request checkups or to interface with their healthcare providers. The wearable serious game is further aimed at mitigating concerns about the deterioration of the physical functions of the silver generation by applying personalized exercise prescriptions. The present investigation revealed that it is necessary to merge the on / off line community activities to meet the silver generation's daily needs for connection and friendship. Further, the sustainability of the serious game must be enhanced through the inculcation of a sense of accomplishment as a player rises through the levels of the game. The proposed wearable serious game is designed specifically for the silver generation that is inexperienced in using digital devices: simple game rules are applied to a familiar interface grounded on the gourmet travels preferred by the target players to increase usability.

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.