• Title/Summary/Keyword: cloud observation data

Search Result 128, Processing Time 0.023 seconds

Fusion of Aerosol Optical Depth from the GOCI and the AHI Observations (GOCI와 AHI 자료를 활용한 에어로졸 광학두께 합성장 산출 연구)

  • Kang, Hyeongwoo;Choi, Wonei;Park, Jeonghyun;Kim, Serin;Lee, Hanlim
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_1
    • /
    • pp.861-870
    • /
    • 2021
  • In this study, fused Aerosol Optical Depth (AOD) data were produced using AOD products from the Geostationary Ocean Color Imager (GOCI) onboard Communication, Oceanography and Meteorology Satellite (COMS)satellite and the Advanced Himawari Imager (AHI) onboard Himawari-8. Since the spatial resolution and the coordinate system between the satellite sensors are different, a preprocessing was first preceded. After that, using the level 1.5 AOD dataset of AErosol RObotic NETwork (AERONET), which is ground-based observation, correlations and trends between each satellite AOD and AERONET AOD were utilized to produce more accurate satellite AOD data than the originalsatellite AODs. The fused AOD were found to be more accurate than the originalsatellite AODs. Root Mean Square Error (RMSE) and mean bias of the fused AODs were calculated to be 0.13 and 0.05, respectively. We also compared errors of the fused AODs against those of the original GOCI AOD (RMSE: 0.15, mean bias: 0.11) and the original AHI AOD (RMSE: 0.15, mean bias: 0.05). It was confirmed that the fused AODs have betterspatial coverage than the original AODsin areas where there are no observations due to the presence of cloud from a single satellite.

Design and Evaluation of an Efficient Flushing Scheme for key-value Store (키-값 저장소를 위한 효율적인 로그 처리 기법 설계 및 평가)

  • Han, Hyuck
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.5
    • /
    • pp.187-193
    • /
    • 2019
  • Key-value storage engines are an essential component of growing demand in many computing environments, including social networks, online e-commerce, and cloud services. Recent key-value storage engines offer many features such as transaction, versioning, and replication. In a key-value storage engine, transaction processing provides atomicity through Write-Ahead-Logging (WAL), and a synchronous commit method for transaction processing flushes log data before the transaction completes. According to our observation, flushing log data to persistent storage is a performance bottleneck for key-value storage engines due to the significant overhead of fsync() calls despite the various optimizations of existing systems. In this article, we propose a group synchronization method to improve the performance of the key-value storage engine. We also design and implement a transaction scheduling method to perform other transactions while the system processes fsync() calls. The proposed method is an efficient way to reduce the number of frequent fsync() calls in the synchronous commit while supporting the same level of transaction provided by the existing system. We implement our scheme on the WiredTiger storage engine and our experimental results show that the proposed system improves the performance of key-value workloads over existing systems.

Analysis of Empirical Multiple Linear Regression Models for the Production of PM2.5 Concentrations (PM2.5농도 산출을 위한 경험적 다중선형 모델 분석)

  • Choo, Gyo-Hwang;Lee, Kyu-Tae;Jeong, Myeong-Jae
    • Journal of the Korean earth science society
    • /
    • v.38 no.4
    • /
    • pp.283-292
    • /
    • 2017
  • In this study, the empirical models were established to estimate the concentrations of surface-level $PM_{2.5}$ over Seoul, Korea from 1 January 2012 to 31 December 2013. We used six different multiple linear regression models with aerosol optical thickness (AOT), ${\AA}ngstr{\ddot{o}}m$ exponents (AE) data from Moderate Resolution Imaging Spectroradiometer (MODIS) aboard Terra and Aqua satellites, meteorological data, and planetary boundary layer depth (PBLD) data. The results showed that $M_6$ was the best empirical model and AOT, AE, relative humidity (RH), wind speed, wind direction, PBLD, and air temperature data were used as input data. Statistical analysis showed that the result between the observed $PM_{2.5}$ and the estimated $PM_{2.5}$ concentrations using $M_6$ model were correlations (R=0.62) and root square mean error ($RMSE=10.70{\mu}gm^{-3}$). In addition, our study show that the relation strongly depends on the seasons due to seasonal observation characteristics of AOT, with a relatively better correlation in spring (R=0.66) and autumntime (R=0.75) than summer and wintertime (R was about 0.38 and 0.56). These results were due to cloud contamination of summertime and the influence of snow/ice surface of wintertime, compared with those of other seasons. Therefore, the empirical multiple linear regression model used in this study showed that the AOT data retrieved from the satellite was important a dominant variable and we will need to use additional weather variables to improve the results of $PM_{2.5}$. Also, the result calculated for $PM_{2.5}$ using empirical multi linear regression model will be useful as a method to enable monitoring of atmospheric environment from satellite and ground meteorological data.

Study on the Possibility of Estimating Surface Soil Moisture Using Sentinel-1 SAR Satellite Imagery Based on Google Earth Engine (Google Earth Engine 기반 Sentinel-1 SAR 위성영상을 이용한 지표 토양수분량 산정 가능성에 관한 연구)

  • Younghyun Cho
    • Korean Journal of Remote Sensing
    • /
    • v.40 no.2
    • /
    • pp.229-241
    • /
    • 2024
  • With the advancement of big data processing technology using cloud platforms, access, processing, and analysis of large-volume data such as satellite imagery have recently been significantly improved. In this study, the Change Detection Method, a relatively simple technique for retrieving soil moisture, was applied to the backscattering coefficient values of pre-processed Sentinel-1 synthetic aperture radar (SAR) satellite imagery product based on Google Earth Engine (GEE), one of those platforms, to estimate the surface soil moisture for six observatories within the Yongdam Dam watershed in South Korea for the period of 2015 to 2023, as well as the watershed average. Subsequently, a correlation analysis was conducted between the estimated values and actual measurements, along with an examination of the applicability of GEE. The results revealed that the surface soil moisture estimated for small areas within the soil moisture observatories of the watershed exhibited low correlations ranging from 0.1 to 0.3 for both VH and VV polarizations, likely due to the inherent measurement accuracy of the SAR satellite imagery and variations in data characteristics. However, the surface soil moisture average, which was derived by extracting the average SAR backscattering coefficient values for the entire watershed area and applying moving averages to mitigate data uncertainties and variability, exhibited significantly improved results at the level of 0.5. The results obtained from estimating soil moisture using GEE demonstrate its utility despite limitations in directly conducting desired analyses due to preprocessed SAR data. However, the efficient processing of extensive satellite imagery data allows for the estimation and evaluation of soil moisture over broad ranges, such as long-term watershed averages. This highlights the effectiveness of GEE in handling vast satellite imagery datasets to assess soil moisture. Based on this, it is anticipated that GEE can be effectively utilized to assess long-term variations of soil moisture average in major dam watersheds, in conjunction with soil moisture observation data from various locations across the country in the future.

Intercomparing the Aerosol Optical Depth Using the Geostationary Satellite Sensors (AHI, GOCI and MI) from Yonsei AErosol Retrieval (YAER) Algorithm (연세에어로졸 알고리즘을 이용하여 정지궤도위성 센서(AHI, GOCI, MI)로부터 산출된 에어로졸 광학두께 비교 연구)

  • Lim, Hyunkwang;Choi, Myungje;Kim, Mijin;Kim, Jhoon;Go, Sujung;Lee, Seoyoung
    • Journal of the Korean earth science society
    • /
    • v.39 no.2
    • /
    • pp.119-130
    • /
    • 2018
  • Aerosol Optical Properties (AOPs) are retrieved using the geostationary satellite instruments such as Geostationary Ocean Color Imager (GOCI), Meteorological Imager (MI), and Advanced Himawari Imager (AHI) through Yonsei AErosol Retrieval algorithm (YAER). In this study, the retrieved aerosol optical depths (AOD)s from each instrument were intercompared and validated with the ground-based sunphotometer AErosol Robotic NETwork (AERONET) data. As a result, the four AOD products derived from different instruments showed consistent results over land and ocean. However, AODs from MI and GOCI tend to be overestimated due to cloud contamination. According to the comparison results with AERONET, the percentage within expected errors (EE) are 36.3, 48.4, 56.6, and 68.2% for MI, GOCI, AHI-minimum reflectivity method (MRM), and AHI-estimated surface reflectance from shortwave Infrared (ESR) product, respectively. Since MI AOD is retrieved from a single visible channel, and adopts only one aerosol type by season, EE is relatively lower than other products. On the other hand, the AHI ESR is more accurate than the minimum reflectance method as used by GOCI, MI, and AHI MRM method in May and June when the vegetation is relatively abundant. These results are explained by the RMSE and the EE for each AERONET site. The ESR method result show to be better than the other satellite product in terms of EE for 15 out of 22 sites used for validation, and they are better than the other product for 13 sites in terms of RMSE. In addition, the error in observation time in each product is found by using characteristics of geostationary satellites. The absolute median biases at 00 to 06 Universal Time Coordinated (UTC) are 0.05, 0.09, 0.18, 0.18, 0.14, 0.09, and 0.10. The absolute median bias by observation time has appeared in MI and the only 00 UTC appeared in GOCI.

Interannual Variation of the TOMS Total Ozone and Reflectivity over the Globe (전지구에 대한 TOMS 오존전량과 반사율의 경년 변화)

  • Yoo, Jung-Moon;Jeon, Won-Sun
    • Journal of the Korean earth science society
    • /
    • v.21 no.6
    • /
    • pp.703-718
    • /
    • 2000
  • In order to investigate interannual variation of total ozone and reflectivity over the globe, Nimbus-7/TOMS data were used on the monthly mean and its anomaly for the period of 1979-92. This study also examined MSU channel 4(Ch4; lower-stratosphere) brightness temperature data and two model reanalyses of NCEP and GEOS to compare the ozone variation with atmospheric thermal condition. In addition, the MSU channel 1(Ch1 ; lower-troposphere) brightness temperature was used to compare with the reflectivity. The ozone showed strong annual cycle with downward trend(-6.3${\pm}$0.6 DU/decade) over the globe, and more distinct response to volcanic eruption than El Ni${\tilde{n}$o. The relationship between total ozone and MSU Ch4 observation, and between the ozone and model reanalyses of lower stratosphere temperature showed positive correlation(0.2-0.7) during the period of 1980-92. Reflectivity increased interannually by 0.2${\pm}$0.06%/decade over the globe during the above period and reflected El Ni${\tilde{n}$o(1982-83, 1991-92) well. Its variability in annual cycle was remarkably smaller in tropics than in higher latitudes. This is inferred due to cloud suppression and tropical upwelling regions. Reflectivity correlated negatively(-0.9) to the Ch1 temperature over the globe, but positively(0.2) over tropical ocean. The positive value over the ocean results from the effect of microwave emissivity which increases the Ch1 temperature with enhanced hydrometeor activity. Significant correlations between total ozone and the Ch4 temperature, and between reflectivity and the Ch1 Suggest that the TOMS data may use valuably to better understand the feedback mechanism of climate change.

  • PDF

Development Process for User Needs-based Chatbot: Focusing on Design Thinking Methodology (사용자 니즈 기반의 챗봇 개발 프로세스: 디자인 사고방법론을 중심으로)

  • Kim, Museong;Seo, Bong-Goon;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.221-238
    • /
    • 2019
  • Recently, companies and public institutions have been actively introducing chatbot services in the field of customer counseling and response. The introduction of the chatbot service not only brings labor cost savings to companies and organizations, but also enables rapid communication with customers. Advances in data analytics and artificial intelligence are driving the growth of these chatbot services. The current chatbot can understand users' questions and offer the most appropriate answers to questions through machine learning and deep learning. The advancement of chatbot core technologies such as NLP, NLU, and NLG has made it possible to understand words, understand paragraphs, understand meanings, and understand emotions. For this reason, the value of chatbots continues to rise. However, technology-oriented chatbots can be inconsistent with what users want inherently, so chatbots need to be addressed in the area of the user experience, not just in the area of technology. The Fourth Industrial Revolution represents the importance of the User Experience as well as the advancement of artificial intelligence, big data, cloud, and IoT technologies. The development of IT technology and the importance of user experience have provided people with a variety of environments and changed lifestyles. This means that experiences in interactions with people, services(products) and the environment become very important. Therefore, it is time to develop a user needs-based services(products) that can provide new experiences and values to people. This study proposes a chatbot development process based on user needs by applying the design thinking approach, a representative methodology in the field of user experience, to chatbot development. The process proposed in this study consists of four steps. The first step is 'setting up knowledge domain' to set up the chatbot's expertise. Accumulating the information corresponding to the configured domain and deriving the insight is the second step, 'Knowledge accumulation and Insight identification'. The third step is 'Opportunity Development and Prototyping'. It is going to start full-scale development at this stage. Finally, the 'User Feedback' step is to receive feedback from users on the developed prototype. This creates a "user needs-based service (product)" that meets the process's objectives. Beginning with the fact gathering through user observation, Perform the process of abstraction to derive insights and explore opportunities. Next, it is expected to develop a chatbot that meets the user's needs through the process of materializing to structure the desired information and providing the function that fits the user's mental model. In this study, we present the actual construction examples for the domestic cosmetics market to confirm the effectiveness of the proposed process. The reason why it chose the domestic cosmetics market as its case is because it shows strong characteristics of users' experiences, so it can quickly understand responses from users. This study has a theoretical implication in that it proposed a new chatbot development process by incorporating the design thinking methodology into the chatbot development process. This research is different from the existing chatbot development research in that it focuses on user experience, not technology. It also has practical implications in that companies or institutions propose realistic methods that can be applied immediately. In particular, the process proposed in this study can be accessed and utilized by anyone, since 'user needs-based chatbots' can be developed even if they are not experts. This study suggests that further studies are needed because only one field of study was conducted. In addition to the cosmetics market, additional research should be conducted in various fields in which the user experience appears, such as the smart phone and the automotive market. Through this, it will be able to be reborn as a general process necessary for 'development of chatbots centered on user experience, not technology centered'.

Performance Evaluation of Monitoring System for Sargassum horneri Using GOCI-II: Focusing on the Results of Removing False Detection in the Yellow Sea and East China Sea (GOCI-II 기반 괭생이모자반 모니터링 시스템 성능 평가: 황해 및 동중국해 해역 오탐지 제거 결과를 중심으로)

  • Han-bit Lee;Ju-Eun Kim;Moon-Seon Kim;Dong-Su Kim;Seung-Hwan Min;Tae-Ho Kim
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.6_2
    • /
    • pp.1615-1633
    • /
    • 2023
  • Sargassum horneri is one of the floating algae in the sea, which breeds in large quantities in the Yellow Sea and East China Sea and then flows into the coast of Republic of Korea, causing various problems such as destroying the environment and damaging fish farms. In order to effectively prevent damage and preserve the coastal environment, the development of Sargassum horneri detection algorithms using satellite-based remote sensing technology has been actively developed. However, incorrect detection information causes an increase in the moving distance of ships collecting Sargassum horneri and confusion in the response of related local governments or institutions,so it is very important to minimize false detections when producing Sargassum horneri spatial information. This study applied technology to automatically remove false detection results using the GOCI-II-based Sargassum horneri detection algorithm of the National Ocean Satellite Center (NOSC) of the Korea Hydrographic and Oceanography Agency (KHOA). Based on the results of analyzing the causes of major false detection results, it includes a process of removing linear and sporadic false detections and green algae that occurs in large quantities along the coast of China in spring and summer by considering them as false detections. The technology to automatically remove false detection was applied to the dates when Sargassum horneri occurred from February 24 to June 25, 2022. Visual assessment results were generated using mid-resolution satellite images, qualitative and quantitative evaluations were performed. Linear false detection results were completely removed, and most of the sporadic and green algae false detection results that affected the distribution were removed. Even after the automatic false detection removal process, it was possible to confirm the distribution area of Sargassum horneri compared to the visual assessment results, and the accuracy and precision calculated using the binary classification model averaged 97.73% and 95.4%, respectively. Recall value was very low at 29.03%, which is presumed to be due to the effect of Sargassum horneri movement due to the observation time discrepancy between GOCI-II and mid-resolution satellite images, differences in spatial resolution, location deviation by orthocorrection, and cloud masking. The results of this study's removal of false detections of Sargassum horneri can determine the spatial distribution status in near real-time, but there are limitations in accurately estimating biomass. Therefore, continuous research on upgrading the Sargassum horneri monitoring system must be conducted to use it as data for establishing future Sargassum horneri response plans.