• 제목/요약/키워드: 평균절대오차

검색결과 274건 처리시간 0.028초

Prelaunch Study of Validation for the Geostationary Ocean Color Imager (GOCI) (정지궤도 해색탑재체(GOCI) 자료 검정을 위한 사전연구)

  • Ryu, Joo-Hyung;Moon, Jeong-Eon;Son, Young-Baek;Cho, Seong-Ick;Min, Jee-Eun;Yang, Chan-Su;Ahn, Yu-Hwan;Shim, Jae-Seol
    • Korean Journal of Remote Sensing
    • /
    • 제26권2호
    • /
    • pp.251-262
    • /
    • 2010
  • In order to provide quantitative control of the standard products of Geostationary Ocean Color Imager (GOCI), on-board radiometric correction, atmospheric correction, and bio-optical algorithm are obtained continuously by comprehensive and consistent calibration and validation procedures. The calibration/validation for radiometric, atmospheric, and bio-optical data of GOCI uses temperature, salinity, ocean optics, fluorescence, and turbidity data sets from buoy and platform systems, and periodic oceanic environmental data. For calibration and validation of GOCI, we compared radiometric data between in-situ measurement and HyperSAS data installed in the Ieodo ocean research station, and between HyperSAS and SeaWiFS radiance. HyperSAS data were slightly different in in-situ radiance and irradiance, but they did not have spectral shift in absorption bands. Although all radiance bands measured between HyperSAS and SeaWiFS had an average 25% error, the 11% absolute error was relatively lower when atmospheric correction bands were omitted. This error is related to the SeaWiFS standard atmospheric correction process. We have to consider and improve this error rate for calibration and validation of GOCI. A reference target site around Dokdo Island was used for studying calibration and validation of GOCI. In-situ ocean- and bio-optical data were collected during August and October, 2009. Reflectance spectra around Dokdo Island showed optical characteristic of Case-1 Water. Absorption spectra of chlorophyll, suspended matter, and dissolved organic matter also showed their spectral characteristics. MODIS Aqua-derived chlorophyll-a concentration was well correlated with in-situ fluorometer value, which installed in Dokdo buoy. As we strive to solv the problems of radiometric, atmospheric, and bio-optical correction, it is important to be able to progress and improve the future quality of calibration and validation of GOCI.

Derivation of Stem Taper Equations and a Stem Volume Table for Quercus acuta in a Warm Temperate Region (난대지역 붉가시나무의 수간곡선식 도출 및 수간재적표 작성)

  • Suyoung Jung;Kwangsoo Lee;Hyunsoo Kim; Joonhyung Park;Jaeyeop Kim;Chunhee Park;Yeongmo Son
    • Journal of Korean Society of Forest Science
    • /
    • 제112권4호
    • /
    • pp.417-425
    • /
    • 2023
  • The aim of this study was to derive stem taper equations for Quercus acuta, one of main evergreen broad-leaved tree species found in warm temperate regions, and to prepare a stem volume table using those stem taper equations. A total of 688 individual trees were used in the analysis, which were collected from Jeonnam-do, Gyeongnam-do, and Jeju-do. The stem taper models applied to derive the stem curve pattern were the Max and Burkhart, Kozak, and Lee models. Among the three stem taper models, the best explanation of the stem curve shape of Q. acuta was found to be given by the Kozak model, which showed a fitness index of 0.9583, bias of 0.0352, percentage of estimated standard error of 1.1439, and mean absolute deviation of 0.6751. Thus, the stem taper of Q. acuta was estimated using the Kozak model. Moreover,thestemvolumecalculationwasperforme d by applying the Smalian formula to the diameter and height of each stem interval. In addition, an analysis of variance (ANOVA) was conducted to compare the two existing Q. acuta stem volume tables (2007 and 2010) and the newly created stem volume table (2023). This analysis revealed that the stem volume table constructed in the Wando region in 2007 included about twice as much as the stem volume tables constructed in 2010 and 2023. The stem volume table (2023) developed in this study is not only based on the regional collection range and number of utilized trees but also on a sound scientific basis. Therefore, it can be used at the national level as an official stem volume table for Q. acuta.

Evaluation of Error Factors in Quantitative Analysis of Lymphoscintigraphy (Lymphoscintigraphy의 정량분석 시 오류 요인에 관한 평가)

  • Yeon, Joon-Ho;Kim, Soo-Yung;Choi, Sung-Ook;Seok, Jae-Dong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • 제15권2호
    • /
    • pp.76-82
    • /
    • 2011
  • Purpose: Lymphoscintigraphy is absolutely being used standard examination in lymphatic diagnosis, evaluation after treatment, and it is useful for lymphedema to plan therapy. In case of lymphoscintigraphy of lower-extremity lymphedema, it had an effect on results if patients had not pose same position on the examination of 1 min, 1 hour and 2 hours after injection. So we'll study the methods to improve confidence with minimized quantitative analysis errors by influence factors. Materials and Methods: Being used the Infinia of GE Co. we injected $^{99m}Tc$-phytate 37 MBq (1.0 mCi) 4 sylinges into 40 people's feet hypodermically from June to August 2010 in Samsung Medical Center. After we acquired images of fixed and unfixed condition, we confirmed the count values change by attenuation of soft tissue and bone according to different feet position. And we estimated 5 times increasing 2 cm of distance between $^{99m}Tc$ point source and detector each time to check counts difference according to distance change by different feet position. Finally, we compared 1 and 6 min lymphoscintigraphy images with same position to check the effect of quantitative analysis results owing to difference of amounts of movement of the $^{99m}Tc$-phytate in the lymphatic duct. Results: Percentage difference regarding error values showed minimum 2.7% and maximum 25.8% when comparing fixed and unfixed feet position of lymphoscintigraphy examination at 1 min after injection. And count values according to distance were 173,661 (2 cm), 172,095 (4 cm), 170,996 (6 cm), 167,677 (8 cm), 169,208 counts (10 cm) which distance was increased interval of 2 cm and basal value was mean 176,587 counts, and percentage difference values were not over 2.5% such as 1.27, 1.79, 2.04, 2.42, 2.35%. Also, Assessment results about amounts of movement in lymphatic duct within 6 min until scanning after injection showed minimum 0.15%, and maximum 2.3% which were amounts of movement. We can recognize that error values represent over 20% due to only attenuation of soft tissue and bone except for distance difference (2.42%) and amounts of movement in lymphatic duct (2.3%). Conclusion: It was show that if same patients posed different feet position on the examination of 1 min, 1 hour and 2 hours after injection in the lymphoscintigraphy which is evaluating lymphatic flow of patients with lymphedema and analyzing amount of intake by lymphatic system, maximum error value represented 25.8% due to attenuation of soft tissue and bone, and PASW (Predictive Analytics Software) showed that fixed and unfixed feet position was different each other. And difference of distance between detector and feet and change of count values by difference of examination beginning time after injection influence on quantitative analysis results partially. Therefore, we'll make an effort to fix feet position and make the most of fixing board in lymphoscintigraphy with quantitative analysis.

  • PDF

Scalable Collaborative Filtering Technique based on Adaptive Clustering (적응형 군집화 기반 확장 용이한 협업 필터링 기법)

  • Lee, O-Joun;Hong, Min-Sung;Lee, Won-Jin;Lee, Jae-Dong
    • Journal of Intelligence and Information Systems
    • /
    • 제20권2호
    • /
    • pp.73-92
    • /
    • 2014
  • An Adaptive Clustering-based Collaborative Filtering Technique was proposed to solve the fundamental problems of collaborative filtering, such as cold-start problems, scalability problems and data sparsity problems. Previous collaborative filtering techniques were carried out according to the recommendations based on the predicted preference of the user to a particular item using a similar item subset and a similar user subset composed based on the preference of users to items. For this reason, if the density of the user preference matrix is low, the reliability of the recommendation system will decrease rapidly. Therefore, the difficulty of creating a similar item subset and similar user subset will be increased. In addition, as the scale of service increases, the time needed to create a similar item subset and similar user subset increases geometrically, and the response time of the recommendation system is then increased. To solve these problems, this paper suggests a collaborative filtering technique that adapts a condition actively to the model and adopts the concepts of a context-based filtering technique. This technique consists of four major methodologies. First, items are made, the users are clustered according their feature vectors, and an inter-cluster preference between each item cluster and user cluster is then assumed. According to this method, the run-time for creating a similar item subset or user subset can be economized, the reliability of a recommendation system can be made higher than that using only the user preference information for creating a similar item subset or similar user subset, and the cold start problem can be partially solved. Second, recommendations are made using the prior composed item and user clusters and inter-cluster preference between each item cluster and user cluster. In this phase, a list of items is made for users by examining the item clusters in the order of the size of the inter-cluster preference of the user cluster, in which the user belongs, and selecting and ranking the items according to the predicted or recorded user preference information. Using this method, the creation of a recommendation model phase bears the highest load of the recommendation system, and it minimizes the load of the recommendation system in run-time. Therefore, the scalability problem and large scale recommendation system can be performed with collaborative filtering, which is highly reliable. Third, the missing user preference information is predicted using the item and user clusters. Using this method, the problem caused by the low density of the user preference matrix can be mitigated. Existing studies on this used an item-based prediction or user-based prediction. In this paper, Hao Ji's idea, which uses both an item-based prediction and user-based prediction, was improved. The reliability of the recommendation service can be improved by combining the predictive values of both techniques by applying the condition of the recommendation model. By predicting the user preference based on the item or user clusters, the time required to predict the user preference can be reduced, and missing user preference in run-time can be predicted. Fourth, the item and user feature vector can be made to learn the following input of the user feedback. This phase applied normalized user feedback to the item and user feature vector. This method can mitigate the problems caused by the use of the concepts of context-based filtering, such as the item and user feature vector based on the user profile and item properties. The problems with using the item and user feature vector are due to the limitation of quantifying the qualitative features of the items and users. Therefore, the elements of the user and item feature vectors are made to match one to one, and if user feedback to a particular item is obtained, it will be applied to the feature vector using the opposite one. Verification of this method was accomplished by comparing the performance with existing hybrid filtering techniques. Two methods were used for verification: MAE(Mean Absolute Error) and response time. Using MAE, this technique was confirmed to improve the reliability of the recommendation system. Using the response time, this technique was found to be suitable for a large scaled recommendation system. This paper suggested an Adaptive Clustering-based Collaborative Filtering Technique with high reliability and low time complexity, but it had some limitations. This technique focused on reducing the time complexity. Hence, an improvement in reliability was not expected. The next topic will be to improve this technique by rule-based filtering.