• Title/Summary/Keyword: Error Analysis

Search Result 9,303, Processing Time 0.041 seconds

Quantitative Analysis of Carbohydrate, Protein, and Oil Contents of Korean Foods Using Near-Infrared Reflectance Spectroscopy (근적외 분광분석법을 이용한 국내 유통 식품 함유 탄수화물, 단백질 및 지방의 정량 분석)

  • Song, Lee-Seul;Kim, Young-Hak;Kim, Gi-Ppeum;Ahn, Kyung-Geun;Hwang, Young-Sun;Kang, In-Kyu;Yoon, Sung-Won;Lee, Junsoo;Shin, Ki-Yong;Lee, Woo-Young;Cho, Young Sook;Choung, Myoung-Gun
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.43 no.3
    • /
    • pp.425-430
    • /
    • 2014
  • Foods contain various nutrients such as carbohydrates, protein, oil, vitamins, and minerals. Among them, carbohydrates, protein, and oil are the main constituents of foods. Usually, these constituents are analyzed by the Kjeldahl and Soxhlet method and so on. However, these analytical methods are complex, costly, and time-consuming. Thus, this study aimed to rapidly and effectively analyze carbohydrate, protein, and oil contents with near-infrared reflectance spectroscopy (NIRS). A total of 517 food samples were measured within the wavelength range of 400 to 2,500 nm. Exactly 412 food calibration samples and 162 validation samples were used for NIRS equation development and validation, respectively. In the NIRS equation of carbohydrates, the most accurate equation was obtained under 1, 4, 5, 1 (1st derivative, 4 nm gap, 5 points smoothing, and 1 point second smoothing) math treatment conditions using the weighted MSC (multiplicative scatter correction) scatter correction method with MPLS (modified partial least square) regression. In the case of protein and oil, the best equation were obtained under 2, 5, 5, 3 and 1, 1, 1, 1 conditions, respectively, using standard MSC and standard normal variate only scatter correction methods with MPLS regression. Calibrations of these NIRS equations showed a very high coefficient of determination in calibration ($R^2$: carbohydrates, 0.971; protein, 0.974; oil, 0.937) and low standard error of calibration (carbohydrates, 4.066; protein, 1.080; oil, 1.890). Optimal equation conditions were applied to a validation set of 162 samples. Validation results of these NIRS equations showed a very high coefficient of determination in prediction ($r^2$: carbohydrates, 0.987; protein, 0.970; oil, 0.947) and low standard error of prediction (carbohydrates, 2.515; protein, 1.144; oil, 1.370). Therefore, these NIRS equations can be applicable for determination of carbohydrates, proteins, and oil contents in various foods.

The Pattern Analysis of Financial Distress for Non-audited Firms using Data Mining (데이터마이닝 기법을 활용한 비외감기업의 부실화 유형 분석)

  • Lee, Su Hyun;Park, Jung Min;Lee, Hyoung Yong
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.111-131
    • /
    • 2015
  • There are only a handful number of research conducted on pattern analysis of corporate distress as compared with research for bankruptcy prediction. The few that exists mainly focus on audited firms because financial data collection is easier for these firms. But in reality, corporate financial distress is a far more common and critical phenomenon for non-audited firms which are mainly comprised of small and medium sized firms. The purpose of this paper is to classify non-audited firms under distress according to their financial ratio using data mining; Self-Organizing Map (SOM). SOM is a type of artificial neural network that is trained using unsupervised learning to produce a lower dimensional discretized representation of the input space of the training samples, called a map. SOM is different from other artificial neural networks as it applies competitive learning as opposed to error-correction learning such as backpropagation with gradient descent, and in the sense that it uses a neighborhood function to preserve the topological properties of the input space. It is one of the popular and successful clustering algorithm. In this study, we classify types of financial distress firms, specially, non-audited firms. In the empirical test, we collect 10 financial ratios of 100 non-audited firms under distress in 2004 for the previous two years (2002 and 2003). Using these financial ratios and the SOM algorithm, five distinct patterns were distinguished. In pattern 1, financial distress was very serious in almost all financial ratios. 12% of the firms are included in these patterns. In pattern 2, financial distress was weak in almost financial ratios. 14% of the firms are included in pattern 2. In pattern 3, growth ratio was the worst among all patterns. It is speculated that the firms of this pattern may be under distress due to severe competition in their industries. Approximately 30% of the firms fell into this group. In pattern 4, the growth ratio was higher than any other pattern but the cash ratio and profitability ratio were not at the level of the growth ratio. It is concluded that the firms of this pattern were under distress in pursuit of expanding their business. About 25% of the firms were in this pattern. Last, pattern 5 encompassed very solvent firms. Perhaps firms of this pattern were distressed due to a bad short-term strategic decision or due to problems with the enterpriser of the firms. Approximately 18% of the firms were under this pattern. This study has the academic and empirical contribution. In the perspectives of the academic contribution, non-audited companies that tend to be easily bankrupt and have the unstructured or easily manipulated financial data are classified by the data mining technology (Self-Organizing Map) rather than big sized audited firms that have the well prepared and reliable financial data. In the perspectives of the empirical one, even though the financial data of the non-audited firms are conducted to analyze, it is useful for find out the first order symptom of financial distress, which makes us to forecast the prediction of bankruptcy of the firms and to manage the early warning and alert signal. These are the academic and empirical contribution of this study. The limitation of this research is to analyze only 100 corporates due to the difficulty of collecting the financial data of the non-audited firms, which make us to be hard to proceed to the analysis by the category or size difference. Also, non-financial qualitative data is crucial for the analysis of bankruptcy. Thus, the non-financial qualitative factor is taken into account for the next study. This study sheds some light on the non-audited small and medium sized firms' distress prediction in the future.

A Prospective Randomized Comparative Clinical Trial Comparing the Efficacy between Ondansetron and Metoclopramide for Prevention of Nausea and Vomiting in Patients Undergoing Fractionated Radiotherapy to the Abdominal Region (복부 방사선치료를 받는 환자에서 발생하는 오심 및 구토에 대한 온단세트론과 메토클로프라미드의 효과 : 제 3상 전향적 무작위 비교임상시험)

  • Park Hee Chul;Suh Chang Ok;Seong Jinsil;Cho Jae Ho;Lim John Jihoon;Park Won;Song Jae Seok;Kim Gwi Eon
    • Radiation Oncology Journal
    • /
    • v.19 no.2
    • /
    • pp.127-135
    • /
    • 2001
  • Purpose : This study is a prospective randomized clinical trial comparing the efficacy and complication of anti-emetic drugs for prevention of nausea and vomiting after radiotherapy which has moderate emetogenic potential. The aim of this study was to investigate whether the anti-emetic efficacy of ondansetron $(Zofran^{\circledR})$ 8 mg bid dose (Group O) is better than the efficacy of metoclopramide 5 mg lid dose (Group M) in patients undergoing fractionated radiotherapy to the abdominal region. Materials and Methods : Study entry was restricted to those patients who met the following eligibility criteria: histologically confirmed malignant disease; no distant metastasis; performance status of not more than ECOG grade 2; no previous chemotherapy and radiotherapy. Between March 1997 and February 1998, 60 patients enrolled in this study. All patients signed a written statement of informed consent prior to enrollment. Blinding was maintained by dosing identical number of tablets including one dose of matching placebo for Group O. The extent of nausea, appetite loss, and the number of emetic episodes were recorded everyday using diary card. The mean score of nausea, appetite loss and the mean number of emetic episodes were obtained in a weekly interval. Results : Prescription error occurred in one patient. And diary cards have not returned in 3 patients due to premature refusal of treatment. Card from one patient was excluded from the analysis because she had a history of treatment for neurosis. As a result, the analysis consisted of 55 patients. Patient characteristics and radiotherapy characteristics were similar except mean age was $52.9{\pm}11.2$ in group M, $46.5{\pm}9.5$ in group O. The difference of age was statistically significant. The mean score of nausea, appetite loss and emetic episodes in a weekly interval was higher in group M than O. In group M, the symptoms were most significant at 5th week. In a panel data analysis using mixed procedure, treatment group was only significant factor detecting the difference of weekly score for all three symptoms. Ondansetron $(Zofran^{\circledR})$ 8 mg bid dose and metoclopramide 5 mg lid dose were well tolerated without significant side effects. There were no clinically important changes In vital signs or clinical laboratory parameters with either drug. Conclusion : Concerning the fact that patients with younger age have higher emetogenic potential, there are possibilities that age difference between two treatment groups lowered the statistical power of analysis. There were significant difference favoring ondansetron group with respect to the severity of nausea, vomiting and loss of appetite. We concluded that ondansetron is more effective anti-emetic agents in the control of radiotherapy-induced nausea, vomiting, loss of appetite without significant toxicity, compared with commonly used drug, i.e., metoclopramide. However, there were patients suffering emesis despite the administration of ondansetron. The possible strategies to improve the prevention and the treatment of radiotherapy-induced emesis must be further studied.

  • PDF

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF

A Study on the Estimation of Monthly Average River Basin Evaporation (월(月) 평균유역증발산량(平均流域蒸發散量) 추정(推定)에 관(關)한 연구(硏究))

  • Kim, Tai Cheol;Ahn, Byoung Gi
    • Korean Journal of Agricultural Science
    • /
    • v.8 no.2
    • /
    • pp.195-202
    • /
    • 1981
  • The return of water to the atmosphere from water, soil and vegetation surface is one of the most important aspects of hydrological cycle, and the seasonal trend of variation of river basin evaporation is also meaningful in the longterm runoff analysis for the irrigation and water resources planning. This paper has been prepared to show some imformation to estimate the monthly river basin evaporation from pan evaporation, potential evaporation, regional evaporation and temperature through the comparison with river basin evaporation derived from water budget method. The analysis has been carried out with the observation data of Yongdam station in the Geum river basin for five year. The results are summarized as follows and these would be applied to the estimation of river basin evaporation and longterm runoff in ungaged station. 1. The ratio of pan evaporation to river basin evaporation ($E_w/E_{pan}$) shows the most- significant relation at the viewpoint of seasonal trend of variation. River basin evaporation could be estimated from the pan evaporation through either Fig. 9 or Table-7. 2. Local coefficients of cloudness effect and wind function has been determined to apply the Penman's mass and energy transfer equation to the estimation of river basin evaporation. $R_c=R_a(0.13+0.52n/D)$ $E=0.35(e_s-e)(1.8+1.0U)$ 3. It seems that Regional evaporation concept $E_R=(1-a)R_C-E_p$ has kept functional errors due to the inapplicable assumptions. But it is desirable that this kind of function which contains the results of complex physical, chemical and biological processes of river basin evaporation should be developed. 4. Monthly river basin evaporation could be approximately estimated from the monthly average temperature through either the equation of $E_w=1.44{\times}1.08^T$ or Fig. 12 in the stations with poor climatological observation data.

  • PDF

Problems and Improvement Measures of Private Consulting Firms Working on Rural Area Development (농촌지역개발 민간컨설팅회사의 실태와 개선방안)

  • Kim, Jung Tae
    • Journal of Agricultural Extension & Community Development
    • /
    • v.21 no.2
    • /
    • pp.1-28
    • /
    • 2014
  • Private consulting firms that are currently participating in rural area development projects with a bottom-up approach are involved in nearly all areas of rural area development, and the policy environment that emphasizes the bottom-up approach will further expand their participation. Reviews of private consulting firms, which started out with high expectations in the beginning, are now becoming rather negative. Expertise is the key issue in the controversy over private consulting firms, and the analysis tends to limit the causes of the problems within firms. This study was conducted on the premise that the fixation on cause and structure results in policy issues in the promotion process. That is because the government authorities are responsible for managing and supervising the implementation of policies, not developing the policies. The current issues with consulting firms emerged because of the hasty implementation of private consulting through the government policy trend without sufficient consideration, as well as the policy environment that demanded short-term outcomes even though the purpose of bottom-up rural area development lies in the ideology of endogenous development focused on the changes in residents' perceptions. Research was conducted to determine how the problems of private consulting firms that emerged and were addressed in this context influenced the consulting market, using current data and based on the firms' business performance. In analyzing the types, firms were divided into three groups: top performers including market leaders (9), excellent performers (36), and average performers (34). An analysis of the correlation between the business performance of each type and managerial resources such as each firm's expertise revealed that there was only a correlation between human resources and regional development in excellent performers, and none was found with the other types. These results imply that external factors other than a firm's capabilities (e.g., expertise) play a significant role in the standards of selecting private consulting firms. Thus, government authorities must reflect on their error of hastily adopting private consulting firms without sufficient consideration and must urgently establish response measures.

ASSOCIATION STUDY OF ATTENTION-DEFICIT/HYPERACTIVITY DISORDER(ADHD) AND THE DOPAMINE TRANSPORTER(DAT1) GENE - CASE CONTROL DESIGN STUDY - (주의력결핍과잉행동 장애와 도파민 운반체 유전자간 연합연구 - 환자-대조군 디자인 연구 -)

  • Kim Boong-Nyun;Cho Soo-Churl
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.16 no.2
    • /
    • pp.199-210
    • /
    • 2005
  • Objective : Attention deficit hyperactivity disorder(ADHD) affects $5-10\%$ of children in Korea, with more boys and girls being diagnosed. Despite seriousness of ADHD, little is known about its causes. From the current genetic epidemiologic studies, ADHD is known as a heritable disorder. Till now, however, there have been very few genetic studies about ADHD in Korea. The aim of the this study is to examine the association between dopamine transporter gone type 1 and ADHD using case-control design in Korean ADHD probands and normal controls. Materials and Method : Child Psychiatric Genetic research team in Seoul National University Hospital, Clinical Research Institute recruited the ADHD probands using clinical interview/observation, diverse rating scales, and neuropsychological tests. For eliminating phenocopy or ADHD, diagnosis of ADHD was based upon clinical data, psychometric data, and parent/teacher reports. Total 85 ADHD-probands were recruited as final study subjects and independent 100 normal adults participated in this study as control group. For all the ADHD probands, and controls, the 3'-UTR-VNTR polymorphism of DAT1 was analyzed. Based on the DAT1 allele and genotype informations, Chi-square test based on case-control design was performed. Results : As for genetic study, total of 85 probands and 100 controls were included for the genetic analysis. Four different alleles, 350bp (7repeat), 440bp (9repeat), 480bp (10repeat) and 520bp (11repeat) were found in DAT1 gene of study subjects. In case-control analysis, ADHD probands and parents have significantly more 9 repeat allele and 9/10 genotype. Also, The probands with 9repeat allele have more commission errors in ADS. Conclusion : The positive association between ADHD and DAT1 gene was replicated in this report like other previous results for caucasian children and Korean children with ADHD. There are ongoing studies on other candidate genes such as DRD4 and DRD5 and it would be required to explore the association of these candidate genes in Korean children with ADHD. These ongoing genetic research will contribute to the understanding of heterogenous genetic and environmental etiologies of ADHD phenotype, which will lead to the development of more comprehensive treatment and preventive interventions for ADHD.

  • PDF

Analysis of Empirical Multiple Linear Regression Models for the Production of PM2.5 Concentrations (PM2.5농도 산출을 위한 경험적 다중선형 모델 분석)

  • Choo, Gyo-Hwang;Lee, Kyu-Tae;Jeong, Myeong-Jae
    • Journal of the Korean earth science society
    • /
    • v.38 no.4
    • /
    • pp.283-292
    • /
    • 2017
  • In this study, the empirical models were established to estimate the concentrations of surface-level $PM_{2.5}$ over Seoul, Korea from 1 January 2012 to 31 December 2013. We used six different multiple linear regression models with aerosol optical thickness (AOT), ${\AA}ngstr{\ddot{o}}m$ exponents (AE) data from Moderate Resolution Imaging Spectroradiometer (MODIS) aboard Terra and Aqua satellites, meteorological data, and planetary boundary layer depth (PBLD) data. The results showed that $M_6$ was the best empirical model and AOT, AE, relative humidity (RH), wind speed, wind direction, PBLD, and air temperature data were used as input data. Statistical analysis showed that the result between the observed $PM_{2.5}$ and the estimated $PM_{2.5}$ concentrations using $M_6$ model were correlations (R=0.62) and root square mean error ($RMSE=10.70{\mu}gm^{-3}$). In addition, our study show that the relation strongly depends on the seasons due to seasonal observation characteristics of AOT, with a relatively better correlation in spring (R=0.66) and autumntime (R=0.75) than summer and wintertime (R was about 0.38 and 0.56). These results were due to cloud contamination of summertime and the influence of snow/ice surface of wintertime, compared with those of other seasons. Therefore, the empirical multiple linear regression model used in this study showed that the AOT data retrieved from the satellite was important a dominant variable and we will need to use additional weather variables to improve the results of $PM_{2.5}$. Also, the result calculated for $PM_{2.5}$ using empirical multi linear regression model will be useful as a method to enable monitoring of atmospheric environment from satellite and ground meteorological data.

An Analysis on Characteristics of Turbulence Energy Dissipation Rate from Comparison of Wind Profiler and Rawinsonde (연직바람관측장비와 레윈존데의 비교를 통한 난류 에너지 감소률의 특성 분석)

  • Kang, Woo Kyeong;Moon, Yun Seob;Jung, Ok Jin
    • Journal of the Korean earth science society
    • /
    • v.37 no.7
    • /
    • pp.448-464
    • /
    • 2016
  • The purpose of this study is to optimize the parameters related to consensus coherency within the PCL 1300, the operating program of wind profiler, from a validation of wind data between rawinsonde and wind profiler at Chupungryeong ($36^{\circ}13^{\prime}$, $127^{\circ}59^{\prime}$) site in Korea. It is then to analyze the diurnal and seasonal characteristics of the turbulence energy dissipation rate (${\varepsilon}$) in clear and rainy days from March 2009 to February 2010. In comparison of the wind data between wind profiler and rawinsonde during April 22-23, 2010, it was shown in a big error more than $10ms^{-1}$ over the height of 3,000 meters in the zonal (u) and meridional (v) wind components. When removing more than $10ms^{-1}$ in each wind speed difference of u an v components between the two instruments, the correlation coefficients of these wind components were 0.92 and 0.88, respectively, and the root mean square errors were 3.07 and $1.06ms^{-1}$. Based on these results, when the data processing time and the minimum available data within the PCL 1300 program were adjusted as 30 minutes and 60%, respectively, the bias errors were small. In addition, as a result of an analysis of sensitivity to consensus coherency of u and v components within the PCL1300 program, u components were underestimated in radial coherency, instantaneous and winbarbs coherency, whereas v components were overestimated. Finally by optimizing parameters of the PCL1300 program, the diurnal and seasonal means of ${\varepsilon}$ at each height were higher in rainy days than those in clear days because of increasing in the vertical wind speed due to upward and downward motions. The mean ${\varepsilon}$ for clear and rainy days in winter was lower than those of other seasons, due to stronger horizontal wind speed in winter than those in other seasons. Consequently, when the turbulence energy dissipation rates in the vertical wind speed of more than ${\pm}10cm\;s^{-1}$ were excluded for clear and rainy days, the mean ${\varepsilon}$ in rainy days was 6-7 times higher than that in clear days, but when considering them, it was 4-5 times higher.

Determination of Tumor Boundaries on CT Images Using Unsupervised Clustering Algorithm (비교사적 군집화 알고리즘을 이용한 전산화 단층영상의 병소부위 결정에 관한 연구)

  • Lee, Kyung-Hoo;Ji, Young-Hoon;Lee, Dong-Han;Yoo, Seoung-Yul;Cho, Chul-Koo;Kim, Mi-Sook;Yoo, Hyung-Jun;Kwon, Soo-Il;Chun, Jun-Chul
    • Journal of Radiation Protection and Research
    • /
    • v.26 no.2
    • /
    • pp.59-66
    • /
    • 2001
  • It is a hot issue to determine the spatial location and shape of tumor boundary in fractionated stereotactic radiotherapy (FSRT). We could get consecutive transaxial plane images from the phantom (paraffin) and 4 patients with brain tumor using helical computed tomography(HCT). K-means classification algorithm was adjusted to change raw data pixel value in CT images into classified average pixel value. The classified images consists of 5 regions that ate tumor region (TR), normal region (NR), combination region (CR), uncommitted region (UR) and artifact region (AR). The major concern was how to separate the normal region from tumor region in the combination area. Relative average deviation analysis was adjusted to alter average pixel values of 5 regions into 2 regions of normal and tumor region to define maximum point among average deviation pixel values. And then we drawn gross tumor volume (GTV) boundary by connecting maximum points in images using semi-automatic contour method by IDL(Interactive Data Language) program. The error limit of the ROI boundary in homogeneous phantom is estimated within ${\pm}1%$. In case of 4 patients, we could confirm that the tumor lesions described by physician and the lesions described automatically by the K-mean classification algorithm and relative average deviation analyses were similar. These methods can make uncertain boundary between normal and tumor region into clear boundary. Therefore it will be useful in the CT images-based treatment planning especially to use above procedure apply prescribed method when CT images intermittently fail to visualize tumor volume comparing to MRI images.

  • PDF