• Title/Summary/Keyword: 최적 적용기간

Search Result 369, Processing Time 0.03 seconds

Prediction of Spring Flowering Timing in Forested Area in 2023 (산림지역에서의 2023년 봄철 꽃나무 개화시기 예측)

  • Jihee Seo;Sukyung Kim;Hyun Seok Kim;Junghwa Chun;Myoungsoo Won;Keunchang Jang
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.25 no.4
    • /
    • pp.427-435
    • /
    • 2023
  • Changes in flowering time due to weather fluctuations impact plant growth and ecosystem dynamics. Accurate prediction of flowering timing is crucial for effective forest ecosystem management. This study uses a process-based model to predict flowering timing in 2023 for five major tree species in Korean forests. Models are developed based on nine years (2009-2017) of flowering data for Abeliophyllum distichum, Robinia pseudoacacia, Rhododendron schlippenbachii, Rhododendron yedoense f. poukhanense, and Sorbus commixta, distributed across 28 regions in the country, including mountains. Weather data from the Automatic Mountain Meteorology Observation System (AMOS) and the Korea Meteorological Administration (KMA) are utilized as inputs for the models. The Single Triangle Degree Days (STDD) and Growing Degree Days (GDD) models, known for their superior performance, are employed to predict flowering dates. Daily temperature readings at a 1 km spatial resolution are obtained by merging AMOS and KMA data. To improve prediction accuracy nationwide, random forest machine learning is used to generate region-specific correction coefficients. Applying these coefficients results in minimal prediction errors, particularly for Abeliophyllum distichum, Robinia pseudoacacia, and Rhododendron schlippenbachii, with root mean square errors (RMSEs) of 1.2, 0.6, and 1.2 days, respectively. Model performance is evaluated using ten random sampling tests per species, selecting the model with the highest R2. The models with applied correction coefficients achieve R2 values ranging from 0.07 to 0.7, except for Sorbus commixta, and exhibit a final explanatory power of 0.75-0.9. This study provides valuable insights into seasonal changes in plant phenology, aiding in identifying honey harvesting seasons affected by abnormal weather conditions, such as those of Robinia pseudoacacia. Detailed information on flowering timing for various plant species and regions enhances understanding of the climate-plant phenology relationship.

Evaluation of Site-specific Potential for Rice Production in Korea under the Changing Climate (지구온난화에 따른 우리나라 벼농사지대의 생산성 재평가)

  • Chung, U-Ran;Cho, Kyung-Sook;Lee, Byun-Woo
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.8 no.4
    • /
    • pp.229-241
    • /
    • 2006
  • Global air temperature has risen by $0.6^{\circ}C$ over the last one hundred years due to increased atmospheric greenhouse gases. Moreover, this global warming trend is projected to continue in the future. This study was carried out to evaluate spatial variations in rice production areas by simulating rice-growth and development with projected high resolution climate data in Korea far 2011-2100, which was geospatially interpolated from the 25 km gridded data based on the IPCC SRES A2 emission scenario. Satellite remote sensing data were used to pinpoint the rice-growing areas, and corresponding climate data were aggregated to represent the official 'crop reporting county'. For the simulation experiment, we used a CERES-Rice model modified by introducing two equations to calculate the leaf appearance rate based on the effective temperature and existing leaf number and the final number of leaves based on day-length in the photoperiod sensitive phase of rice. We tested the performance of this model using data-sets obtained from transplanting dates and nitrogen fertilization rates experiments over three years (2002 to 2004). The simulation results showed a good performance of this model in heading date prediction [$R^2$=0.9586 for early (Odaebyeo), $R^2$=0.9681 for medium (Hwasungbyeo), and $R^2$=0.9477 for late (Dongjinbyeo) maturity cultivars]. A modified version of CERES-Rice was used to simulate the growth and development of three Japonica varieties, representing early, medium, and late maturity classes, to project crop status for climatological normal years between 2011 and 2100. In order to compare the temporal changes, three sets of data representing 3 climatological years (2011-2040, 2041-2070, and 2071-2100) were successively used to run the model. Simulated growth and yield data of the three Japonica cultivars under the observed climate for 1971-2000 was set as a reference. Compared with the current normal, heading date was accelerated by 7 days for 2011-2040 and 20 days for 2071-2100. Physiological maturity was accelerated by 15 days for 2011-2040 and 30 days for 2071-2100. Rice yield was in general reduced by 6-25%, 3-26%, and 3-25% per 10a in early, medium, and late maturity classes, respectively. However, mid to late maturing varieties showed an increased yield in northern Gyeonggi Province and in most of Kwangwon Province in 2071-2100.

Demonstration and Operation of Pilot Plant for Short-circuit Nitrogen Process for Economic Treatment of High Concentration Nitrogen Wastewater (고농도 질소함유폐수의 경제적 처리를 위한 단축질소공정 파일럿플랜트 실증화 및 운영 결과)

  • Lee, Jae Myung;Jeon, Ji-hyeong;Choi, Hong-bok
    • Journal of the Korea Organic Resources Recycling Association
    • /
    • v.28 no.1
    • /
    • pp.53-64
    • /
    • 2020
  • A 2㎥/d combined wastewater treatment pilot plant containing the multi-stage vertical stacking type nitrification reactor was installed and operated for more than 1 year under the operating conditions of the short-circuit nitrogen process (pH 8, DO 1mg/L and Internal return rate 4Q from nitrification to denitrification reactor). For economically the combination treatment of food wastewater and the leachate from a landfill, the optimal combination ratio was operated by adjusting the food wastewater with the minimum oil content to 5-25% of the total throughput. The main treatment efficiency of the three-phase centrifugal separator which was introduced to effectively separate solids and oil from the food wastewater was about 52% of SS from 116,000mg/L to 55,700mg/L, and about 48% of normal hexane (NH) from 53,200mg to 27,800 mg/L. During the operational period, the average removal efficiency in the combined wastewater treatment process of BOD was 99.3%, CODcr 94.2%, CODmn 90%, SS 70.1%, T-N 85.8%, and T-P 99.2%. The average concentrations of BOD, CODcr, T-N, and T-P of the treated water were all satisfied with the discharge quality standard for landfill leachate ("Na" region), and SS was satisfied after applying the membrane process. On-site leachate had a relatively high nitrite nitrogen content in the combined wastewater due to intermittent aeration of the equalization tanks and different monthly discharges. Nevertheless nitrite nitrogen was accumulated, denitrification from nitrite nitrogen was observed rather than denitrification after complete nitrification. The average input of anti-forming chemical during the operation period is about 2L/d, which seems to be economical compared to the input of methanol required to treat the same wastewater.

Preliminary Study on the Development of a Platform for the Optimization of Beach Stabilization Measures Against Beach Erosion III - Centering on the Effects of Random Waves Occurring During the Unit Observation Period, and Infra-Gravity Waves of Bound Mode, and Boundary Layer Streaming on the Sediment Transport (해역별 최적 해빈 안정화 공법 선정 Platform 개발을 위한 기초연구 III - 단위 관측 기간에 발생하는 불규칙 파랑과 구속모드의 외중력파, 경계층 Streaming이 횡단표사에 미치는 영향을 중심으로)

  • Chang, Pyong Sang;Cho, Yong Jun
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.31 no.6
    • /
    • pp.434-449
    • /
    • 2019
  • In this study, we develop a new cross-shore sediment module which takes the effect of infra-gravity waves of bound mode, and boundary layer streaming on the sediment transport into account besides the well-known asymmetry and under-tow. In doing so, the effect of individual random waves occurring during the unit observation period of 1 hr on sediment transport is also fully taken into account. To demonstrate how the individual random waves would affect the sediment transport, we numerically simulate the non-linear shoaling process of random wavers over the beach of uniform slope. Numerical results show that with the consistent frequency Boussinesq Eq. the application of which is lately extended to surf zone, we could simulate the saw-tooth profile observed without exception over the surf zone, infra-gravity waves of bound mode, and boundary-layer streaming accurately enough. It is also shown that when yearly highest random waves are modeled by the equivalent nonlinear uniform waves, the maximum cross-shore transport rate well exceeds the one where the randomness is fully taken into account as much as three times. Besides, in order to optimize the free parameter K involved in the long-shore sediment module, we carry out the numerical simulation to trace the yearly shoreline change of Mang-Bang beach from 2017.4.26 to 2018.4.20 as well, and proceeds to optimize the K by comparing the traced shoreline change with the measured one. Numerical results show that the optimized K for Mang-Bang beach would be 0.17. With K = 0.17, via yearly grand circulation process comprising severe erosion by consecutively occurring yearly highest waves at the end of October, and gradual recovery over the winter and spring by swell, the advance of shore-line at the northern and southern ends of Mang-Bang beach by 18 m, and the retreat of shore-line by 2.4 m at the middle of Mang-Bang beach can be successfully duplicated in the numerical simulation.

Perineal Skin Toxicity according to Irradiation Technique in Radiotherapy of Anal Cancer (항문암의 방사선치료 시 방사선 조사 기법에 따른 회음부 피부 독성)

  • You, Sei-Hwan;Seong, Jin-Sil;Koom, Woong-Sub
    • Radiation Oncology Journal
    • /
    • v.26 no.4
    • /
    • pp.222-228
    • /
    • 2008
  • Purpose: Various treatment techniques have been attempted for the radiotherapy of anal cancer because of acute side effects such as perineal skin reactions. This study was performed to investigate an optimal radiotherapy technique in anal cancer. Materials and Methods: The study subjects included 35 patients who underwent definitive concurrent chemoradiotherapy for anal cancer in Yonsei Cancer Center between 1990 and 2007. The patients' clinical data, including irradiation technique, were reviewed retrospectively. The primary lesion, regional lymph nodes, and both inguinal lymph nodes were irradiated by $41.4{\sim}45\;Gy$ with a conventional schedule, followed by a boost does to the primary lesion or metastatic lymph nodes. The radiotherapy technique was classified into four categories according to the irradiation field and number of portals. In turn, acute skin reactions associated with the treatment interruption period were investigated according to each of the four techniques. Results: 28 patients (80.0%) had grade 2 radiation dermatitis or greater, whereas 10 patients (28.6%) had grade 3 radiation dermatitis or greater during radiotherapy. Radiation dermatitis and the treatment interruption period were relatively lower in patients belonging to the posterior-right-left 3 x-ray field with inguinal electron boost and in patients belonging to electron thunderbird techniques. The interruption periods were $8.2{\pm}10.2$ and $5.7{\pm}7.7$ for the two technique groups, respectively. Twenty-seven patients (77.1%) went into complete remission at 1 month after radiotherapy and the overall 5 year survival rates were 67.7%. Conclusion: Field size and beam arrangement can affect patients' compliance in anal cancer radiotherapy, whereas a small x-ray field for the perineum seems to be helpful by decreasing severe radiation dermatitis.

A Case Study: Improvement of Wind Risk Prediction by Reclassifying the Detection Results (풍해 예측 결과 재분류를 통한 위험 감지확률의 개선 연구)

  • Kim, Soo-ock;Hwang, Kyu-Hong
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.23 no.3
    • /
    • pp.149-155
    • /
    • 2021
  • Early warning systems for weather risk management in the agricultural sector have been developed to predict potential wind damage to crops. These systems take into account the daily maximum wind speed to determine the critical wind speed that causes fruit drops and provide the weather risk information to farmers. In an effort to increase the accuracy of wind risk predictions, an artificial neural network for binary classification was implemented. In the present study, the daily wind speed and other weather data, which were measured at weather stations at sites of interest in Jeollabuk-do and Jeollanam-do as well as Gyeongsangbuk- do and part of Gyeongsangnam- do provinces in 2019, were used for training the neural network. These weather stations include 210 synoptic and automated weather stations operated by the Korean Meteorological Administration (KMA). The wind speed data collected at the same locations between January 1 and December 12, 2020 were used to validate the neural network model. The data collected from December 13, 2020 to February 18, 2021 were used to evaluate the wind risk prediction performance before and after the use of the artificial neural network. The critical wind speed of damage risk was determined to be 11 m/s, which is the wind speed reported to cause fruit drops and damages. Furthermore, the maximum wind speeds were expressed using Weibull distribution probability density function for warning of wind damage. It was found that the accuracy of wind damage risk prediction was improved from 65.36% to 93.62% after re-classification using the artificial neural network. Nevertheless, the error rate also increased from 13.46% to 37.64%, as well. It is likely that the machine learning approach used in the present study would benefit case studies where no prediction by risk warning systems becomes a relatively serious issue.

Studies on Epicotyl Grafting of Hardwood Scion of Walnut (호도(胡桃)나무 유태접목(幼台接木)에 관(関)한 연구(硏究))

  • Kim, Su In
    • Journal of Korean Society of Forest Science
    • /
    • v.55 no.1
    • /
    • pp.68-75
    • /
    • 1982
  • This study was carried out to promote percent survival of the walnut seedling grafting. The hardwood scions of the walnut were grafted on the nures seed-stock of the Juglans mandshurica Mat in an electric heating bed, then planted in field. The results obtained from the study were as follows : The optimum time of scion cpllection was from January to February. The best medium of the seed bed was sandy soil. The best grafting time was form the early to the 20the of the march. When the grafted seedling in the heating bed was trans-planted on filed 90percent of the seedlings was survived until autmn. The percent grafting on the elective heating bed was 90%. Crown gall occuring frequently in chestnut nurse graft was not appeared in juglans mandshurica Max grafted seedling of after outplanting. The grafted seedlings have not shown any physiological defects but developed normaly 3 years since grafting.

  • PDF

Characteristics of Residual Free Chlorine Decay in Reclaimed Water (하수재이용수의 유리잔류염소 수체감소 특성 연구)

  • Kang, Sungwon;Lee, Jaiyoung;Lee, Hyundong;Park, Jaehyun;Kwak, Pilljae;Oh, Hyunje
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.35 no.4
    • /
    • pp.276-282
    • /
    • 2013
  • The reclaimed water has been highlighted as a representative alternative to solve the lacking water resources. This study examined the reduction of residual free chlorine by temperature (5, 15, $25^{\circ}C$) and initial injection concentration (1, 2, 4, 6 mg/L) in the reclaimed water and carried out propose on the calculating method of the optimal chlorine dosage. As the reclaimed water showed a very fast reaction with chlorine at the intial time comparing to that of drinking water, the existing general first-order decay model ($C_t=C_o(e^{-k_bt})$) was not suitable for use. Accordingly, the reduction of residual free chlorine could be estimated in a more accurate way as a result of applying the exponential first-order decay model ($C_t=a+b(e^{-k_bt})$). ($r^2$=0.872~0.988). As a result of calculating the bulk decay constant, it showed the highest result at 653 $day^{-1}$ under the condition of 1 mg/L, $25^{\circ}C$ for the initial injection whereas it showed the lowest result at 3.42 $day^{-1}$ under the condition of 6 mg/L, $5^{\circ}C$ for the initial injection. The bulk decay constant tends to increase as temperature increases, whereas the bulk decay constant tends to decrease as the initial injection concentration increases. More accurate calculation for optimal chlorine dosage could be done by using the experimental results for 30~5,040 min, after the entire response time is classified into 0~30 min and 30~5,040 min to calculate the optimal chlorine dosage. In addition, as a result of calculating the optimal chlorine dosage by temperature, the relationships of initial chlorine demand (y) by temperature (x) could be obtained such as y=1.409+0.450x to maintain 0.2 mg/L of residual free chlorine at the time after 4 hours from the chlorine injection.

Robo-Advisor Algorithm with Intelligent View Model (지능형 전망모형을 결합한 로보어드바이저 알고리즘)

  • Kim, Sunwoong
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.39-55
    • /
    • 2019
  • Recently banks and large financial institutions have introduced lots of Robo-Advisor products. Robo-Advisor is a Robot to produce the optimal asset allocation portfolio for investors by using the financial engineering algorithms without any human intervention. Since the first introduction in Wall Street in 2008, the market size has grown to 60 billion dollars and is expected to expand to 2,000 billion dollars by 2020. Since Robo-Advisor algorithms suggest asset allocation output to investors, mathematical or statistical asset allocation strategies are applied. Mean variance optimization model developed by Markowitz is the typical asset allocation model. The model is a simple but quite intuitive portfolio strategy. For example, assets are allocated in order to minimize the risk on the portfolio while maximizing the expected return on the portfolio using optimization techniques. Despite its theoretical background, both academics and practitioners find that the standard mean variance optimization portfolio is very sensitive to the expected returns calculated by past price data. Corner solutions are often found to be allocated only to a few assets. The Black-Litterman Optimization model overcomes these problems by choosing a neutral Capital Asset Pricing Model equilibrium point. Implied equilibrium returns of each asset are derived from equilibrium market portfolio through reverse optimization. The Black-Litterman model uses a Bayesian approach to combine the subjective views on the price forecast of one or more assets with implied equilibrium returns, resulting a new estimates of risk and expected returns. These new estimates can produce optimal portfolio by the well-known Markowitz mean-variance optimization algorithm. If the investor does not have any views on his asset classes, the Black-Litterman optimization model produce the same portfolio as the market portfolio. What if the subjective views are incorrect? A survey on reports of stocks performance recommended by securities analysts show very poor results. Therefore the incorrect views combined with implied equilibrium returns may produce very poor portfolio output to the Black-Litterman model users. This paper suggests an objective investor views model based on Support Vector Machines(SVM), which have showed good performance results in stock price forecasting. SVM is a discriminative classifier defined by a separating hyper plane. The linear, radial basis and polynomial kernel functions are used to learn the hyper planes. Input variables for the SVM are returns, standard deviations, Stochastics %K and price parity degree for each asset class. SVM output returns expected stock price movements and their probabilities, which are used as input variables in the intelligent views model. The stock price movements are categorized by three phases; down, neutral and up. The expected stock returns make P matrix and their probability results are used in Q matrix. Implied equilibrium returns vector is combined with the intelligent views matrix, resulting the Black-Litterman optimal portfolio. For comparisons, Markowitz mean-variance optimization model and risk parity model are used. The value weighted market portfolio and equal weighted market portfolio are used as benchmark indexes. We collect the 8 KOSPI 200 sector indexes from January 2008 to December 2018 including 132 monthly index values. Training period is from 2008 to 2015 and testing period is from 2016 to 2018. Our suggested intelligent view model combined with implied equilibrium returns produced the optimal Black-Litterman portfolio. The out of sample period portfolio showed better performance compared with the well-known Markowitz mean-variance optimization portfolio, risk parity portfolio and market portfolio. The total return from 3 year-period Black-Litterman portfolio records 6.4%, which is the highest value. The maximum draw down is -20.8%, which is also the lowest value. Sharpe Ratio shows the highest value, 0.17. It measures the return to risk ratio. Overall, our suggested view model shows the possibility of replacing subjective analysts's views with objective view model for practitioners to apply the Robo-Advisor asset allocation algorithms in the real trading fields.

Corporate Default Prediction Model Using Deep Learning Time Series Algorithm, RNN and LSTM (딥러닝 시계열 알고리즘 적용한 기업부도예측모형 유용성 검증)

  • Cha, Sungjae;Kang, Jungseok
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.1-32
    • /
    • 2018
  • In addition to stakeholders including managers, employees, creditors, and investors of bankrupt companies, corporate defaults have a ripple effect on the local and national economy. Before the Asian financial crisis, the Korean government only analyzed SMEs and tried to improve the forecasting power of a default prediction model, rather than developing various corporate default models. As a result, even large corporations called 'chaebol enterprises' become bankrupt. Even after that, the analysis of past corporate defaults has been focused on specific variables, and when the government restructured immediately after the global financial crisis, they only focused on certain main variables such as 'debt ratio'. A multifaceted study of corporate default prediction models is essential to ensure diverse interests, to avoid situations like the 'Lehman Brothers Case' of the global financial crisis, to avoid total collapse in a single moment. The key variables used in corporate defaults vary over time. This is confirmed by Beaver (1967, 1968) and Altman's (1968) analysis that Deakins'(1972) study shows that the major factors affecting corporate failure have changed. In Grice's (2001) study, the importance of predictive variables was also found through Zmijewski's (1984) and Ohlson's (1980) models. However, the studies that have been carried out in the past use static models. Most of them do not consider the changes that occur in the course of time. Therefore, in order to construct consistent prediction models, it is necessary to compensate the time-dependent bias by means of a time series analysis algorithm reflecting dynamic change. Based on the global financial crisis, which has had a significant impact on Korea, this study is conducted using 10 years of annual corporate data from 2000 to 2009. Data are divided into training data, validation data, and test data respectively, and are divided into 7, 2, and 1 years respectively. In order to construct a consistent bankruptcy model in the flow of time change, we first train a time series deep learning algorithm model using the data before the financial crisis (2000~2006). The parameter tuning of the existing model and the deep learning time series algorithm is conducted with validation data including the financial crisis period (2007~2008). As a result, we construct a model that shows similar pattern to the results of the learning data and shows excellent prediction power. After that, each bankruptcy prediction model is restructured by integrating the learning data and validation data again (2000 ~ 2008), applying the optimal parameters as in the previous validation. Finally, each corporate default prediction model is evaluated and compared using test data (2009) based on the trained models over nine years. Then, the usefulness of the corporate default prediction model based on the deep learning time series algorithm is proved. In addition, by adding the Lasso regression analysis to the existing methods (multiple discriminant analysis, logit model) which select the variables, it is proved that the deep learning time series algorithm model based on the three bundles of variables is useful for robust corporate default prediction. The definition of bankruptcy used is the same as that of Lee (2015). Independent variables include financial information such as financial ratios used in previous studies. Multivariate discriminant analysis, logit model, and Lasso regression model are used to select the optimal variable group. The influence of the Multivariate discriminant analysis model proposed by Altman (1968), the Logit model proposed by Ohlson (1980), the non-time series machine learning algorithms, and the deep learning time series algorithms are compared. In the case of corporate data, there are limitations of 'nonlinear variables', 'multi-collinearity' of variables, and 'lack of data'. While the logit model is nonlinear, the Lasso regression model solves the multi-collinearity problem, and the deep learning time series algorithm using the variable data generation method complements the lack of data. Big Data Technology, a leading technology in the future, is moving from simple human analysis, to automated AI analysis, and finally towards future intertwined AI applications. Although the study of the corporate default prediction model using the time series algorithm is still in its early stages, deep learning algorithm is much faster than regression analysis at corporate default prediction modeling. Also, it is more effective on prediction power. Through the Fourth Industrial Revolution, the current government and other overseas governments are working hard to integrate the system in everyday life of their nation and society. Yet the field of deep learning time series research for the financial industry is still insufficient. This is an initial study on deep learning time series algorithm analysis of corporate defaults. Therefore it is hoped that it will be used as a comparative analysis data for non-specialists who start a study combining financial data and deep learning time series algorithm.