• Title/Summary/Keyword: Function Optimization

Search Result 3,320, Processing Time 0.033 seconds

Application of The Semi-Distributed Hydrological Model(TOPMODEL) for Prediction of Discharge at the Deciduous and Coniferous Forest Catchments in Gwangneung, Gyeonggi-do, Republic of Korea (경기도(京畿道) 광릉(光陵)의 활엽수림(闊葉樹林)과 침엽수림(針葉樹林) 유역(流域)의 유출량(流出量) 산정(算定)을 위한 준분포형(準分布型) 수문모형(水文模型)(TOPMODEL)의 적용(適用))

  • Kim, Kyongha;Jeong, Yongho;Park, Jaehyeon
    • Journal of Korean Society of Forest Science
    • /
    • v.90 no.2
    • /
    • pp.197-209
    • /
    • 2001
  • TOPMODEL, semi-distributed hydrological model, is frequently applied to predict the amount of discharge, main flow pathways and water quality in a forested catchment, especially in a spatial dimension. TOPMODEL is a kind of conceptual model, not physical one. The main concept of TOPMODEL is constituted by the topographic index and soil transmissivity. Two components can be used for predicting the surface and subsurface contributing area. This study is conducted for the validation of applicability of TOPMODEL at small forested catchments in Korea. The experimental area is located at Gwangneung forest operated by Korea Forest Research Institute, Gyeonggi-do near Seoul metropolitan. Two study catchments in this area have been working since 1979 ; one is the natural mature deciduous forest(22.0 ha) about 80 years old and the other is the planted young coniferous forest(13.6 ha) about 22 years old. The data collected during the two events in July 1995 and June 2000 at the mature deciduous forest and the three events in July 1995 and 1999, August 2000 at the young coniferous forest were used as the observed data set, respectively. The topographic index was calculated using $10m{\times}10m$ resolution raster digital elevation map(DEM). The distribution of the topographic index ranged from 2.6 to 11.1 at the deciduous and 2.7 to 16.0 at the coniferous catchment. The result of the optimization using the forecasting efficiency as the objective function showed that the model parameter, m and the mean catchment value of surface saturated transmissivity, $lnT_0$ had a high sensitivity. The values of the optimized parameters for m and InT_0 were 0.034 and 0.038; 8.672 and 9.475 at the deciduous and 0.031, 0.032 and 0.033; 5.969, 7.129 and 7.575 at the coniferous catchment, respectively. The forecasting efficiencies resulted from the simulation using the optimized parameter were comparatively high ; 0.958 and 0.909 at the deciduous and 0.825, 0.922 and 0.961 at the coniferous catchment. The observed and simulated hyeto-hydrograph shoed that the time of lag to peak coincided well. Though the total runoff and peakflow of some events showed a discrepancy between the observed and simulated output, TOPMODEL could overall predict a hydrologic output at the estimation error less than 10 %. Therefore, TOPMODEL is useful tool for the prediction of runoff at an ungaged forested catchment in Korea.

  • PDF

A Study on the Optimization Methods of Security Risk Analysis and Management (경비위험 분석 및 관리의 최적화 방안에 관한 연구)

  • Lee, Doo-Suck
    • Korean Security Journal
    • /
    • no.10
    • /
    • pp.189-213
    • /
    • 2005
  • Risk management should be controlled systematically by effectively evaluating and suggesting countermeasures against the various risks which are followed by the change of the society and environment. These days, enterprise risk management became a new trend in the field. The first step in risk analysis is to recognize the risk factors, that is to verify the vulnerabilities of loss in the security facilities. The second step is to consider the probability of loss in assessing the risk factors. And the third step is to evaluate the criticality of loss. The security manager will determine the assessment grades and then the risk levels of each risk factor, on the basis of the result of risk analysis which includes the assessment of vulnerability, the provability of loss and the criticality. It is of great importance to put the result of risk analysis in mathematical statement for a scientific approach to risk management. Using the risk levels gained from the risk analysis, the security manager can develop a comprehensive and supplementary security plan. In planning the risk management measures to prepare against and minimize the loss, insurance is one of the best loss-prevention programs. However, insurance in and of itself is no longer able to meet the security challenges faced by major corporations. The security manager have to consider the cost-effectiveness, to suggest the productive risk management alternatives by using the security files which contains every information about the security matters. Also he/she have to reinforce the company regulations on security and safety, and to execute education repeatedly on security and risk management. Risk management makes the most efficient before-the-loss arrangement for and after-the-loss continuation of a business. So it is very much important to suggest a best cost-effective and realistic alternatives for optimizing risk management above all, and this function should by maintained and developed continuously and repeatedly.

  • PDF

Opportunity Tree Framework Design For Optimization of Software Development Project Performance (소프트웨어 개발 프로젝트 성능의 최적화를 위한 Opportunity Tree 모델 설계)

  • Song Ki-Won;Lee Kyung-Whan
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.417-428
    • /
    • 2005
  • Today, IT organizations perform projects with vision related to marketing and financial profit. The objective of realizing the vision is to improve the project performing ability in terms of QCD. Organizations have made a lot of efforts to achieve this objective through process improvement. Large companies such as IBM, Ford, and GE have made over $80\%$ of success through business process re-engineering using information technology instead of business improvement effect by computers. It is important to collect, analyze and manage the data on performed projects to achieve the objective, but quantitative measurement is difficult as software is invisible and the effect and efficiency caused by process change are not visibly identified. Therefore, it is not easy to extract the strategy of improvement. This paper measures and analyzes the project performance, focusing on organizations' external effectiveness and internal efficiency (Qualify, Delivery, Cycle time, and Waste). Based on the measured project performance scores, an OT (Opportunity Tree) model was designed for optimizing the project performance. The process of design is as follows. First, meta data are derived from projects and analyzed by quantitative GQM(Goal-Question-Metric) questionnaire. Then, the project performance model is designed with the data obtained from the quantitative GQM questionnaire and organization's performance score for each area is calculated. The value is revised by integrating the measured scores by area vision weights from all stakeholders (CEO, middle-class managers, developer, investor, and custom). Through this, routes for improvement are presented and an optimized improvement method is suggested. Existing methods to improve software process have been highly effective in division of processes' but somewhat unsatisfactory in structural function to develop and systemically manage strategies by applying the processes to Projects. The proposed OT model provides a solution to this problem. The OT model is useful to provide an optimal improvement method in line with organization's goals and can reduce risks which may occur in the course of improving process if it is applied with proposed methods. In addition, satisfaction about the improvement strategy can be improved by obtaining input about vision weight from all stakeholders through the qualitative questionnaire and by reflecting it to the calculation. The OT is also useful to optimize the expansion of market and financial performance by controlling the ability of Quality, Delivery, Cycle time, and Waste.

Optimization of Image Tracking Algorithm Used in 4D Radiation Therapy (4차원 방사선 치료시 영상 추적기술의 최적화)

  • Park, Jong-In;Shin, Eun-Hyuk;Han, Young-Yih;Park, Hee-Chul;Lee, Jai-Ki;Choi, Doo-Ho
    • Progress in Medical Physics
    • /
    • v.23 no.1
    • /
    • pp.8-14
    • /
    • 2012
  • In order to develop a Patient respiratory management system includinga biofeedback function for4-dimentional radiation therapy, this study investigated anoptimal tracking algorithmfor moving target using IR (Infra-red) camera as well as commercial camera. A tracking system was developed by LabVIEW 2010. Motion phantom images were acquired using a camera (IR or commercial). After image process were conducted to convert acquired image to binary image by applying a threshold values, several edge enhance methods such as Sobel, Prewitt, Differentiation, Sigma, Gradient, Roberts, were applied. The targetpattern was defined in the images, and acquired image from a moving targetwas tracked by matching pre-defined tracking pattern. During the matching of imagee, thecoordinateof tracking point was recorded. In order to assess the performance of tracking algorithm, the value of score which represents theaccuracy of pattern matching was defined. To compare the algorithm objectively, we repeat experiments 3 times for 5 minuts for each algorithm. Average valueand standard deviations (SD) of score were automatically calculatedsaved as ASCII format. Score of threshold only was 706, and standard deviation was 84. The value of average and SD for other algorithms which combined edge detection method and thresholdwere 794, 64 in Sobel, 770, 101 in Differentiation, 754, 85 in Gradient, 763, 75 in Prewitt, 777, 93 in Roberts, and 822, 62 in Sigma, respectively. According to score analysis, the most efficient tracking algorithm is the Sigma method. Therefore, 4-dimentional radiation threapy is expected tobemore efficient if threshold and Sigma edge detection method are used together in target tracking.

Production of $[^{18}F]F_2$ Gas for Electrophilic Substitution Reaction (친전자성 치환반응을 위한 $[^{18}F]F_2$ Gas의 생산 연구)

  • Moon, Byung-Seok;Kim, Jae-Hong;Lee, Kyo-Chul;An, Gwang-Il;Cheon, Gi-Jeong;Chun, Kwon-Soo
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.40 no.4
    • /
    • pp.228-232
    • /
    • 2006
  • Purpose: electrophilic $^{18}F(T_{1/2}=110\;min)$ radionuclide in the form of $[^{18}F]F_2$ gas is of great significance for labeling radiopharmaceuticals for positron omission tomography (PET). However, its production In high yield and with high specific radioactivity is still a challenge to overcome several problems on targetry. The aim of the present study was to develop a method suitable for the routine production of $[^{18}F]F_2$ for the electrophilic substitution reaction. Materials and Methods: The target was designed water-cooled aluminum target chamber system with a conical bore shape. Production of the elemental fluorine was carried out via the $^{18}O(p,n)^{18}F$ reaction using a two-step irradiation protocol. In the first irradiation, the target filled with highly enriched $^{18}O_2$ was irradiated with protons for $^{18}F$ production, which were adsorbed on the inner surface of target body. In the second irradiation, the mixed gas ($1%[^{19}F]F_2/Ar$) was leaded into the target chamber, fellowing a short irradiation of proton for isotopic exchange between the carrier-fluorine and the radiofluorine absorbed in the target chamber. Optimization of production was performed as the function of irradiation time, the beam current and $^{18}O_2$ loading pressure. Results: Production runs was performed under the following optimum conditions: The 1st irradiation for the nuclear reaction (15.0 bar of 97% enriched $^{18}O_2$, 13.2 MeV protons, 30 ${\mu}A$, 60-90 min irradiation), the recovery of enriched oxygen via cryogenic pumping; The 2nd irradiation for the recovery of absorbed radiofluorine (12.0 bar of 1% $[^{19}F]fluorine/argon$ gas, 13.2 MeV protons, 30 ${\mu}A$, 20-30 min irradiation) the recovery of $[^{18}F]fluorine$ for synthesis. The yield of $[^{18}F]fluorine$ at EOB (end of bombardment) was achieved around $34{\pm}6.0$ GBq (n>10). Conclusion: The production of $^{18}F$ electrophilic agent via $^{18}O(p,n)^{18}F$ reaction was much under investigation. Especially, an aluminum gas target was very advantageous for routine production of $[^{18}F]fluorine$. These results suggest the possibility to use $[^{18}F]F_2$ gas as a electrophilic substitution agent.

Development and Evaluation of Model-based Predictive Control Algorithm for Effluent $NH_4-N$ in $A^2/O$ Process ($A^2/O$ 공정의 유출수 $NH_4-N$에 대한 모델기반 예측 제어 알고리즘 개발 및 평가)

  • Woo, Dae-Joon;Kim, Hyo-Soo;Kim, Ye-Jin;Cha, Jae-Hwan;Choi, Soo-Jung;Kim, Min-Soo;Kim, Chang-Won
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.33 no.1
    • /
    • pp.25-31
    • /
    • 2011
  • In this study, model-based $NH_4-N$ predictive control algorithm by using influent pattern was developed and evaluated for effective control application in $A^2/O$ process. A pilot-scale $A^2/O$process at S wastewater treatment plant in B city was selected. The behaviors of organic, nitrogen and phosphorous in the biological reactors were described by using the modified ASM3+Bio-P model. A one-dimensional double exponential function model was selected for modeling of the secondary settlers. The effluent $NH_4-N$ concentration on the next day was predicted according to model-based simulation by using influent pattern. After the objective effluent quality and simulation result were compared, the optimal operational condition which able to meet the objective effluent quality was deduced through repetitive simulation. Next the effluent $NH_4-N$ control schedule was generated by using the optimal operational condition and this control schedule on the next day was applied in pilot-scale $A^2/O$ process. DO concentration in aerobic reactor in predictive control algorithm was selected as the manipulated variable. Without control case and with control case were compared to confirm the control applicability and the study of the applied $NH_4-N$control schedule in summer and winter was performed to confirm the seasonal effect. In this result, the effluent $NH_4-N$concentration without control case was exceeded the objective effluent quality. However the effluent $NH_4-N$ concentration with control case was not exceeded the objective effluent quality both summer and winter season. As compared in case of without predictive control algorithm, in case of application of predictive control algorithm, the RPM of air blower was increased about 9.1%, however the effluent $NH_4-N$ concentration was decreased about 45.2%. Therefore it was concluded that the developed predictive control algorithm to the effluent $NH_4-N$ in this study was properly applied in a full-scale wastewater treatment process and was more efficient in aspect to stable effluent.

Performance assessment of an urban stormwater infiltration trench considering facility maintenance (침투도랑 유지관리를 통한 도시 강우유출수 처리 성능 평가)

  • Reyes, N.J. D.G.;Geronimo, F.K.F.;Choi, H.S.;Kim, L.H.
    • Journal of Wetlands Research
    • /
    • v.20 no.4
    • /
    • pp.424-431
    • /
    • 2018
  • Stormwater runoff containing considerable amounts of pollutants such as particulates, organics, nutrients, and heavy metals contaminate natural bodies of water. At present, best management practices (BMP) intended to reduce the volume and treat pollutants from stormwater runoff were devised to serve as cost-effective measures of stormwater management. However, improper design and lack of proper maintenance can lead to degradation of the facility, making it unable to perform its intended function. This study evaluated an infiltration trench (IT) that went through a series of maintenance operations. 41 monitored rainfall events from 2009 to 2016 were used to evaluate the pollutant removal capabilities of the IT. Assessment of the water quality and hydrological data revealed that the inflow volume was the most relative factor affecting the unit pollutant loads (UPL) entering the facility. Seasonal variations also affected the pollutant removal capabilities of the IT. During the summer season, the increased rainfall depths and runoff volumes diminished the pollutant removal efficiency (RE) of the facility due to increased volumes that washed off larger pollutant loads and caused the IT to overflow. Moreover, the system also exhibited reduced pollutant RE for the winter season due to frozen media layers and chemical-related mechanisms impacted by the low winter temperature. Maintenance operations also posed considerable effects of the performance of the IT. During the first two years of operation, the IT exhibited a decrease in pollutant RE due to aging and lack of proper maintenance. However, some events also showed reduced pollutant RE succeeding the maintenance as a result of disturbed sediments that were not removed from the geotextile. Ultimately, the presented effects of maintenance operations in relation to the pollutant RE of the system may lead to the optimization of maintenance schedules and procedures for BMP of same structure.

Assessment of water supply reliability in the Geum River Basin using univariate climate response functions: a case study for changing instreamflow managements (단변량 기후반응함수를 이용한 금강수계 이수안전도 평가: 하천유지유량 관리 변화를 고려한 사례연구)

  • Kim, Daeha;Choi, Si Jung;Jang, Su Hyung;Kang, Dae Hu
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.12
    • /
    • pp.993-1003
    • /
    • 2023
  • Due to the increasing greenhouse gas emissions, the global mean temperature has risen by 1.1℃ compared to pre-industrial levels, and significant changes are expected in functioning of water supply systems. In this study, we assessed impacts of climate change and instreamflow management on water supply reliability in the Geum River basin, Korea. We proposed univariate climate response functions, where mean precipitation and potential evaporation were coupled as an explanatory variable, to assess impacts of climate stress on multiple water supply reliabilities. To this end, natural streamflows were generated in the 19 sub-basins with the conceptual GR6J model. Then, the simulated streamflows were input into the Water Evaluation And Planning (WEAP) model. The dynamic optimization by WEAP allowed us to assess water supply reliability against the 2020 water demand projections. Results showed that when minimizing the water shortage of the entire river basin under the 1991-2020 climate, water supply reliability was lowest in the Bocheongcheon among the sub-basins. In a scenario where the priority of instreamflow maintenance is adjusted to be the same as municipal and industrial water use, water supply reliability in the Bocheongcheon, Chogang, and Nonsancheon sub-basins significantly decreased. The stress tests with 325 sets of climate perturbations showed that water supply reliability in the three sub-basins considerably decreased under all the climate stresses, while the sub-basins connected to large infrastructures did not change significantly. When using the 2021-2050 climate projections with the stress test results, water supply reliability in the Geum River basin was expected to generally improve, but if the priority of instreamflow maintenance is increased, water shortage is expected to worsen in geographically isolated sub-basins. Here, we suggest that the climate response function can be established by a single explanatory variable to assess climate change impacts of many sub-basin's performance simultaneously.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

A Study on the Prediction Model of Stock Price Index Trend based on GA-MSVM that Simultaneously Optimizes Feature and Instance Selection (입력변수 및 학습사례 선정을 동시에 최적화하는 GA-MSVM 기반 주가지수 추세 예측 모형에 관한 연구)

  • Lee, Jong-sik;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.4
    • /
    • pp.147-168
    • /
    • 2017
  • There have been many studies on accurate stock market forecasting in academia for a long time, and now there are also various forecasting models using various techniques. Recently, many attempts have been made to predict the stock index using various machine learning methods including Deep Learning. Although the fundamental analysis and the technical analysis method are used for the analysis of the traditional stock investment transaction, the technical analysis method is more useful for the application of the short-term transaction prediction or statistical and mathematical techniques. Most of the studies that have been conducted using these technical indicators have studied the model of predicting stock prices by binary classification - rising or falling - of stock market fluctuations in the future market (usually next trading day). However, it is also true that this binary classification has many unfavorable aspects in predicting trends, identifying trading signals, or signaling portfolio rebalancing. In this study, we try to predict the stock index by expanding the stock index trend (upward trend, boxed, downward trend) to the multiple classification system in the existing binary index method. In order to solve this multi-classification problem, a technique such as Multinomial Logistic Regression Analysis (MLOGIT), Multiple Discriminant Analysis (MDA) or Artificial Neural Networks (ANN) we propose an optimization model using Genetic Algorithm as a wrapper for improving the performance of this model using Multi-classification Support Vector Machines (MSVM), which has proved to be superior in prediction performance. In particular, the proposed model named GA-MSVM is designed to maximize model performance by optimizing not only the kernel function parameters of MSVM, but also the optimal selection of input variables (feature selection) as well as instance selection. In order to verify the performance of the proposed model, we applied the proposed method to the real data. The results show that the proposed method is more effective than the conventional multivariate SVM, which has been known to show the best prediction performance up to now, as well as existing artificial intelligence / data mining techniques such as MDA, MLOGIT, CBR, and it is confirmed that the prediction performance is better than this. Especially, it has been confirmed that the 'instance selection' plays a very important role in predicting the stock index trend, and it is confirmed that the improvement effect of the model is more important than other factors. To verify the usefulness of GA-MSVM, we applied it to Korea's real KOSPI200 stock index trend forecast. Our research is primarily aimed at predicting trend segments to capture signal acquisition or short-term trend transition points. The experimental data set includes technical indicators such as the price and volatility index (2004 ~ 2017) and macroeconomic data (interest rate, exchange rate, S&P 500, etc.) of KOSPI200 stock index in Korea. Using a variety of statistical methods including one-way ANOVA and stepwise MDA, 15 indicators were selected as candidate independent variables. The dependent variable, trend classification, was classified into three states: 1 (upward trend), 0 (boxed), and -1 (downward trend). 70% of the total data for each class was used for training and the remaining 30% was used for verifying. To verify the performance of the proposed model, several comparative model experiments such as MDA, MLOGIT, CBR, ANN and MSVM were conducted. MSVM has adopted the One-Against-One (OAO) approach, which is known as the most accurate approach among the various MSVM approaches. Although there are some limitations, the final experimental results demonstrate that the proposed model, GA-MSVM, performs at a significantly higher level than all comparative models.