• Title/Summary/Keyword: Statistical optimization

Search Result 649, Processing Time 0.03 seconds

Production of Rapamycin in Streptomyces hygroscopicus from Glycerol-Based Media Optimized by Systemic Methodology

  • Kim, Yong Hyun;Park, Bu Soo;Bhatia, Shashi Kant;Seo, Hyung-Min;Jeon, Jong-Min;Kim, Hyun-Joong;Yi, Da-Hye;Lee, Ju-Hee;Choi, Kwon-Young;Park, Hyung-Yeon;Kim, Yun-Gon;Yang, Yung-Hun
    • Journal of Microbiology and Biotechnology
    • /
    • v.24 no.10
    • /
    • pp.1319-1326
    • /
    • 2014
  • Rapamycin, produced by the soil bacterium Streptomyces hygroscopicus, has the ability to suppress the immune system and is used as an antifungal, anti-inflammatory, antitumor, and immunosuppressive agent. In an attempt to increase the productivity of rapamycin, mutagenesis of wild-type Streptomyces hygroscopicus was performed using ultraviolet radiation, and the medium composition was optimized using glycerol (which is one of the cheapest starting substrates) by applying Plackett-Burman design and response surface methodology. Plackett-Burman design was used to analyze 14 medium constituents: M100 (maltodextrin), glycerol, soybean meal, soytone, yeast extract, $(NH_4)_2SO_4$, $\small{L}$-lysine, $KH_2PO_4$, $K_2HPO_4$, NaCl, $FeSO_4{cdot}7H_2O$, $CaCO_3$, 2-(N-morpholino) ethanesulfonic acid, and the initial pH level. Glycerol, soytone, yeast extract, and $CaCO_3$ were analyzed to evaluate their effect on rapamycin production. The individual and interaction effects of the four selected variables were determined by Box-Behnken design, suggesting $CaCO_3$, soytone, and yeast extract have negative effects, but glycerol was a positive factor to determine rapamycin productivity. Medium optimization using statistical design resulted in a 45% ($220.7{\pm}5.7mg/l$) increase in rapamycin production for the Streptomyces hygroscopicus mutant, compared with the unoptimized production medium ($151.9{\pm}22.6mg/l$), and nearly 588% compared with wild-type Streptomyces hygroscopicus ($37.5{\pm}2.8mg/l$). The change in pH showed that $CaCO_3$ is a critical and negative factor for rapamycin production.

Why Gabor Frames? Two Fundamental Measures of Coherence and Their Role in Model Selection

  • Bajwa, Waheed U.;Calderbank, Robert;Jafarpour, Sina
    • Journal of Communications and Networks
    • /
    • v.12 no.4
    • /
    • pp.289-307
    • /
    • 2010
  • The problem of model selection arises in a number of contexts, such as subset selection in linear regression, estimation of structures in graphical models, and signal denoising. This paper studies non-asymptotic model selection for the general case of arbitrary (random or deterministic) design matrices and arbitrary nonzero entries of the signal. In this regard, it generalizes the notion of incoherence in the existing literature on model selection and introduces two fundamental measures of coherence-termed as the worst-case coherence and the average coherence-among the columns of a design matrix. It utilizes these two measures of coherence to provide an in-depth analysis of a simple, model-order agnostic one-step thresholding (OST) algorithm for model selection and proves that OST is feasible for exact as well as partial model selection as long as the design matrix obeys an easily verifiable property, which is termed as the coherence property. One of the key insights offered by the ensuing analysis in this regard is that OST can successfully carry out model selection even when methods based on convex optimization such as the lasso fail due to the rank deficiency of the submatrices of the design matrix. In addition, the paper establishes that if the design matrix has reasonably small worst-case and average coherence then OST performs near-optimally when either (i) the energy of any nonzero entry of the signal is close to the average signal energy per nonzero entry or (ii) the signal-to-noise ratio in the measurement system is not too high. Finally, two other key contributions of the paper are that (i) it provides bounds on the average coherence of Gaussian matrices and Gabor frames, and (ii) it extends the results on model selection using OST to low-complexity, model-order agnostic recovery of sparse signals with arbitrary nonzero entries. In particular, this part of the analysis in the paper implies that an Alltop Gabor frame together with OST can successfully carry out model selection and recovery of sparse signals irrespective of the phases of the nonzero entries even if the number of nonzero entries scales almost linearly with the number of rows of the Alltop Gabor frame.

Design and Effects of Science Simulation Applications Using Flash and ActionScript 3.0: In the Composition of Material Chapter in Middle School Science Textbooks (Flash와 Actionscript 3.0을 이용한 과학 시뮬레이션 앱의 디자인 및 효과 -중학교 과학 '물질의 구성' 단원을 중심으로-)

  • Lee, Chang Youn;Park, Chulkyu;Hong, Hun-Gi
    • Journal of The Korean Association For Science Education
    • /
    • v.38 no.4
    • /
    • pp.527-539
    • /
    • 2018
  • Although a simulation is proposed as a candidate for alternative contents of inquiry activities, design cases focused on the characteristic of science education are rare. This study suggested the definition and requirements of science simulation to clarify science subject-specific design and set up the design guidelines to consider usability. Then the science simulation was developed in the form of an app for mobile devices, where 'Flash and Actionscript 3.0' was selected as a development tool for compatibility, functionality, ease of use and optimization for mobile devices with educational applicability in mind. In effect, six science simulation apps were prepared for seven classes of inquiry activity in 10 science classes on the chapter of 'composition of material' in middle school science 2 textbooks. In this regard, the main advantages of the simulation apps expected from each design characteristic are also suggested in this article. In the implementation of the science simulation apps, educational effects were investigated based on the statistical comparison, while 134 students in the second grade in a coeducational middle school, Gyeonggi-do participated as an intervention group and a control group. Our results showed that the scores of academic achievement and affective test in the intervention group were significantly higher than those of the control group (p <.05). In the questionnaire survey on usability, most students responded positively to the design of the science simulation apps. This study will contribute to expanding the horizon of design about science simulation as a design case in science education.

A VLSI Design of High Performance H.264 CAVLC Decoder Using Pipeline Stage Optimization (파이프라인 최적화를 통한 고성능 H.264 CAVLC 복호기의 VLSI 설계)

  • Lee, Byung-Yup;Ryoo, Kwang-Ki
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.46 no.12
    • /
    • pp.50-57
    • /
    • 2009
  • This paper proposes a VLSI architecture of CAVLC hardware decoder which is a tool eliminating statistical redundancy in H.264/AVC video compression. The previous CAVLC hardware decoder used four stages to decode five code symbols. The previous CAVLC hardware architectures decreased decoding performance because there was an unnecessary idle cycle in between state transitions. Likewise, the computation of valid bit length includes an unnecessary idle cycle. This paper proposes hardware architecture to eliminate the idle cycle efficiently. Two methods are applied to the architecture. One is a method which eliminates an unnecessary things of buffers storing decoded codes and then makes efficient pipeline architecture. The other one is a shifter control to simplify operations and controls in the process of calculating valid bit length. The experimental result shows that the proposed architecture needs only 89 cycle in average for one macroblock decoding. This architecture improves the performance by about 29% than previous designs. The synthesis result shows that the design achieves the maximum operating frequency at 140Mhz and the hardware cost is about 11.5K under a 0.18um CMOS process. Comparing with the previous design, it can achieve low-power operation because this design is implemented with high throughputs and low gate count.

A BPM Activity-Performer Correspondence Analysis Method (BPM 기반의 업무-수행자 대응분석 기법)

  • Ahn, Hyun;Park, Chungun;Kim, Kwanghoon
    • Journal of Internet Computing and Services
    • /
    • v.14 no.4
    • /
    • pp.63-72
    • /
    • 2013
  • Business Process Intelligence (BPI) is one of the emerging technologies in the knowledge discovery and analysis area. BPI deals with a series of techniques from discovering knowledge to analyzing the discovered knowledge in BPM-supported organizations. By means of the BPI technology, we are able to provide the full functionality of control, monitoring, prediction, and optimization of process-supported organizational knowledge. Particularly, we focus on the focal organizational knowledge, which is so-called the BPM activity-performer affiliation networking knowledge that represents the affiliated relationships between performers and activities in enacting a specific business process model. That is, in this paper we devise a statistical analysis method to be applied to the BPM activity-performer affiliation networking knowledge, and dubbed it the activity-performer correspondence analysis method. The devised method consists of a series of pipelined phases from the generation of a bipartite matrix to the visualization of the analysis result, and through the method we are eventually able to analyze the degree of correspondences between a group of performers and a group of activities involved in a business process model or a package of business process models. Conclusively, we strongly expect the effectiveness and efficiency of the human resources allotments, and the improvement of the correlational degree between business activities and performers, in planning and designing business process models and packages for the BPM-supported organization, through the activity-performer correspondence analysis method.

Fingerprint Recognition Algorithm using Clique (클릭 구조를 이용한 지문 인식 알고리즘)

  • Ahn, Do-Sung;Kim, Hak-Il
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.36S no.2
    • /
    • pp.69-80
    • /
    • 1999
  • Recently, social requirements of personal identification techniques are rapidly expanding in a number of new application ares. Especially fingerprint recognition is the most important technology. Fingerprint recognition technologies are well established, proven, cost and legally accepted. Therefore, it has more spot lighted among the any other biometrics technologies. In this paper we propose a new on-line fingerprint recognition algorithm for non-inked type live scanner to fit their increasing of security level under the computing environment. Fingerprint recognition system consists of two distinct structural blocks: feature extraction and feature matching. The main topic in this paper focuses on the feature matching using the fingerprint minutiae (ridge ending and bifurcation). Minutiae matching is composed in the alignment stage and matching stage. Success of optimizing the alignment stage is the key of real-time (on-line) fingerprint recognition. Proposed alignment algorithm using clique shows the strength in the search space optimization and partially incomplete image. We make our own database to get the generality. Using the traditional statistical discriminant analysis, 0.05% false acceptance rate (FAR) at 8.83% false rejection rate (FRR) in 1.55 second average matching speed on a Pentium system have been achieved. This makes it possible to construct high performance fingerprint recognition system.

  • PDF

Robust Optimal Design of Disc Brake Based on Response Surface Model Considering Standard Normal Distribution of Shape Tolerance (표준정규분포를 고려한 반응표면모델 기반 디스크 브레이크의 강건최적설계)

  • Lee, Kwang-Ki;Lee, Yong-Bum;Han, Seung-Ho
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.34 no.9
    • /
    • pp.1305-1310
    • /
    • 2010
  • In a practical design process, the method of extracting the design space information of the complex system for verifying, improving, and optimizing the design process by taking into account the design variables and their shape tolerance is very important. Finite element analysis has been successfully implemented and integrated with design of experiment such as D-Optimal array; thus, a response surface model and optimization tools have been obtained, and design variables can be optimized by using the model and these tools. Then, to guarantee the robustness of the design variables, a robust design should be additionally performed by taking into account the statistical variation of the shape tolerance of the optimized design variables. In this study, a new approach based on the use of the response surface model is proposed; in this approach, the standard normal distribution of the shape tolerance is considered. By adopting this approach, it is possible to simultaneously optimize variables and perform a robust design. This approach can serve as a means of efficiently modeling the trade-off among many conflicting goals in the applications of finite element analysis. A case study on the robust optimal design of disc brakes under thermal loadings was carried out to solve multiple objective functions and determine the constraints of the design variables, such as a thermal deformation and weight.

Improvement of Rating Curve Fitting Considering Variance Function with Pseudo-likelihood Estimation (의사우도추정법에 의한 분산함수를 고려한 수위-유량 관계 곡선 산정법 개선)

  • Lee, Woo-Seok;Kim, Sang-Ug;Chung, Eun-Sung;Lee, Kil-Seong
    • Journal of Korea Water Resources Association
    • /
    • v.41 no.8
    • /
    • pp.807-823
    • /
    • 2008
  • This paper presents a technique for estimating discharge rating curve parameters. In typical practical applications, the original non-linear rating curve is transformed into a simple linear regression model by log-transforming the measurement without examining the effect of log transformation. The model of pseudo-likelihood estimation is developed in this study to deal with heteroscedasticity of residuals in the original non-linear model. The parameters of rating curves and variance functions of errors are simultaneously estimated by the pseudo-likelihood estimation(P-LE) method. Simulated annealing, a global optimization technique, is adapted to minimize the log likelihood of the weighted residuals. The P-LE model was then applied to a hypothetical site where stage-discharge data were generated by incorporating various errors. Results of the P-LE model show reduced error values and narrower confidence intervals than those of the common log-transform linear least squares(LT-LR) model. Also, the limit of water levels for segmentation of discharge rating curve is estimated in the process of P-LE using the Heaviside function. Finally, model performance of the conventional log-transformed linear regression and the developed model, P-LE are computed and compared. After statistical simulation, the developed method is then applied to the real data sets from 5 gauge stations in the Geum River basin. It can be suggested that this developed strategy is applied to real sites to successfully determine weights taking into account error distributions from the observed discharge data.

A Study on the Optimization of Anti-Jamming Trash Screen with Rake using by Response Surface Method (반응표면분석법을 이용한 제진기의 목메임 방지 개선 및 레이크 최적화)

  • Seon, Sang-Won;Yi, Won;Hong, Seok-Beom
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.3
    • /
    • pp.230-236
    • /
    • 2020
  • A trash screen is installed in front of the inflow channel of a drainage pumping station, sewage treatment plant, and a power plant to block floating contaminants. The bottleneck phenomenon, which decreases the water inflow, causes damage to the damper as a result of clogging in between the screen if string type obstacles are not removed. In this paper, the apron was removed, and the screen was expanded, to prevent breakage of the bottleneck phenomenon and string type obstacles. This was designed using an extended rake by adding an inner rake in between the screen interspace to remove the bottleneck phenomenon and string type obstacles. To design the inner rake that satisfies the allowable stresses of the existing damper rake, the experiment points were determined according to the experimental design method using the inner rake vertical length and the thickness of the reinforced section as parameters. The use of the ANSYS static structural module and statistical analysis tool R software gives the optimized shape according to the response surface method. The relative error between the response surface analysis results and the simulation results was 1.63% of the determined optimal design-point rake length of 210.2 mm and the reinforcement section thickness of 2 mm. Through empirical experiments, a test rake was constructed to the actual size, and approximately 97% of the bottleneck phenomenon and string type obstacles could be removed.

Monitoring of Alcohol Fermentation Condition of Brown Rice Using Raw Starch Digesting Enzyme (생전분 분해효소를 이용한 현미 알콜발효조건의 모니터링)

  • 신진숙;이오석;김경은;정용진
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.32 no.3
    • /
    • pp.375-380
    • /
    • 2003
  • The study was carried out to set up alcohol fermentation condition for uncooked brown rice. Response surface methodology (RSM) was applied to optimize and monitor of the alcohol fermentation condition with uncooked brown rice. The primary variables were conducted the reaction surface regression analysis for the particle size of brown rice (20 40 60 mesh) the enzyme content (0.1,0.3,0.5%) and the agitating rate (0,100,200 rpm). Their optimization was 35~42 mesh for the size of particle and 0.32~0.43% for enzyme content by SAS (Statistical Analysis System). The coefficient of determination ($R^2$) in ingredients was admitted at the significant level of 5~10% in all ingredients except for a reducing sugar. Predicted values at optimum alcohol fermentation condition agreed with experimental values. During the fermentation, pH was decreased from 6.25 to 4.34, and total acidity was increased from 0.15 to 0.2. The amino acidity was decreased from 1.88 to 0.92, reducing sugar and total sugar contents were decreased 213 mg% and 1,077 mg%, respectively. Alcohol content was increased to 10% after 48 hr fermentation.