• Title/Summary/Keyword: Optimization Model

Search Result 5,650, Processing Time 0.031 seconds

A Deep Learning Based Approach to Recognizing Accompanying Status of Smartphone Users Using Multimodal Data (스마트폰 다종 데이터를 활용한 딥러닝 기반의 사용자 동행 상태 인식)

  • Kim, Kilho;Choi, Sangwoo;Chae, Moon-jung;Park, Heewoong;Lee, Jaehong;Park, Jonghun
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.163-177
    • /
    • 2019
  • As smartphones are getting widely used, human activity recognition (HAR) tasks for recognizing personal activities of smartphone users with multimodal data have been actively studied recently. The research area is expanding from the recognition of the simple body movement of an individual user to the recognition of low-level behavior and high-level behavior. However, HAR tasks for recognizing interaction behavior with other people, such as whether the user is accompanying or communicating with someone else, have gotten less attention so far. And previous research for recognizing interaction behavior has usually depended on audio, Bluetooth, and Wi-Fi sensors, which are vulnerable to privacy issues and require much time to collect enough data. Whereas physical sensors including accelerometer, magnetic field and gyroscope sensors are less vulnerable to privacy issues and can collect a large amount of data within a short time. In this paper, a method for detecting accompanying status based on deep learning model by only using multimodal physical sensor data, such as an accelerometer, magnetic field and gyroscope, was proposed. The accompanying status was defined as a redefinition of a part of the user interaction behavior, including whether the user is accompanying with an acquaintance at a close distance and the user is actively communicating with the acquaintance. A framework based on convolutional neural networks (CNN) and long short-term memory (LSTM) recurrent networks for classifying accompanying and conversation was proposed. First, a data preprocessing method which consists of time synchronization of multimodal data from different physical sensors, data normalization and sequence data generation was introduced. We applied the nearest interpolation to synchronize the time of collected data from different sensors. Normalization was performed for each x, y, z axis value of the sensor data, and the sequence data was generated according to the sliding window method. Then, the sequence data became the input for CNN, where feature maps representing local dependencies of the original sequence are extracted. The CNN consisted of 3 convolutional layers and did not have a pooling layer to maintain the temporal information of the sequence data. Next, LSTM recurrent networks received the feature maps, learned long-term dependencies from them and extracted features. The LSTM recurrent networks consisted of two layers, each with 128 cells. Finally, the extracted features were used for classification by softmax classifier. The loss function of the model was cross entropy function and the weights of the model were randomly initialized on a normal distribution with an average of 0 and a standard deviation of 0.1. The model was trained using adaptive moment estimation (ADAM) optimization algorithm and the mini batch size was set to 128. We applied dropout to input values of the LSTM recurrent networks to prevent overfitting. The initial learning rate was set to 0.001, and it decreased exponentially by 0.99 at the end of each epoch training. An Android smartphone application was developed and released to collect data. We collected smartphone data for a total of 18 subjects. Using the data, the model classified accompanying and conversation by 98.74% and 98.83% accuracy each. Both the F1 score and accuracy of the model were higher than the F1 score and accuracy of the majority vote classifier, support vector machine, and deep recurrent neural network. In the future research, we will focus on more rigorous multimodal sensor data synchronization methods that minimize the time stamp differences. In addition, we will further study transfer learning method that enables transfer of trained models tailored to the training data to the evaluation data that follows a different distribution. It is expected that a model capable of exhibiting robust recognition performance against changes in data that is not considered in the model learning stage will be obtained.

An Ontology Model for Public Service Export Platform (공공 서비스 수출 플랫폼을 위한 온톨로지 모형)

  • Lee, Gang-Won;Park, Sei-Kwon;Ryu, Seung-Wan;Shin, Dong-Cheon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.149-161
    • /
    • 2014
  • The export of domestic public services to overseas markets contains many potential obstacles, stemming from different export procedures, the target services, and socio-economic environments. In order to alleviate these problems, the business incubation platform as an open business ecosystem can be a powerful instrument to support the decisions taken by participants and stakeholders. In this paper, we propose an ontology model and its implementation processes for the business incubation platform with an open and pervasive architecture to support public service exports. For the conceptual model of platform ontology, export case studies are used for requirements analysis. The conceptual model shows the basic structure, with vocabulary and its meaning, the relationship between ontologies, and key attributes. For the implementation and test of the ontology model, the logical structure is edited using Prot$\acute{e}$g$\acute{e}$ editor. The core engine of the business incubation platform is the simulator module, where the various contexts of export businesses should be captured, defined, and shared with other modules through ontologies. It is well-known that an ontology, with which concepts and their relationships are represented using a shared vocabulary, is an efficient and effective tool for organizing meta-information to develop structural frameworks in a particular domain. The proposed model consists of five ontologies derived from a requirements survey of major stakeholders and their operational scenarios: service, requirements, environment, enterprise, and county. The service ontology contains several components that can find and categorize public services through a case analysis of the public service export. Key attributes of the service ontology are composed of categories including objective, requirements, activity, and service. The objective category, which has sub-attributes including operational body (organization) and user, acts as a reference to search and classify public services. The requirements category relates to the functional needs at a particular phase of system (service) design or operation. Sub-attributes of requirements are user, application, platform, architecture, and social overhead. The activity category represents business processes during the operation and maintenance phase. The activity category also has sub-attributes including facility, software, and project unit. The service category, with sub-attributes such as target, time, and place, acts as a reference to sort and classify the public services. The requirements ontology is derived from the basic and common components of public services and target countries. The key attributes of the requirements ontology are business, technology, and constraints. Business requirements represent the needs of processes and activities for public service export; technology represents the technological requirements for the operation of public services; and constraints represent the business law, regulations, or cultural characteristics of the target country. The environment ontology is derived from case studies of target countries for public service operation. Key attributes of the environment ontology are user, requirements, and activity. A user includes stakeholders in public services, from citizens to operators and managers; the requirements attribute represents the managerial and physical needs during operation; the activity attribute represents business processes in detail. The enterprise ontology is introduced from a previous study, and its attributes are activity, organization, strategy, marketing, and time. The country ontology is derived from the demographic and geopolitical analysis of the target country, and its key attributes are economy, social infrastructure, law, regulation, customs, population, location, and development strategies. The priority list for target services for a certain country and/or the priority list for target countries for a certain public services are generated by a matching algorithm. These lists are used as input seeds to simulate the consortium partners, and government's policies and programs. In the simulation, the environmental differences between Korea and the target country can be customized through a gap analysis and work-flow optimization process. When the process gap between Korea and the target country is too large for a single corporation to cover, a consortium is considered an alternative choice, and various alternatives are derived from the capability index of enterprises. For financial packages, a mix of various foreign aid funds can be simulated during this stage. It is expected that the proposed ontology model and the business incubation platform can be used by various participants in the public service export market. It could be especially beneficial to small and medium businesses that have relatively fewer resources and experience with public service export. We also expect that the open and pervasive service architecture in a digital business ecosystem will help stakeholders find new opportunities through information sharing and collaboration on business processes.

Computer Simulations of Hoffman Brain Phantom:Sensitivity Measurements and Optimization of Data Analysis of 〔Tc-99m〕ECD SPECT Before and After Acftazolamide Administraton (Acetazolamide 사용전후 〔Tc-99m〕 EDC SPECT 데이타 분석 방법의 최적화 및 민감도 측정)

  • Kim, Hee-Joung;Lee, Hee-Kyung
    • Progress in Medical Physics
    • /
    • v.6 no.2
    • /
    • pp.71-81
    • /
    • 1995
  • Consecutive brain 〔Tc-99m〕ECD SPECT studies before and after acetazolamide (Diamox) administration have been performed with patients for the evaluation of cerebrovascular hemodynamic reserve. However, the quantitaitve potential of SPECT Diamox imaging is limited as a result of degrading fractors such as finite detector resolution, attenuation, scatter, poor counting statistics, and methods of data analysis. Making physical measurements in phantoms filled with known amounts of radioactivity can help characterize and potentially quantify the sensitivities. However, it is often very difficult to make a realistic phantom simulating patients in clinical situations. By computer simulation, we studied the sensitivities of ECD SPECT before and after Diamox administration. The sensitivity is defined as ($\Delta$N/N)/($\Delta$S/S)$\times$100%, where $\Delta$N denotes the differences in mean counts between post-and pre-Diamox in the measured data, N denotes the mean counts before Diamox in the measure data, $\Delta$S denotes the differences in mean counts between post-and pre-Diamox in the model, and S denotes the mean counts before Diamox in the model. In clinical Diamox studies, the percentage changes of radioactivity could be determined to measure changes in radioactivity concentration by Diamox after subtracting pre-from post-Diamox data. However, the optimal amount of subtraction for 100% sensitivity is not known since this requires a thorough sensitivity analysis by computer simulation. For consecutive brain SPECT imaging model before and after Diamox, when 30% increased radioactivity concentrations were assingned for Diamox effect in model, the sensitivities were measured as 51.03, 73.4, 94.00, 130.74% for 0, 100, 150, 200% subtraction, respectively. Sensitivity analysis indicated that the partial voluming effects due to finite detector resolution and statistical noise result in a significant underestimation of radioactivity measurements and the amount of underestimation depends on the. % increase of radioactivity concentration and % subtraction of pre-from post-Diamox data. The 150% subtraction appears to be optimal in clinical situations where we expect approximately 30% changes in radioactivity concentration. The computer simulation may be a powerful technique to study sensitivities of ECD SPECT before and after Diamox administration.

  • PDF

Opportunity Tree Framework Design For Optimization of Software Development Project Performance (소프트웨어 개발 프로젝트 성능의 최적화를 위한 Opportunity Tree 모델 설계)

  • Song Ki-Won;Lee Kyung-Whan
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.417-428
    • /
    • 2005
  • Today, IT organizations perform projects with vision related to marketing and financial profit. The objective of realizing the vision is to improve the project performing ability in terms of QCD. Organizations have made a lot of efforts to achieve this objective through process improvement. Large companies such as IBM, Ford, and GE have made over $80\%$ of success through business process re-engineering using information technology instead of business improvement effect by computers. It is important to collect, analyze and manage the data on performed projects to achieve the objective, but quantitative measurement is difficult as software is invisible and the effect and efficiency caused by process change are not visibly identified. Therefore, it is not easy to extract the strategy of improvement. This paper measures and analyzes the project performance, focusing on organizations' external effectiveness and internal efficiency (Qualify, Delivery, Cycle time, and Waste). Based on the measured project performance scores, an OT (Opportunity Tree) model was designed for optimizing the project performance. The process of design is as follows. First, meta data are derived from projects and analyzed by quantitative GQM(Goal-Question-Metric) questionnaire. Then, the project performance model is designed with the data obtained from the quantitative GQM questionnaire and organization's performance score for each area is calculated. The value is revised by integrating the measured scores by area vision weights from all stakeholders (CEO, middle-class managers, developer, investor, and custom). Through this, routes for improvement are presented and an optimized improvement method is suggested. Existing methods to improve software process have been highly effective in division of processes' but somewhat unsatisfactory in structural function to develop and systemically manage strategies by applying the processes to Projects. The proposed OT model provides a solution to this problem. The OT model is useful to provide an optimal improvement method in line with organization's goals and can reduce risks which may occur in the course of improving process if it is applied with proposed methods. In addition, satisfaction about the improvement strategy can be improved by obtaining input about vision weight from all stakeholders through the qualitative questionnaire and by reflecting it to the calculation. The OT is also useful to optimize the expansion of market and financial performance by controlling the ability of Quality, Delivery, Cycle time, and Waste.

A Framework for Creating Inter-Industry Service Models in the Convergence Era (융합 서비스 모델 개발 방법론 및 체계 연구)

  • Kwon, Hyeog-In;Ryu, Gui-Jin;Joo, Hi-Yeob;Kim, Man-Jin
    • Asia pacific journal of information systems
    • /
    • v.21 no.1
    • /
    • pp.81-101
    • /
    • 2011
  • In today's rapidly changing and increasingly competitive business environment, new product development in tune with market trends in a timely manner has been a matter of the utmost concern for all enterprises. Indeed, developing a sustainable new business has been a top priority for not only business enterprises, but also for the government policy makers accountable for the health of Its national economy as well as for decision makers in what type of organizations. Further, for a soft landing of new businesses, building a government-initiated industry base has been claimed to be necessary as a way to effectively boost corporate activities. However, the existing methodology in new service and new product development is not suitable for nurturing industry, because it is mainly focused on the research and development of corporate business activities instead of new product development. The approach for developing new business is based on 'innovation' and 'convergence.' Yet, the convergence among technologies, supplies, businesses and industries is believed to be more effective than innovation alone as a way to gain momentum. Therefore, it has become more important than ever to study a new methodology based on convergence in industrial quality new product development (NPD) and new service development (NDS). In this research, therefore, we reviewed any restrictions in the existing new product and new service development methodology and the existing business model development methodology. In doing so, we conducted industry standard collaboration analysis on a new service model development methodology in the private sector and the public sector. This approach is fundamentally different from the existing one in that ours focuses on new business development under private management. The suggested framework can be categorized into industry level and service level. First, in the industry level, we define new business opportunities In occurrence of convergence between businesses. For this, we analyze the existing industry at the industry level to identify the opportunities in a market and its business attractiveness, based on which the convergence industry is formulated. Also, through the analysis of environment and market opportunity at the industry level. we can trace how different industries are lined to one another so as to extend the result of the study to develop better insights into industry expansion and new industry emergence. After then, in the service level, we elicit the service for the defined new business, which is composed of private service and supporting service for nurturing industry. Private service includes 3steps: plan-design-do; supporting service for nurturing industry has 4 steps: selection-make environment- business preparation-do and see. The existing methodology focuses on mainly securing business competitiveness, building a business model for success, and offering new services based on the core competence of companies. This suggested methodology, on other hand, suggests the necessity of service development, when new business opportunities arise, in relation to the opportunity analysis of supporting service based on the clear understanding of new business supporting infrastructure optimization. Meanwhile, we have performed case studies on the printing and publishing field with the restrict procedure and development system to assure the feasibility and practical application. Even though the printing and publishing industry is considered a typical knowledge convergence industry, it is also known as a low-demand and low-value industry in Korea. For this reason, we apply the new methodology and suggest the direction and the possibility of how the printing and publishing industry can be transformed as a core dynamic force for new growth. Then, we suggest the base composition service for industry promotion(public) and business opportunities for private's profitability(private).

The Business Model & Feasibility Analysis of the Han-Ok Residential Housing Block (한옥주거단지 사업모델구상 및 타당성 분석)

  • Choi, Sang-Hee;Song, Ki-Wook;Park, Sin-Won
    • Land and Housing Review
    • /
    • v.2 no.4
    • /
    • pp.453-461
    • /
    • 2011
  • This study is to derive a project model based on potential demand for Korean-style houses, focusing on new town detached housing sites that LH supplies and to test validity of the derived model and to present the direction and supply methods of the projects. The existing high-class new town Korean-style housing developments that have been considered were found to have little business value due to problems in choice of location and discordance of demand, so 6 types of projects were established through the methods of changes in planned scale, combined use, and subdivision of plot of land based on the results of survey. The type that has the highest business value among the project models was block-type multifamily houses, and this can be interpreted as the increase in total construction area leading to increase inrevenues of allotment sales due to economies of scale. The feasibility of mass housing model in which small-scale Korean-style houses are combined with amenities was found to be high, and if the same project conditions as those of the block-type multifamily houses are applied, the business value of the Korean-style tenement houses was found to be high. Besides, the high-class housing models within block-type detached housing areas are typical projects that the private sector generally promotes, and the construction cost was found to be most expensive with 910 million won per house. In order to enhance the business value of the Korean-style housing development, collectivization such as choice of location, diversification of demand classes, optimization of house sizes, and combination of uses is needed. And in order to adopt Korean-style houses in the detached housing sites, the adjustments and division of the existing planned plots are needed, and the strategies to cope with new demand through supplying Korean-style housing types of sites can be suggested. Also breaking away from the existing uniform residential development methods, the development method through supplying original land that is natural land not yet developed besides basic infrastructures (main roads and water and sewage) can be considered, and as the construction of more than 1~2 stories building is impossible due to the structure of Korean-style house roof and furniture. So it can be suggested that original land in the form of hilly land is considered to be most suitable to large-scale development projects.

MDP(Markov Decision Process) Model for Prediction of Survivor Behavior based on Topographic Information (지형정보 기반 조난자 행동예측을 위한 마코프 의사결정과정 모형)

  • Jinho Son;Suhwan Kim
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.2
    • /
    • pp.101-114
    • /
    • 2023
  • In the wartime, aircraft carrying out a mission to strike the enemy deep in the depth are exposed to the risk of being shoot down. As a key combat force in mordern warfare, it takes a lot of time, effot and national budget to train military flight personnel who operate high-tech weapon systems. Therefore, this study studied the path problem of predicting the route of emergency escape from enemy territory to the target point to avoid obstacles, and through this, the possibility of safe recovery of emergency escape military flight personnel was increased. based problem, transforming the problem into a TSP, VRP, and Dijkstra algorithm, and approaching it with an optimization technique. However, if this problem is approached in a network problem, it is difficult to reflect the dynamic factors and uncertainties of the battlefield environment that military flight personnel in distress will face. So, MDP suitable for modeling dynamic environments was applied and studied. In addition, GIS was used to obtain topographic information data, and in the process of designing the reward structure of MDP, topographic information was reflected in more detail so that the model could be more realistic than previous studies. In this study, value iteration algorithms and deterministic methods were used to derive a path that allows the military flight personnel in distress to move to the shortest distance while making the most of the topographical advantages. In addition, it was intended to add the reality of the model by adding actual topographic information and obstacles that the military flight personnel in distress can meet in the process of escape and escape. Through this, it was possible to predict through which route the military flight personnel would escape and escape in the actual situation. The model presented in this study can be applied to various operational situations through redesign of the reward structure. In actual situations, decision support based on scientific techniques that reflect various factors in predicting the escape route of the military flight personnel in distress and conducting combat search and rescue operations will be possible.

Optimization of the Reaction Conditions and the Effect of Surfactants on the Kinetic Resolution of [R,S]-Naoroxen 2,2,2-Trifluoroethyl Thioester by Using Lipse (리파아제를 이용한 라세믹 나프록센 2,2,2-트리플로로에틸 씨오에스터의 Kinetic Resolution에서 반응조건 죄적화와 계면활성제 영향)

  • Song, Yoon-Seok;Lee, Jung-Ho;Cho, Sang-Won;Kang, Seong-Woo;Kim, Seung-Wook
    • KSBB Journal
    • /
    • v.23 no.3
    • /
    • pp.257-262
    • /
    • 2008
  • In this study, the reaction conditions for lipase-catalyzed resolution of racemic naproxen 2,2,2-trilfluoroethyl thioester were optimized, and the effect of surfactants was investigated. Among the organic solvents tested, the isooctane showed the highest conversion (92.19%) in a hydrolytic reaction of (S)-naproxen 2,2,2-trifluoroethyl thioester. In addition, the isooctane induced the highest initial reaction rate of (S)-naproxen 2,2,2-trifluoroethyl thioester ($V_s=2.34{\times}10^{-2}mM/h$), the highest enantioselectivity (E = 36.12) and the highest specific activity ($V_s/(E_t)=7.80{\times}10^{-4}mmol/h{\cdot}g$) of lipase. Furthermore, reaction conditions such as temperature, concentration of the substrate and enzyme, and agitation speed were optimized using response surface methodology (RSM), and the statistical analysis indicated that the optimal conditions were $48.2^{\circ}C$, 3.51 mM, 30.11 mg/mL and 180 rpm, respectively. When the optimal reaction conditions were used, the conversion of (S)-naproxen 2,2,2-trifluoroethyl thioester was 96.5%, which is similar to the conversion (94.6%) that was predicted by the model. After optimization of reaction conditions, the initial reaction rate, lipase specific activity and conversion of (S)-naproxen 2,2,2-trifluoroethyl thioester increased by approximately 19.54%, 19.12% and 4.05%, respectively. The effect of surfactants such as Triton X-100 and NP-10 was also studied and NP-10 showed the highest conversion (89.43%), final reaction rate of (S)-naproxen 2,2,2-trifluoroethyl thioester ($V_s=1.175{\times}10^{-2}mM/h$) and enantioselectivity (E = 59.24) of lipase.

Intelligent Optimal Route Planning Based on Context Awareness (상황인식 기반 지능형 최적 경로계획)

  • Lee, Hyun-Jung;Chang, Yong-Sik
    • Asia pacific journal of information systems
    • /
    • v.19 no.2
    • /
    • pp.117-137
    • /
    • 2009
  • Recently, intelligent traffic information systems have enabled people to forecast traffic conditions before hitting the road. These convenient systems operate on the basis of data reflecting current road and traffic conditions as well as distance-based data between locations. Thanks to the rapid development of ubiquitous computing, tremendous context data have become readily available making vehicle route planning easier than ever. Previous research in relation to optimization of vehicle route planning merely focused on finding the optimal distance between locations. Contexts reflecting the road and traffic conditions were then not seriously treated as a way to resolve the optimal routing problems based on distance-based route planning, because this kind of information does not have much significant impact on traffic routing until a a complex traffic situation arises. Further, it was also not easy to take into full account the traffic contexts for resolving optimal routing problems because predicting the dynamic traffic situations was regarded a daunting task. However, with rapid increase in traffic complexity the importance of developing contexts reflecting data related to moving costs has emerged. Hence, this research proposes a framework designed to resolve an optimal route planning problem by taking full account of additional moving cost such as road traffic cost and weather cost, among others. Recent technological development particularly in the ubiquitous computing environment has facilitated the collection of such data. This framework is based on the contexts of time, traffic, and environment, which addresses the following issues. First, we clarify and classify the diverse contexts that affect a vehicle's velocity and estimates the optimization of moving cost based on dynamic programming that accounts for the context cost according to the variance of contexts. Second, the velocity reduction rate is applied to find the optimal route (shortest path) using the context data on the current traffic condition. The velocity reduction rate infers to the degree of possible velocity including moving vehicles' considerable road and traffic contexts, indicating the statistical or experimental data. Knowledge generated in this papercan be referenced by several organizations which deal with road and traffic data. Third, in experimentation, we evaluate the effectiveness of the proposed context-based optimal route (shortest path) between locations by comparing it to the previously used distance-based shortest path. A vehicles' optimal route might change due to its diverse velocity caused by unexpected but potential dynamic situations depending on the road condition. This study includes such context variables as 'road congestion', 'work', 'accident', and 'weather' which can alter the traffic condition. The contexts can affect moving vehicle's velocity on the road. Since these context variables except for 'weather' are related to road conditions, relevant data were provided by the Korea Expressway Corporation. The 'weather'-related data were attained from the Korea Meteorological Administration. The aware contexts are classified contexts causing reduction of vehicles' velocity which determines the velocity reduction rate. To find the optimal route (shortest path), we introduced the velocity reduction rate in the context for calculating a vehicle's velocity reflecting composite contexts when one event synchronizes with another. We then proposed a context-based optimal route (shortest path) algorithm based on the dynamic programming. The algorithm is composed of three steps. In the first initialization step, departure and destination locations are given, and the path step is initialized as 0. In the second step, moving costs including composite contexts into account between locations on path are estimated using the velocity reduction rate by context as increasing path steps. In the third step, the optimal route (shortest path) is retrieved through back-tracking. In the provided research model, we designed a framework to account for context awareness, moving cost estimation (taking both composite and single contexts into account), and optimal route (shortest path) algorithm (based on dynamic programming). Through illustrative experimentation using the Wilcoxon signed rank test, we proved that context-based route planning is much more effective than distance-based route planning., In addition, we found that the optimal solution (shortest paths) through the distance-based route planning might not be optimized in real situation because road condition is very dynamic and unpredictable while affecting most vehicles' moving costs. For further study, while more information is needed for a more accurate estimation of moving vehicles' costs, this study still stands viable in the applications to reduce moving costs by effective route planning. For instance, it could be applied to deliverers' decision making to enhance their decision satisfaction when they meet unpredictable dynamic situations in moving vehicles on the road. Overall, we conclude that taking into account the contexts as a part of costs is a meaningful and sensible approach to in resolving the optimal route problem.

Statistical Optimization of Culture Conditions of Probiotic Lactobacillus brevis SBB07 for Enhanced Cell Growth (프로바이오틱 Lactobacillus brevis SBB07의 균체량 증가를 위한 배양 조건 최적화)

  • Jeong, Su-Ji;Yang, Hee-Jong;Ryu, Myeong Seon;Seo, Ji Won;Jeong, Seong-Yeop;Jeong, Do-Youn
    • Journal of Life Science
    • /
    • v.28 no.5
    • /
    • pp.577-586
    • /
    • 2018
  • We recently reported the potential probiotic properties of Lactobacillus brevis SBB07 isolated from blueberries. The present study investigates the effect of culture conditions such as temperature, initial pH, culture time, and medium constituent for industrial application. The ingredients of the medium to improve cell growth were selected by Plackett-Burman design (PBD) and central composite design (CCD) within a desirable range. The PBD was applied with 19 factors: seven carbon sources, six nitrogen sources, and six microelements. Protease peptone, corn steep powder (CSP), and yeast extract were found to be significant factors for the growth of SBB07. The CCD was then applied with three variables found from the PBD at five levels, and the optimum values were decided for the three variables: protease peptone, CSP, and yeast extract. In the case of the growth of SBB07, the proposed optimal media contained 2.0% protease peptone, 2.5% CSP, and 2.0% yeast extract, and the maximum dried-cell weight was predicted to be 2.93963 g/l. By the model verification, it was confirmed that the predicted and actual results are similar. Finally, the study investigated the effects of incubation temperature and initial pH at the optimized medium. It was confirmed that the dried-cell weight increased from $2.2933{\pm}0.0601g/l$ to $3.85{\pm}0.0265g/l$ when compared to the basal medium at $37^{\circ}C$ and initial pH 8.0. Establishing the optimal culture condition for SBB07 provides good potential for applications in probiotics and can serve as the foundation for the industrialization of materials.