• Title/Summary/Keyword: Optimization Model

Search Result 5,595, Processing Time 0.036 seconds

An Efficient Heuristic for Storage Location Assignment and Reallocation for Products of Different Brands at Internet Shopping Malls for Clothing (의류 인터넷 쇼핑몰에서 브랜드를 고려한 상품 입고 및 재배치 방법 연구)

  • Song, Yong-Uk;Ahn, Byung-Hyuk
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.2
    • /
    • pp.129-141
    • /
    • 2010
  • An Internet shopping mall for clothing operates a warehouse for packing and shipping products to fulfill its orders. All the products in the warehouse are put into the boxes of same brands and the boxes are stored in a row on shelves equiped in the warehouse. To make picking and managing easy, boxes of the same brands are located side by side on the shelves. When new products arrive to the warehouse for storage, the products of a brand are put into boxes and those boxes are located adjacent to the boxes of the same brand. If there is not enough space for the new coming boxes, however, some boxes of other brands should be moved away and then the new coming boxes are located adjacent in the resultant vacant spaces. We want to minimize the movement of the existing boxes of other brands to another places on the shelves during the warehousing of new coming boxes, while all the boxes of the same brand are kept side by side on the shelves. Firstly, we define the adjacency of boxes by looking the shelves as an one dimensional series of spaces to store boxes, i.e. cells, tagging the series of cells by a series of numbers starting from one, and considering any two boxes stored in the cells to be adjacent to each other if their cell numbers are continuous from one number to the other number. After that, we tried to formulate the problem into an integer programming model to obtain an optimal solution. An integer programming formulation and Branch-and-Bound technique for this problem may not be tractable because it would take too long time to solve the problem considering the number of the cells or boxes in the warehouse and the computing power of the Internet shopping mall. As an alternative approach, we designed a fast heuristic method for this reallocation problem by focusing on just the unused spaces-empty cells-on the shelves, which results in an assignment problem model. In this approach, the new coming boxes are assigned to each empty cells and then those boxes are reorganized so that the boxes of a brand are adjacent to each other. The objective of this new approach is to minimize the movement of the boxes during the reorganization process while keeping the boxes of a brand adjacent to each other. The approach, however, does not ensure the optimality of the solution in terms of the original problem, that is, the problem to minimize the movement of existing boxes while keeping boxes of the same brands adjacent to each other. Even though this heuristic method may produce a suboptimal solution, we could obtain a satisfactory solution within a satisfactory time, which are acceptable by real world experts. In order to justify the quality of the solution by the heuristic approach, we generate 100 problems randomly, in which the number of cells spans from 2,000 to 4,000, solve the problems by both of our heuristic approach and the original integer programming approach using a commercial optimization software package, and then compare the heuristic solutions with their corresponding optimal solutions in terms of solution time and the number of movement of boxes. We also implement our heuristic approach into a storage location assignment system for the Internet shopping mall.

Water Digital Twin for High-tech Electronics Industrial Wastewater Treatment System (II): e-ASM Calibration, Effluent Prediction, Process selection, and Design (첨단 전자산업 폐수처리시설의 Water Digital Twin(II): e-ASM 모델 보정, 수질 예측, 공정 선택과 설계)

  • Heo, SungKu;Jeong, Chanhyeok;Lee, Nahui;Shim, Yerim;Woo, TaeYong;Kim, JeongIn;Yoo, ChangKyoo
    • Clean Technology
    • /
    • v.28 no.1
    • /
    • pp.79-93
    • /
    • 2022
  • In this study, an electronics industrial wastewater activated sludge model (e-ASM) to be used as a Water Digital Twin was calibrated based on real high-tech electronics industrial wastewater treatment measurements from lab-scale and pilot-scale reactors, and examined for its treatment performance, effluent quality prediction, and optimal process selection. For specialized modeling of a high-tech electronics industrial wastewater treatment system, the kinetic parameters of the e-ASM were identified by a sensitivity analysis and calibrated by the multiple response surface method (MRS). The calibrated e-ASM showed a high compatibility of more than 90% with the experimental data from the lab-scale and pilot-scale processes. Four electronics industrial wastewater treatment processes-MLE, A2/O, 4-stage MLE-MBR, and Bardenpo-MBR-were implemented with the proposed Water Digital Twin to compare their removal efficiencies according to various electronics industrial wastewater characteristics. Bardenpo-MBR stably removed more than 90% of the chemical oxygen demand (COD) and showed the highest nitrogen removal efficiency. Furthermore, a high concentration of 1,800 mg L-1 T MAH influent could be 98% removed when the HRT of the Bardenpho-MBR process was more than 3 days. Hence, it is expected that the e-ASM in this study can be used as a Water Digital Twin platform with high compatibility in a variety of situations, including plant optimization, Water AI, and the selection of best available technology (BAT) for a sustainable high-tech electronics industry.

Performance Optimization of Numerical Ocean Modeling on Cloud Systems (클라우드 시스템에서 해양수치모델 성능 최적화)

  • JUNG, KWANGWOOG;CHO, YANG-KI;TAK, YONG-JIN
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.27 no.3
    • /
    • pp.127-143
    • /
    • 2022
  • Recently, many attempts to run numerical ocean models in cloud computing environments have been tried actively. A cloud computing environment can be an effective means to implement numerical ocean models requiring a large-scale resource or quickly preparing modeling environment for global or large-scale grids. Many commercial and private cloud computing systems provide technologies such as virtualization, high-performance CPUs and instances, ether-net based high-performance-networking, and remote direct memory access for High Performance Computing (HPC). These new features facilitate ocean modeling experimentation on commercial cloud computing systems. Many scientists and engineers expect cloud computing to become mainstream in the near future. Analysis of the performance and features of commercial cloud services for numerical modeling is essential in order to select appropriate systems as this can help to minimize execution time and the amount of resources utilized. The effect of cache memory is large in the processing structure of the ocean numerical model, which processes input/output of data in a multidimensional array structure, and the speed of the network is important due to the communication characteristics through which a large amount of data moves. In this study, the performance of the Regional Ocean Modeling System (ROMS), the High Performance Linpack (HPL) benchmarking software package, and STREAM, the memory benchmark were evaluated and compared on commercial cloud systems to provide information for the transition of other ocean models into cloud computing. Through analysis of actual performance data and configuration settings obtained from virtualization-based commercial clouds, we evaluated the efficiency of the computer resources for the various model grid sizes in the virtualization-based cloud systems. We found that cache hierarchy and capacity are crucial in the performance of ROMS using huge memory. The memory latency time is also important in the performance. Increasing the number of cores to reduce the running time for numerical modeling is more effective with large grid sizes than with small grid sizes. Our analysis results will be helpful as a reference for constructing the best computing system in the cloud to minimize time and cost for numerical ocean modeling.

Classification of Carbon-Based Global Marine Eco-Provinces Using Remote Sensing Data and K-Means Clustering (K-Means Clustering 기법과 원격탐사 자료를 활용한 탄소기반 글로벌 해양 생태구역 분류)

  • Young Jun Kim;Dukwon Bae;Jungho Im ;Sihun Jung;Minki Choo;Daehyeon Han
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.5_3
    • /
    • pp.1043-1060
    • /
    • 2023
  • An acceleration of climate change in recent years has led to increased attention towards 'blue carbon' which refers to the carbon captured by the ocean. However, our comprehension of marine ecosystems is still incomplete. This study classified and analyzed global marine eco-provinces using k-means clustering considering carbon cycling. We utilized five input variables during the past 20 years (2001-2020): Carbon-based Productivity Model (CbPM) Net Primary Production (NPP), particulate inorganic and organic carbon (PIC and POC), sea surface salinity (SSS), and sea surface temperature (SST). A total of nine eco-provinces were classified through an optimization process, and the spatial distribution and environmental characteristics of each province were analyzed. Among them, five provinces showed characteristics of open oceans, while four provinces reflected characteristics of coastal and high-latitude regions. Furthermore, a qualitative comparison was conducted with previous studies regarding marine ecological zones to provide a detailed analysis of the features of nine eco-provinces considering carbon cycling. Finally, we examined the changes in nine eco-provinces for four periods in the past (2001-2005, 2006-2010, 2011-2015, and 2016-2020). Rapid changes in coastal ecosystems were observed, and especially, significant decreases in the eco-provinces having higher productivity by large freshwater inflow were identified. Our findings can serve as valuable reference material for marine ecosystem classification and coastal management, with consideration of carbon cycling and ongoing climate changes. The findings can also be employed in the development of guidelines for the systematic management of vulnerable coastal regions to climate change.

Assessment of water supply reliability in the Geum River Basin using univariate climate response functions: a case study for changing instreamflow managements (단변량 기후반응함수를 이용한 금강수계 이수안전도 평가: 하천유지유량 관리 변화를 고려한 사례연구)

  • Kim, Daeha;Choi, Si Jung;Jang, Su Hyung;Kang, Dae Hu
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.12
    • /
    • pp.993-1003
    • /
    • 2023
  • Due to the increasing greenhouse gas emissions, the global mean temperature has risen by 1.1℃ compared to pre-industrial levels, and significant changes are expected in functioning of water supply systems. In this study, we assessed impacts of climate change and instreamflow management on water supply reliability in the Geum River basin, Korea. We proposed univariate climate response functions, where mean precipitation and potential evaporation were coupled as an explanatory variable, to assess impacts of climate stress on multiple water supply reliabilities. To this end, natural streamflows were generated in the 19 sub-basins with the conceptual GR6J model. Then, the simulated streamflows were input into the Water Evaluation And Planning (WEAP) model. The dynamic optimization by WEAP allowed us to assess water supply reliability against the 2020 water demand projections. Results showed that when minimizing the water shortage of the entire river basin under the 1991-2020 climate, water supply reliability was lowest in the Bocheongcheon among the sub-basins. In a scenario where the priority of instreamflow maintenance is adjusted to be the same as municipal and industrial water use, water supply reliability in the Bocheongcheon, Chogang, and Nonsancheon sub-basins significantly decreased. The stress tests with 325 sets of climate perturbations showed that water supply reliability in the three sub-basins considerably decreased under all the climate stresses, while the sub-basins connected to large infrastructures did not change significantly. When using the 2021-2050 climate projections with the stress test results, water supply reliability in the Geum River basin was expected to generally improve, but if the priority of instreamflow maintenance is increased, water shortage is expected to worsen in geographically isolated sub-basins. Here, we suggest that the climate response function can be established by a single explanatory variable to assess climate change impacts of many sub-basin's performance simultaneously.

Contrast Media in Abdominal Computed Tomography: Optimization of Delivery Methods

  • Joon Koo Han;Byung Ihn Choi;Ah Young Kim;Soo Jung Kim
    • Korean Journal of Radiology
    • /
    • v.2 no.1
    • /
    • pp.28-36
    • /
    • 2001
  • Objective: To provide a systematic overview of the effects of various parameters on contrast enhancement within the same population, an animal experiment as well as a computer-aided simulation study was performed. Materials and Methods: In an animal experiment, single-level dynamic CT through the liver was performed at 5-second intervals just after the injection of contrast medium for 3 minutes. Combinations of three different amounts (1, 2, 3 mL/kg), concentrations (150, 200, 300 mgI/mL), and injection rates (0.5, 1, 2 mL/sec) were used. The CT number of the aorta (A), portal vein (P) and liver (L) was measured in each image, and time-attenuation curves for A, P and L were thus obtained. The degree of maximum enhancement (Imax) and time to reach peak enhancement (Tmax) of A, P and L were determined, and times to equilibrium (Teq) were analyzed. In the computed-aided simulation model, a program based on the amount, flow, and diffusion coefficient of body fluid in various compartments of the human body was designed. The input variables were the concentrations, volumes and injection rates of the contrast media used. The program generated the time-attenuation curves of A, P and L, as well as liver-to-hepatocellular carcinoma (HCC) contrast curves. On each curve, we calculated and plotted the optimal temporal window (time period above the lower threshold, which in this experiment was 10 Hounsfield units), the total area under the curve above the lower threshold, and the area within the optimal range. Results: A. Animal Experiment: At a given concentration and injection rate, an increased volume of contrast medium led to increases in Imax A, P and L. In addition, Tmax A, P, L and Teq were prolonged in parallel with increases in injection time The time-attenuation curve shifted upward and to the right. For a given volume and injection rate, an increased concentration of contrast medium increased the degree of aortic, portal and hepatic enhancement, though Tmax A, P and L remained the same. The time-attenuation curve shifted upward. For a given volume and concentration of contrast medium, changes in the injection rate had a prominent effect on aortic enhancement, and that of the portal vein and hepatic parenchyma also showed some increase, though the effect was less prominent. A increased in the rate of contrast injection led to shifting of the time enhancement curve to the left and upward. B. Computer Simulation: At a faster injection rate, there was minimal change in the degree of hepatic attenuation, though the duration of the optimal temporal window decreased. The area between 10 and 30 HU was greatest when contrast media was delivered at a rate of 2 3 mL/sec. Although the total area under the curve increased in proportion to the injection rate, most of this increase was above the upper threshould and thus the temporal window was narrow and the optimal area decreased. Conclusion: Increases in volume, concentration and injection rate all resulted in improved arterial enhancement. If cost was disregarded, increasing the injection volume was the most reliable way of obtaining good quality enhancement. The optimal way of delivering a given amount of contrast medium can be calculated using a computer-based mathematical model.

  • PDF

Verification of Gated Radiation Therapy: Dosimetric Impact of Residual Motion (여닫이형 방사선 치료의 검증: 잔여 움직임의 선량적 영향)

  • Yeo, Inhwan;Jung, Jae Won
    • Progress in Medical Physics
    • /
    • v.25 no.3
    • /
    • pp.128-138
    • /
    • 2014
  • In gated radiation therapy (gRT), due to residual motion, beam delivery is intended to irradiate not only the true extent of disease, but also neighboring normal tissues. It is desired that the delivery covers the true extent (i.e. clinical target volume or CTV) as a minimum, although target moves under dose delivery. The objectives of our study are to validate if the intended dose is surely delivered to the true target in gRT and to quantitatively understand the trend of dose delivery on it and neighboring normal tissues when gating window (GW), motion amplitude (MA), and CTV size changes. To fulfill the objectives, experimental and computational studies have been designed and performed. A custom-made phantom with rectangle- and pyramid-shaped targets (CTVs) on a moving platform was scanned for four-dimensional imaging. Various GWs were selected and image integration was performed to generate targets (internal target volume or ITV) for planning that included the CTVs and internal margins (IM). The planning was done conventionally for the rectangle target and IMRT optimization was done for the pyramid target. Dose evaluation was then performed on a diode array aligned perpendicularly to the gated beams through measurements and computational modeling of dose delivery under motion. This study has quantitatively demonstrated and analytically interpreted the impact of residual motion including penumbral broadening for both targets, perturbed but secured dose coverage on the CTV, and significant doses delivered in the neighboring normal tissues. Dose volume histogram analyses also demonstrated and interpreted the trend of dose coverage: for ITV, it increased as GW or MA decreased or CTV size increased; for IM, it increased as GW or MA decreased; for the neighboring normal tissue, opposite trend to that of IM was observed. This study has provided a clear understanding on the impact of the residual motion and proved that if breathing is reproducible gRT is secure despite discontinuous delivery and target motion. The procedures and computational model can be used for commissioning, routine quality assurance, and patient-specific validation of gRT. More work needs to be done for patient-specific dose reconstruction on CT images.

The Analysis on the Relationship between Firms' Exposures to SNS and Stock Prices in Korea (기업의 SNS 노출과 주식 수익률간의 관계 분석)

  • Kim, Taehwan;Jung, Woo-Jin;Lee, Sang-Yong Tom
    • Asia pacific journal of information systems
    • /
    • v.24 no.2
    • /
    • pp.233-253
    • /
    • 2014
  • Can the stock market really be predicted? Stock market prediction has attracted much attention from many fields including business, economics, statistics, and mathematics. Early research on stock market prediction was based on random walk theory (RWT) and the efficient market hypothesis (EMH). According to the EMH, stock market are largely driven by new information rather than present and past prices. Since it is unpredictable, stock market will follow a random walk. Even though these theories, Schumaker [2010] asserted that people keep trying to predict the stock market by using artificial intelligence, statistical estimates, and mathematical models. Mathematical approaches include Percolation Methods, Log-Periodic Oscillations and Wavelet Transforms to model future prices. Examples of artificial intelligence approaches that deals with optimization and machine learning are Genetic Algorithms, Support Vector Machines (SVM) and Neural Networks. Statistical approaches typically predicts the future by using past stock market data. Recently, financial engineers have started to predict the stock prices movement pattern by using the SNS data. SNS is the place where peoples opinions and ideas are freely flow and affect others' beliefs on certain things. Through word-of-mouth in SNS, people share product usage experiences, subjective feelings, and commonly accompanying sentiment or mood with others. An increasing number of empirical analyses of sentiment and mood are based on textual collections of public user generated data on the web. The Opinion mining is one domain of the data mining fields extracting public opinions exposed in SNS by utilizing data mining. There have been many studies on the issues of opinion mining from Web sources such as product reviews, forum posts and blogs. In relation to this literatures, we are trying to understand the effects of SNS exposures of firms on stock prices in Korea. Similarly to Bollen et al. [2011], we empirically analyze the impact of SNS exposures on stock return rates. We use Social Metrics by Daum Soft, an SNS big data analysis company in Korea. Social Metrics provides trends and public opinions in Twitter and blogs by using natural language process and analysis tools. It collects the sentences circulated in the Twitter in real time, and breaks down these sentences into the word units and then extracts keywords. In this study, we classify firms' exposures in SNS into two groups: positive and negative. To test the correlation and causation relationship between SNS exposures and stock price returns, we first collect 252 firms' stock prices and KRX100 index in the Korea Stock Exchange (KRX) from May 25, 2012 to September 1, 2012. We also gather the public attitudes (positive, negative) about these firms from Social Metrics over the same period of time. We conduct regression analysis between stock prices and the number of SNS exposures. Having checked the correlation between the two variables, we perform Granger causality test to see the causation direction between the two variables. The research result is that the number of total SNS exposures is positively related with stock market returns. The number of positive mentions of has also positive relationship with stock market returns. Contrarily, the number of negative mentions has negative relationship with stock market returns, but this relationship is statistically not significant. This means that the impact of positive mentions is statistically bigger than the impact of negative mentions. We also investigate whether the impacts are moderated by industry type and firm's size. We find that the SNS exposures impacts are bigger for IT firms than for non-IT firms, and bigger for small sized firms than for large sized firms. The results of Granger causality test shows change of stock price return is caused by SNS exposures, while the causation of the other way round is not significant. Therefore the correlation relationship between SNS exposures and stock prices has uni-direction causality. The more a firm is exposed in SNS, the more is the stock price likely to increase, while stock price changes may not cause more SNS mentions.

Optimization of Conditions for the Microencapsulation of ${\alpha}-Tocopherol$ and Its Storage Stability (${\alpha}-Tocopherol$ 미세캡슐화의 최적화 및 저장안정성 규명)

  • Chang, Pahn-Shick;Ha, Jae-Seok;Roh, Hoe-Jin;Choi, Jin-Hwan
    • Korean Journal of Food Science and Technology
    • /
    • v.32 no.4
    • /
    • pp.843-850
    • /
    • 2000
  • We have produced the microcapsule composed of ${\alpha}-tocopherol$ as a core material (Cm) and the gelatinized polysaccharide as a wall material (Wm). Firstly, we have developed a simple, sensitive, and quantitative analysis method of the microencapsulation product using 5% cupric acetate pyridine solution. We could then optimize all the conditions for the microencapsulation process such as the ratio of [Cm] to [Wm], the temperature of dispersion fluid, and the emulsifier concentration using response surface methodology (RSM). As for the microencapsulation of ${\alpha}-tocopherol$, the regression model equation for the yield of microencapsulation (YM, %) to the change of an independent variable could be predicted as follows : YM=99.77-1.76([Cm]:[Wm])-1.72$([Cm]\;:\;[Wm])^2$. From the ridge of maximum response, the optimum conditions for the microencapsulation of ${\alpha}-tocopherol$ were able to be determined as the ratio of [Cm] to [Wm] of 4.6:5.4(w/w), the emulsifier concentration of 0.49%, and dispersion fluid temperature of $25.5^{\circ}C$, respectively. Finally, the microcapsules produced under the optimal conditions were applied for the analysis of storage stability. The optimal conditions for the storage were found to be the values of pH 9.0 and $25{\sim}35^{\circ}C$. And the storage stability of the microcapsules containing ${\alpha}-tocopherol$ were higher than 99% for a week at pH 9.0 and $25^{\circ}C$.

  • PDF

Preliminary Study on the Development of a Platform for the Optimization of Beach Stabilization Measures Against Beach Erosion III - Centering on the Effects of Random Waves Occurring During the Unit Observation Period, and Infra-Gravity Waves of Bound Mode, and Boundary Layer Streaming on the Sediment Transport (해역별 최적 해빈 안정화 공법 선정 Platform 개발을 위한 기초연구 III - 단위 관측 기간에 발생하는 불규칙 파랑과 구속모드의 외중력파, 경계층 Streaming이 횡단표사에 미치는 영향을 중심으로)

  • Chang, Pyong Sang;Cho, Yong Jun
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.31 no.6
    • /
    • pp.434-449
    • /
    • 2019
  • In this study, we develop a new cross-shore sediment module which takes the effect of infra-gravity waves of bound mode, and boundary layer streaming on the sediment transport into account besides the well-known asymmetry and under-tow. In doing so, the effect of individual random waves occurring during the unit observation period of 1 hr on sediment transport is also fully taken into account. To demonstrate how the individual random waves would affect the sediment transport, we numerically simulate the non-linear shoaling process of random wavers over the beach of uniform slope. Numerical results show that with the consistent frequency Boussinesq Eq. the application of which is lately extended to surf zone, we could simulate the saw-tooth profile observed without exception over the surf zone, infra-gravity waves of bound mode, and boundary-layer streaming accurately enough. It is also shown that when yearly highest random waves are modeled by the equivalent nonlinear uniform waves, the maximum cross-shore transport rate well exceeds the one where the randomness is fully taken into account as much as three times. Besides, in order to optimize the free parameter K involved in the long-shore sediment module, we carry out the numerical simulation to trace the yearly shoreline change of Mang-Bang beach from 2017.4.26 to 2018.4.20 as well, and proceeds to optimize the K by comparing the traced shoreline change with the measured one. Numerical results show that the optimized K for Mang-Bang beach would be 0.17. With K = 0.17, via yearly grand circulation process comprising severe erosion by consecutively occurring yearly highest waves at the end of October, and gradual recovery over the winter and spring by swell, the advance of shore-line at the northern and southern ends of Mang-Bang beach by 18 m, and the retreat of shore-line by 2.4 m at the middle of Mang-Bang beach can be successfully duplicated in the numerical simulation.