• 제목/요약/키워드: Realistic model experiments

검색결과 94건 처리시간 0.027초

Fundamentals of Particle Fouling in Membrane Processes

  • Bhattacharjee Subir;Hong Seungkwan
    • Korean Membrane Journal
    • /
    • 제7권1호
    • /
    • pp.1-18
    • /
    • 2005
  • The permeate flux decline due to membrane fouling can be addressed using a variety of theoretical stand-points. Judicious selection of an appropriate theory is a key toward successful prediction of the permeate flux. The essential criterion f3r such a decision appears to be a detailed characterization of the feed solution and membrane properties. Modem theories are capable of accurately predicting several properties of colloidal systems that are important in membrane separation processes from fundamental information pertaining to the particle size, charge, and solution ionic strength. Based on such information, it is relatively straight-forward to determine the properties of the concentrated colloidal dispersion in a polarized layer or the cake layer properties. Incorporation of such information in the framework of the standard theories of membrane filtration, namely, the convective diffusion equation coupled with an appropriate permeate transport model, can lead to reasonably accurate prediction of the permeate flux due to colloidal fouling. The schematic of the essential approach has been delineated in Figure 5. The modern approaches based on appropriate cell models appear to predict the permeate flux behavior in crossflow membrane filtration processes quite accurately without invoking novel theoretical descriptions of particle back transport mechanisms or depending on adjust-able parameters. Such agreements have been observed for a wide range of particle size ranging from small proteins like BSA (diameter ${\~}$6 nm) to latex suspensions (diameter ${\~}1\;{\mu}m$). There we, however, several areas that need further exploration. Some of these include: 1) A clear mechanistic description of the cake formation mechanisms that clearly identifies the disorder to order transition point in different colloidal systems. 2) Determining the structure of a cake layer based on the interparticle and hydrodynamic interactions instead of assuming a fixed geometrical structure on the basis of cell models. 3) Performing well controlled experiments where the cake deposition mechanism can be observed for small colloidal particles (< $1\;{\mu}m$). 4) A clear mechanistic description of the critical operating conditions (for instance, critical pressure) which can minimize the propensity of colloidal membrane fluting. 5) Developing theoretical approaches to account for polydisperse systems that can render the models capable of handing realistic feed solutions typically encountered in diverse applications of membrane filtration.

Dynamics of a Globular Protein and Its Hydration Water Studied by Neutron Scattering and MD Simulations

  • Kim, Chan-Soo;Chu, Xiang-Qiang;Lagi, Marco;Chen, Sow-Hsin;Lee, Kwang-Ryeol
    • 한국진공학회:학술대회논문집
    • /
    • 한국진공학회 2011년도 제40회 동계학술대회 초록집
    • /
    • pp.21-21
    • /
    • 2011
  • A series of Quasi-Elastic Neutron Scattering (QENS) experiments helps us to understand the single-particle (hydrogen atom) dynamics of a globular protein and its hydration water and strong coupling between them. We also performed Molecular Dynamics (MD) simulations on a realistic model of the hydrated hen-egg Lysozyme powder having two proteins in the periodic box. We found the existence of a Fragile-to-Strong dynamic Crossover (FSC) phenomenon in hydration water around a protein occurring at TL=$225{\pm}5K$ by analyzing Intermediate Scattering Function (ISF). On lowering of the temperature toward FSC, the structure of hydration water makes a transition from predominantly the High Density Liquid (HDL) form, a more fluid state, to predominantly the Low Density Liquid (LDL) form, a less fluid state, derived from the existence of a liquid?liquid critical point at an elevated pressure. We showed experimentally and confirmed theoretically that this sudden switch in the mobility of the hydration water around a protein triggers the dynamic transition (so-called glass transition) of the protein, at a temperature TD=220 K. Mean Square Displacement (MSD) is the important factor to show that the FSC is the key to the strong coupling between a protein and its hydration water by suggesting TL${\fallingdotseq}$TD. MD simulations with TIP4P force field for water were performed to understand hydration level dependency of the FSC temperature. We added water molecules to increase hydration level of the protein hydration water, from 0.30, 0.45, 0.60 and 1.00 (1.00 is the bulk water). These confirm the existence of the FSC and the hydration level dependence of the FSC temperature: FSC temperature is decreased upon increasing hydration level. We compared the hydration water around Lysozyme, B-DNA and RNA. Similarity among those suggests that the FSC and this coupling be universal for globular proteins, biopolymers.

  • PDF

표적 할당 및 사격순서결정문제를 위한 최적해 알고리즘 연구 (Exact Algorithm for the Weapon Target Assignment and Fire Scheduling Problem)

  • 차영호;정봉주
    • 산업경영시스템학회지
    • /
    • 제42권1호
    • /
    • pp.143-150
    • /
    • 2019
  • We focus on the weapon target assignment and fire scheduling problem (WTAFSP) with the objective of minimizing the makespan, i.e., the latest completion time of a given set of firing operations. In this study, we assume that there are m available weapons to fire at n targets (> m). The artillery attack operation consists of two steps of sequential procedure : assignment of weapons to the targets; and scheduling firing operations against the targets that are assigned to each weapon. This problem is a combination of weapon target assignment problem (WTAP) and fire scheduling problem (FSP). To solve this problem, we define the problem with a mixed integer programming model. Then, we develop exact algorithms based on a dynamic programming technique. Also, we suggest how to find lower bounds and upper bounds to a given problem. To evaluate the performance of developed exact algorithms, computational experiments are performed on randomly generated problems. From the results, we can see suggested exact algorithm solves problems of a medium size within a reasonable amount of computation time. Also, the results show that the computation time required for suggested exact algorithm can be seen to increase rapidly as the problem size grows. We report the result with analysis and give directions for future research for this study. This study is meaningful in that it suggests an exact algorithm for a more realistic problem than existing researches. Also, this study can provide a basis for developing algorithms that can solve larger size problems.

Adaptive Learning Path Recommendation based on Graph Theory and an Improved Immune Algorithm

  • BIAN, Cun-Ling;WANG, De-Liang;LIU, Shi-Yu;LU, Wei-Gang;DONG, Jun-Yu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제13권5호
    • /
    • pp.2277-2298
    • /
    • 2019
  • Adaptive learning in e-learning has garnered researchers' interest. In it, learning resources could be recommended automatically to achieve a personalized learning experience. There are various ways to realize it. One of the realistic ways is adaptive learning path recommendation, in which learning resources are provided according to learners' requirements. This paper summarizes existing works and proposes an innovative approach. Firstly, a learner-centred concept map is created using graph theory based on the features of the learners and concepts. Then, the approach generates a linear concept sequence from the concept map using the proposed traversal algorithm. Finally, Learning Objects (LOs), which are the smallest concrete units that make up a learning path, are organized based on the concept sequences. In order to realize this step, we model it as a multi-objective combinatorial optimization problem, and an improved immune algorithm (IIA) is proposed to solve it. In the experimental stage, a series of simulated experiments are conducted on nine datasets with different levels of complexity. The results show that the proposed algorithm increases the computational efficiency and effectiveness. Moreover, an empirical study is carried out to validate the proposed approach from a pedagogical view. Compared with a self-selection based approach and the other evolutionary algorithm based approaches, the proposed approach produces better outcomes in terms of learners' homework, final exam grades and satisfaction.

A study on the Performance of Hybrid Normal Mapping Techniques for Real-time Rendering

  • ZhengRan Liu;KiHong Kim;YuanZi Sang
    • International journal of advanced smart convergence
    • /
    • 제12권4호
    • /
    • pp.361-369
    • /
    • 2023
  • Achieving realistic visual quality while maintaining optimal real-time rendering performance is a major challenge in evolving computer graphics and interactive 3D applications. Normal mapping, as a core technology in 3D, has matured through continuous optimization and iteration. Hybrid normal mapping as a new mapping model has also made significant progress and has been applied in the 3D asset production pipeline. This study comprehensively explores the hybrid normal techniques, analyzing Linear Blending, Overlay Blending, Whiteout Blending, UDN Blending, and Reoriented Normal Mapping, and focuses on how the various hybrid normal techniques can be used to achieve rendering performance and visual fidelity. performance and visual fidelity. Under the consideration of computational efficiency, visual coherence, and adaptability in different 3D production scenes, we design comparative experiments to explore the optimal solutions of the hybrid normal techniques by analyzing and researching the code, the performance of different hybrid normal mapping in the engine, and analyzing and comparing the data. The purpose of the research and summary of the hybrid normal technology is to find out the most suitable choice for the mainstream workflow based on the objective reality. Provide an understanding of the hybrid normal mapping technique, so that practitioners can choose how to apply different hybrid normal techniques to the corresponding projects. The purpose of our research and summary of mixed normal technology is to find the most suitable choice for mainstream workflows based on objective reality. We summarized the hybrid normal mapping technology and experimentally obtained the advantages and disadvantages of different technologies, so that practitioners can choose to apply different hybrid normal mapping technologies to corresponding projects in a reasonable manner.

미시역학을 이용한 사질토의 이방적 탄성 변형 특성의 해석 (Micromechanical Analysis on Anisotropic Elastic Deformation of Granular Soils)

  • 정충기;정영훈
    • 한국지반공학회논문집
    • /
    • 제20권5호
    • /
    • pp.99-107
    • /
    • 2004
  • 흙의 이방적 변형 특성은 파괴 이전 상태의 변형 거동을 정확히 이해하기 위한 중요한 특성 중 하나이다. 최근 활발히 이루어지고 있는 실험적 연구 결과는 사질토fl서 나타나는 이방적 탄성계수가 직교 이방 탄성이론으로 표현될 수 있으며, 또한 각 방향의 수직 탄성계수가 해당 방향의 수직 응력에 의한 지수 함수로 표현될 수 있음을 보여준다. 이러한 사질토의 탄성계수 이방성은 입자의 미시역학적 특성과 밀접한 관계가 있다. 사질토는 수많은 입자에 의해 구성된 입상체이므로 각 입자간의 접촉면에서 나타나는 힘-변위 관계가 거시적 인 입상체의 응력-변형률 관계를 지배한다. 따라서 사질토의 변형을 입자 간 상호 작용으로 해석하는 미시역학적 접근 방법은 흙의 이방적 변형 특성을 연구하는 가장 좋은 방법 중 하나이다. 본 연구에서는 미시 역학 이론을 토대로 흙의 이방적 탄성 변형 특성을 예측하는 수치해석 프로그램을 개발하였다. 실제 토립자의 불규칙한 접촉면 상태를 간략하게 모사할 수 있는 접촉 모델을 제시하였다. 삼축 시험 등의 일반적인 역학 시험으로부터 얻을 수 있는 거시적 탄성 응력-변형률 관계로부터, 미시역학 모델에 필요한 변수를 결정할 수 있는 해석해를 유도하였다 실내 시험을 통해 구할 수 있는 거시적 탄성계수와 해석해를 이용하여 모델 변수를 구하는 방법을 구체적으로 제시하였다.

수자원 영향평가에 활용 가능한 지역기후변화 시나리오 연구 (A study on the regional climate change scenario for impact assessment on water resources)

  • 임은순;권원태;배덕효
    • 한국수자원학회논문집
    • /
    • 제39권12호
    • /
    • pp.1043-1056
    • /
    • 2006
  • 온실가스 증가로 인한 기후변화를 이해하고, 다양한 영향평가 분야에 상세한 기후정보를 제공하기 위해서 온실가스 배출 시나리오에 근거한 지역기후변화 시나리오 연구가 수행되었다. 본 연구에서는 역학적 상세화를 위하여 지역기후모델을 이용한 이중둥지격자시스템을 구축하고, 과거 30년(1971-2000)과 미래 30년(2021-2050)에 대한 시나리오를 생산하였다. 미래 시나리오에 대한 신뢰도를 확보하기 위하여, 기준 시나리오에 대한 관측과의 검증이 선행되었다. 기준 시나리오는 둥지격자의 한반도 영역뿐만 아니라 모격자의 동아시아 영역에 대하여 기온과 강수의 계절 및 경년변동성과 일빈도분포를 성공적으로 모사하였다. 또한 경계조건으로 이용된 전구모델과 비교하여 공간적인 특성뿐만 아니라 면적평균 시계열에서도 뚜렷한 오차의 감소를 나타내었다. 미래 기후변화 전망은 기준 시나리오와 미래 시나리오의 차이로부터 유도되며, 평균적인 변화뿐만 아니라 극한 기후의 빈도와 강도변화에 대한 분석이 수행되었다. 미래 시나리오에 의하면 2050년까지 한반도에서는 $2^{\circ}C$ 정도의 기온 상승과 겨울철 강수량의 뚜렷한 증가경향이 전망되었다. 본 연구는 한반도의 복잡한 지형적 특성이 반영된 고해상도 시나리오를 생산하기 위한 방법론을 제시하고, 생산된 시나리오를 이용하여 다양한 시공간 규모에 대한 기후특성을 파악하였다는데 의미가 있다.

DNN을 이용한 응시 깊이 보정 (Correcting the gaze depth by using DNN)

  • 한석호;장훈석
    • 한국정보전자통신기술학회논문지
    • /
    • 제16권3호
    • /
    • pp.123-129
    • /
    • 2023
  • 응시점을 통해 어떤 것을 보고 있는지 알 수 있다면 많은 정보를 얻을 수 있다. 응시 추적 기술의 발달로 응시점에 대한 정보는 다양한 응시 추적 기기에서 제공해주는 소프트웨어를 통해 얻을 수 있다. 하지만 실제 응시 깊이와 같은 정확한 정보를 추정하기란 어렵다. 응시 추적 기기를 통해 만약 실제 응시 깊이로 보정할 수 있다면 시뮬레이션, 디지털 트윈, VR 등 다양한 분야에서 현실적이고 정확한 신뢰성 있는 결과를 도출하는 것이 가능해질 것이다. 따라서 본 논문에서는 응시 추적 기기와 소프트웨어를 통해 원시 응시 깊이를 획득하고 보정하는 실험을 진행한다. 실험은 Deep Neural Network(DNN) 모델을 설계한 후 소프트웨어에서 제공하는 응시 깊이 값을 300mm에서 10,000mm까지 지정한 거리별로 획득한다. 획득한 데이터는 설계한 DNN 모델을 통해 학습을 진행하여 실제 응시 깊이와 대응하도록 보정하였다. 보정한 모델을 통해 실험을 진행한 결과, 300mm에서 10,000mm까지 지정한 거리별 297mm, 904mm, 1,485mm, 2,005mm, 3,011mm, 4,021mm, 4,972mm, 6,027mm, 7,026mm, 8,043mm, 9,021mm, 10,076mm로 실제와 비슷한 응시 깊이 값을 획득할 수 있었다.

2014년 특별관측 기간 동안 청미천 농경지에서의 WRF/Noah-MP 고해상도 수치모의 (High-Resolution Numerical Simulations with WRF/Noah-MP in Cheongmicheon Farmland in Korea During the 2014 Special Observation Period)

  • 송지애;이승재;강민석;문민규;이정훈;김준
    • 한국농림기상학회지
    • /
    • 제17권4호
    • /
    • pp.384-398
    • /
    • 2015
  • 본 연구에서는 청미천 농경지를 중심으로 고해상도 지형 및 토지피복 자료 기반의 WRF/Noah-MP 결합시스템을 구축하고 수치모의 한 결과를 2014년 8월 21일부터 9월 10일까지의 청미천 하계 특별관측 자료와 비교하여, 농경지에서의 지면 및 대기모의 성능을 평가하였다. 지면 및 대기 변수의 단기 및 중기모의에 있어서 Noah-MP의 동적 식생 가동이 얼마나 유용한지를 살펴보기 위하여, 동적 식생을 포함하지 않은 실험(CTL)과 포함한 실험(DVG)을 관측기간에 대해 양방향 6중 둥지격자 시스템으로 수행하였다. 본 연구의 결과는 크게 세 가지로서 다음과 같다. 1) CTL 실험은 낮 동안의 순단파 복사 에너지를 과대 모의 함에 따라 현열 및 잠열 플럭스와 보웬비도 관측에 비해 과대 모의하는 경향을 보였다. CTL 실험의 기온은 관측을 대체로 잘 따라갔으나 일출 후 기온의 상승 속도가 관측에 비해 빠른 모습을 보였다. 최저 기온 및 최고 기온의 시점은 잘 모의하였는데, 특히 일 최저기온의 모의는 관측과 $0.3^{\circ}C$ 오차 이내의 성능을 보여, 동해 및 병해충과 연관된 엽면수분 지속시간 예측에 고무적인 결과로 평가되었다. CTL 실험의 10m 바람은 동서 및 남북 풍속 모두 대체로 과대 모의하는 경향을 보였고, 강수 또한 과대 모의하는 경향을 보였으나 강수의 시작 종료시점은 대체로 잘 포착하였다. 2) Noah-MP의 동적 식생을 구동시킨 DVG 실험은 CTL 실험에 비해 엽면적지수, 단파 복사, 지표면 플럭스, 보웬비, 기온, 바람, 강수의 모의를 전반적으로 관측에 더 가깝게 생산하는 것으로 나타났다. 강수, 온도, 복사, 가용 영양소 등의 변동에 대응하여 엽면적지수를 예단하는 DVG 실험은 CTL 실험보다 더 큰 엽면적지수를 생산했으며, 이는 실측에 더 가까운 결과였다. DVG 실험에서도 일출 후 기온 상승률은 관측에 비해 높았는데, 이는 CTL와 DVG 실험 모두에서 공통으로 사용한 YSU 경계층 방안이 갖는 혼합층의 조기 성장 특성과 관련이 있는 것으로 분석되었다. CTL 실험이 보인 바람과 강수의 과대모의 경향도 DVG 실험에서는 어느 정도 완화되는 개선점을 보였다. 3) 수평 해상도의 증가에 따른 청미천 농경지의 수치모의 성능 향상은 지표면 플럭스, 기온, 바람 강수 모두에서 미비하거나 거의 없는 것으로 나타났으며, 보다 정확한 평가를 위해서는 농경지 상의 여러 지점에서 입체적인 관측이 이뤄져야 하고, 모형에 사용되는 지형 및 토지피복 자료의 도메인 간 일관성이 확보되어야 할 것이다.

도입주체에 따른 인터넷경로의 도입효과 (The Impact of the Internet Channel Introduction Depending on the Ownership of the Internet Channel)

  • 유원상
    • 마케팅과학연구
    • /
    • 제19권1호
    • /
    • pp.37-46
    • /
    • 2009
  • The Census Bureau of the Department of Commerce announced in May 2008 that U.S. retail e-commerce sales for 2006 reached $ 107 billion, up from $ 87 billion in 2005 - an increase of 22 percent. From 2001 to 2006, retail e-sales increased at an average annual growth rate of 25.4 percent. The explosive growth of E-Commerce has caused profound changes in marketing channel relationships and structures in many industries. Despite the great potential implications for both academicians and practitioners, there still exists a great deal of uncertainty about the impact of the Internet channel introduction on distribution channel management. The purpose of this study is to investigate how the ownership of the new Internet channel affects the existing channel members and consumers. To explore the above research questions, this study conducts well-controlled mathematical experiments to isolate the impact of the Internet channel by comparing before and after the Internet channel entry. The model consists of a monopolist manufacturer selling its product through a channel system including one independent physical store before the entry of an Internet store. The addition of the Internet store to this channel system results in a mixed channel comprised of two different types of channels. The new Internet store can be launched by the independent physical store such as Bestbuy. In this case, the physical retailer coordinates the two types of stores to maximize the joint profits from the two stores. The Internet store also can be introduced by an independent Internet retailer such as Amazon. In this case, a retail level competition occurs between the two types of stores. Although the manufacturer sells only one product, consumers view each product-outlet pair as a unique offering. Thus, the introduction of the Internet channel provides two product offerings for consumers. The channel structures analyzed in this study are illustrated in Fig.1. It is assumed that the manufacturer plays as a Stackelberg leader maximizing its own profits with the foresight of the independent retailer's optimal responses as typically assumed in previous analytical channel studies. As a Stackelberg follower, the independent physical retailer or independent Internet retailer maximizes its own profits, conditional on the manufacturer's wholesale price. The price competition between two the independent retailers is assumed to be a Bertrand Nash game. For simplicity, the marginal cost is set at zero, as typically assumed in this type of study. In order to explore the research questions above, this study develops a game theoretic model that possesses the following three key characteristics. First, the model explicitly captures the fact that an Internet channel and a physical store exist in two independent dimensions (one in physical space and the other in cyber space). This enables this model to demonstrate that the effect of adding an Internet store is different from that of adding another physical store. Second, the model reflects the fact that consumers are heterogeneous in their preferences for using a physical store and for using an Internet channel. Third, the model captures the vertical strategic interactions between an upstream manufacturer and a downstream retailer, making it possible to analyze the channel structure issues discussed in this paper. Although numerous previous models capture this vertical dimension of marketing channels, none simultaneously incorporates the three characteristics reflected in this model. The analysis results are summarized in Table 1. When the new Internet channel is introduced by the existing physical retailer and the retailer coordinates both types of stores to maximize the joint profits from the both stores, retail prices increase due to a combination of the coordination of the retail prices and the wider market coverage. The quantity sold does not significantly increase despite the wider market coverage, because the excessively high retail prices alleviate the market coverage effect to a degree. Interestingly, the coordinated total retail profits are lower than the combined retail profits of two competing independent retailers. This implies that when a physical retailer opens an Internet channel, the retailers could be better off managing the two channels separately rather than coordinating them, unless they have the foresight of the manufacturer's pricing behavior. It is also found that the introduction of an Internet channel affects the power balance of the channel. The retail competition is strong when an independent Internet store joins a channel with an independent physical retailer. This implies that each retailer in this structure has weak channel power. Due to intense retail competition, the manufacturer uses its channel power to increase its wholesale price to extract more profits from the total channel profit. However, the retailers cannot increase retail prices accordingly because of the intense retail level competition, leading to lower channel power. In this case, consumer welfare increases due to the wider market coverage and lower retail prices caused by the retail competition. The model employed for this study is not designed to capture all the characteristics of the Internet channel. The theoretical model in this study can also be applied for any stores that are not geographically constrained such as TV home shopping or catalog sales via mail. The reasons the model in this study is names as "Internet" are as follows: first, the most representative example of the stores that are not geographically constrained is the Internet. Second, catalog sales usually determine the target markets using the pre-specified mailing lists. In this aspect, the model used in this study is closer to the Internet than catalog sales. However, it would be a desirable future research direction to mathematically and theoretically distinguish the core differences among the stores that are not geographically constrained. The model is simplified by a set of assumptions to obtain mathematical traceability. First, this study assumes the price is the only strategic tool for competition. In the real world, however, various marketing variables can be used for competition. Therefore, a more realistic model can be designed if a model incorporates other various marketing variables such as service levels or operation costs. Second, this study assumes the market with one monopoly manufacturer. Therefore, the results from this study should be carefully interpreted considering this limitation. Future research could extend this limitation by introducing manufacturer level competition. Finally, some of the results are drawn from the assumption that the monopoly manufacturer is the Stackelberg leader. Although this is a standard assumption among game theoretic studies of this kind, we could gain deeper understanding and generalize our findings beyond this assumption if the model is analyzed by different game rules.

  • PDF