• Title/Summary/Keyword: Output processes

Search Result 527, Processing Time 0.024 seconds

Spectrophotometric Study of Acidity and Complex Formation of Anti-Inflammatory Drug Piroxicam with Some Transition Metal Ions in Different Methanol/Water Mixtures by Chemometric Methods (Chemometric 방법에 의한 메탄올/물 계에서 전이 금속 이온과 소염제 Piroxicam의 산성도 및 착체 형성에 관한 분광광도법 연구)

  • Ghasemi, Jahan B.;Jalalvand, Alireza
    • Journal of the Korean Chemical Society
    • /
    • v.53 no.6
    • /
    • pp.693-703
    • /
    • 2009
  • The complex formation of anti-inflamatory drug piroxicam (PX, 4-hydroxy-2-methyl-N-2--pridyl-2H-1,2-benzothiazine-3-carboxadiamide-1,1-dioxide) with transition metal ions Co(II), Ni(II), Cu(II) and Zn(II) in methanol(MeOH)/water binary mixtures were studied by spectrophotometric method at 25$^{\circ}C$, constant pH = 5.0 and I = 0.1 M. The computer program SQUAD was used to extract the desired information from the spectral data. The outputs of the fitting processes were stability constants, standard deviations of the estimated stability constants, concentration distribution diagrams and spectral profiles of all species. The sequence of the stability constants of PX complexes with Co(II), Ni(II), Cu(II) and Zn(II) follow the Cu(II) > Co(II) > Ni(II) ${\approx}$ Zn(II) order. This may be due to different geometry tendencies of these metal ions. The acidity constants of the PX were also determined under above condition from its absorption spectra at different pH values. The computer program DATAN was used for determination of acidity constants of PX. The validity of the obtained acidity constants was checked by a well known computer program SPECFIT/32. The effects of the different parameters like solvent nature, cations characteristics on the stability and acidity constants were thoroughly discussed.

A Study for Hybrid Honeypot Systems (하이브리드 허니팟 시스템에 대한 연구)

  • Lee, Moon-Goo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.11
    • /
    • pp.127-133
    • /
    • 2014
  • In order to protect information asset from various malicious code, Honeypot system is implemented. Honeypot system is designed to elicit attacks so that internal system is not attacked or it is designed to collect malicious code information. However, existing honeypot system is designed for the purpose of collecting information, so it is designed to induce inflows of attackers positively by establishing disguised server or disguised client server and by providing disguised contents. In case of establishing disguised server, it should reinstall hardware in a cycle of one year because of frequent disk input and output. In case of establishing disguised client server, it has operating problem such as procuring professional labor force because it has a limit to automize the analysis of acquired information. To solve and supplement operating problem and previous problem of honeypot's hardware, this thesis suggested hybrid honeypot. Suggested hybrid honeypot has honeywall, analyzed server and combined console and it processes by categorizing attacking types into two types. It is designed that disguise (inducement) and false response (emulation) are connected to common switch area to operate high level interaction server, which is type 1 and low level interaction server, which is type 2. This hybrid honeypot operates low level honeypot and high level honeypot. Analysis server converts hacking types into hash value and separates it into correlation analysis algorithm and sends it to honeywall. Integrated monitoring console implements continuous monitoring, so it is expected that not only analyzing information about recent hacking method and attacking tool but also it provides effects of anticipative security response.

A Study on the Economic Efficiency of Capital Market (자본시장(資本市場)의 경제적(經濟的) 효율성(效率性)에 관한 연구(硏究))

  • Nam, Soo-Hyun
    • The Korean Journal of Financial Management
    • /
    • v.2 no.1
    • /
    • pp.55-75
    • /
    • 1986
  • This article is to analyse the economic efficiency of capital market, which plays a role of resource allocation in terms of financial claims such as stock and bond. It provides various contributions to the welfare theoretical aspects of modern capital market theory. The key feature that distinguishes the theory described here from traditional welfare theory is the presence of uncertainty. Securities has time dimensions and the state and outcome of the future are really uncertain. This problem resulting from this uncertainty can be solved by complete market, but it has a weak power to explain real stock market. Capital Market is faced with the uncertainity because it is a kind of incomplete market. Individuals and firms in capital market made their consumption-investment decision by their own criteria, i. e. the maximization of expected utility form intertemporal consumption and the maximization of the market value of firm. We noted that allocative decisions that had to be made in the economy could be naturally subdivided into two groups. One set of decisions concerned the allocation of first-period resources among consumption $C_i$, investment in risky firms $I_j$, and riskless investment M. The other decisions concern the distribution among individuals of income available in the second period $Y_i(\theta)$. Corresponing to this grouping, the theoretical analysis of efficiency has also been dichotomized. The optimality of the distribution of output in the second period is distributive efficiency" and the optimality of the allocation of first-period resources is 'the efficiency of investment'. We have found in the distributive efficiency that the conditions for attainability is the same as the conditions for market optimality. The necessary and sufficient conditions for attainability or market optimality is that (1) all utility functions are such that -$\frac{{U_i}^'(Y_i)}{{U_i}^"(Y_i)}={\mu}_i+{\lambda}Y_i$-linear risk tolerance function where the coefficients ${\mu}_i$ and $\lambda$ are independent of $Y_i$, and (2) there are homogeneous expectations, i. e. ${\Large f}_i(\theta)={\Large f}(\theta)$ for every i. On the other hand, the efficiency of investment has disagreement about optimal investment level. The investment level for market rule will not generally lead to Pareto-optimal allocation of investment. This suboptimality is caused by (1)the difference of Diamond's decomposable production function and mean-variance valuation model and (2) the selection of exelusive investment or competitive investment. In conclusion, this article has made an analysis of conditions and processes of Pareto-optimal allocation of resources in capital marker and tried to connect with significant issues in modern finance.

  • PDF

Development of Intelligent Load Balancing Algorithm in Application of Fuzzy-Neural Network (퍼지-뉴럴 네트워크를 응용한 지능형 로드밸런싱 알고리즘 개발)

  • Chu, Gyo-Soo;Kim, Wan-Yong;Jung, Jae-Yun;Kim, Hag-Bae
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.2B
    • /
    • pp.36-43
    • /
    • 2005
  • This paper suggests a method to effectively apply an application model of fuzzy-neural network to the optimal load distribution algorithm, considering the complication and non-linearity of the web server environment. We use the clustering web server in the linux system and it consists of a load balancer that distributes the network loads and some of real servers that processes the load and responses to the client. The previous works considered only with the scrappy decision information such as the connections. That is, since the distribution algorithm depends on the input of the whole network throughput, it was proved inefficient in terms of performance improvement of the web server. With the proposed algorithm, it monitors the whole states of both network input and output. Then, it infers CPU and memory states of each real server and effectively distributes the requests of the clients. In this paper, the proposed model is compared with the previous method through simulations and we analysis the results to develop the optimal and intelligent load balancing model.

Case of Service Design Process for Medical Space Focused on Users (사용자중심 의료공간을 위한 서비스디자인 프로세스의 적용사례)

  • Noh, Meekyung
    • Journal of The Korea Institute of Healthcare Architecture
    • /
    • v.21 no.4
    • /
    • pp.27-36
    • /
    • 2015
  • Purpose: Of late, the focus of service design is moving toward emphasizing customer satisfaction and taking users' experience more seriously. In addition to the change in perspective in service design, scholars in this area are paying more attention to service design methodology and process, as well as its theory and real-world case studies. In the case of medical space, there have been few studies in attempting to apply service design methods useful for deriving user-focused results. The author of this paper believes, however, case study-oriented approaches are more needed in this area rather than ones focusing on theoretical aspects. The author hopes thereby to expand the horizon to practical application of spatial design beyond service design methodology. Methods: In order to incorporate the strengths of service design methodology that can reflect a variety of user opinions, this study will introduce diverse tools in the framework of double diamond process. In addition, it will present field cases that successfully brought about best results in medical space design. It will end with summarizing the ideal process of medical space design which is reasonable and comprehensive. Results: Medical service encompasses preventive medicine as well as treatment of existing medical conditions. A study in establishing the platform of medical service design consists of a wide range of trend research, followed by the summary of two-matrix design classification based on results of the trend research. The draft of design process is divided into five stages composed of basic tools for establishing spatial flow lines created by matching service design tools with each stage of space design processes. In all this, most important elements to consider are communication and empathy. When service design is actually applied to space design, one can see that output has reflected the users' needs very well. The service design process for user-oriented medical space can thus be established by interactions on the final outcome and feedback on the results. Implications: One can see that the service design with the hospital at its center produces the result that encompasses the user's needs best. If the user-focused service design process for medical space can be extended to other space designs, the author believes that it would enhance the level of satisfaction for users and minimize trials and errors.

Recent Technological Advances in Optical Instruments and Future Applications for in Situ Stable Isotope Analysis of CH4 in the Surface Ocean and Marine Atmosphere (표층해수 내 용존 메탄 탄소동위원소 실시간 측정을 위한 광학기기의 개발 및 활용 전망)

  • PARK, MI-KYUNG;PARK, SUNYOUNG
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.23 no.1
    • /
    • pp.32-48
    • /
    • 2018
  • The mechanisms of $CH_4$ uptake into and release from the ocean are not well understood due mainly to complexity of the biogeochemical cycle and to lack of regional-scale and/or process-scale observations in the marine boundary layers. Without complete understanding of oceanic mechanisms to control the carbon balance and cycles on a various spatial and temporal scales, however, it is difficult to predict future perturbation of oceanic carbon levels and its influence on the global and regional climates. High frequency, high precision continuous measurements for carbon isotopic compositions from dissolved $CH_4$ in the surface ocean and marine atmosphere can provide additional information about the flux pathways and production/consumption processes occurring in the boundary of two large reservoirs. This paper introduces recent advances on optical instruments for real time $CH_4$ isotope analysis to diagnose potential applications for in situ, continuous measurements of carbon isotopic composition of dissolved $CH_4$. Commercially available, three laser absorption spectrometers - quantum cascade laser spectroscopy (QCLAS), off-axis integrated cavity output spectrometer (OA-ICOS), and cavity ring-down spectrometer (CRDS) are discussed in comparison with the conventional isotope ratio mass spectrometry (IRMS). Details of functioning and performance of a CRDS isotope instrument for atmospheric ${\delta}^{13}C-CH_4$ are also given, showing its capability to detect localized methane emission sources.

A Systemic Model for the Gifted Education (체제적인 영재교육을 위한 Renzulli의 전교 심화학습 모형(SEM)의 개성방안)

  • Park, Eun-Young
    • Journal of Gifted/Talented Education
    • /
    • v.10 no.2
    • /
    • pp.1-23
    • /
    • 2000
  • The Schoolwide Enrichment Model(SEM) is a representative model for the gifted education. As the model seems to be more conceptual in nature, it is hard to respond to the different interests and changing needs of the gifted learners. Also it does not provide specific procedures and prescriptions in teaching-learning processes for the teachers. Therefore, SEM needs to be modified into a Systemic Model that is more flexible and procedural. The paper proposes an Instructional Systems Design(ISD) model for SEM. The Systemic Model for SEM consists of 5 major steps. These are as follows: Planning, Diagnosis, Prescription, Implementation, Evaluation. In Planning step, there is a six-stage procedure for initiating the implementation of the SEM. In Diagnosis step, there are two-phases in identifying students for participation in the SEM and assessing strengths, interests, and talents of the learners and recording in The Total Talent Portfolio(TTP). In Prescription step, Curriculum Compacting is administered as a systematic procedure for modifying thecurriculum for above-average ability students. In Implementation step, Enrichment Learning and Teaching is an instructional strategy designed to promote active engagement in learning for teachers and students. Whenever each step has completed, Evaluation step should be followed. These 5 steps are repetitive, cycling and interactive. That is, each one becomes input for the next step, process for itself, and output for the previous step. Each step is monitored through the process of Review and Revision step. In conclusion, the paper suggests six strengths of the Systemic Model for SEM; The Model (1) provides the specific procedure in teaching-learning process; 92) has interactive relations with each component; (3) can be revised continuously for creation of the most effective system; (4) can be implemented more flexibly; (5) can be developed as an unique system for each school; (6) facilitates communications between teachers and students.

  • PDF

Life Cycle Assessment on Process of Wet Tissue Production (물티슈 제조공정의 전과정 평가)

  • Ahn, Joong Woo
    • Clean Technology
    • /
    • v.24 no.4
    • /
    • pp.269-274
    • /
    • 2018
  • In this study, Life Cycle Assessment (LCA) of wet tissue manufacturing process was performed. The wet tissue manufacturing process consists of preparation of wetting agent (chemical liquid), impregnation of nonwoven fabric into wetting agent and primary and secondary packaging. Data and information were collected on the input and output of the actual process from a certain company and the database of the Korea Ministry of Environment and some foreign countries (when Korean unavailable) were employed to connect the upper and the lower process flow. Based on the above and the potential environmental impacts of the wet tissue manufacturing process were calculated. As a result of the characterization, Ozone Layer Depletion (OD) is 3.46.E-06 kg $CFC_{11}$, Acidification (AD) is 5.11.E-01 kg $SO_2$, Abiotic Resource Depletion (ARD) is $3.52.E+00\;1yr^{-1}$, Global Warming (GW) is 1.04.E+02 kg $CO_2$, Eutrophication (EUT) is 2.31.E-02 kg ${PO_4}^{3-}$, Photochemical Oxide Creation (POC) was 2.22.E-02 kg $C_2H_4$, Human Toxicity (HT) was 1.55.E+00 kg 1,4 DCB and Terrestrial Ecotoxicity (ET) was 5.82.E-04 kg 1,4 DCB. In order to reduce the environmental impact of the manufacturing process, it is necessary to improve the overall process as other general cases and change the raw materials including packaging materials with less environmental impact. Conclusively, the energy consumed in the manufacturing process has emerged as a major issue, and this needs to be reconsidered other options such as alternative energy. Therefore, it is recommended that a process system should be redesigned to improve energy efficiency and to change to an energy source with lower environmental impact. Due to the nature of LCA, the final results of this study can be varied to some extent depending on the type of LCI DB employed and may not represent of all wet tissue manufacturing processes in the current industry.

Estimation of GHGs Emission to Improvement of Facility Efficiency in the Food wastewater Treatment Process (식품폐수처리시설의 설비효율 개선에 따른 온실가스 배출량 평가)

  • An, Sang-Hyung;Song, Jang-Heon;Kim, San;Chung, Jin-Do
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.2
    • /
    • pp.378-384
    • /
    • 2019
  • In the food wastewater treatment facilities, the water quality improvement effect and the greenhouse gas emission amount followed by the change in electricity usage through a change of the aeration tank ventilation system were evaluated. also, the amount of greenhouse gas emission followed by the change in electricity usage through the change of the sludge dewatering, storage, transporting method was also evaluated. The total GHG emission from food wastewater treatment facility improvement were divided into direct emissions from the treatment processes and indirect ones from electricity usage. The water quality improvement effect of wastewater treatment plant was found to be 63.3% for BOD removal rate, 42.0% for COD removal rate, 71.0% for SS removal rate and 39.6% for T-N removal rate. and according to the results of calculating output by applying both direct emissions of greenhouse gas (Scope 1) and the indirect emission (Scope 2) of greenhouse gas followed by changes in power consumption. It was estimated that there was a total of 276.0tCO2eq./yr(7.5%) greenhouse gas reduction effect from 3,668.8tCO2eq./yr before improvement to 3,392.8tCO2eq./yr after improvement. In this result is not due to the effects of water quality improvement of emission source, but because the reduction in electricity use has reduced the amount of greenhouse gas emissions.

Development of an Input File Preparation Tool for Offline Coupling of DNDC and DSSAT Models (DNDC 지역별 구동을 위한 입력자료 생성 도구 개발)

  • Hyun, Shinwoo;Hwang, Woosung;You, Heejin;Kim, Kwang Soo
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.23 no.1
    • /
    • pp.68-81
    • /
    • 2021
  • The agricultural ecosystem is one of the major sources of greenhouse gas (GHG) emissions. In order to search for climate change adaptation options which mitigate GHG emissions while maintaining crop yield, it is advantageous to integrate multiple models at a high spatial resolution. The objective of this study was to develop a tool to support integrated assessment of climate change impact b y coupling the DSSAT model and the DNDC model. DNDC Regional Input File Tool(DRIFT) was developed to prepare input data for the regional mode of DNDC model using input data and output data of the DSSAT model. In a case study, GHG emissions under the climate change conditions were simulated using the input data prepared b y the DRIFT. The time to prepare the input data was increased b y increasing the number of grid points. Most of the process took a relatively short time, while it took most of the time to convert the daily flood depth data of the DSSAT model to the flood period of the DNDC model. Still, processing a large amount of data would require a long time, which could be reduced by parallelizing some calculation processes. Expanding the DRIFT to other models would help reduce the time required to prepare input data for the models.