• Title/Summary/Keyword: Green Computing

Search Result 112, Processing Time 0.024 seconds

Calculation of Wave-making Resistance using Neumann-Kelvin Theory (Neumann-Kelvin 이론을 사용한 조파저항 계산)

  • S.J. Kim;S.J. Lee
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.29 no.3
    • /
    • pp.71-79
    • /
    • 1992
  • In order to obtain the wave-making resistance of a ship, so-called the Neumann-Kelvin problem is solved numerically. For computing the Havelock source, which is the Green's function of the problem, we adopted the methods given by Newman(1987) for the term representing the local disturbance, and Baar and Price(1988) for the wave disturbance, respectively. In the numerical code we developed, the source strength is assumed as bilinear on each panel and continuous throughout the hull surface. The wave-making resistance is calculated using the algorithm of de Sendagorta and erases(1988), which makes use of the wave amplitude far downstream. The Wigley hull was chosen for the sample calculation, and our results showed a good agreement with other existing experimental and numerical results.

  • PDF

A New N-time Systolic Array Architecture for the Vector Median Filter (N-time 시스톨릭 어레이 구조를 가지는 벡터 미디언 필터의 하드웨어 아키텍쳐)

  • Yang, Yeong-Yil
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.8 no.4
    • /
    • pp.293-296
    • /
    • 2007
  • In this paper, we propose the systolic array architecture for the vector median filter. In the color image processing, the vector signal (i.e. the color) consists of three elements, red, green and blue. The vector median filter is very effective to utilize the correlation among red, green and blue elements. The computational complexity of the proposed architecture for computing the vector median of N vector signals is (N+2) clock periods compared to the (3N+1) clock periods in the previous method. In addition to, the input vector signals can be loaded in serial in the proposed architecture. In the previous method, N input vector signals should be loaded to the vector median filter in parallel at the first clock. The proposed architecture is implemented with FPGA.

  • PDF

Development of a Real-time Medical Imaging System Combined with Laser Speckle Contrast Imaging and Fluorescence Imaging (형광과 레이저 스펙클 대조도 이미징을 결합한 실시간 의료영상 시스템 개발)

  • Shim, Min Jae;Kim, Yikeun;Ko, Taek Yong;Choi, Jin Hyuk;Ahn, Yeh-Chan
    • Journal of Biomedical Engineering Research
    • /
    • v.42 no.3
    • /
    • pp.116-124
    • /
    • 2021
  • It is important to differentiate between the target tissue (or organ) and the rest of the tissue before incision during surgery. And when it is necessary to preserve the differentiated tissues, the blood vessels connected to the tissue must be preserved together. Various non-invasive medical imaging methods have been developed for this purpose. We aimed to develop a medical imaging system that can simultaneously apply fluorescence imaging using indocyanine green (ICG) and laser speckle contrast imaging (LSCI) using laser speckle patterns. We designed to collect images directed to the two cameras on a co-axial optical path and to compensate equal optical path length for two optical designs. The light source used for fluorescence and LSCI the same 785 nm wavelength. This system outputs real-time images and is designed to intuitively distinguish target tissues or blood vessels. This system outputs LSCI images up to 37 fps through parallel processing. Fluorescence for ICG and blood flow in animal models were observed throughout the experiment.

Deriving Strategic Priorities of Green ICT Policy using AHP and ANP (AHP와 ANP 방법론을 이용한 그린 ICT 정책의 전략적 우선순위 도출 방안)

  • Shim, Yong-Ho;Byun, Gi-Seob;Lee, Bong-Gyou
    • Journal of Internet Computing and Services
    • /
    • v.12 no.1
    • /
    • pp.85-98
    • /
    • 2011
  • Recently, the world faces a global environmental crisis by the increase of energy consumption and global warming. Since the crisis directly affects political, economic, social, and environmental areas, many countries prepare Green ICT policy to overcome it. However, although Green IT policy provides many benefits by solving environmental pollution and increasing energy efficiency, Korean government did not prepare measures by the policy. The purpose of this study is to suggest priorities of political goals for maximizing the efficiency after introducing Green ICT policy in Korea. Major variables are drawn for the analysis, and they are eco-friendliness, technology evolution, economic efficiency, energy efficiency, and stable supply of energy. The variables are suggested based on 'Low Carbon, Green Growth Act', then the survey was conducted to policy expert using AHP(Analytic Hierarchy Process) and ANP(Analytic Network Process) for prioritizing variables. As a result of the AHP, it is derived in the order of eco-friendliness, technology evolution, economic efficiency, energy efficiency, and stable supply of energy. The ANP result shows in the order of technology evolution, energy efficiency, economic efficiency, eco-friendliness, and stable supply of energy. The research is conducted to analyze the priorities of goals for Green IT policy, and the analysis results are possible to use as a practical guideline for establishing associated policies in the future.

Performance and Energy Oriented Resource Provisioning in Cloud Systems Based on Dynamic Thresholds and Host Reputation (클라우드 시스템에서 동적 임계치와 호스트 평판도를 기반으로 한 성능 및 에너지 중심 자원 프로비저닝)

  • Elijorde, Frank I.;Lee, Jaewan
    • Journal of Internet Computing and Services
    • /
    • v.14 no.5
    • /
    • pp.39-48
    • /
    • 2013
  • A cloud system has to deal with highly variable workloads resulting from dynamic usage patterns in order to keep the QoS within the predefined SLA. Aside from the aspects regarding services, another emerging concern is to keep the energy consumption at a minimum. This requires the cloud providers to consider energy and performance trade-off when allocating virtualized resources in cloud data centers. In this paper, we propose a resource provisioning approach based on dynamic thresholds to detect the workload level of the host machines. The VM selection policy uses utilization data to choose a VM for migration, while the VM allocation policy designates VMs to a host based on its service reputation. We evaluated our work through simulations and results show that our work outperforms non-power aware methods that don't support migration as well as those based on static thresholds and random selection policy.

Computation of Green's Tensor Integrals in Three-Dimensional Magnetotelluric Modeling Using Integral Equations (적분방정식을 사용한 3차원 MT 모델링에서의 텐서 그린 적분의 계산)

  • Kim, Hee Joon;Lee, Dong Sung
    • Economic and Environmental Geology
    • /
    • v.27 no.1
    • /
    • pp.41-47
    • /
    • 1994
  • A fast Hankel transform (FHT) algorithm (Anderson, 1982) is applied to numerical evaluation of many Green's tensor integrals encountered in three-dimensional electromagnetic modeling using integral equations. Efficient computation of Hankel transforms is obtained by a combination of related and lagged convolutions which are available in the FHT. We express Green's tensor integrals for a layered half-space, and rewrite those to a form of related functions so that the FHT can be applied in an efficient manner. By use of the FHT, a complete or full matrix of the related Hankel transform can be rapidly and accurately calculated for about the same computation time as would be required for a single direct convolution. Computing time for a five-layer half-space shows that the FHT is about 117 and 4 times faster than conventional direct and multiple lagged convolution methods, respectively.

  • PDF

A Study on Service Enhancement using Information Demand Analysis of Green Technologies (녹색기술 정보수요 분석을 통한 서비스 개선 방안 연구)

  • Suh, Min-Ho;Lee, Hoo-Min;Lee, Il-Hyung;Kwon, Young-Il
    • Journal of Internet Computing and Services
    • /
    • v.13 no.1
    • /
    • pp.117-124
    • /
    • 2012
  • Recently, the trend of open innovation and web 2.0 change the information portal services. The importance of using information demand analysis is emphasized in planning service strategy. The issues of the study are how we can figure out the demand and how we can use it in enhancing service level. This study aims at presenting the result of green technology information users' interests and the service enhancement directions. For the information types, additive functions, open innovation needs, the relation between users' organization, main job title, and the involved technological area are analysed. Among them, the involved technological area of the user is correlated to the needs, so we illustrate the meaning of planning service strategy according to the technological fields. The main contribution of the study can be the customized service strategy proposal using the users' demand which can be different from one technology field to another.

Three-dimensional Cross-hole EM Modeling using the Extended Born Approximation (확장 Born 근사에 의한 시추공간 3차원 전자탐사 모델링)

  • Lee, Seong-Kon;Kim, Hee-Joon;Suh, Jung-Hee
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.2
    • /
    • pp.86-95
    • /
    • 1999
  • This paper presents an efficient three-dimensional (3-D) modeling algorithm using the extended approximation to an electric field integral equation. Numerical evaluations of Green's tensor integral are performed in the spatial wavenumber domain. This approach makes it possible to reduce computing time, to handle smoothly varying conductivity model and to remove singularity problems encountered in the integration of Green's tensor at a source point. The responses obtained by 3-D modeling algorithm developed in this study are compared with those by the full integral equation for a thin-sheet EM scattering. The extensive analyses on the performance of modeling algorithm are made with the conductivity contrasts and source frequencies. These results show that the modeling algorithm are accurate up to the conductivity contrast of 1:16 and the frequency range of 100 Hz-100 kHz. The extended Born approximation, however, may produce inaccurate results for some source and model configurations in which the electric field is discontinuous across the conductivity boundary. We performed the modeling of a composite model of which conductivity varies continuously and this shows the modeling algorithm developed in this study is efficient for 3-D EM modeling. For a cross-hole source-receiver configuration a composite model of which conductivity varies continuously can be successfully simulated using this algorithm.

  • PDF

A Resource Scheduling Algorithm Using Node Clustering in VDI Environment (VDI 환경에서 클러스터링을 이용한 자원 스케줄링 알고리즘)

  • Seo, Kyung-Seok;Lee, Bong-Hwan
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2012.05a
    • /
    • pp.360-363
    • /
    • 2012
  • Recently green IT is considered as an essential element due to continuous consumption of energy and abrupt oil price. Thus, IT infrastructure is being replaced with cloud computing platform in oder to reduce server heat and energy consumption of data centers. In this paper, we implement an open source-based cloud platform and propose a resource scheduling algorithm for cloud VDI service using node clustering.

  • PDF

Enhanced Prediction Algorithm for Near-lossless Image Compression with Low Complexity and Low Latency

  • Son, Ji Deok;Song, Byung Cheol
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.2
    • /
    • pp.143-151
    • /
    • 2016
  • This paper presents new prediction methods to improve compression performance of the so-called near-lossless RGB-domain image coder, which is designed to effectively decrease the memory bandwidth of a system-on-chip (SoC) for image processing. First, variable block size (VBS)-based intra prediction is employed to eliminate spatial redundancy for the green (G) component of an input image on a pixel-line basis. Second, inter-color prediction (ICP) using spectral correlation is performed to predict the R and B components from the previously reconstructed G-component image. Experimental results show that the proposed algorithm improves coding efficiency by up to 30% compared with an existing algorithm for natural images, and improves coding efficiency with low computational cost by about 50% for computer graphics (CG) images.