• Title/Summary/Keyword: Supercomputer Center

Search Result 47, Processing Time 0.028 seconds

A Study on the Infra-Capacity Analysis for Optimal Operating Environments of Supercomputer Center (슈퍼컴퓨터센터의 최적 운영환경을 위한 기반시설 용량 산정에 관한 연구)

  • Ryu, Young-Hee;Sung, Jin-Woo;Kim, Duk-Su;Kil, Seong-Ho
    • KIEAE Journal
    • /
    • v.10 no.2
    • /
    • pp.19-24
    • /
    • 2010
  • According to the increasing demands of supercomputer, an exclusive supercomputer building is requested to install a supercomputer for promoting high-end R&D as well as creating the public service infrastructure in the national level. KISTI, as a public supercomputer center with the 4th supercomputer (capacity of 360Tflops), is experiencing shortage of infrastructure systems, caused by increased capacity. Thus, it is anticipated that the situation will be growing serious when the 5th and 6th supercomputers will be installed. On this study, analyzed on the 5th supercomputer system through projecting performance level and optimal operating environments by assessing infra-capacity. Explored way to construct optimal operating environments through infrastructure-capacity analysis of supercomputer center. This study can be of use for reviewing KISTI's conditions as the only supercomputer center in Korea. In addition, it provides reference data for planning the new exclusive supercomputer center in terms of feasibility, while analyzing infrastructure systems.

A Study on the Rle and B/C Analysis of National Supported Supercomputing Center (국가 주도 슈퍼컴퓨터센터의 역할과 B/C 분석 및 발전방향)

  • 이정희
    • Journal of Korea Technology Innovation Society
    • /
    • v.1 no.3
    • /
    • pp.402-418
    • /
    • 1998
  • This study attempts to analysis of the 1'ole and B/C(Benefict/Cost)of National Supported Supercomputing Center in process of the promotion for informatization in Korea. ETRI Supercomputing Center, as National Supported Supercomputing Center, was established in 1967 as a laboratory of KIST(Korea Institute of Science and Technology). ETRI Supercomputer Center have acted a leading role as National HPCC(High Performance Computing and Communication) in Korea. The result of B/C analysis of En Supercomputer Center showed that it is twenty times benefit as many as cost for the last 30 years. As soon as possible, it was Suggested that ETRI Supercomputer Center must be developed as National Supercomputing Center(NSC).

  • PDF

Enabling Performance Intelligence for Application Adaptation in the Future Internet

  • Calyam, Prasad;Sridharan, Munkundan;Xu, Yingxiao;Zhu, Kunpeng;Berryman, Alex;Patali, Rohit;Venkataraman, Aishwarya
    • Journal of Communications and Networks
    • /
    • v.13 no.6
    • /
    • pp.591-601
    • /
    • 2011
  • Today's Internet which provides communication channels with best-effort end-to-end performance is rapidly evolving into an autonomic global computing platform. Achieving autonomicity in the Future Internet will require a performance architecture that (a) allows users to request and own 'slices' of geographically-distributed host and network resources, (b) measures and monitors end-to-end host and network status, (c) enables analysis of the measurements within expert systems, and (d) provides performance intelligence in a timely manner for application adaptations to improve performance and scalability. We describe the requirements and design of one such "Future Internet performance architecture" (FIPA), and present our reference implementation of FIPA called 'OnTimeMeasure.' OnTimeMeasure comprises of several measurement-related services that can interact with each other and with existing measurement frameworks to enable performance intelligence. We also explain our OnTimeMeasure deployment in the global environment for network innovations (GENI) infrastructure collaborative research initiative to build a sliceable Future Internet. Further, we present an applicationad-aptation case study in GENI that uses OnTimeMeasure-enabled performance intelligence in the context of dynamic resource allocation within thin-client based virtual desktop clouds. We show how a virtual desktop cloud provider in the Future Internet can use the performance intelligence to increase cloud scalability, while simultaneously delivering satisfactory user quality-of-experience.

Preferences for Supercomputer Resources Using the Logit Model

  • Hyungwook Shim;Jaegyoon Hahm
    • Journal of information and communication convergence engineering
    • /
    • v.21 no.4
    • /
    • pp.261-267
    • /
    • 2023
  • Public research, which requires large computational resources, utilizes the supercomputers of the National Supercomputing Center in the Republic of Korea. The average utilization rate of resources over the past three years reached 80%. Therefore, to ensure the operational stability of this national infrastructure, specialized centers have been established to distribute the computational demand concentrated in the national centers. It is necessary to predict the computational demand accurately to build an appropriate resource scale. Therefore, it is important to estimate the inflow and outflow of computational demand between the national and specialized centers to size the resources required to construct specialized centers. We conducted a logit model analysis using the probabilistic utility theory to derive the preferences of individual users for future supercomputer resources. This analysis shows that the computational demand share of specialized centers is 59.5%, which exceeds the resource utilization plan of existing specialized centers.

An Interface between Computing, Ecology and Biodiversity : Environmental Informatics

  • Stockwell, David;Arzberger, Peter;Fountain, Tony;Helly, John
    • The Korean Journal of Ecology
    • /
    • v.23 no.2
    • /
    • pp.101-106
    • /
    • 2000
  • The grand challenge for the 21$^{st$ century is to harness knowledge of the earth`s biological and ecological diversity to understand how they shape global environmental systems. This insight benefits both science and society. Biological and ecological data are among the most diverse and complex in the scientific realm. spanning vast temporal and spatial scales, distant localities. and multiple disciplines. Environmental informatics is an emerging discipline applying information science, ecology, and biodiversity to the understanding and solution of environmental problems. In this paper we give an overview of the experiences of the San Diego Supercomputer Center (SDSC) with this new multidisciplinary science, discuss the application of computing resources to the study of environmental systems, and outline strategic partnership activities in environmental iformatics that are underway, We hope to foster interactions between ecology, biodiversity, and conservation researchers in East Asia-Pacific Rim and those at SDSC and the Partnership for Biodiversity Informatics.

  • PDF

A Study on the Government's Investment Priorities for Building a Supercomputer Joint Utilization System

  • Hyungwook Shim;Jaegyoon Hahm
    • Asian Journal of Innovation and Policy
    • /
    • v.12 no.2
    • /
    • pp.200-215
    • /
    • 2023
  • The purpose of this paper is to analyze the Korean government's investment priorities for the establishment of a supercomputer joint utilization system using AHP. The AHP model was designed as a two-layered structure consisting of two areas of specialized infrastructure, a one-stop joint utilization system service, and four evaluation items for detailed tasks. For the weight of each evaluation item, a cost efficiency index considering the annual budget was developed for the first time and applied to the weight calculation process. AHP analysis conducted a survey targeting supercomputer experts and derived priorities with 22 data that had completed reliability verification. As a result of the analysis, the government's investment priority was high in the order of dividing infrastructure for each Specialized Center and building resources in stages. In the future, the analysis results will be used to select economic promotion plans and prepare strategies for the establishment of the government's supercomputer joint utilization system.

Power Control with Nearest Neighbor Nodes Distribution for Coexisting Wireless Body Area Network Based on Stochastic Geometry

  • Liu, Ruixia;Wang, Yinglong;Shu, Minglei;Zhao, Huiqi;Chen, Changfang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.11
    • /
    • pp.5218-5233
    • /
    • 2018
  • The coexisting wireless body area networks (WBAN) is a very challenging issue because of strong inter-networks interference, which seriously affects energy consumption and spectrum utilization ratio. In this paper, we study a power control strategy with nearest neighbor nodes distribution for coexisting WBAN based on stochastic geometry. Using homogeneous Poisson point processes (PPP) model, the relationship between the transmission power and the networks distribution is analytically derived to reduce interference to other devices. The goal of this paper is to increase the transmission success probability and throughput through power control strategy. In addition, we evaluate the area spectral efficiency simultaneously active WBAN in the same channel. Finally, extensive simulations are conducted to evaluate the power control algorithm.

The Analysis of the Supercomputer Trends in Weather and Climate Research Areas (기상 및 기후 연구 분야의 슈퍼컴퓨터 보유 추이 분석)

  • Joh, Minsu;Park, Hyei-Sun
    • Atmosphere
    • /
    • v.15 no.2
    • /
    • pp.119-127
    • /
    • 2005
  • It is challenging work to predict weather and climate conditions of the future in advance. Since ENIAC was developed, weather and climate research areas have been taking advantage of the improvements in computer hardware. High performance computers allows researchers to build high quality models that allow them to make good predictions of what might happen in the future. Statistics on the high performance computers are one of the major interest to not only manufacturers but also the users such as weather and climate researchers. For this reason, the Top500 Supercomputer Sites Report has been being released twice a year since 1993 to provide a reliable basis for tracking and detecting trends in high performance computing. Using the Top500 Report, a short review on the supercomputer trends in weather and climate research areas is provided in this article.

An Economic Analysis on the Operation Effect of Public Supercomputer (공용 슈퍼컴퓨터 운영효과에 대한 경제성 분석)

  • Lee, Hyung Jin;Choi, Youn Keun;Park, Jinsoo
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.23 no.4
    • /
    • pp.69-79
    • /
    • 2018
  • We performs the cost-benefit analysis, an economical analysis technique, to measure the effect of a shared public supercomputer. The costs of two given alternatives, to share the public supercomputer in a national center and to employ their own supercomputers in the organizations under the necessity, will be estimated and compared for decision making. In the case of sharing, we can simply predict the cost based on the results of the previous public supercomputer. The cost of individual introduction, however, is almost unpredictable since it has a remarkable variability due to the required system performances, locations, human factor, and so on. Accordingly, an objective and valid method to estimate the cost of individual cases will be proposed in this research. Finally, we analyze the economic effect of operating public supercomputer by comparing the sharing cost with that of the individual employs. The results of analysis confirms that the sharing public supercomputer will reduce the operational cost about 10.3 billion won annually compared with the individual introduction. Accordingly, it is expected that the sharing public supercomputer will bring a considerable economical effect.

A Study of the FEM Forming Analysis of the Al Power Forging Piston (유한요소해석을 이용한 알루미늄분말단조 피스톤 성형해석에 관한 연구)

  • Kim, Ho-Yoon;Park, Chul-Woo;Kim, Hyun-Il;Park, Kyung-Seo;Kim, Young-Ho;Joe, Ho-Sung
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.34 no.10
    • /
    • pp.1543-1548
    • /
    • 2010
  • Powder metallurgy processes are used to form Net-Shape products and have been widely used in the production of automobile parts to improve its manufacture productivity. Powder-forging technology is being developed rapidly because of its economic merits and because of the possibility of reducing the weight of automobile parts by replacing steel parts with aluminum ones, in particular while manufacturing automotive parts. In the powder-forging process, the products manufactured by powder metallurgy are forged in order to remove any pores inside them. Powderforging technology can help expand the applications of powder metallurgy; this is possible because powder-forging technology enables the minimization of flashes, reduction of the number of stages, and possible grain refinement. At present, powder forging is widely used for manufacturing primary mechanical parts as in combination with the technology of powder forging of aluminum alloy pistons.