• Title/Summary/Keyword: Supercomputer

Search Result 143, Processing Time 0.031 seconds

Visualization of Calculated Flow Fields Using Methods of Computer Graphics (컴퓨터 그래픽을 이용한 유동의 가시화)

  • Soon-Hung Han;Kyung-Ho Lee;Kyu-Ock Lee
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.29 no.4
    • /
    • pp.7-17
    • /
    • 1992
  • Developments in the emerging field of Computational Fluid Dynamics(CFD), which is made possible by the supercomputer technologies, introduce a new problem of analysing the massive amount of output produced. This problem is common to fields of computational science and engineering. Scientific visualization is to solve this problem by applying advanced technologies of computer graphics. Methods of scientific visualization are studded to visualize calculated flow fields. Different methods of scientific visualization has been surveyed, analysed and compared to select one method, iso-surface. Methods of constructing iso-surfaces from a 3-D data set have been studied. A new algorithm for constructing iso-surfaces has been developed. The algorithm can be classified as one of surface tiling methods. To develope a portable visualization system the international standard PHIGS PLUS and its implementation on X-Window system, PEX, have been selected as the development environment. A prototype of visualization system has been developed. The developed visualization system has been tried to visualize several well-known flow fields.

  • PDF

A "GAP-Model" based Framework for Online VVoIP QoE Measurement

  • Calyam, Prasad;Ekici, Eylem;Lee, Chang-Gun;Haffner, Mark;Howes, Nathan
    • Journal of Communications and Networks
    • /
    • v.9 no.4
    • /
    • pp.446-456
    • /
    • 2007
  • Increased access to broadband networks has led to a fast-growing demand for voice and video over IP(VVoIP) applications such as Internet telephony(VoIP), videoconferencing, and IP television(IPTV). For pro-active troubleshooting of VVoIP performance bottlenecks that manifest to end-users as performance impairments such as video frame freezing and voice dropouts, network operators cannot rely on actual end-users to report their subjective quality of experience(QoE). Hence, automated and objective techniques that provide real-time or online VVoIP QoE estimates are vital. Objective techniques developed to-date estimate VVoIP QoE by performing frame-to-frame peak-signal-to-noise ratio(PSNR) comparisons of the original video sequence and the reconstructed video sequence obtained from the sender-side and receiver-side, respectively. Since processing such video sequences is time consuming and computationally intensive, existing objective techniques cannot provide online VVoIP QoE. In this paper, we present a novel framework that can provide online estimates of VVoIP QoE on network paths without end-user involvement and without requiring any video sequences. The framework features the "GAP-model", which is an offline model of QoE expressed as a function of measurable network factors such as bandwidth, delay, jitter, and loss. Using the GAP-model, our online framework can produce VVoIP QoE estimates in terms of "Good", "Acceptable", or "Poor"(GAP) grades of perceptual quality solely from the online measured network conditions.

Authentication Method using Multiple Biometric Information in FIDO Environment (FIDO 환경에서 다중 생체정보를 이용한 인증 방법)

  • Chae, Cheol-Joo;Cho, Han-Jin;Jung, Hyun Mi
    • Journal of Digital Convergence
    • /
    • v.16 no.1
    • /
    • pp.159-164
    • /
    • 2018
  • Biometric information does not need to be stored separately, and there is no risk of loss and no theft. For this reason, it has been attracting attention as an alternative authentication means for existing authentication means such as passwords and authorized certificates. However, there may be a privacy problem due to leakage of personal information stored in the server. To overcome these weaknesses, FIDO solved the problem of leakage of personal information on the server by using biometric information stored on the user device and authenticating. In this paper, we propose a multiple biometric authentication method that can be used in FIDO environment. In order to utilize multiple biometric information, fingerprints and EEG signals can be generated and used in FIDO system. The proposed method can solve the problem due to limitations of existing 2-factor authentication system by authentication using multiple biometric information.

User-friendly Web-based ezSIM Platform Development for SMBs (중소·중견기업을 위한 사용자 친화형 웹 기반 ezSIM 플랫폼 개발)

  • Yoon, Tae Ho;Park, Hyungwook;Sohn, Ilyoup;Hwang, Jae Soon;Seo, Dongwoo
    • Korean Journal of Computational Design and Engineering
    • /
    • v.20 no.1
    • /
    • pp.65-74
    • /
    • 2015
  • Structure and/or fluid analysis is gradually increased by an essential design process in the small and medium-sized businesses (SMBs) because of the needs for a rapid design process and the certification about the supplement of the parts by the large business (LB). In this paper, we developed the web-based ezSIM platform installed in the resources integrated system server. The ezSIM platform is based on the heterogeneous linux and windows operating system for the user-friendly connection with the part of the analysis for the SMBs. The procedure of the structure/fluid analysis service module using the public software and the license-free open code in the ezSIM platform was explained. The convenience of the ezSIM platform service was presented by the reaction rate of the graphic motion compared with that of a local PC and the solving and pre-post processing interface compared with that of the KISTI supercomputer. The web-based ezSIM platform service was identified as a useful and essential platform to the SMBs for the usage of the structure and/or fluid analysis procedure.

Calculation of Sputter Yield using Monte Carlo Techniques (몬테카를로 방식에 의한 스퍼터율 계산에 관한 연구)

  • 반용찬;이제희;원태영
    • Journal of the Korean Institute of Telematics and Electronics D
    • /
    • v.35D no.12
    • /
    • pp.59-67
    • /
    • 1998
  • In this paper, a rigorous three-dimensional Monte Carlo approach to simulate the sputter yield as a function of the incident ion energy and the incident angle as well as the atomic ejection distribution of the target is presented. The sputter yield of the target atom (Cu, Al) has been calculated for the different species of the incident atoms with the incident energy range of 10 eV ~ 100 KeV, which coincides with the previously reported experimental results. According to the simulation results, the calculated sputter yield tends to increase with the amount of the energy of the incident atoms. Our simulation revealed that the maximum sputter yield can be obtained for the incident atom with 10 KeV for the heavy ion, while the maximum sputter yield for the light ion is for the incident atoms with an energy less than 1 KeV. The sputter yield increases with angle of incidence and seems to have the maximum value at 68$^{\circ}$. For angular distributions of the sputtered particle, the atoms in the direction normal to the surface increase with angle of incidence. Furthermore, we has conducted the parallel computation on CRAY T3E supercomputer and built a GUI(Graphic User Interface) system running the sputter simulator.

  • PDF

EDISON Platform to Supporting Education and Integration Research in Computational Science (계산과학 분야의 교육 및 융합연구 지원을 위한 EDISON 플랫폼)

  • Jin, Du-Seok;Jung, Young-Jin;Lee, Jong-Suk Ruth;Cho, Kum-Won;Jung, Hoe-Kyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.466-469
    • /
    • 2011
  • Recently, a new theoretical and methodological approach for computational science is becoming more and more popular for analyzing and solving scientific problems in various scientific disciplines such as Computational fluid dynamics, Chemistry, Physics, Structural Dynamics, Computational Design and applied research. Computational science is a field of study concerned with constructing mathematical models and quantitative analysis techniques and using large computing resources to solve the problems which are difficult to approach in a physical experimentally. In this paper, we present R&D of EDISON open integration platform that allows anyone like professors, researchers, industrial workers, students etc to upload their advanced research result such as simulation SW to use and share based on the cyber infrastructure of supercomputer and network. EDISON platform, which consists of 3 tiers (EDISON application framework, EDISON middleware, and EDISON infra resources) provides Web portal for education and research in 5 areas (CFD, Chemistry, Physics, Structural Dynamics, Computational Design) and user service.

  • PDF

Work Allocation Methods and Performance Comparisons on the Virtual Parallel Computing System based on the IBM Aglets (IBM Aglets를 기반으로 하는 가상 병렬 컴퓨팅 시스템에서 작업 할당 기법과 성능 비교)

  • Kim, Kyong-Ha;Kim, Young-Hak;Oh, Gil-Ho
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.8 no.4
    • /
    • pp.411-422
    • /
    • 2002
  • Recently, there have been active researches about the VPCS (Virtual Parallel Computing System) based on multiple agents. The PVCS uses personal computers or workstations that are dispersed all over the internet, rather than a high-cost supercomputer, to solve complex problems that require a huge number of calculations. It can be made up with either homogeneous or heterogeneous computers, depending on resources available on the internet. In this paper, we propose a new method in order to distribute worker agents and work packages efficiently on the VPCS based on the IBM Aglets. The previous methods use mainly the master-slave pattern for distributing worker agents and work packages. However, in these methods the workload increases dramatically at the central master as the number of agents increases. As a solution to this problem, our method appoints worker agents to distribute worker agents and workload packages. The proposed method is evaluated in several ways on the VPCS, and its results are improved to be worthy of close attention as compared with the previous ones.

Deployment and Performance Analysis of Data Transfer Node Cluster for HPC Environment (HPC 환경을 위한 데이터 전송 노드 클러스터 구축 및 성능분석)

  • Hong, Wontaek;An, Dosik;Lee, Jaekook;Moon, Jeonghoon;Seok, Woojin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.9 no.9
    • /
    • pp.197-206
    • /
    • 2020
  • Collaborative research in science applications based on HPC service needs rapid transfers of massive data between research colleagues over wide area network. With regard to this requirement, researches on enhancing data transfer performance between major superfacilities in the U.S. have been conducted recently. In this paper, we deploy multiple data transfer nodes(DTNs) over high-speed science networks in order to move rapidly large amounts of data in the parallel filesystem of KISTI's Nurion supercomputer, and perform transfer experiments between endpoints with approximately 130ms round trip time. We have shown the results of transfer throughput in different size file sets and compared them. In addition, it has been confirmed that the DTN cluster with three nodes can provide about 1.8 and 2.7 times higher transfer throughput than a single node in two types of concurrency and parallelism settings.

An Environmental Impact Assessment System for Microscale Winds Based on a Computational Fluid Dynamics Model (전산유체역학모형에 근거한 미기상 바람환경 영향평가 시스템)

  • Kim, Kyu Rang;Koo, Hae Jung;Kwon, Tae Heon;Choi, Young-Jean
    • Journal of Environmental Impact Assessment
    • /
    • v.20 no.3
    • /
    • pp.337-348
    • /
    • 2011
  • Urban environmental problem became one of major issues during its urbanization processes. Environmental impacts are assessed during recent urban planning and development. Though the environmental impact assessment considers meteorological impact as a minor component, changes in wind environment during development can largely affect the distribution pattern of air temperature, humidity, and pollutants. Impact assessment of local wind is, therefore, a major element for impact assessment prior to any other meteorological impact assessment. Computational Fluid Dynamics (CFD) models are utilized in various fields such as in wind field assessment during a construction of a new building and in post analysis of a fire event over a mountain. CFD models require specially formatted input data and produce specific output files, which can be analyzed using special programs. CFD's huge requirement in computing power is another hurdle in practical use. In this study, a CFD model and related software processors were automated and integrated as a microscale wind environmental impact assessment system. A supercomputer system was used to reduce the running hours of the model. Input data processor ingests development plans in CAD or GIS formatted files and produces input data files for the CFD model. Output data processor produces various analytical graphs upon user requests. The system was used in assessing the impacts of a new building near an observatory on wind fields and showed the changes by the construction visually and quantitatively. The microscale wind assessment system will evolve, of course, incorporating new improvement of the models and processors. Nevertheless the framework suggested here can be utilized as a basic system for the assessment.

Effective Distributed Supercomputing Resource Management for Large Scale Scientific Applications (대규모 과학응용을 위한 효율적인 분산 슈퍼컴퓨팅 자원관리 기술 연구)

  • Rho, Seungwoo;Kim, Jik-Soo;Kim, Sangwan;Kim, Seoyoung;Hwang, Soonwook
    • Journal of KIISE
    • /
    • v.42 no.5
    • /
    • pp.573-579
    • /
    • 2015
  • Nationwide supercomputing infrastructures in Korea consist of geographically distributed supercomputing clusters. We developed High-Throughput Computing as a Service(HTCaaS) based on these distributed national supecomputing clusters to facilitate the ease at which scientists can explore large-scale and complex scientific problems. In this paper, we present our mechanism for dynamically managing computing resources and show its effectiveness through a case study of a real scientific application called drug repositioning. Specifically, we show that the resource utilization, accuracy, reliability, and usability can be improved by applying our resource management mechanism. The mechanism is based on the concepts of waiting time and success rate in order to identify valid computing resources. The results show a reduction in the total job completion time and improvement of the overall system throughput.