• Title/Summary/Keyword: file distribution

Search Result 183, Processing Time 0.027 seconds

Design and Implementation of Game Server using the Efficient Load Balancing Technology based on CPU Utilization (게임서버의 CPU 사용율 기반 효율적인 부하균등화 기술의 설계 및 구현)

  • Myung, Won-Shig;Han, Jun-Tak
    • Journal of Korea Game Society
    • /
    • v.4 no.4
    • /
    • pp.11-18
    • /
    • 2004
  • The on-line games in the past were played by only two persons exchanging data based on one-to-one connections, whereas recent ones (e.g. MMORPG: Massively Multi-player Online Role-playings Game) enable tens of thousands of people to be connected simultaneously. Specifically, Korea has established an excellent network infrastructure that can't be found anywhere in the world. Almost every household has a high-speed Internet access. What made this possible was, in part, high density of population that has accelerated the formation of good Internet infrastructure. However, this rapid increase in the use of on-line games may lead to surging traffics exceeding the limited Internet communication capacity so that the connection to the games is unstable or the server fails. expanding the servers though this measure is very costly could solve this problem. To deal with this problem, the present study proposes the load distribution technology that connects in the form of local clustering the game servers divided by their contents used in each on-line game reduces the loads of specific servers using the load balancer, and enhances performance of sewer for their efficient operation. In this paper, a cluster system is proposed where each Game server in the system has different contents service and loads are distributed efficiently using the game server resource information such as CPU utilization. Game sewers having different contents are mutually connected and managed with a network file system to maintain information consistency required to support resource information updates, deletions, and additions. Simulation studies show that our method performs better than other traditional methods. In terms of response time, our method shows shorter latency than RR (Round Robin) and LC (Least Connection) by about 12%, 10% respectively.

  • PDF

제주도 지하수자원의 최적 개발가능량 선정에 관한 수리지질학적 연구

  • 한정상;김창길;김남종;한규상
    • Proceedings of the Korean Society of Soil and Groundwater Environment Conference
    • /
    • 1994.07a
    • /
    • pp.184-215
    • /
    • 1994
  • The Hydrogeologic data of 455 water wells comprising geologic and aquifer test were analyzed to determine hydrogeoloic characteristics of Cheju island. The groundwater of Cheju island is occurred in unconsolidated pyroclastic deposits interbedded in highly jointed basaltic and andesic rocks as high level, basal and parabasal types order unconfined condition. The average transmissivity and specific yield of the aquifer are at about 29,300m$^2$/day and 0.12 respectively. The total storage of groundwater is estimated about 44 billion cubic meters(m$^3$). Average annual precipitation is about 3390 million m$^3$ among which average recharge amount is estimated 1494 million m$^3$ equivalent 44.1% of annual precipitation with 638 million m$^3$ of runoff and 1256 million m$^3$ of evapotranspiration. Based on groundwater budget analysis, the sustainable yield is about 620 million m$^3$(41% of annual recharge)and rest of it is discharging into the sea. The geologic logs of recently drilled thermal water wens indicate that very low-permeable marine sediments(Sehwa-ri formation) composed of loosely cemented sandy sat derived from mainly volcanic ashes, at the 1st stage volcanic activity of the area was situated at the 120$\pm$68m below sea level. And also the other low-permeable sedimentary rock called Segipo-formation which is deemed younger than former marine sediment is occured at the area covering north-west and western part of Cheju at the $\pm$70m below sea level. If these impermeable beds are distributed as a basal formation of fresh water zone of Cheju, most of groundwater in Cheju will be para-basal type. These formations will be one of the most important hydrogeologic boundary and groundwater occurences in the area.

  • PDF

Assessment for the Utility of Treatment Plan QA System according to Dosimetric Leaf Gap in Multileaf Collimator (다엽콜리메이터의 선량학적엽간격에 따른 치료계획 정도관리시스템의 효용성 평가)

  • Lee, Soon Sung;Choi, Sang Hyoun;Min, Chul Kee;Kim, Woo Chul;Ji, Young Hoon;Park, Seungwoo;Jung, Haijo;Kim, Mi-Sook;Yoo, Hyung Jun;Kim, Kum Bae
    • Progress in Medical Physics
    • /
    • v.26 no.3
    • /
    • pp.168-177
    • /
    • 2015
  • For evaluating the treatment planning accurately, the quality assurance for treatment planning is recommended when patients were treated with IMRT which is complex and delicate. To realize this purpose, treatment plan quality assurance software can be used to verify the delivered dose accurately before and after of treatment. The purpose of this study is to evaluate the accuracy of treatment plan quality assurance software for each IMRT plan according to MLC DLG (dosimetric leaf gap). Novalis Tx with a built-in HD120 MLC was used in this study to acquire the MLC dynalog file be imported in MobiusFx. To establish IMRT plan, Eclipse RTP system was used and target and organ structures (multi-target, mock prostate, mock head/neck, C-shape case) were contoured in I'mRT phantom. To verify the difference of dose distribution according to DLG, MLC dynalog files were imported to MobiusFx software and changed the DLG (0.5, 0.7, 1.0, 1.3, 1.6 mm) values in MobiusFx. For evaluation dose, dose distribution was evaluated by using 3D gamma index for the gamma criteria 3% and distance to agreement 3 mm, and the point dose was acquired by using the CC13 ionization chamber in isocenter of I'mRT phantom. In the result for point dose, the mock head/neck and multi-target had difference about 4% and 3% in DLG 0.5 and 0.7 mm respectively, and the other DLGs had difference less than 3%. The gamma index passing-rate of mock head/neck were below 81% for PTV and cord, and multi-target were below 30% for center and superior target in DLGs 0.5, 0.7 mm, however, inferior target of multi-target case and parotid of mock head/neck case had 100.0% passing rate in all DLGs. The point dose of mock prostate showed difference below 3.0% in all DLGs, however, the passing rate of PTV were below 95% in 0.5, 0.7 mm DLGs, and the other DLGs were above 98%. The rectum and bladder had 100.0% passing rate in all DLGs. As the difference of point dose in C-shape were 3~9% except for 1.3 mm DLG, the passing rate of PTV in 1.0 1.3 mm were 96.7, 93.0% respectively. However, passing rate of the other DLGs were below 86% and core was 100.0% passing rate in all DLGs. In this study, we verified that the accuracy of treatment planning QA system can be affected by DLG values. For precise quality assurance for treatment technique using the MLC motion like IMRT and VMAT, we should use appropriate DLG value in linear accelerator and RTP system.

Estimation of Disease Code Accuracy of National Medical Insurance Data and the Related Factors (의료보험자료 상병기호의 정확도 추정 및 관련 특성 분석 -법정전염병을 중심으로-)

  • Shin, Eui-Chul;Park, Yong-Mun;Park, Yong-Gyu;Kim, Byung-Sung;Park, Ki-Dong;Meng, Kwang-Ho
    • Journal of Preventive Medicine and Public Health
    • /
    • v.31 no.3 s.62
    • /
    • pp.471-480
    • /
    • 1998
  • This study was undertaken in order to estimate the accuracy of disease code of the Korean National Medical Insurance Data and disease the characteristics related to the accuracy. To accomplish these objectives, 2,431 cases coded as notifiable acute communicable diseases (NACD) were randomly selected from 1994 National Medical Insurance data file and family medicine specialists reviewed the medical records to confirm the diagnostic accuracy and investigate the related factors. Major findings obtained from this study are as follows : 1. The accuracy rate of disease code of NACD in National Medical Insurance data was very low, 10.1% (95% C.I. : 8.8-11.4). 2. The reasons of inaccuracy in disease code were 1) claiming process related administrative error by physician and non-physician personnel in medical institutions (41.0%), 2) input error of claims data by key punchers of National Medical Insurer (31.3%) and 3) diagnostic error by physicians (21.7%). 3. Characteristics significantly related with lowering the accuracy of disease code were location and level of the medical institutions in multiple logistic regression analysis. Medical institutions in Seoul showed lower accuracy than those in Kyonngi, and so did general hospitals, hospitals and clinics than tertiary hospitals. Physician related characteristics significantly lowering disease code accuracy of insurance data were sex, age group and specialty. Male physicians showed significantly lower accuracy than female physicians; thirties and fortieg age group also showed significantly lower accuracy than twenties, and so did general physicians and other specialists than internal medicine/pediatric specialists. This study strongly suggests that a series of policies like 1) establishment of peer review organization of National Medical Insurance data, 2) prompt nation-wide expansion of computerized claiming network of National Medical Insurance and 3) establishment and distribution of objective diagnostic criteria to physicians are necessary to set up a national disease surveillance system utilizing National Medical Insurance claims data.

  • PDF

Development of Information Technology Infrastructures through Construction of Big Data Platform for Road Driving Environment Analysis (도로 주행환경 분석을 위한 빅데이터 플랫폼 구축 정보기술 인프라 개발)

  • Jung, In-taek;Chong, Kyu-soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.3
    • /
    • pp.669-678
    • /
    • 2018
  • This study developed information technology infrastructures for building a driving environment analysis platform using various big data, such as vehicle sensing data, public data, etc. First, a small platform server with a parallel structure for big data distribution processing was developed with H/W technology. Next, programs for big data collection/storage, processing/analysis, and information visualization were developed with S/W technology. The collection S/W was developed as a collection interface using Kafka, Flume, and Sqoop. The storage S/W was developed to be divided into a Hadoop distributed file system and Cassandra DB according to the utilization of data. Processing S/W was developed for spatial unit matching and time interval interpolation/aggregation of the collected data by applying the grid index method. An analysis S/W was developed as an analytical tool based on the Zeppelin notebook for the application and evaluation of a development algorithm. Finally, Information Visualization S/W was developed as a Web GIS engine program for providing various driving environment information and visualization. As a result of the performance evaluation, the number of executors, the optimal memory capacity, and number of cores for the development server were derived, and the computation performance was superior to that of the other cloud computing.

Distribution and Potential Human Risk Assessment of Trace Metals in Benthic Fish Collected from the Offshore of Busan, Korea (부산 연근해 저서어류 체내의 미량금속 분포 특성과 잠재적 인체 위해성 평가)

  • Choi, Jin Young;Kim, Kyoungrean
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.37 no.6
    • /
    • pp.349-356
    • /
    • 2015
  • Trace metals concentrations in the tissue of edible marine fish (4 species), olive flounder (Paralichthys olivaceus), Korean rockfish (Sebastes schlegelii), file fish (Stephanolepis cirrhifer) and abbysal searobin (Lepidotrigla abyssalis), collected near the Yongho wharf in Busan were determined to assess the potential human health risk (HRA) of trace metals by fish consumption. Levels of Li, Cr, Ni, Cu, Zn, As, Cd, and Pb in the fish tissue were $0.005{\pm}0.009$, $0.77{\pm}0.30$, $0.29{\pm}0.34$, $0.49{\pm}0.14$, $15.96{\pm}2.52$, $10.62{\pm}4.67$, $0.001{\pm}0.002$, and $0.045{\pm}0.06mg/kgdw$ respectively. The estimated daily intakes of Cu and Zn and the estimated weekly intakes of As, Cd, and Pb from the fish collected near the Yongho wharf were 0.0032, 0.054-0.18% of PMTDI (provisional maximum tolerable daily intake) and 13, 0.0041, 0.020% of PTWI (provisional tolerable weekly intake) which were set to evaluate the food safeties by the JFCFA (The Joint FAO/WHO Expert Committee on Food Additives). Lifetime cancer risk and target hazard for local residents due to those fish consumption were found to be negligible.

Overlay Multicast for File Distribution using Virtual Sources (파일전송의 성능향상을 위한 다중 가상소스 응용계층 멀티캐스트)

  • Lee Soo-Jeon;Lee Dong-Man;Kang Kyung-Ran
    • Journal of KIISE:Information Networking
    • /
    • v.33 no.4
    • /
    • pp.289-298
    • /
    • 2006
  • Algorithms for application-level multicast often use trees to deliver data from the source to the multiple receivers. With the tree structure, the throughput experienced by the descendant nodes will be determined by the performance of the slowest ancestor node. Furthermore, the failure of an ancestor node results in the suspension of the session of all the descendant nodes. This paper focuses on the transmission of data using multiple virtual forwarders, and suggests a scheme to overcome the drawbacks of the plain tree-based application layer multicast schemes. The proposed scheme elects multiple forwarders other than the parent node of the delivery tree. A receiver receives data from the multiple forwarders as well as the parent node and it can increase the amount of receiving data per time unit. The multiple forwarder helps a receiver to reduce the impact of the failure of an ancestor node. The proposed scheme suggests the forwarder selection algorithm to avoid the receipt of duplicate packets. We implemented the proposed scheme using MACEDON which provides a development environment for application layer multicast. We compared the proposed scheme with Bullet by applying the implementation in PlanetLab which is a global overlay network. The evaluation results show that the proposed scheme enhanced the throughput by 20 % and reduced the control overhead over 90 % compared with Bullet.

A Study on the System for AI Service Production (인공지능 서비스 운영을 위한 시스템 측면에서의 연구)

  • Hong, Yong-Geun
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.10
    • /
    • pp.323-332
    • /
    • 2022
  • As various services using AI technology are being developed, much attention is being paid to AI service production. Recently, AI technology is acknowledged as one of ICT services, a lot of research is being conducted for general-purpose AI service production. In this paper, I describe the research results in terms of systems for AI service production, focusing on the distribution and production of machine learning models, which are the final steps of general machine learning development procedures. Three different Ubuntu systems were built, and experiments were conducted on the system, using data from 2017 validation COCO dataset in combination of different AI models (RFCN, SSD-Mobilenet) and different communication methods (gRPC, REST) to request and perform AI services through Tensorflow serving. Through various experiments, it was found that the type of AI model has a greater influence on AI service inference time than AI machine communication method, and in the case of object detection AI service, the number and complexity of objects in the image are more affected than the file size of the image to be detected. In addition, it was confirmed that if the AI service is performed remotely rather than locally, even if it is a machine with good performance, it takes more time to infer the AI service than if it is performed locally. Through the results of this study, it is expected that system design suitable for service goals, AI model development, and efficient AI service production will be possible.

Comparison of the wall clock time for extracting remote sensing data in Hierarchical Data Format using Geospatial Data Abstraction Library by operating system and compiler (운영 체제와 컴파일러에 따른 Geospatial Data Abstraction Library의 Hierarchical Data Format 형식 원격 탐사 자료 추출 속도 비교)

  • Yoo, Byoung Hyun;Kim, Kwang Soo;Lee, Jihye
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.21 no.1
    • /
    • pp.65-73
    • /
    • 2019
  • The MODIS (Moderate Resolution Imaging Spectroradiometer) data in Hierarchical Data Format (HDF) have been processed using the Geospatial Data Abstraction Library (GDAL). Because of a relatively large data size, it would be preferable to build and install the data analysis tool with greater computing performance, which would differ by operating system and the form of distribution, e.g., source code or binary package. The objective of this study was to examine the performance of the GDAL for processing the HDF files, which would guide construction of a computer system for remote sensing data analysis. The differences in execution time were compared between environments under which the GDAL was installed. The wall clock time was measured after extracting data for each variable in the MODIS data file using a tool built lining against GDAL under a combination of operating systems (Ubuntu and openSUSE), compilers (GNU and Intel), and distribution forms. The MOD07 product, which contains atmosphere data, were processed for eight 2-D variables and two 3-D variables. The GDAL compiled with Intel compiler under Ubuntu had the shortest computation time. For openSUSE, the GDAL compiled using GNU and intel compilers had greater performance for 2-D and 3-D variables, respectively. It was found that the wall clock time was considerably long for the GDAL complied with "--with-hdf4=no" configuration option or RPM package manager under openSUSE. These results indicated that the choice of the environments under which the GDAL is installed, e.g., operation system or compiler, would have a considerable impact on the performance of a system for processing remote sensing data. Application of parallel computing approaches would improve the performance of the data processing for the HDF files, which merits further evaluation of these computational methods.

Development of Preliminary Quality Assurance Software for $GafChromic^{(R)}$ EBT2 Film Dosimetry ($GafChromic^{(R)}$ EBT2 Film Dosimetry를 위한 품질 관리용 초기 프로그램 개발)

  • Park, Ji-Yeon;Lee, Jeong-Woo;Choi, Kyoung-Sik;Hong, Semie;Park, Byung-Moon;Bae, Yong-Ki;Jung, Won-Gyun;Suh, Tae-Suk
    • Progress in Medical Physics
    • /
    • v.21 no.1
    • /
    • pp.113-119
    • /
    • 2010
  • Software for GafChromic EBT2 film dosimetry was developed in this study. The software provides film calibration functions based on color channels, which are categorized depending on the colors red, green, blue, and gray. Evaluations of the correction effects for light scattering of a flat-bed scanner and thickness differences of the active layer are available. Dosimetric results from EBT2 films can be compared with those from the treatment planning system ECLIPSE or the two-dimensional ionization chamber array MatriXX. Dose verification using EBT2 films is implemented by carrying out the following procedures: file import, noise filtering, background correction and active layer correction, dose calculation, and evaluation. The relative and absolute background corrections are selectively applied. The calibration results and fitting equation for the sensitometric curve are exported to files. After two different types of dose matrixes are aligned through the interpolation of spatial pixel spacing, interactive translation, and rotation, profiles and isodose curves are compared. In addition, the gamma index and gamma histogram are analyzed according to the determined criteria of distance-to-agreement and dose difference. The performance evaluations were achieved by dose verification in the $60^{\circ}$-enhanced dynamic wedged field and intensity-modulated (IM) beams for prostate cancer. All pass ratios for the two types of tests showed more than 99% in the evaluation, and a gamma histogram with 3 mm and 3% criteria was used. The software was developed for use in routine periodic quality assurance and complex IM beam verification. It can also be used as a dedicated radiochromic film software tool for analyzing dose distribution.