• Title/Summary/Keyword: Amount of Cloud

Search Result 390, Processing Time 0.024 seconds

Symbiotic Dynamic Memory Balancing for Virtual Machines in Smart TV Systems

  • Kim, Junghoon;Kim, Taehun;Min, Changwoo;Jun, Hyung Kook;Lee, Soo Hyung;Kim, Won-Tae;Eom, Young Ik
    • ETRI Journal
    • /
    • v.36 no.5
    • /
    • pp.741-751
    • /
    • 2014
  • Smart TV is expected to bring cloud services based on virtualization technologies to the home environment with hardware and software support. Although most physical resources can be shared among virtual machines (VMs) using a time sharing approach, allocating the proper amount of memory to VMs is still challenging. In this paper, we propose a novel mechanism to dynamically balance the memory allocation among VMs in virtualized Smart TV systems. In contrast to previous studies, where a virtual machine monitor (VMM) is solely responsible for estimating the working set size, our mechanism is symbiotic. Each VM periodically reports its memory usage pattern to the VMM. The VMM then predicts the future memory demand of each VM and rebalances the memory allocation among the VMs when necessary. Experimental results show that our mechanism improves performance by up to 18.28 times and reduces expensive memory swapping by up to 99.73% with negligible overheads (0.05% on average).

Study on Simulation of Dust Diffusion at Open Pit Mines (노천광산의 발파분진 비산영역 예측에 관한 연구)

  • 김복윤;이상권;조영도;김임호
    • Tunnel and Underground Space
    • /
    • v.8 no.3
    • /
    • pp.194-199
    • /
    • 1998
  • This research was aimed to figure out the trend of dust diffusion at open pit limestone mine for assessing the environmental impacts on the high voltage power transmission line. It is rather easy to assess the dust generation and size distribution of limestone dust at the blasting site, but it is very hard to assess the expected area of dust diffusion and amount of dust fall by the distances from the dust source. In this research, a 3-dimensional fluid dynamic simulation software (3D-Flow) was used for analysing the above mentioned matters to assess the impacts to the insulators on the transmission tower by the blasting dust. It was verfied that the 3D-Flow is reliable tool for simulating dust movement, and the limestone dust is not much hazardous to the power transmission line.

  • PDF

Effect of Volatile Matter and Oxygen Concentration on Tar and Soot Yield Depending on Coal Type in a Laminar Flow Reactor (LFR에서 탄종에 따른 휘발분과 산소농도가 타르와 수트의 발생률에 미치는 영향)

  • Jeong, Tae Yong;Kim, Yong Gyun;Kim, Jin Ho;Lee, Byoung Hwa;Song, Ju Hun;Jeon, Chung Hwan
    • Korean Chemical Engineering Research
    • /
    • v.50 no.6
    • /
    • pp.1034-1042
    • /
    • 2012
  • This study was performed by using an LFR (laminar flow reactor), which can be used to carry out different types of research on coal. In this study, an LFR was used to analyze coal flames, tar and soot yields, and structures of chars for two coals depending on their volatile content. The results show that the volatile content and oxygen concentration have a significant effect on the length and width of the soot cloud and that the length and width of the cloud under combustion conditions are less than those under a pyrolysis atmosphere. At sampling heights until 50 mm, the tar and soot yields of Berau (sub-bituminous) coal, which contains a large amount of volatile matter, are less than those of Glencore A.P. (bituminous) coal because tar is oxidized by the intrinsic oxygen component of coal and by radicals such as OH-. On the other hand, at sampling heights above 50 mm, the tar and soot yields of Berau coal are higher than those of Glencore A.P. coal by reacted residual volatile matter, tar and light gas in char and flame. With above results, it is confirmed that the volatile matter content and the intrinsic oxygen component in a coal are significant parameters for length and width of the soot cloud and yields of the soot. In addition, the B.E.T. results and the images of samples (SEM) obtained from the particle separation system of the sampling probe support the above results pertaining to the yields; the results also confirm the pore development on the char surface caused by devolatilization.

Characteristics and Quality Control of Precipitable Water Vapor Measured by G-band (183 GHz) Water Vapor Radiometer (G-band (183 GHz) 수증기 라디오미터의 가강수량 특성과 품질 관리)

  • Kim, Min-Seong;Koo, Tae-Young;Kim, Ji-Hyoung;Jung, Sueng-Pil;Kim, Bu-Yo;Kwon, Byung Hyuk;Lee, Kwangjae;Kang, Myeonghun;Yang, Jiwhi;Lee, ChulKyu
    • Journal of the Korean earth science society
    • /
    • v.43 no.2
    • /
    • pp.239-252
    • /
    • 2022
  • Quality control methods for the first G-band vapor radiometer (GVR) mounted on a weather aircraft in Korea were developed using the GVR Precipitable Water Vapor (PWV). The aircraft attitude information (degree of pitch and roll) was applied to quality control to select the shortest vertical path of the GVR beam. In addition, quality control was applied to remove a GVR PWV ≥20 mm. It was found that the difference between the warm load average power and sky load average power converged to near 0 when the GVR PWV increased to 20 mm or higher. This could be due to the high brightness temperature of the substratus and mesoclouds, which was confirmed by the Communication, Ocean and Meteorological Satellite (COMS) data (cloud type, cloud top height, and cloud amount), cloud combination probe (CCP), and precipitation imaging probe (PIP). The GVR PWV before and after the application of quality control on a cloudy day was quantitatively compared with that of a local data assimilation and prediction system (LDAPS). The Root Mean Square Difference (RMSD) decreased from 2.9 to 1.8 mm and the RMSD with Korea Local Analysis and Precipitation System (KLAPS) decreased from 5.4 to 4.3 mm, showing improved accuracy. In addition, the quality control effectiveness of GVR PWV suggested in this study was verified through comparison with the COMS PWV by using the GVR PWV applied with quality control and the dropsonde PWV.

Application of Point Cloud Data for Transmission Power Line Monitoring (송전선 모니터링을 위한 포인트클라우드 데이터 활용)

  • Park, Joon-Kyu;Um, Dae-Yong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.11
    • /
    • pp.224-229
    • /
    • 2018
  • Korea is experiencing a rapid increase in electricity consumption due to rapid economic development, and many power transmission towers are installed to provide smooth power supply. The high-voltage transmission line is mainly made of aluminum stranded wire, and the wire is loosely guided so that some deflection is maintained. The degree of deflection has a great influence on the quality of the construction and the life of the cable. As the time passes, the shrinkage and expansion occur repeatedly due to the weight of the cable and the surrounding environment. Therefore, periodic monitoring is essential for the management of the power transmission line. In this study, the power transmission lines were monitored using 3D laser scanning technology. The data of the power transmission line of the study area was acquired and the point cloud type 3D geospatial information of the transmission line was extracted through data processing. The length of the transmission line and deflection amount were calculated using the 3D geospatial information of the transmission line, and the distance from the surrounding obstacles could be calculated effectively. The result of study shows the utilization of 3D laser scanning technology for transmission line management. Future research will contribute to the efficiency of transmission line management if a transmission line monitoring system using 3D laser scanning technology is developed.

Radiative Properties at King Sejong Station in West Antarctica with the Radiative Transfer Model : A Surface UV-A and Erythemal UV-B Radiation Changes (대기 복사 모형에 의한 남극 세종기지에서의 복사학적 특징 : 지표면에서 UV-A와 Erythemal UV-B 자외선 양 변화)

  • Lee, Kyu-Tae;Lee, Bang-Yong;Won, Young-In;Jee, Joon-Bum;Lee, Won-Hak;Kim, Youn-Joung
    • Ocean and Polar Research
    • /
    • v.25 no.1
    • /
    • pp.9-20
    • /
    • 2003
  • A solar radiation model was used to investigate the UV radiation at the surface offing Sejong Station in West Antarctica. The results calculated by this model were compared with the values measured by UV-Biometer and UV-A meter during 1999-2000. In this study, the parameterization of solar radiative transfer process was based on Chou and Lee(1996). The total ozone amounts measured by Breve. Ozone Spectrophotometer and the aerosol amounts by Nakajima et al.(1996) was used as the input data of the solar radiative transfer model. And the surface albedo is assumed to be 0.20 in summer and 0.85 in winter. The sensitivity test of solar radiative transfer model was done with the variation of total ozone, aerosol amount, and surface albedo. When the cosine of solar zenith angle is 0.3, Erythemal UV-B radiation decreased 73% with the 200% increase of total ozone from 100 DU to 300 DU, but the decrease of UV-A radiation is about 1%. Also, for the same solar zenith angle, UV-A radiation was decreased 31.0% with the variation of aerosol optical thickness from 0.0 to 0.3 and Erythemal UV-B radiation was decreased only 6.1%. The increase of Erythemal W-B radiation with the variation of surface albedo was twice that of UV-A increase. The surface Erythemal UV-B and UV-A radiation calculated by solar raditive transfer model were compared with the measured values fer the relatively clear day at King Sejong Station in West Antarctica. The model calculated Erythemal UV-B radiation at the surface coincide well with the measured values except for cloudy days. But the difference between the model calculated UV-A radiation and the measured value at the surface was large because of cloud scattering effect. So, the cloud property data is needed to calculate the UV radiation more exactly at King Sejong Station in West Antarctica.

Analysis of Factors for Korean Women's Cancer Screening through Hadoop-Based Public Medical Information Big Data Analysis (Hadoop기반의 공개의료정보 빅 데이터 분석을 통한 한국여성암 검진 요인분석 서비스)

  • Park, Min-hee;Cho, Young-bok;Kim, So Young;Park, Jong-bae;Park, Jong-hyock
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.10
    • /
    • pp.1277-1286
    • /
    • 2018
  • In this paper, we provide flexible scalability of computing resources in cloud environment and Apache Hadoop based cloud environment for analysis of public medical information big data. In fact, it includes the ability to quickly and flexibly extend storage, memory, and other resources in a situation where log data accumulates or grows over time. In addition, when real-time analysis of accumulated unstructured log data is required, the system adopts Hadoop-based analysis module to overcome the processing limit of existing analysis tools. Therefore, it provides a function to perform parallel distributed processing of a large amount of log data quickly and reliably. Perform frequency analysis and chi-square test for big data analysis. In addition, multivariate logistic regression analysis of significance level 0.05 and multivariate logistic regression analysis of meaningful variables (p<0.05) were performed. Multivariate logistic regression analysis was performed for each model 3.

Selection of Optimal Variables for Clustering of Seoul using Genetic Algorithm (유전자 알고리즘을 이용한 서울시 군집화 최적 변수 선정)

  • Kim, Hyung Jin;Jung, Jae Hoon;Lee, Jung Bin;Kim, Sang Min;Heo, Joon
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.22 no.4
    • /
    • pp.175-181
    • /
    • 2014
  • Korean government proposed a new initiative 'government 3.0' with which the administration will open its dataset to the public before requests. City of Seoul is the front runner in disclosure of government data. If we know what kind of attributes are governing factors for any given segmentation, these outcomes can be applied to real world problems of marketing and business strategy, and administrative decision makings. However, with respect to city of Seoul, selection of optimal variables from the open dataset up to several thousands of attributes would require a humongous amount of computation time because it might require a combinatorial optimization while maximizing dissimilarity measures between clusters. In this study, we acquired 718 attribute dataset from Statistics Korea and conducted an analysis to select the most suitable variables, which differentiate Gangnam from other districts, using the Genetic algorithm and Dunn's index. Also, we utilized the Microsoft Azure cloud computing system to speed up the process time. As the result, the optimal 28 variables were finally selected, and the validation result showed that those 28 variables effectively group the Gangnam from other districts using the Ward's minimum variance and K-means algorithm.

Selection of Transition Point through Calculation of Cumulative Toxic Load -Focused on Incheon Area- (누적독성부하 산정을 통한 주민소산 전환시점 선정에 관한 연구 -인천지역을 중심으로-)

  • Lee, Eun Ji;Han, Man Hyeong;Chon, Young Woo;Lee, Ik Mo;Hwang, Yong Woo
    • Journal of the Korean Society of Safety
    • /
    • v.35 no.6
    • /
    • pp.15-24
    • /
    • 2020
  • With the development of the chemical industry, the chemical accident is increasing every year, thereby increasing the risk of accidents caused by chemicals. The Ministry of Environment provides the criteria for determining shelter-in-place or outdoor evacuation by material, duration of accident, and distance from the toxic substance leak. However, it is hard to say that the criteria for determining the transition point are not clear. Transition point mean the time that evacuation method is switched from shelter-in-place to outdoor evacuation. So, the purpose of this study was to calculate appropriate transition point by comparing the cumulative toxic load. Namdong-gu in Incheon Metropolitan City was finally selected as the target area, considering the current status of the population of Incheon Metropolitan City in 2016 and the statistical survey of chemicals in 2016. The target materials were HCl, HF, and NH3. Modeling was simulated by ALOHA and performed assuming that the entire amount would be leaked for 10 min. Residents' evacuation scenarios were assumed to be shelter-in-place, immediate outdoor evacuation, and outdoor evacuation at an appropriate time after shelter-in-place. Based on the above method, the appropriate transition point from residents located in A(800 m away), B(1,200 m away), C(1,400 m away) and D(2,200 m away) was identified. In HCl, appropriate transition point was after 15 min, after 16 min, after 17 min, after 20 min in order by A, B, C and D. In HF, appropriate transition point was before 1 min or after 16 min, before 4 min or after 19 min, before 5 min or after 20 min, before 14 min or after 26 min in order by A, B, C and D. In NH3, appropriate transition point at A was before 4 min or after 16. Others are not in chemical cloud. This study confirmed the transition point to minimize the cumulative toxic load can be obtained by quantitative method. Through this, it might be possible to select evacuation method quantitatively that cumulative toxic load are minimal. In addition, if the shelter-in-place is maintained without transition to outdoor evacuation, the cumulative toxic load will increase more than outdoor evacuation. Therefore, it was confirmed that actions to reduce the concentration of chemicals in the room were necessary, such as conducting ventilation after the chemical cloud passed through the site.

Investigations on Techniques and Applications of Text Analytics (텍스트 분석 기술 및 활용 동향)

  • Kim, Namgyu;Lee, Donghoon;Choi, Hochang;Wong, William Xiu Shun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.42 no.2
    • /
    • pp.471-492
    • /
    • 2017
  • The demand and interest in big data analytics are increasing rapidly. The concepts around big data include not only existing structured data, but also various kinds of unstructured data such as text, images, videos, and logs. Among the various types of unstructured data, text data have gained particular attention because it is the most representative method to describe and deliver information. Text analysis is generally performed in the following order: document collection, parsing and filtering, structuring, frequency analysis, and similarity analysis. The results of the analysis can be displayed through word cloud, word network, topic modeling, document classification, and semantic analysis. Notably, there is an increasing demand to identify trending topics from the rapidly increasing text data generated through various social media. Thus, research on and applications of topic modeling have been actively carried out in various fields since topic modeling is able to extract the core topics from a huge amount of unstructured text documents and provide the document groups for each different topic. In this paper, we review the major techniques and research trends of text analysis. Further, we also introduce some cases of applications that solve the problems in various fields by using topic modeling.