• Title/Summary/Keyword: Cloud Data Center

Search Result 327, Processing Time 0.026 seconds

Goal-driven Optimization Strategy for Energy and Performance-Aware Data Centers for Cloud-Based Wind Farm CMS

  • Elijorde, Frank;Kim, Sungho;Lee, Jaewan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.3
    • /
    • pp.1362-1376
    • /
    • 2016
  • A cloud computing system can be characterized by the provision of resources in the form of services to third parties on a leased, usage-based basis, as well as the private infrastructures maintained and utilized by individual organizations. To attain the desired reliability and energy efficiency in a cloud data center, trade-offs need to be carried out between system performance and power consumption. Resolving these conflicting goals is often the major challenge encountered in the design of optimization strategies for cloud data centers. The work presented in this paper is directed towards the development of an Energy-efficient and Performance-aware Cloud System equipped with strategies for dynamic switching of optimization approach. Moreover, a platform is also provided for the deployment of a Wind Farm CMS (Condition Monitoring System) which allows ubiquitous access. Due to the geographically-dispersed nature of wind farms, the CMS can take advantage of the cloud's highly scalable architecture in order to keep a reliable and efficient operation capable of handling multiple simultaneous users and huge amount of monitoring data. Using the proposed cloud architecture, a Wind Farm CMS is deployed in a virtual platform to monitor and evaluate the aging conditions of the turbine's major components in concurrent, yet isolated working environments.

CloudSwitch: A State-aware Monitoring Strategy Towards Energy-efficient and Performance-aware Cloud Data Centers

  • Elijorde, Frank;Lee, Jaewan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.12
    • /
    • pp.4759-4775
    • /
    • 2015
  • The reduction of power consumption in large-scale datacenters is highly-dependent on the use of virtualization to consolidate multiple workloads. However, these consolidation strategies must also take into account additional important parameters such as performance, reliability, and profitability. Resolving these conflicting goals is often the major challenge encountered in the design of optimization strategies for cloud data centers. In this paper, we put forward a data center monitoring strategy which dynamically alters its approach depending on the cloud system's current state. Results show that our proposed scheme outperformed strategies which only focus on a single metric such as SLA-Awareness and Energy Efficiency.

Integration of Cloud and Big Data Analytics for Future Smart Cities

  • Kang, Jungho;Park, Jong Hyuk
    • Journal of Information Processing Systems
    • /
    • v.15 no.6
    • /
    • pp.1259-1264
    • /
    • 2019
  • Nowadays, cloud computing and big data analytics are at the center of many industries' concerns to take advantage of the potential benefits of building future smart cities. The integration of cloud computing and big data analytics is the main reason for massive adoption in many organizations, avoiding the potential complexities of on-premise big data systems. With these two technologies, the manufacturing industry, healthcare system, education, academe, etc. are developing rapidly, and they will offer various benefits to expand their domains. In this issue, we present a summary of 18 high-quality accepted articles following a rigorous review process in the field of cloud computing and big data analytics.

Five Forces Model of Computational Power: A Comprehensive Measure Method

  • Wu, Meixi;Guo, Liang;Yang, Xiaotong;Xie, Lina;Wang, Shaopeng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.7
    • /
    • pp.2239-2256
    • /
    • 2022
  • In this paper, a model is proposed to comprehensively evaluate the computational power. The five forces model of computational power solves the problem that the measurement units of different indexes are not unified in the process of computational power evaluation. It combines the bidirectional projection method with TOPSIS method. This model is more scientific and effective in evaluating the comprehensive situation of computational power. Lastly, an example shows the validity and practicability of the model.

Outsourcing decryption algorithm of Verifiable transformed ciphertext for data sharing

  • Guangwei Xu;Chen Wang;Shan Li;Xiujin Shi;Xin Luo;Yanglan Gan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.4
    • /
    • pp.998-1019
    • /
    • 2024
  • Mobile cloud computing is a very attractive service paradigm that outsources users' data computing and storage from mobile devices to cloud data centers. To protect data privacy, users often encrypt their data to ensure data sharing securely before data outsourcing. However, the bilinear and power operations involved in the encryption and decryption computation make it impossible for mobile devices with weak computational power and network transmission capability to correctly obtain decryption results. To this end, this paper proposes an outsourcing decryption algorithm of verifiable transformed ciphertext. First, the algorithm uses the key blinding technique to divide the user's private key into two parts, i.e., the authorization key and the decryption secret key. Then, the cloud data center performs the outsourcing decryption operation of the encrypted data to achieve partial decryption of the encrypted data after obtaining the authorization key and the user's outsourced decryption request. The verifiable random function is used to prevent the semi-trusted cloud data center from not performing the outsourcing decryption operation as required so that the verifiability of the outsourcing decryption is satisfied. Finally, the algorithm uses the authorization period to control the final decryption of the authorized user. Theoretical and experimental analyses show that the proposed algorithm reduces the computational overhead of ciphertext decryption while ensuring the verifiability of outsourcing decryption.

Public Key based Secure Data Management Scheme for the Cloud Data Centers in Public Institution (공공기관 클라우드 데이터 센터에 활용 가능한 공개키 기반의 안전한 데이터 관리 기법)

  • Wi, Yukyeong;Kwak, Jin
    • Journal of Digital Convergence
    • /
    • v.11 no.12
    • /
    • pp.467-477
    • /
    • 2013
  • The cloud computing has propagated rapidly and thus there is growing interest on the introduction of cloud services in the public institution. Accordingly, domestic public institution are adoption of cloud computing impose and devise a plan. In addition, more specifically, is building a cloud computing system in the public institution. However, solutions to various security threats(e.g., availability invasion of storage, access by unauthorized attacker, data downloaded from uncertain identifier, decrease the reliability of cloud data centers and so on) is required. For the introduction and revitalize of cloud services in the public institution. Therefore, in this paper, we propose a public key based secure data management scheme for the cloud data centers in public institution. Thus, the use of cloud computing in the public institutions, the only authorized users have access to the data center. And setting for importance and level of difficulty of public data management enables by systematic, secure, and efficient. Thus, cloud services for public institution to improve the overall security and convenience.

Integrity verification of VM data collected in private cloud environment and reliability verification of related forensic tools (사설 클라우드 환경에서 수집된 VM 데이터의 무결성 입증과 관련 포렌식 도구의 신뢰성 검증)

  • Kim, Deunghwa;Jang, Sanghee;Park, Jungheum;Kang, Cheulhoon;Lee, Sangjin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.23 no.2
    • /
    • pp.223-230
    • /
    • 2013
  • Recently, a large number of corporations are adopting cloud solution in order to reduce IT-related costs. By the way, Digital Trace should have admissibility to be accepted as digital evidence in court, and integrity is one of the factors for admissibility. In this context, this research implemented integrity verification test to VM Data which was collected by well-known private cloud solutions such as Citrix, VMware, and MS Hyper-V. This paper suggests the effective way to verify integrity of VM data collected in private cloud computing environment based on the experiment and introduces the error that EnCase fails to mount VHD (Virtual Hard Disk) files properly.

Massive 3D Point Cloud Visualization by Generating Artificial Center Points from Multi-Resolution Cube Grid Structure (다단계 정육면체 격자 기반의 가상점 생성을 통한 대용량 3D point cloud 가시화)

  • Yang, Seung-Chan;Han, Soo Hee;Heo, Joon
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.30 no.4
    • /
    • pp.335-342
    • /
    • 2012
  • 3D point cloud is widely used in Architecture, Civil Engineering, Medical, Computer Graphics, and many other fields. Due to the improvement of 3D laser scanner, a massive 3D point cloud whose gigantic file size is bigger than computer's memory requires efficient preprocessing and visualization. We suggest a data structure to solve the problem; a 3D point cloud is gradually subdivided by arbitrary-sized cube grids structure and corresponding point cloud subsets generated by the center of each grid cell are achieved while preprocessing. A massive 3D point cloud file is tested through two algorithms: QSplat and ours. Our algorithm, grid-based, showed slower speed in preprocessing but performed faster rendering speed comparing to QSplat. Also our algorithm is further designed to editing or segmentation using the original coordinates of 3D point cloud.

Secure and Efficient Privacy-Preserving Identity-Based Batch Public Auditing with Proxy Processing

  • Zhao, Jining;Xu, Chunxiang;Chen, Kefei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.2
    • /
    • pp.1043-1063
    • /
    • 2019
  • With delegating proxy to process data before outsourcing, data owners in restricted access could enjoy flexible and powerful cloud storage service for productivity, but still confront with data integrity breach. Identity-based data auditing as a critical technology, could address this security concern efficiently and eliminate complicated owners' public key certificates management issue. Recently, Yu et al. proposed an Identity-Based Public Auditing for Dynamic Outsourced Data with Proxy Processing (https://doi.org/10.3837/tiis.2017.10.019). It aims to offer identity-based, privacy-preserving and batch auditing for multiple owners' data on different clouds, while allowing proxy processing. In this article, we first demonstrate this scheme is insecure in the sense that malicious cloud could pass integrity auditing without original data. Additionally, clouds and owners are able to recover proxy's private key and thus impersonate it to forge tags for any data. Secondly, we propose an improved scheme with provable security in the random oracle model, to achieve desirable secure identity based privacy-preserving batch public auditing with proxy processing. Thirdly, based on theoretical analysis and performance simulation, our scheme shows better efficiency over existing identity-based auditing scheme with proxy processing on single owner and single cloud effort, which will benefit secure big data storage if extrapolating in real application.

Image Deduplication Based on Hashing and Clustering in Cloud Storage

  • Chen, Lu;Xiang, Feng;Sun, Zhixin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.4
    • /
    • pp.1448-1463
    • /
    • 2021
  • With the continuous development of cloud storage, plenty of redundant data exists in cloud storage, especially multimedia data such as images and videos. Data deduplication is a data reduction technology that significantly reduces storage requirements and increases bandwidth efficiency. To ensure data security, users typically encrypt data before uploading it. However, there is a contradiction between data encryption and deduplication. Existing deduplication methods for regular files cannot be applied to image deduplication because images need to be detected based on visual content. In this paper, we propose a secure image deduplication scheme based on hashing and clustering, which combines a novel perceptual hash algorithm based on Local Binary Pattern. In this scheme, the hash value of the image is used as the fingerprint to perform deduplication, and the image is transmitted in an encrypted form. Images are clustered to reduce the time complexity of deduplication. The proposed scheme can ensure the security of images and improve deduplication accuracy. The comparison with other image deduplication schemes demonstrates that our scheme has somewhat better performance.