• Title/Summary/Keyword: Resource management, Machine Learning

Search Result 54, Processing Time 0.022 seconds

A Prediction of Work-life Balance Using Machine Learning

  • Youngkeun Choi
    • Asia pacific journal of information systems
    • /
    • v.34 no.1
    • /
    • pp.209-225
    • /
    • 2024
  • This research aims to use machine learning technology in human resource management to predict employees' work-life balance. The study utilized a dataset from IBM Watson Analytics in the IBM Community for the machine learning analysis. Multinomial dependent variables concerning workers' work-life balance were examined, categorized into continuous and categorical types using the Generalized Linear Model. The complexity of assessing variable roles and their varied impact based on the type of model used was highlighted. The study's outcomes are academically and practically relevant, showcasing how machine learning can offer further understanding of psychological variables like work-life balance through analyzing employee profiles.

Machine learning-based Multi-modal Sensing IoT Platform Resource Management (머신러닝 기반 멀티모달 센싱 IoT 플랫폼 리소스 관리 지원)

  • Lee, Seongchan;Sung, Nakmyoung;Lee, Seokjun;Jun, Jaeseok
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.2
    • /
    • pp.93-100
    • /
    • 2022
  • In this paper, we propose a machine learning-based method for supporting resource management of IoT software platforms in a multi-modal sensing scenario. We assume that an IoT device installed with a oneM2M-compatible software platform is connected with various sensors such as PIR, sound, dust, ambient light, ultrasonic, accelerometer, through different embedded system interfaces such as general purpose input output (GPIO), I2C, SPI, USB. Based on a collected dataset including CPU usage and user-defined priority, a machine learning model is trained to estimate the level of nice value required to adjust according to the resource usage patterns. The proposed method is validated by comparing with a rule-based control strategy, showing its practical capability in a multi-modal sensing scenario of IoT devices.

An Engine for DRA in Container Orchestration Using Machine Learning

  • Gun-Woo Kim;Seo-Yeon Gu;Seok-Jae Moon;Byung-Joon Park
    • International journal of advanced smart convergence
    • /
    • v.12 no.4
    • /
    • pp.126-133
    • /
    • 2023
  • Recent advancements in cloud service virtualization technologies have witnessed a shift from a Virtual Machine-centric approach to a container-centric paradigm, offering advantages such as faster deployment and enhanced portability. Container orchestration has emerged as a key technology for efficient management and scheduling of these containers. However, with the increasing complexity and diversity of heterogeneous workloads and service types, resource scheduling has become a challenging task. Various research endeavors are underway to address the challenges posed by diverse workloads and services. Yet, a systematic approach to container orchestration for effective cloud management has not been clearly defined. This paper proposes the DRA-Engine (Dynamic Resource Allocation Engine) for resource scheduling in container orchestration. The proposed engine comprises the Request Load Procedure, Required Resource Measurement Procedure, and Resource Provision Decision Procedure. Through these components, the DRA-Engine dynamically allocates resources according to the application's requirements, presenting a solution to the challenges of resource scheduling in container orchestration.

Hierarchical IoT Edge Resource Allocation and Management Techniques based on Synthetic Neural Networks in Distributed AIoT Environments (분산 AIoT 환경에서 합성곱신경망 기반 계층적 IoT Edge 자원 할당 및 관리 기법)

  • Yoon-Su Jeong
    • Advanced Industrial SCIence
    • /
    • v.2 no.3
    • /
    • pp.8-14
    • /
    • 2023
  • The majority of IoT devices already employ AIoT, however there are still numerous issues that need to be resolved before AI applications can be deployed. In order to more effectively distribute IoT edge resources, this paper propose a machine learning-based approach to managing IoT edge resources. The suggested method constantly improves the allocation of IoT resources by identifying IoT edge resource trends using machine learning. IoT resources that have been optimized make use of machine learning convolution to reliably sustain IoT edge resources that are always changing. By storing each machine learning-based IoT edge resource as a hash value alongside the resource of the previous pattern, the suggested approach effectively verifies the resource as an attack pattern in a distributed AIoT context. Experimental results evaluate energy efficiency in three different test scenarios to verify the integrity of IoT Edge resources to see if they work well in complex environments with heterogeneous computational hardware.

Design of a ParamHub for Machine Learning in a Distributed Cloud Environment

  • Su-Yeon Kim;Seok-Jae Moon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.2
    • /
    • pp.161-168
    • /
    • 2024
  • As the size of big data models grows, distributed training is emerging as an essential element for large-scale machine learning tasks. In this paper, we propose ParamHub for distributed data training. During the training process, this agent utilizes the provided data to adjust various conditions of the model's parameters, such as the model structure, learning algorithm, hyperparameters, and bias, aiming to minimize the error between the model's predictions and the actual values. Furthermore, it operates autonomously, collecting and updating data in a distributed environment, thereby reducing the burden of load balancing that occurs in a centralized system. And Through communication between agents, resource management and learning processes can be coordinated, enabling efficient management of distributed data and resources. This approach enhances the scalability and stability of distributed machine learning systems while providing flexibility to be applied in various learning environments.

A Machine Learning-based Method for Virtual Network Function Resource Demand Prediction (기계학습 기반의 가상 네트워크 기능 자원 수요 예측 방법)

  • Kim, Hee-Gon;Lee, Do-Young;Yoo, Jae-Hyung;Hong, James Won-Ki
    • KNOM Review
    • /
    • v.21 no.2
    • /
    • pp.1-9
    • /
    • 2018
  • Network virtualization refers to a technology creating independent virtual network environment on a physical network. Network virtualization technology can share the physical network resources to reduce the cost of establishing the network for each user and enables the network administrator to dynamically change the network configuration according to the purpose. Although the network management can be handled dynamically, the management is manual, and it does not maximize the profit of network virtualization. In this paper, we propose Machine-Learning technology to allow the network to learn by itself and manage its management dynamically. The proposed approach is to dynamically allocate appropriate resources by predicting resource demand of VNF in service function chaining, which is a core and essential problem in virtual network management. Our goal is to predict the resource demand of the VNF and dynamically allocate the appropriate resources to reduce the cost of network operation while preventing service interruption.

Deep Learning-based Delinquent Taxpayer Prediction: A Scientific Administrative Approach

  • YongHyun Lee;Eunchan Kim
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.1
    • /
    • pp.30-45
    • /
    • 2024
  • This study introduces an effective method for predicting individual local tax delinquencies using prevalent machine learning and deep learning algorithms. The evaluation of credit risk holds great significance in the financial realm, impacting both companies and individuals. While credit risk prediction has been explored using statistical and machine learning techniques, their application to tax arrears prediction remains underexplored. We forecast individual local tax defaults in Republic of Korea using machine and deep learning algorithms, including convolutional neural networks (CNN), long short-term memory (LSTM), and sequence-to-sequence (seq2seq). Our model incorporates diverse credit and public information like loan history, delinquency records, credit card usage, and public taxation data, offering richer insights than prior studies. The results highlight the superior predictive accuracy of the CNN model. Anticipating local tax arrears more effectively could lead to efficient allocation of administrative resources. By leveraging advanced machine learning, this research offers a promising avenue for refining tax collection strategies and resource management.

Machine Learning SNP for Classification of Korean Abalone Species (Genus Haliotis) (전복류(Genus Haliotis)의 분류를 위한 단일염기변이 기반 기계학습분석)

  • Noh, Eun Soo;Kim, Ju-Won;Kim, Dong-Gyun
    • Korean Journal of Fisheries and Aquatic Sciences
    • /
    • v.54 no.4
    • /
    • pp.489-497
    • /
    • 2021
  • Climate change is affecting the evolutionary trajectories of individual species and ecological communities, partly through the creation of new species groups. As population shift geographically and temporally as a result of climate change, reproductive interactions between previously isolated species are inevitable and it could potentially lead to invasion, speciation, or even extinction. Four species of abalone, genus Haliotis are present along the Korean coastline and these species are important for commercial and fisheries resources management. In this study, genetic markers for fisheries resources management were discovered based on genomic information, as part of the management of endemic species in response to climate change. Two thousand one hundred and sixty one single nucleotide polymorphisms (SNPs) were discovered using genotyping-by-sequencing (GBS) method. Forty-one SNPs were selected based on their features for species classification. Machine learning analysis using these SNPs makes it possible to differentiate four Haliotis species and hybrids. In conclusion, the proposed machine learning method has potentials for species classification of the genus Haliotis. Our results will provide valuable data for biodiversity conservation and management of abalone population in Korea.

5G Network Resource Allocation and Traffic Prediction based on DDPG and Federated Learning (DDPG 및 연합학습 기반 5G 네트워크 자원 할당과 트래픽 예측)

  • Seok-Woo Park;Oh-Sung Lee;In-Ho Ra
    • Smart Media Journal
    • /
    • v.13 no.4
    • /
    • pp.33-48
    • /
    • 2024
  • With the advent of 5G, characterized by Enhanced Mobile Broadband (eMBB), Ultra-Reliable Low Latency Communications (URLLC), and Massive Machine Type Communications (mMTC), efficient network management and service provision are becoming increasingly critical. This paper proposes a novel approach to address key challenges of 5G networks, namely ultra-high speed, ultra-low latency, and ultra-reliability, while dynamically optimizing network slicing and resource allocation using machine learning (ML) and deep learning (DL) techniques. The proposed methodology utilizes prediction models for network traffic and resource allocation, and employs Federated Learning (FL) techniques to simultaneously optimize network bandwidth, latency, and enhance privacy and security. Specifically, this paper extensively covers the implementation methods of various algorithms and models such as Random Forest and LSTM, thereby presenting methodologies for the automation and intelligence of 5G network operations. Finally, the performance enhancement effects achievable by applying ML and DL to 5G networks are validated through performance evaluation and analysis, and solutions for network slicing and resource management optimization are proposed for various industrial applications.

A supervised-learning-based spatial performance prediction framework for heterogeneous communication networks

  • Mukherjee, Shubhabrata;Choi, Taesang;Islam, Md Tajul;Choi, Baek-Young;Beard, Cory;Won, Seuck Ho;Song, Sejun
    • ETRI Journal
    • /
    • v.42 no.5
    • /
    • pp.686-699
    • /
    • 2020
  • In this paper, we propose a supervised-learning-based spatial performance prediction (SLPP) framework for next-generation heterogeneous communication networks (HCNs). Adaptive asset placement, dynamic resource allocation, and load balancing are critical network functions in an HCN to ensure seamless network management and enhance service quality. Although many existing systems use measurement data to react to network performance changes, it is highly beneficial to perform accurate performance prediction for different systems to support various network functions. Recent advancements in complex statistical algorithms and computational efficiency have made machine-learning ubiquitous for accurate data-based prediction. A robust network performance prediction framework for optimizing performance and resource utilization through a linear discriminant analysis-based prediction approach has been proposed in this paper. Comparison results with different machine-learning techniques on real-world data demonstrate that SLPP provides superior accuracy and computational efficiency for both stationary and mobile user conditions.