• Title/Summary/Keyword: 5G Networks

Search Result 357, Processing Time 0.021 seconds

A Study on the Analysis of Security Requirements through Literature Review of Threat Factors of 5G Mobile Communication

  • DongGyun Chu;Jinho Yoo
    • Journal of Information Processing Systems
    • /
    • v.20 no.1
    • /
    • pp.38-52
    • /
    • 2024
  • The 5G is the 5th generation mobile network that provides enhanced mobile broadband, ultra-reliable & low latency communications, and massive machine-type communications. New services can be provided through multi-access edge computing, network function virtualization, and network slicing, which are key technologies in 5G mobile communication. However, these new technologies provide new attack paths and threats. In this paper, we analyzed the overall threats of 5G mobile communication through a literature review. First, defines 5G mobile communication, analyzes its features and technology architecture, and summarizes possible security issues. Addition, it presents security threats from the perspective of user devices, radio access network, multi-access edge computing, and core networks that constitute 5G mobile communication. After that, security requirements for threat factors were derived through literature analysis. The purpose of this study is to conduct a fundamental analysis to examine and assess the overall threat factors associated with 5G mobile communication. Through this, it will be possible to protect the information and assets of individuals and organizations that use 5G mobile communication technology, respond to various threat situations, and increase the overall level of 5G security.

Research on the Implementation of 5G SA Test Network Test Bed Function Based on Service-Based Architecture (SBA 기반 5G SA 시험망 시스템 기능 구현에 관한 연구)

  • Park, Jea-Seok;Yoon, Mahn-Suk
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.529-531
    • /
    • 2022
  • The 5th generation mobile communication (5G) is being commercialized by major domestic and foreign mobile telecommunication businesses and is spreading to general customers mainly on smart devices such as smartphones, wearables, and IoT. If 4G networks and 5G access equipment were utilized by introducing NSA(None-Stand Alone) technology when 5G was first introduced, recently, 5G convergence services are being realized by gradually expanding evolution to 5G standalone networks through SA (Stand Alone) technology. The purpose of this study is to study a design plan for implementing necessary service-oriented functions from the perspective of communication network users on the configuration of 5G SA equipment based on SBA(Service-based Architecture) mentioned in the 3GPP technical specification document. Through this research, it is expected that companies that need to enter the 5G market can easily access the 5G SA network to develop and supplement specialized 5G convergence services to improve product performance and quality.

  • PDF

Correlated Intelligent Reflecting Surface and Improved BER Performance of NOMA

  • Chung, Kyuhyuk
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.14 no.3
    • /
    • pp.79-84
    • /
    • 2022
  • Towards the sixth generation (6G) mobile networks, spectrum and energy efficiency of non-orthogonal multiple access (NOMA) transmissions in the fifth generation (5G) wireless system have been improved by intelligent reflecting surface (IRS) technologies. However, the reflecting devices of an IRS tend to be correlated because they are placed close on the surface each other. In this paper, we present an analysis on the correlated IRS in NOMA cellular networks. Specifically, we consider the bit-error rate (BER) performances for correlated-IRS in NOMA networks. First, based on the central limit theorem, we derive an approximate analytical expression of the BER for correlated-IRS NOMA systems, by using the second moment of the channel gain. Then we validate the proposed analytical BER by Monte Carlo simulations, and show that they are in good agreement. In addition, we also show numerically the BER improvement of the correlated-IRS NOMA over the conventional independent-IRS NOMA.

Load Balancing Algorithm of Ultra-Dense Networks: a Stochastic Differential Game based Scheme

  • Xu, Haitao;He, Zhen;Zhou, Xianwei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.7
    • /
    • pp.2454-2467
    • /
    • 2015
  • Increasing traffic and bandwidth requirements bring challenges to the next generation wireless networks (5G). As one of the main technology in 5G networks, Ultra-Dense Network (UDN) can be used to improve network coverage. In this paper, a radio over fiber based model is proposed to solve the load balancing problem in ultra-dense network. Stochastic differential game is introduced for the load balancing algorithm, and optimal load allocated to each access point (RAP) are formulated as Nash Equilibrium. It is proved that the optimal load can be achieved and the stochastic differential game based scheme is applicable and acceptable. Numerical results are given to prove the effectiveness of the optimal algorithm.

Embedding Mesh-Like Networks into Petersen-Torus(PT) Networks (메쉬 부류 네트워크를 피터슨-토러스(PT) 네트워크에 임베딩)

  • Seo, Jung-Hyun;Lee, Hyeong-Ok;Jang, Moon-Suk
    • The KIPS Transactions:PartA
    • /
    • v.15A no.4
    • /
    • pp.189-198
    • /
    • 2008
  • In this paper, we prove mesh-like networks can be embedded into Petersen-Torus(PT) networks. Once interconnection network G is embedded in H, the parallel algorithm designed in Gcan be applied to interconnection network H. The torus is embedded into PT with dilation 5, link congestion 5 and expansion 1 using one-to-one embedding. The honeycomb mesh is embedded into PT with dilation 5, link congestion 2 and expansion 5/3 using one-to-one embedding. Additional, We derive average dilation. The embedding algorithm could be available in both wormhole routing system and store-and-forward routing system by embedding the generally known Torus and honeycomb mesh networks into PT at 5 or less of dilation and congestion, and the processor throughput could be minimized at simulation through one-to-one.

The Performance Analysis of Cognitive-based Overlay D2D Communication in 5G Networks

  • Abdullilah Alotaibi;Salman A. AlQahtani
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.2
    • /
    • pp.178-188
    • /
    • 2024
  • In the near future, it is expected that there will be billions of connected devices using fifth generation (5G) network services. The recently available base stations (BSs) need to mitigate their loads without changing and at the least monetary cost. The available spectrum resources are limited and need to be exploited in an efficient way to meet the ever-increasing demand for services. Device to Device communication (D2D) technology will likely help satisfy the rapidly increasing capacity and also effectively offload traffic from the BS by distributing the transmission between D2D users from one side and the cellular users and the BS from the other side. In this paper, we propose to apply D2D overlay communication with cognitive radio capability in 5G networks to exploit unused spectrum resources taking into account the dynamic spectrum access. The performance metrics; throughput and delay are formulated and analyzed for CSMA-based medium access control (MAC) protocol that utilizes a common control channel for device users to negotiate the data channel and address the contention between those users. Device users can exploit the cognitive radio to access the data channels concurrently in the common interference area. Estimating the achievable throughput and delay in D2D communication in 5G networks is not exploited in previous studies using cognitive radio with CSMA-based MAC protocol to address the contention. From performance analysis, applying cognitive radio capability in D2D communication and allocating a common control channel for device users effectively improve the total aggregated network throughput by more than 60% compared to the individual D2D throughput without adding harmful interference to cellular network users. This approach can also reduce the delay.

Towards Scalable and Cost-efficient Software-Defined 5G Core Network

  • Park, Jong Han;Choi, Changsoon;Jeong, Sangsoo;Na, Minsoo;Jo, Sungho
    • Information and Communications Magazine
    • /
    • v.33 no.6
    • /
    • pp.18-26
    • /
    • 2016
  • Network and network functions virtualization (NFV) promise a number of attractive benefits and thus have driven mobile network operators to transform their previously static networks to more dynamic and software-defined networks. In this article, we share a mobile network operator's view based on implementation and deployment experiences in the wild during the past few years towards a software-defined 5G core network. More specifically, we present a practical point of view from mobile network operators and elaborate on why some of the virtualization benefits such as total cost of ownership (TCO) reduction are not easily realized as initially intended. Then, we describe 5G visions, services, and their requirements commonly agreed across mobile operators globally. Given the requirements, we then introduce desirable characteristics of 5G mobile core network and its key enabling technologies.

Bi-LSTM model with time distribution for bandwidth prediction in mobile networks

  • Hyeonji Lee;Yoohwa Kang;Minju Gwak;Donghyeok An
    • ETRI Journal
    • /
    • v.46 no.2
    • /
    • pp.205-217
    • /
    • 2024
  • We propose a bandwidth prediction approach based on deep learning. The approach is intended to accurately predict the bandwidth of various types of mobile networks. We first use a machine learning technique, namely, the gradient boosting algorithm, to recognize the connected mobile network. Second, we apply a handover detection algorithm based on network recognition to account for vertical handover that causes the bandwidth variance. Third, as the communication performance offered by 3G, 4G, and 5G networks varies, we suggest a bidirectional long short-term memory model with time distribution for bandwidth prediction per network. To increase the prediction accuracy, pretraining and fine-tuning are applied for each type of network. We use a dataset collected at University College Cork for network recognition, handover detection, and bandwidth prediction. The performance evaluation indicates that the handover detection algorithm achieves 88.5% accuracy, and the bandwidth prediction model achieves a high accuracy, with a root-mean-square error of only 2.12%.

A Study on Improving Data Poisoning Attack Detection against Network Data Analytics Function in 5G Mobile Edge Computing (5G 모바일 에지 컴퓨팅에서 빅데이터 분석 기능에 대한 데이터 오염 공격 탐지 성능 향상을 위한 연구)

  • Ji-won Ock;Hyeon No;Yeon-sup Lim;Seong-min Kim
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.3
    • /
    • pp.549-559
    • /
    • 2023
  • As mobile edge computing (MEC) is gaining attention as a core technology of 5G networks, edge AI technology of 5G network environment based on mobile user data is recently being used in various fields. However, as in traditional AI security, there is a possibility of adversarial interference of standard 5G network functions within the core network responsible for edge AI core functions. In addition, research on data poisoning attacks that can occur in the MEC environment of standalone mode defined in 5G standards by 3GPP is currently insufficient compared to existing LTE networks. In this study, we explore the threat model for the MEC environment using NWDAF, a network function that is responsible for the core function of edge AI in 5G, and propose a feature selection method to improve the performance of detecting data poisoning attacks for Leaf NWDAF as some proof of concept. Through the proposed methodology, we achieved a maximum detection rate of 94.9% for Slowloris attack-based data poisoning attacks in NWDAF.

Cloud Radio Access Network: Virtualizing Wireless Access for Dense Heterogeneous Systems

  • Simeone, Osvaldo;Maeder, Andreas;Peng, Mugen;Sahin, Onur;Yu, Wei
    • Journal of Communications and Networks
    • /
    • v.18 no.2
    • /
    • pp.135-149
    • /
    • 2016
  • Cloud radio access network (C-RAN) refers to the virtualization of base station functionalities by means of cloud computing. This results in a novel cellular architecture in which low-cost wireless access points, known as radio units or remote radio heads, are centrally managed by a reconfigurable centralized "cloud", or central, unit. C-RAN allows operators to reduce the capital and operating expenses needed to deploy and maintain dense heterogeneous networks. This critical advantage, along with spectral efficiency, statistical multiplexing and load balancing gains, make C-RAN well positioned to be one of the key technologies in the development of 5G systems. In this paper, a succinct overview is presented regarding the state of the art on the research on C-RAN with emphasis on fronthaul compression, baseband processing, medium access control, resource allocation, system-level considerations and standardization efforts.