Optimizing Open Radio Access Network Systems with LLAMA V2 for Enhanced Mobile Broadband, Ultra-Reliable Low-Latency Communications, and Massive Machine-Type Communications: A Framework for Efficient Network Slicing and Real-Time Resource Allocation

H. Ahmed Tahir, Walaa Alayed, Waqar ul Hassan, Thuan Dinh Do

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

This study presents an advanced framework integrating LLAMA_V2, a large language model, into Open Radio Access Network (O-RAN) systems. The focus is on efficient network slicing for various services. Sensors in IoT devices generate continuous data streams, enabling resource allocation through O-RAN's dynamic slicing and LLAMA_V2's optimization. LLAMA_V2 was selected for its superior ability to capture complex network dynamics, surpassing traditional AI/ML models. The proposed method combines sophisticated mathematical models with optimization and interfacing techniques to address challenges in resource allocation and slicing. LLAMA_V2 enhances decision making by offering explanations for policy decisions within the O-RAN framework and forecasting future network conditions using a lightweight LSTM model. It outperforms baseline models in key metrics such as latency reduction, throughput improvement, and packet loss mitigation, making it a significant solution for 5G network applications in advanced industries.
Original languageEnglish
Article number7009
JournalSensors
Volume24
Issue number21
DOIs
Publication statusPublished - Nov 2024

Bibliographical note

Publisher Copyright:
© 2024 by the authors.

Keywords

  • 5G
  • AI/ML
  • LLM
  • network slicing
  • O-RAN
  • resource allocation

Fingerprint

Dive into the research topics of 'Optimizing Open Radio Access Network Systems with LLAMA V2 for Enhanced Mobile Broadband, Ultra-Reliable Low-Latency Communications, and Massive Machine-Type Communications: A Framework for Efficient Network Slicing and Real-Time Resource Allocation'. Together they form a unique fingerprint.

Cite this