Graph Signal Processing and Machine Learning for Prediction Problems in Mobile Communications Systems

Detalhes bibliográficos
Ano de defesa: 2021
Autor(a) principal: Ortega, Yosbel Rodríguez
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Não Informado pela instituição
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://www.repositorio.ufc.br/handle/riufc/59223
Resumo: Mobile communications in fifth generation (5G) scenarios still face significant challenges due to the adoption of real-time applications and mission-critical services with stringent requirements, such as greater bandwidth and reliability, lower latency and the capacity to support a massive number of user equipments (UEs) with an intense data transfer between them and the basestations (BSs). In order to tackle some of these challenges, the present thesis analyzes solutions for prediction problems based on graph signal processing (GSP) jointly with machine learning (ML) techniques, to efficiently deal with two relevant use cases of mobile communication systems, that are: (i) beam management and, (ii) increased feedback channel burden. Specifically, graph-based sampling and reconstruction tools are exploited in order to reduce the feedback channel burden, which is used to report the UE measurements that feed the ML-based predictors. Before addressing these two use cases, it is presented an overview of some graph-theoretic definitions and were introduced some of the tools and concepts that are employed throughout this thesis. Concerning the beam management problem, it is proposed an adaptive beam tracking framework for highly directional communications, that exploits the samples in a historical UE dataset (HUD), to efficiently estimate and predict the channel state at the BS. First, a supervised learning algorithm, namely 퐾-nearest neighbors (퐾-NN), is proposed and evaluated for channel tracking and prediction. Then, as preliminary stages of the 퐾-NN algorithm, a graph-based sampling and reconstruction strategy, called 퐾-nearest neighbors with reconstruction (퐾-NN-R), is proposed in order to reduce the beam search space during the beam measurement stage, which allows a more efficient usage of the feedback channel. After that, the block-diagonal structure over graph that represents a cellular vehicular-to-everything (C-V2X) measurement dataset is exploited, and based on it, a graph-based sampling strategy, namely smart Laplacian sampling (SLS), is proposed so that (i) the graph structure/connectivity is preserved, and (ii) the amount of sampled UE measurements is reduced. Finally, the SLS proposal is evaluated through a latency prediction problem, in which is predicted whether a packet can be delivered within a predetermined latency constraint.