This is a research code repository for GPS-aided beam prediction and tracking for mmWave communication.
In this work, there are 3 folders inside folder notebooks:
- Folder
minmaxgeo_uebsvectorcontains the proposed model for beam prediction and tracking that uses min-max normalized UAV's geodetic position (latitude and longitude) and UAV-BS unit vector as input. This folder contains the code implementation of this work: GPS-Aided Deep Learning for Beam Prediction and Tracking in UAV mmWave Communication. - Folder
uebsvector_logscaledheightcontains the proposed model for beam prediction and tracking that uses UAV-BS unit vector and log scaled UAV's height as input. Read the tl;dr article about it here. - Folder
baselinecontains the baseline models for beam prediction and tracking as described in [3] (NOTE: these implementations are not the official code from the original authors).
00_test_drone_cnn_ed_rnn_experimentand01_drone_cnn_ed_rnn_experiment.ipynb: a notebook to train the proposed model.02_visualization_combination.ipynband03_visualization.ipynb: a notebook to visualize the evaluation metrics.00_test_onnx.ipynb: a notebook to measure the inference time using PyTorch model and ONNX model. Read the article about it here00_test_dataset_label.ipynb: visualize label distribution using various data set splitting methods.00_test_drone_base_prediction.ipynb: train a baseline beam prediction model[3].00_test_drone_base_tracking.ipynb: train a baseline beam tracking model[3].
- Clone this repo.
- Enter the repo directory through the terminal.
- Run
poetry installto install the dependencies (runpip install poetryif you haven't installed poetry yet). - Run
poetry updateto update the libraries version. - Run
poetry shellto activate the virtual environment. - Download end extract the zip file data set Scenario 23 from DeepSense6G website [1].
- Create folder
data/raw/inside the repo. - Put the data set into folder
data/raw/. The dataset folder should be named in the format ofScenario{scenario_number}. This folder should containsscenario{scenario_number}.csvfile. - To run the jupyter notebook, open the jupyter notebook and make sure to set the kernel to the poetry environment.
- Inside the notebook, make sure to change the repository path to the correct path (e.g.
sys.path.append('F:/repo/gpsbeam')). - Run the notebook.
- To reproduce the proposed model result, run
01_drone_cnn_ed_rnn_experiment.ipynb. - Visualize the result by running
02_visualization_combination.ipynband02_visualization.ipynb.
- The preprocessed dataset will be saved in
data/processed/. - Experiment result will be saved in
data/experiment_result/.
This repository is inspired by the following repositories:
Thanks to the tutorials from the following sources:
[1] A. Alkhateeb, G. Charan, T. Osman, A. Hredzak, and N. Srinivas, “DeepSense 6G: large-scale real-world multi-modal sensing and communication datasets,” to be available on arXiv, 2022. [Online]. Available: https://www.DeepSense6G.net
[2] Charan, G., Hredzak, A., Stoddard, C., Berrey, B., Seth, M., Nunez, H., & Alkhateeb, A. (2022, December). Towards real-world 6G drone communication: Position and camera aided beam prediction. In GLOBECOM 2022-2022 IEEE Global Communications Conference (pp. 2951-2956). IEEE.
[3] Charan, G., & Alkhateeb, A. (2024). Sensing-Aided 6G Drone Communications: Real-World Datasets and Demonstration. arXiv preprint arXiv:2412.04734. [Online].Available: https://arxiv.org/abs/2412.04734