UAV-Networks-Routing is a Python simulator for experimenting routing algorithms and mac protocols on unmanned aerial vehicle networks.

Overview

UAV-Networks Simulator - Autonomous Networking - A.A. 20/21

UAV-Networks-Routing is a Python simulator for experimenting routing algorithms and mac protocols on unmanned aerial vehicle networks. The project requires Python 3, and several dependencies. This code is released for the course of Autonomous Networking - A.A. 2020-2021, to develop and test AI based protocols.

Execution

In order to execute UAV-Networks-Routing project from the terminal, clone the git project and place yourself in UAV-Networks-Routing directory, then and run:

python -m src.main

The simulation will start in a new window, the parameters of the simulation are set in src.utilities.config, have a look at the simulation setup in the configuration file to understand what is going on in the simulation.

Project structure

The project has the following structure:

.
├── README.md
├── data
│   └── tours
│       ├── RANDOM_missions1.json
│       ├── ...
│       └── RANDOM_missions90.json
└── src
    ├── main.py
    ├── drawing
    │   ├── color.py
    │   ├── picture.py
    │   ├── pp_draw.py
    │   └── stddraw.py
    ├── entities
    │   └── uav_entities.py
    ├── experiments
    ├── routing_algorithms
    │   └── georouting.py
    ├── simulation
    │   ├── metrics.py
    │   └── simulator.py
    └── utilities
        ├── config.py
        └── utilities.py

The entry point of the project is the src.main file, from there you can run simulations and extensive experimental campaigns, by setting up an appropriate src.simulator.Simulator object.

On a high level, the two main directories are data and src. The directory data must contain all the data of the project, like drones tours, and other input and output of the project. The directory src contains the source code, organized in several packages.

  • src.drawing it contains all the classes needed for drawing the simulation on screen. Typically you may want to get your hands in this directory if you want to change the aspect of the simulation, display a new object, or label on the area.

  • src.entites it contains all the classes that define the behaviour and the structure of the main entities of the project like: Drone, Depot, Environment, Packet, Event classes.

  • src.experiments it contains classes that handle experimental campaigns.

  • src.routing_algorithms it contains all the classes modelling the several routing algorithms, every routing algorithm should have its own class, see section Adding routing algorithms below.

  • src.simulation it contains all the classes to handle a simulation and its metrics.

  • src.utilities it contains all the utilities and the configuration parameters. In particular use src.utilities.config file to specify all the constants and parameters for a one-shot simulation, ideal when one wants to evaluate the quality of a routing algorithm making frequent executions. Constants and parameters should always be added here and never be hard-coded.

Understand the project

In this section it will be given a high level overview of the project. Before adding any new file to the project, as a contribute, you may want to run some simulations, understand the idea behind the simulator, and the routing algorithm available.

Make some simulations

Run a simulation from src.main, on a new window it will be displayed a live simulation. At the end of the simulation some metrics will be printed. In the main function, a Simulator object is instantiated with default parameters coming from the src.utilities.config. In order to make different executions and simulations, you may want to let the parameters in the config file vary appropriately.

Let us make an example with an excerpt of the configuration file:

SIM_DURATION = 7000   # int: steps of simulation. # ***
TS_DURATION = 0.150   # float: seconds duration of a step in seconds.
SEED = 1              # int: seed of this simulation.

N_DRONES = 5          # int: number of drones. # ***
ENV_WIDTH = 1500      # float: meters, width of environment.
ENV_HEIGHT = 1500     # float: meters, height of environment.

# events
EVENTS_DURATION = SIM_DURATION   # int: steps, number of time steps that an event lasts.
D_FEEL_EVENT = 500               # int: steps, a new packet is felt (generated on the drone) every 'D_FEEL_EVENT' steps. # ***
P_FEEL_EVENT = .25               # float: probability that the drones feels the event generated on the drone. # ***

From this excerpt, one expects a simulation that lasts for 7000 steps of 0.150 seconds each. The executions will run with seed 1, with 5 drones flying over an area of 1500m * 1500m. The events on the map last for the entire duration of the simulation. The drones are set to feel an event every 500 steps, but they feel it with probability 0.25.

Simulator and K-Routing algorithm

In the simulator, time is simulated. A simulation lasts for SIM_DURATION steps, lasting TS_DURATION seconds each. During a single step, as one can see from src.simulator.Simulator.run(), essentially 4 things happen, for every drone:

  1. it feels an event, if it's the right moment and if it is lucky enough to grasp it from the environment.

  2. it updates the packets in its buffer, deleting all the packets that are expired.

  3. it routes its buffer to its neighbours, if it has any.

  4. it sets its next waypoint and moves towards it, it can be either a point in the map, or the depot, depending on what the routing algorithm decides for it.

The UAVs can have any possible path/tour given by a json file (a dict id_drone : list of waypoints). Notice that a waypoint is a 2-tuple (x, y), the coordinate of the point. Events are generated right on the drone. If an event is successfully "felt", the drone generates a packet out of it and it is responsible to bring it to the depot according to the routing algorithm currently running. Packets can expire and have a TTL to avoid infinite pin-pongs, that are seen to be rare.

The routing algorithms in the project go under the directorysrc.routing_algorithms .

Adding routing algorithms

Routing algorithms should be implemented as a class, extending the src.routing_algorithms.BASE_routing class. This will need the definition of required methods, such as: routing().

Once created, the class should be declared in the configuration file, specifically in the RoutingAlgorithm enumeration, in which it suffices to give a name to the enumeration variable and associate it to the class name. For instance if the created routing algorithm class is named MyRouting, then add in src.utilities.config to the RoutingAlgorithm enumeration the enumeration variable MY_ROUTING = MyRouting.

To run a simulation with your new routing algorithms, just set the attribute ROUTING_ALGORITHM in the config file with the enumeration variable of your choice.

Contacts

For further information contact Andrea Coletta at coletta[AT]di.uniroma1.it.

Thanks and License

The current version of the simulator is free for non-commercial use. The simulator was done in collaboration with Matteo Prata, PhD Student at La Sapienza prata[AT]di.uniroma1.it.

Self-Supervised Vision Transformers Learn Visual Concepts in Histopathology (LMRL Workshop, NeurIPS 2021)

Self-Supervised Vision Transformers Learn Visual Concepts in Histopathology Self-Supervised Vision Transformers Learn Visual Concepts in Histopatholog

Richard Chen 95 Dec 24, 2022
fcn by tensorflow

Update An example on how to integrate this code into your own semantic segmentation pipeline can be found in my KittiSeg project repository. tensorflo

9 May 22, 2022
A PyTorch-based library for semi-supervised learning

News If you want to join TorchSSL team, please e-mail Yidong Wang ([email protected]<

1k Jan 06, 2023
Extracting knowledge graphs from language models as a diagnostic benchmark of model performance.

Interpreting Language Models Through Knowledge Graph Extraction Idea: How do we interpret what a language model learns at various stages of training?

EPFL Machine Learning and Optimization Laboratory 9 Oct 25, 2022
《Geo Word Clouds》paper implementation

《Geo Word Clouds》paper implementation

Russellwzr 2 Jan 28, 2022
This is the official implementation of "One Question Answering Model for Many Languages with Cross-lingual Dense Passage Retrieval".

CORA This is the official implementation of the following paper: Akari Asai, Xinyan Yu, Jungo Kasai and Hannaneh Hajishirzi. One Question Answering Mo

Akari Asai 59 Dec 28, 2022
RealFormer-Pytorch Implementation of RealFormer using pytorch

RealFormer-Pytorch Implementation of RealFormer using pytorch. Includes comparison with classical Transformer on image classification task (ViT) wrt C

Simo Ryu 90 Dec 08, 2022
Canonical Appearance Transformations

CAT-Net: Learning Canonical Appearance Transformations Code to accompany our paper "How to Train a CAT: Learning Canonical Appearance Transformations

STARS Laboratory 54 Dec 24, 2022
Code needed to reproduce the examples found in "The Temporal Robustness of Stochastic Signals"

The Temporal Robustness of Stochastic Signals Code needed to reproduce the examples found in "The Temporal Robustness of Stochastic Signals" Case stud

0 Oct 28, 2021
An unopinionated replacement for PyTorch's Dataset and ImageFolder, that handles Tar archives

Simple Tar Dataset An unopinionated replacement for PyTorch's Dataset and ImageFolder classes, for datasets stored as uncompressed Tar archives. Just

Joao Henriques 47 Dec 20, 2022
Software associated to AAAI paper "Planning with Biological Neurons and Synapses"

jBrain Software associated with the AAAI 2022 paper Francesco D'Amore, Daniel Mitropolsky, Pierluigi Crescenzi, Emanuele Natale, Christos H. Papadimit

Pierluigi Crescenzi 1 Apr 10, 2022
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.

TL;DR Ignite is a high-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently. Click on the image to

4.2k Jan 01, 2023
Towards Understanding Quality Challenges of the Federated Learning: A First Look from the Lens of Robustness

FL Analysis This repository contains the code and results for the paper "Towards Understanding Quality Challenges of the Federated Learning: A First L

3 Oct 17, 2022
DualGAN-tensorflow: tensorflow implementation of DualGAN

ICCV paper of DualGAN DualGAN: unsupervised dual learning for image-to-image translation please cite the paper, if the codes has been used for your re

Jack Yi 252 Nov 10, 2022
A modular, open and non-proprietary toolkit for core robotic functionalities by harnessing deep learning

A modular, open and non-proprietary toolkit for core robotic functionalities by harnessing deep learning Website • About • Installation • Using OpenDR

OpenDR 304 Dec 28, 2022
DFM: A Performance Baseline for Deep Feature Matching

DFM: A Performance Baseline for Deep Feature Matching Python (Pytorch) and Matlab (MatConvNet) implementations of our paper DFM: A Performance Baselin

143 Jan 02, 2023
Multi-Modal Machine Learning toolkit based on PyTorch.

简体中文 | English TorchMM 简介 多模态学习工具包 TorchMM 旨在于提供模态联合学习和跨模态学习算法模型库,为处理图片文本等多模态数据提供高效的解决方案,助力多模态学习应用落地。 近期更新 2022.1.5 发布 TorchMM 初始版本 v1.0 特性 丰富的任务场景:工具

njustkmg 1 Jan 05, 2022
The implementation of the algorithm in the paper "Safe Deep Semi-Supervised Learning for Unseen-Class Unlabeled Data" published in ICML 2020.

DS3L This is the code for paper "Safe Deep Semi-Supervised Learning for Unseen-Class Unlabeled Data" published in ICML 2020. Setups The code is implem

Guolz 36 Oct 19, 2022
Just-Now - This Is Just Now Login Friendlist Cloner Tools

JUST NOW LOGIN FRIENDLIST CLONER TOOLS Install $ apt update $ apt upgrade $ apt

MAHADI HASAN AFRIDI 21 Mar 09, 2022
Object Detection with YOLOv3

Object Detection with YOLOv3 Bu projede YOLOv3-608 modeli kullanılmıştır. Requirements Python 3.8 OpenCV Numpy Documentation Yolo ile ilgili detaylı b

Ayşe Konuş 0 Mar 27, 2022