The Road Scene Understanding for the Visually Impaired initiative aims to develop a Sidewalk Environment Detection System for enhancing the mobility capabilities of visually impaired people through the combination of GPS systems and image segmentation techniques refined for sidewalk recognition.
The project will be tested from the main train station in Erlangen to the University Library of Erlangen-Nuremberg (Schuhstrasse 1a).
To create a sample environment on Windows:
conda create -n myenv python=3.9 pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu121 pip install -r requirements.txt
Running rsu_vi.py
:
input_video.mp4
and GPS
file are in the input
folder.model.onnx
in the model_weights
folder.For video input:
python rsu_vi.py "input/input_video.mp4" "segmentation_output.avi" "model_weights/model.onnx" "input/new.gpx" --headless
For image folder input:
python rsu_vi.py "images" "segmentation_output" "model_weights/model.onnx" "input/new.gpx"
For camera input:
python rsu_vi.py 0 "cam_output.avi" "model_weights/model.onnx" "input/new.gpx"
Exporting ONNX Model (onnx_export.py
):
python onnx_export.py --pytorch="fine_tuned_mapillary.ckpt"
Training Cityscapes Model (city_training.py
):
dataset_path
in city_config.py
python city_training.py
Converting Mapillary Masks to Grayscale (convert_masks_to_grayscale.py
):
json_path
, masks_path
, and op_path
in map_config.py
python convert_masks_to_grayscale.py
Fine-tuning on Mapillary Dataset (training_pipeline.py
):
mapillary_train_path
, mapillary_val_path
, mapillary_test_path
, city_ckpt_path
in map_config.py
python training_pipeline.py
Performing Inference on an Image (inference.py
):
map_ckp_path
, img_dir
, and op_dir
in map_config.py
python inference.py
The following warning can be ignored:
[W NNPACK.cpp:64] Could not initialize NNPACK! Reason: Unsupported hardware.
This project is licensed under the MIT License. See the LICENSE file for more information.
For questions or support, please contact the project team at nurulgofran@gmail.com