Classify Object Behavior to Enhance the Safety of Autonomous VehiclesAutomatically classify behavior of tracked objects to enhance the safety of autonomous systems. |
This project is part of the Matlab-Simulink Challenge and focuses on classifying object behavior and assessing the risk it poses to autonomous vehicles. The project leverages MATLAB and Simulink to create a robust model for autonomous driving scenarios.
Invited to the ReadyTensor Computer Vision Competition
The project aims to classify the behavior of objects within autonomous driving scenarios using a Recurrent Neural Network (RNN). The goal is to understand the behavior of objects and predict their risk level, which is crucial for the safe operation of autonomous vehicles.
It is certifcated by Mathworks with the Excellence in Innovation Certificate
I am Jakub, currently enrolled as computer science student pursuing a bachelors degree. Since I am working with Matlab for some time now, I was more than excited to have the oppertunity to partake in the Matlab/Simulink Challenge.
I chose this particular project because, not only am I interested and enthusiastic in learning more about Artifical Intelligence, but I really like to create Matlab Code with real world applications.
To begin this project, I created a MATLAB project, initialized a Git repository, and connected it to this remote repository. I used the Automated Driving Toolbox™ and Deep Learning Toolbox™ to build, simulate, and train models.
The Scenario Designer from the Automated Driving Toolbox™ was used to create driving scenarios, including traffic, pedestrians, and different driving behaviors.
The Data Found here is created by running the Scene Builder simulation and then exporting the sensor data. They don't hold the scenario itself but only the sensor data, therefore they can be opened in the normal Matlab workplace
I chose an RNN to classify risky vs. safe behavior since it is ideal for processing sequences of time-series data and can capture temporal dependencies in object behavior.
layers = [ sequenceInputLayer(numFeatures) lstmLayer(numHiddenUnits, 'OutputMode', 'sequence') dropoutLayer(0.2) fullyConnectedLayer(numUniqueValues) softmaxLayer classificationLayer]; Training Options:
options = trainingOptions('adam', ... 'MaxEpochs', 2500, ... 'InitialLearnRate', 1e-3, ... 'MiniBatchSize', 32, ... 'Shuffle', 'every-epoch', ... 'ValidationFrequency', 50, ... 'Verbose', false, ... 'Plots', 'training-progress');
Visualization Folder: Contains various visualizations related to scenario testing, including 2D and 3D representations of object trajectories and behaviors.
Main.m: Main entry point to run the project, visualize results, and interact with the GUI.
VisGUI.m: A custom GUI for interacting with the different Scenario Visualizations.
The Scene(s) are created in the Scenebuilder and then via the 'export Matlab function' utility converted into a program.
I extended the project by integrating a JavaInterface to connect a Model that performs object recognition. The integration between MATLAB and Java enhances the object risk classification model by incorporating visual input directly from the vehicle’s sensors.
The RNN model achieved an accuracy ranging between 95% and 99% when tested on various driving scenarios. The confusion matrix generated after testing shows that the model is highly effective at distinguishing between safe and risky behaviors.
With the selfwritten Model in Java and an selfwritten JavaMatlab interface, the Object Recognition model from java could get visual input from the car to help enhance the object risk classification with additonal data such as labels or movement predictions
This project demonstrated the effectiveness of using RNNs for classifying object behavior in autonomous driving scenarios. By leveraging MATLAB and Simulink, I was able to build a model that accurately predicts the risk level posed by different objects, which is essential for the safe operation of autonomous vehicles.
To get started with this project:
git clone https://github.com/JakubSchwenkbeck/Behavior-Classification-For-AV.git
cd Behavior-Classification-For-AV
Go to the Main.m Function,which is the Main entry point. It can run out of the box (with the right Toolboxes;). You can also run the GUI.m function to be able to interact with the graphical user-interface prvided
.(4. Download a small Dataset )
Dropbox Link
This project is licensed under the MIT License. See the LICENSE file for details.