A Filtering and Detection Framework for Collaborative Localization

Ford Center for Autonomous Vehicles, University of Michigan

This work was completed at the lab as an independent research project by a group of 2, with mentorship from Mr Ankit Vora, Mr Siddarth Agarwal from Ford Motor Company and Dr Katie Skinner. My focus was on filtering and state.

My contribution to this work * : Develop the filtering setup, the perception module simulator, integrate the two and performing ablation studies 

Problem Description

It is not always possible for all agents of a multi-agent autonomous vehicles network to possess equivalent sensing suites in terms of accuracy and precision. It would be ideal to exploit the sensing suites available on one vehicle to improve the localization of other vehicles. 

This area is the focus of this work, where 

Project overview

We propose and produce results using a filtering framework to combine pose information derived from vision(a simulator) and odometry to improve localization of the ADAS vehicle which is following the smart vehicle

Key achievements

Implementation - How?

The subject vehicle (that's following the smart vehicle) is called the 'challenge' vehicle. The perception simulator uses the relative pose (Smart to Challenge pose) information to generate an additional observation of the pose that is then fused with the prorioceptive information using an EKF.

Implementation - The Result?

Using the transformations as described above and shown here, we were able to improve the accuracy of localization for the ADAS/Challenge vehicle. 

Project outcome

An example of the results generated under various noise levels is shown here. 

Praesent sapien massa.