Interaction-Aware Vehicle Trajectory Prediction via Attention Mechanism and Beyond
Time: Tue 2022-06-21 14.00 - 15.00
Location: Zoom link https://kth-se.zoom.us/j/68330181904
Video link: https://kth-se.zoom.us/j/68330181904
Language: English
Respondent: Wenxuan Wu , DCS
Opponent: Ning Wang
Supervisor: Yuchao Li
Examiner: Jonas Mårtensson
Abstract: With the development of autonomous driving technology, vehicle trajectory prediction has become a hot topic in the intelligent traffic area. However, complex road conditions may bring multiple challenges to the vehicle trajectory prediction model. To address this, most recent studies mainly focus on designing different neural network structures to learn vehicles’ dynamics and interaction features for better prediction. In this thesis we restrict our research scope to highway scenarios. Based on the experimental comparison among Vanilla-RNN, Vanilla-LSTM, and Vanilla-Transformer, we find the best configuration of the dynamics-only encoder module and utilized it to design a novel model called the LSTM-attention model for vehicle trajectory prediction. The objective of our design is to explore whether the self-attention mechanism based encoder outperforms the pooling mechanism based encoder utilized in most current baseline models. The experiment results on the interaction encoder module show that the self-attention based encoder with 8 heads outperforms the pooling based encoder in longer prediction horizons. To test the robustness of our LSTM-attention model, we also compare the prediction performance between using maneuver-based decoder and using maneuver-free decoder, respectively. And we find the maneuver-based decoder performs better on the heavily unbalanced NGSIM dataset. Finally, to explore other latent interaction features our LSTM-attention model might fuse, we analyze the graph-based encoder and the polar-based encoder, respectively. Based on this, we find more meaningful designs we might exploit in our future work.