Skip to main content

A Discrete Choice Model - Analyzing non-linear contributions to predictive performance

A machine learning approach that investigates how neural networks' ability to learn non-linear phenomena efficiently can provide increased performance in a four-step model.

This paper aims to build and investigate a discrete choice model on activity-based travel demand using artificial neural networks with non-linear activations. This approach can be seen as an extension in complexity compared to the conventional MNL. By using intermediate layers and many more parameters, variables are allowed to interact with each other in a great number of different combinations to learn the non-linear phenomena present in the data more efficiently.

In a four-step model framework, the final model utilizes the three sub-models: generation, distribution, and mode choice. At every 108 10-minute intervals between 05:00 am and 11:00 pm, a simulated agent's decision to in the next time-step stay or take a trip from its selected activity is determined by the generation model. The distribution and mode choice models are used whenever a trip is performed. Tests on hold-out data show how successive non-linear additions in the distribution model increase performance and that the model's suggested structure can simulate travel demand on a very detailed level.

Figure showing the probability of choosing a zone and doing one of the colored activities is visuali
The probability of choosing a zone and doing any of the colored activities is visualized by each circle's size. Currently, the agent is positioned in the more central parts of Stockholm, indicated by the white circle, performing the work activity at 4 pm. In the top left corner, the red circle indicating the home activity appears. All non-home, non-work and non-trip to same zone activities have their probabilities scaled such that their maximum probability at each time is increased to 35% of a c
Page responsible:infomaster@abe.kth.se
Belongs to: Department of Urban Planning and Environment
Last changed: May 02, 2022