Sunday, February 6, 2022

Taxi Dispatch

A question on OR Stack Exchange pertains to a taxi stand. The assumptions are as follows.

  • There are $t$ taxis operating between points A and B, with all passenger traffic going from A to B. For concreteness, I'll refer to a "taxi stand" with a "line" of taxis, similar to what you might see at an airport or popular hotel.
  • The round-trip travel time for a taxi (taking passengers from A to B and deadheading back from B to A) is deterministic with value $T$. (Note: Deterministic travel time is the author's assumption, not mine.)
  • Each taxi has a capacity for $p$ passengers and is dispatched only when full.
  • Passengers arrive according to a Poisson process with parameter (rate) $\lambda$.

The author was looking for the mean time between dispatches.

It has been a very long time since I taught queuing theory, and I have forgotten most of what I knew. My best guess for the mean time between dispatches, under the assumption that the system reached steady state, was $p/\lambda$, which would be the long-run average time required for $p$ passengers to arrive. To achieve steady-state, you need the average arrival rate ($\lambda$) to be less than the maximum full-blast service rate ($t p/T$). If arrivals occur faster than that, in the long term you will dispatching a taxi every $1/T$ time units while either queue explodes or there is significant balking/reneging.

To test whether my answer was plausible (under the assumption of a steady state), I decided to write a little discrete event simulation (DES). There are special purpose DES programs, but it's been way to long since I used one (or had a license), so I decided to try writing one in R. A little googling turned up at least a couple of DES packages for R, but I did not want to commit much time to learning their syntax for a one-off experiment, plus I was not sure if any of them handled batch processing. So I wrote my own from scratch.

My code is not exactly a paragon of efficiency. Given the simplicity of the problem and the fact that I would only be running it once or twice, I went for minimal programmer time as opposed to minimal run time. That said, it can run 10,000 simulated passenger arrivals in a few seconds. The program uses one data frame as what is known in DES (or at least was back in my day) an "event calendar", holding in chronological order three types of events: passengers joining the queue; taxis being dispatched; and taxis returning.

The simulation starts with all taxis in line and no passengers present. I did not bother with a separate "warm-up" period to achieve steady-state, which would be required if I were striving for more accuracy. For a typical trial run ($t=5$, $p=4$, $T=20$, $\lambda=0.7$), my guess at the mean time between dispatches worked out to 5.714 and the mean time computed from the run was 5.702. So I'm inclined to trust my answer (and to quit while I'm ahead).

If anyone is curious (and will forgive the lack of efficiency), you can take a look at my R notebook (which includes the code).

No comments:

Post a Comment

Due to intermittent spamming, comments are being moderated. If this is your first time commenting on the blog, please read the Ground Rules for Comments. In particular, if you want to ask an operations research-related question not relevant to this post, consider asking it on Operations Research Stack Exchange.