Predicting staff shortages for a fire and rescue service
Our model accurately predicted 7 out of 10 staffing problems for a London fire and rescue service. This allowed better allocation of resources, producing a more efficient fire service.
A fire and rescue service operating in a capital city.
When the fire service is called out to an incident, each fire crew consists of team members whose skills and training combine with and complement one another. Not all skills are shared by all firefighters, so staff absences can be a problem. At worst, if a crew is short of skills, the fire service will not be able to respond to emergencies.
Unanticipated disruptions to the team frequently render crews unable to respond to emergencies, and at the start of each shift, managers are left with the time-consuming and tedious task of searching for replacement specialist crew members by phone. We were asked to scrutinise and classify the data to help predict when there would be staff shortages and to provide a method for finding replacements.
First, we analysed the rich data set to determine how the data might be used to improve the process. Combined with staff interviews, we then simulated the occasions where a fire engine would not be able to be put into use as a result of understaffing.
We randomly split the data into three sets (training, validation and test) and made adjustments before testing a variety of classifier models, including random forest and gradient boosted decision trees.
Using a grid-search approach to tune the estimators’ hyperparameters, gradient boosted decision trees outperformed all other classifiers. Our model worked well: we were able to classify 72% of all test data correctly.
We were able to use algorithms to predict with a high level of certainty when staff shortages were likely to occur, and which skills needed to be found when shortages occurred. Our model was able accurately to predict 7 out of 10 staffing problems. This allowed better allocation of resources, producing a more efficient fire service.