Predictive Analytics Times
Predictive Analytics Times
EXCLUSIVE HIGHLIGHTS
Blatantly Discriminatory Machines: When Algorithms Explicitly Penalize
 Originally published in The San Francisco...
Data Reliability and Validity, Redux: Do Your CIO and Data Curators Really Understand the Concepts?
 Here are two recent entries on...
On Variable Importance in Logistic Regression
 The model looks good. It’s parsimonious,...
Data-Driven Decisions for Law Enforcement in Toronto
 For today’s leading deep learning methods...
SHARE THIS:

1 month ago
Data-Driven Decisions for Law Enforcement in Toronto

 

For today’s leading deep learning methods and technology, attend the conference and training workshops at Predictive Analytics World for Government, Sept 17-21, 2018 in Washington, DC.

Data-driven decisions for law enforcement are not new and have been used for over 20 years. New York City is often considered a pioneer in embracing methodologies and technologies to reduce crime. Under Rudy Giuliani’s leadership, the rate of serious crimes in New York City was reduced by 65% between 1993 and 2003. Although a number of factors were cited as the major reasons for this 65% reduction rate, the use of technology such as Compstat and data was certainly one of the driving factors. The end result of using data and technology was better alignment and allocation of police resources within New York City’s 5 boroughs.  The pin-mapping technology behind Compstat allowed New York City to highlight areas that were increasing in crime.

The success of New York City has led other municipalities such as Toronto Police Services (TPS) to explore the use of data and technology in a more optimal alignment of police resources and crime. Let’s explore this in more detail.

The partnership of TPS and Environics Analytics (EA) broadened our accessibility to data and the potential insights that could be used in more effective law enforcement. For example, TPS had all the crime data which was available at the division level to its most granular level which was the neighborhood level. EA has over 22000 variables comprising demographic, attitudinal, and behavioural type information which could be used to develop profiles around a given level of geography.

Our first goal was to use existing TPS data in order to be responsive at the current time given certain workplace conditions. In initially embarking on this type of initiative, domain knowledge of Toronto Police Services and its services was critical in identifying those specific workforce metrics which are used to evaluate law enforcement effectiveness. For example, some of these metrics included average response/service time and by time of day, # of units and type of units dispatched, etc. However, the most important component was segmenting events or incidents into 2 priority segments. These segments were ranked by severity of the incident with priority 1 being the most severe such as murder, gunshots while priority 2 was the least severe with crimes such as a break-in. In conjunction with TPS, we conducted a data discovery to better understand all these metrics and how they could be integrated into workforce demand.  At the end of this exercise, we had developed spreadsheets and formulae that allowed us to estimate demand if we changed certain workforce metrics. For example, by changing the number of incidents within a given priority segment and the average response time for that incident, TPS could estimate the number of constables to meet these changing conditions. This could be done at a city level or could be done at the more granular division level.

Once constable demand was determined, TPS also wanted to better understand how response or wait time would vary by number of constables. We developed a series of queuing equations which assumed Poisson distribution for number of arriving calls and exponential distribution based on service time.

All of this above work effort was invaluable in providing the ability of TPS to conduct sophisticated sensitivity analyses in order to see how resource demand (# of constables) would change by altering certain workforce metrics. Certainly, these solutions were effective in allocating the right resources based on certain conditions at that point in time. But rather than dealing with an incident in the most effective way possible as it occurred, could we be more pre-emptive by predicting these events before they occur.  This TPS goal led us to our second goal of the development of predictive models which allowed us to prioritize areas based on their crime likelihood rates.

One key factor within the model in predicting high crime areas was rather obvious such as the number of crimes committed in the last year.  Other factors encompassed such factors as the age, income, and dwelling types that comprised a given geographical area. With a predictive model, we could now predict the level of crime within a given geographical area. Using the predicted crime rate as our end objective, TPS was then able to utilize our first solution (workforce demand estimate) to determine the number of constables under certain conditions. Through the development of these two solutions (predictive model and the workforce demand estimate), TPS could be prescriptive in estimating the number of constables based on the predicted crime rate in the next 12 months.

Another outcome of being more prescriptive in estimating number of constables was that it enabled TPS to look at their current divisions/precincts and how they were currently staffed. By looking at what was predicted, TPS could redraw their current division boundaries to better reflect the predicted number of constables. In divisions where the predicted number of officers was expected to increase, TPS would expand the division boundaries while contraction of these boundaries would occur for those divisions where a predicted decrease in constables was expected to occur.

This case represented the classic data science exercise where success was achieved by using the strong domain knowledge of TPS alongside Environics Analytics data products and data knowledge. Yet, all this hard effort and work can only be achieved through the adoption of   a culture that is data-driven amongst all the key stakeholders within the organization. This was indeed the case for TPS as data became the foundation that allowed TPS to become a much more nimble and effective organization in deploying the right resources to a given geographic area.

This article was written to highlight our upcoming presentation, Developing a Data-Driven Culture to Optimize Decision-Making in the Toronto Police Force at the upcoming Predictive Analytics World for Government conference, Sept.17-21, 2018, by Rupen Seoni-Senior Vice President of Environics Analytics and Ian Williams, Manager of the Business Intelligence and Analytics Unit, Toronto Police Service. We hope to see you there next month.

About the Author:

Richard Boire, B.Sc. (McGill), MBA (Concordia), is the founding partner at the Boire Filler Group, a nationally recognized expert in the database and data analytical industry and is among the top experts in this field in Canada, with unique expertise and background experience. Boire Filler Group was recently acquired by Environics Analytics where I am currently senior vice-president.

Mr. Boire’s mathematical and technical expertise is complimented by experience working at and with clients who work in the B2C and B2B environments. He previously worked at and with Clients such as: Reader’s Digest, American Express, Loyalty Group, and Petro-Canada among many to establish his top notch credentials.

After 12 years of progressive data mining and analytical experience, Mr. Boire established his own consulting company – Boire Direct Marketing in 1994. He writes numerous articles for industry publications, is a well-sought after speaker on data mining, and works closely with the Canadian Marketing Association on a number of areas including Education and the Database and Technology councils.

Leave a Reply