Artificial Intelligence for us?

Artificial Intelligence for us?

Have you ever wondered how Netflix knows what movies to recommend for you—or realized that if it is suggesting kids’ movies, that means someone else in the house must have been using your account? It turns out that this sort of technology, which relies on what is called “big data,” is also useful for calltaking and emergency response.

Consider the size of the data that we collectively work with. Based on some available information, if there were global coverage of emergency numbers, there would be approximately seven billion emergency calls (911, 999, 998, 997, 112, 111, 102, etc.) per year worldwide. That is around one call per person on average, although some studies have found that the repeat callers account for a larger portion of the calls, with more than two calls per year on average from each person that called. In any case, this generates huge amounts of data, more than enough to qualify as “big data.”

Not unlike that Google search, imagine the day in the not too distant future when you’ve began to type the caller’s problem statement, and the ProQA software knows what the caller’s answers are likely to be and autocompletes them before you have even asked all the questions. I predict that day is coming soon, although this kind of system requires large amounts of data to work from. How can that be possible? Let’s take a brief look at how these systems work and how they might improve the accuracy and speed of your calltaking process.

In 1950, Alan Turing, a famous computer scientist, wondered, “Can computers think?” He proposed answering this question with the so-called Imitation Game. In this game, an interrogator would ask two players (one a human and the other a computer) a series of questions designed to determine which of the two players was the computer and which the human. If they reached the point where the player could not tell the difference between the two players’ answers, it would be determined that computers can, in fact, think.

Now, imagine a time in our near future when the computer’s flawless memory and calculations are available to the very real and human calltaker. Using Machine Learning (ML) tools, also known as Artificial Intelligence (AI), these computers will recognize patterns that you did not even know existed—and do it almost instantly, starting with the phone ring. Complicated patterns and connections could be discovered that would be too difficult for a human to identify separately.

Although we recognize patterns all the time, as humans we only have the capability to handle a limited number of variables, while ML/AI tools can very quickly identify patterns across 15, 50, 100, and even thousands of different inputs and determine precisely what is important and what is not. If you were to draw out these concepts visually, it would look like a map of our nervous system, or perhaps like a pattern of snowflake under a microscope. The purpose of using such pattern recognition, remember, is not to replace the emergency dispatcher—far from it. Rather, the process can recognize patterns across millions of calls, providing insight to the calltaker as they proceed through a call, similar to how doctors’ electronic decision support systems can offer them diagnostic and prescribing advice in real time.

One of the exciting aspects of this technology is that it is what is called a self-learning algorithm; that is, it gets more accurate as time goes on and it is presented with even more data. This kind of algorithm can learn from all dispatchers using the same system anywhere in the world simultaneously, and will learn what aspects of the calltaking and key questions might be regionalized or cultural, or even those that change depending on the type of emergency. The questions that you need ask and have answered might even change depending on the calltaker and the caller, customized for each scenario.

Another ML/AI method is known as Natural Language Processing (NLP), which aims to “understand” the meaning or intent of simple statements. When trained by humans, this method works well in handling text such as a physician’s or paramedic’s narrative in a medical record. It is less efficient, though, at handling information that is less structured, such as the dispatch problem description, which includes all forms of language, as well as word choices with multiple meanings.

Think about all the different responses you get when you ask “Okay, tell me exactly what happened?” Then consider how differently each calltaker might enter these responses into ProQA. As standard as data processing may seem, it is a process that works better when there is only one thing or one kind of thing happening. As you know, that often is not the case at dispatch.

For this kind of complex question, we might use a process known as clustering, which tries to identify the “chief complaint” or “call type,” so that everything that follows can be standardized.

For example, if the caller reports, “There was a shooting,” that will result in a rapid police response. If, however, the caller reports, “The victim is pulseless and not breathing,” you will instruct them to begin CPR and urgently send the paramedics. This system is designed to minimize the time required to determine what level, or priority, of dispatch is needed and what instructions should be given to reduce morbidity and mortality. However, at least for now, it does not tolerate ambiguity or changing circumstances very well.

That’s why, in my future work, I plan to evaluate several different algorithms that could be useful (behind the scenes) to help you more quickly and with more accuracy determine what responses are needed and which instructions should be provided.

Download PDF

Citation: Nudell, N. Artificial Intelligence for us? Annals of Emergency Dispatch & Response. 2017;5(1):5.

Challenges in Utilization of Statistical Analysis Software in Emergency Dispatch Data Analysis and Advances in Data and Technologies

Challenges in Utilization of Statistical Analysis Software in Emergency Dispatch Data Analysis and Advances in Data and Technologies

Introduction

Emergency dispatch and emergency medical services (EMS) data keeps increasing every day—in quality, volume, and dimensionality. An Emergency Medical Dispatcher (EMD) at an emergency communication center is the primary link between the public caller requesting emergency medical assistance and EMS. Data collection starts when the EMD receives the call and ends when the patient is either treated on scene or admitted to the hospital. The data collected is mostly structured data containing both text and numerical data types.

Data generated from emergency dispatch is very valuable, as we can gain extremely beneficial insights from the data, such as identifying ways to reduce response times, determining the most commonly encountered types of emergencies (or Chief Complaints), and understanding the text data entered by the EMD to describe the caller’s emergency problem. These insights in turn can help to evolve dispatch protocols and to design and implement better training for calltakers, dispatchers, and other emergency communication center staff. However, collecting and analyzing the data can present challenges. When attempting to apply one common statistical analysis software (called R) to one large dispatch dataset, we encountered challenges that are typical of those that researchers and data analysts will continue to encounter as dispatch data grows in size and dimensionality.

Challenges and Potential Solutions

There are many tools that can be used for general data analysis and reporting results. Some of the tools are Tableau, MicroStrategy, and QlikView. These tools can be used for data analysis and for developing dashboards where emergency dispatchers can keep track of Key Performance Indicators in real time, depending on when they refresh the data from the database. For more sophisticated statistical data analysis, there are tools such as SAS, SPSS, STATA, and R program, which enable researchers to develop prediction models or perform other complicated analyses. 

Even these programs, though, present their own difficulties. One of the problems with R program, for example, is that it was designed to have all of the data stored in memory—and it is memory-intensive. Emergency dispatch data can contain thousands of rows and can significantly exceed the memory of computers when we analyze large datasets. Hence, to alleviate memory problems, we need to split the datasets prior to analysis. However, splitting the data degrades the insights we can gather.

Another potential challenge for dispatch data analysis is so-called “Big Data.” Big Data can involve both structured and unstructured data, resulting in datasets greater than 1 terabyte (TB) in size. Big Data has three main characteristics: volume, velocity, and variety. Volume is the amount of data generated, velocity is how frequently data is generated, and variety refers to the number of types of data generated. Emergency dispatch data may already be categorized under Big Data based on volume, and sooner or later it will satisfy all three conditions. This is primarily due to the fact that, with time, as data volume increases, velocity also increases, because as population increases the total number of emergency cases may also exponentially increase. Also, as technology becomes cheaper, various other parameters may also be collected, which increases the variety of data. 

Big Data cannot be analyzed using regular tools such as R program, SPSS, STATA, or Tableau. Using regular analysis tools, we pull data from data sources and directly load it into the tools for analysis; this is not a problem when the dataset is small. However, when the dataset size is big, it is difficult to pull it from the source and load it into the analysis tool. Hence, the analysis tools or programs need to reside where the data is—at the source. This way, analysis is performed without moving the data from its source. Hadoop framework software is the commonly used Big Data technology that has gained the most momentum recently. Hadoop framework consists of Hadoop Distributed File System (HDFS) and MapReduce. HDFS enables the storing of data. It divides the data in to smaller parts and distributes the data across various servers or nodes. MapReduce software framework helps in processing the data in parallel in the cluster—in small, easily-managed and -processed chunks of integrated datasets. Numerous vendors such as Amazon Web Services, Coludera, Hortonworks, and MapR technologies distribute open source Hadoop platforms. These may eventually offer solutions to some of the dispatch community’s Big Data challenges.

Summary

Dispatch and EMS provide very rich data. Existing statistical analysis software tools can reveal trends and insights in that data that can help move dispatch forward. However, in the near future, as dispatch and EMS data get classified under Big Data, more sophisticated statistical tools such as Hadoop will evidently be needed in order to gain better insights in the data. Therefore, by keeping the growth in data (both size and dimensionality) in mind, it would be a rewarding investment to begin thinking about how to apply real-time analysis and reporting and big data technologies. 

Acknowledgement

Thanks to Chris Olola, PhD; Isabel Gardett, PhD; and Greg Scott, MBA, EMD-Q® for their mentorship and supervision of the pilot application of the concept of using R analysis tool to mine problem description data in emergency dispatch research.

Citation: Yerram SR. Challenges in utilization of statistical analysis software in emergency dispatch data analysis and advances in data and technologies. Annals of Emergency Dispatch & Response. 2016;4(2):5–6.

Download Original Paper

Topics:||

FBI National Data Exchange System’s On-Line Tool Enhances Dispatching by Law Enforcement Agencies throughout the US

The traffic stop began like any other. The officer radioed to dispatch with the license plate number and a National Crime Information Center (NCIC) search was conducted. The search turned up negative – showing nothing unusual about the vehicle or its owner – and the officer started a routine approach to the vehicle. The dispatcher then searched the license plate number through the Federal Bureau of Investigation’s (FBI) newest System, the National Data Exchange (N-DEx). for short. Moments before the officer reached the driver’s side window, dispatch returned with some relevant information – the vehicle had been associated with weapons charges only two months prior. With this new information, the officer implemented the protocol for a more dangerous situation, ordering the driver out of the vehicle with hands on his head. The subsequent search of the vehicle discovered a loaded, stolen weapon under the driver’s seat. A potentially disastrous situation was averted.

This hypothetical scenario demonstrates the potential of information sharing in the 21st Century. No longer do law enforcement and criminal justice professionals have to rely on limited information to do their jobs. Instead, they have a vast array of information on incidents, arrests, booking and incarceration, pre-trial investigations, as well as probation and parole information available to them with an authorized N-DEx account accessible via common web browsers.

The FBI’s National Data Exchange

The FBI’s N-DEx System contains data from local, tribal, state and federal agencies. It is available 24/7, 365 days a year. The FBI also bears the full cost of development and maintenance of the System. “N-DEx was the FBI’s answer to the 911 commission’s report calling for greater information sharing between all levels of law enforcement,” explains the N-DEx Acting Unit Chief, John Quinlan. Quinlan added that the men and women of the N-DEx Program Office are proactively engaging more and more agencies to further increase participation with N-DEx, both for obtaining more data, as well as increasing usage of the system.

N-DEx data types have evolved from incident reports to include other data types such as probation, parole, and booking reports, providing users with a wider array of investigative information. N-DEx became operational in 2008. Since then, N-DEx has been expanding capabilities incrementally. With these upgrades, version 2 of N-DEx had new features such as full text search, subscription and notification, enhanced link and geo-visualization, and collaboration features. However, version 3 of the system added new features such as the integrated person entity view, batch query, and also enhanced existing features. Subsequent N-DEx builds created roles for the Criminal Justice Information Services (CJIS) Systems Officers (CSOs) to manage personnel, audits, and training. Future builds will continue to expand and improve N-DEx capabilities, further meeting the needs of the user community.

Access to N-DEx is brokered through the FBI’s Law Enforcement Enterprise Portal (LEEP). Along with other services such as Regional Information Sharing Systems (RISS) and National Gang Intelligence Center (NGIC), LEEP also provides N-DEx as a service. After applying for and receiving access to N-DEx, law enforcement or criminal justice professionals can simply click on the N-DEx icon from LEEP and be taken to the default simple search screen of N-DEx, where they can craft key-word searches. One click away is the Targeted Person search where a name and other 36 Annals of Emergency Dispatch & Response | Volume 3, Issue 2 FBI National Data Exchange Program weighted identifiers, such as a date of birth, or Social Security Number, can be used to return all records in the system associated with an individual. In the case above, a license plate number was used to return any information associated with that vehicle.

N-DEx connects the dots between seemingly unrelated people, locations, property, offenses, and more. It correlates data within an agency’s own files and between the files of other agencies that contribute data. This powerful feature allows investigators to easily pull all records associated with an entity, and even view the associations in the form of a link diagram. Other site features include Geo-Visualization – placing a record or records on a map – as well as batch query capability, complex filtering options, and collaboration sites where other authorized users can share files of any size or type.

In an actual case in 2012, a Maryland Highway Patrol Officer stopped a truck transporting three storage containers. A search of the vehicle and driver information did not reveal useful information. The officer then took an additional step of searching the names on the bill of lading through the Law Enforcement Information Exchange (LInX). Because of the designed linkages between the two systems, this LlnX search automatically triggered a query of the N-DEx System. Within seconds, the N-DEx returned results indicating an individual associated with one of the containers was the subject of an ongoing federal investigation. In light of this new information, Maryland Highway Patrol Department was able to collaborate immediately with the Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF) to track down the suspect. The vehicle was allowed to continue to its destination where the person under investigation was intercepted when he claimed the storage container. The contents included 3,000 cartons of counterfeit cigarettes; the suspect was subsequently charged and convicted of multiple crimes.

As of August 2015 the N-DEx System received files from over 5,300 agencies, containing approximately 500 million searchable records, and returns results for an average of 50,000 searches per week. As new users apply for access and begin incorporating N-DEx in their standard operations and as new agencies realize the value of having their data in the System, N-DEx will continue to grow in both size and usage. Each day new records are added to the NDEx System, creating new opportunities daily in the fight to solve and prevent crime. This is why the N-DEx Program Office will continue to actively pursue bringing on board new agencies and help new users conduct searches of the system. As Kent County Sheriff (MI), Lawrence Stelma says, “I get just as much benefit from having my records in N-DEx, as I do from using the system. Investigators from other jurisdictions are constantly bumping in to my investigations and assisting us in solving our county’s crimes.”

In this day and age, it is necessary to use all of the tools available in order to solve and prevent crime and promote public safety. N-DEx is one of those tools – a force multiplier – allowing users to access more information in less time. The N-DEx Program Office encourages police dispatchers to apply for N-DEx membership and utilize this valuable tool as they conduct their daily work. Together, partnerships between local law enforcement and the FBI can make the world a little safer place.

Authors note: To obtain N-DEx access, visit the LEEP at www. CJIS.gov and fill out the online application. If you already have a LEEP account, you can request N-DEx access by finding us in the “Services” section of LEEP and choosing ”Request Access”. For contributing data to the System, contact the N-DEx Program Office’s help desk at ndex@leo.gov and your request will be routed to the Liaison Specialist associated with your state. 2015 | Annals

Citation:Wertheim K, Badgett K. FBI National Data Exchange System’s On-Line Tool Enhances Dispatching by Law Enforcement Agencies throughout the US. Annals of Emergency Dispatch and Response. 2015;3(2):36-37.

Download Original Paper

Close Collaboration Between Public Health and Emergency Services Agencies Leads to an Effective Ebola Response in North America

Close Collaboration Between Public Health and Emergency Services Agencies Leads to an Effective Ebola Response in North America

911 agencies, Emergency medical services (EMS), and first responder agencies  play a critical role in major disease outbreaks because they are a gateway to the overall healthcare system for a wide spectrum of patients who use 911 as their first point of access. For those patients, emergency dispatchers and emergency prehospital responders are the first professionals to communicate with, treat, and provide hospital transport for, patients who are infected with communicable diseases. Public health agencies such as the Centers for Disease Control and Prevention (CDC) at the national level, and state and county health departments at the local level, or provincial health ministries in Canada, are responsible for developing the strategy and setting the policies and practices for managing disease outbreaks, infected patients, and those suspected to be infected. Members of these two professions – emergency services and public health – don’t necessarily interact on a routine basis, and differ substantially in the focus of their work. Given this reality, the recent Ebola outbreak tested the abilities of these two disparate groups to work in concert, in an effective and seamless effort to contain the spread of a novel disease. 911 centers in particular often receive little attention when it comes to a comprehensive public health response to disease outbreaks. Nevertheless, a critical role can be played by 911 agencies willing to review and modify day-to-day operations to contribute to the overall effort of specific disease information gathering as well as responder notifications.

During the summer of 2014, public health authorities around the world became alarmed over the rapid spread of a strain of Ebola Viral Disease (EVD) that had become widespread in several West African countries. By July 30, 2014, Guinea, Sierra Leone, Liberia, and Nigeria had together reported confirmed and suspected case counts totaling 1,440, with 826 reported deaths. On August 8, 2014, the World Health Organization (WHO) announced that this EVD outbreak was the largest ever recorded, and declared it a Public Health Emergency of International Concern. By August 26th, the case count had reached 3,069, with 1,552 deaths – all in a handful of West African countries.

But on September 30, 2014 the crisis suddenly reached a new level when the first case of EVD diagnosed in the United States was reported in Dallas, Texas, in a patient who had recently traveled to the United States from Liberia. Since one of the patient’s close acquaintances had called 911, and the patient was subsequently transported to the hospital by a Dallas Fire Department paramedic crew, the issue of prehospital care management of EVD cases was placed immediately into the public health – and media – spotlight. Following the Dallas case, the CDC moved quickly to implement a series of public health policies and procedures aimed at preventing the spread of the disease in the U.S. population, and protecting healthcare and emergency services workers who may come in contact with Ebola patients.

On August 26, 2014, the CDC, recognizing the value of emergency services – including 911 agencies – in dealing with a disease outbreak, posted its first version of a document entitled “Interim Guidance for Emergency Medical Services (EMS) Systems and 9-1-1 Public Safety Answering Points (PSAPs) for Management of Patients with Known or Suspected Ebola Virus Disease in the United States.” The primary goal of this document was to provide nationwide direction for prehospital and public safety professionals to follow standardized protocols and procedures for patient management and treatment of suspected EVD patients from the first point of contact with a public safety or healthcare professional, to final discharge from a medical facility. For 911 agencies (PSAPs) the CDC’s recommendation was for local authorities to direct emergency dispatch agencies to use “modified caller queries” about Ebola when the risk of Ebola becomes elevated. Specifically, 911 callers should be queried about:

• travel to or from an Ebola-infected country

• signs and symptoms of Ebola, such as fever, diarrhea, vomiting, etc. and

• Other risk factors, such as contact with someone who is sick with Ebola1

The CDC’s “Interim Guidance” was a good framework document for 911 agencies, yet for 911 centers to operationalize this recommended practice, required a specific calltaking protocol format, with scripted, ordered questions and instructions to use when receiving calls that involve potentially infected patients. To fill that need, the International Academies of Emergency Dispatch (IAED) took up the task of modifying its existing tool – the Severe Respiratory Infection (SRI) Tool – for querying 911 callers regarding infectious diseases.

The IAED’s Chemical Biological, Radiological, and Nuclear (CBRN) Fast-Track Committee – a subcommittee of the IAED Medical Council of Standards – was assigned the responsibility of creating a new template to address the points in the CDC’s guidance document. Formed shortly after the 2003 SARS (Severe Acute Respiratory Syndrome) outbreak in North America, the committee is made up of emergency medical services (EMS), public health, and 911 professionals, whose primary goal is to identify and evaluate emerging public health or public safety threats, and develop or update relevant emergency dispatch protocols and procedures for managing those threats.

The group’s biggest challenge during any novel disease outbreak is to stay ahead of the rapidly changing course of the disease, through a process of constant monitoring, teleconferencing and electronic communication among group members.

The committee had previously developed the SRI Tool for use in an outbreak of a flu-like, respiratory illness, such as SARS in 2003, or the H1N1 influenza pandemic in 2009. The SRI Tool was a useful template as a starting point, but EVD is categorized as a viral hemorrhagic fever, with a different pathology than these previous novel virus outbreaks. Further, the fact that EVD was transmitted through direct body fluid contact with a symptomatic patient as opposed to airborne transmission of the previous respiratory disease outbreaks, and the geographical concentration of the disease in West Africa, meant that specific caller/patient queries needed to be developed for detection of potential EVD patients. After reviewing the CDC recommendations, receiving input from user agencies, and discussing the content, wording, and ordering of specific questions and instructions, a final draft was developed by October 18th, less than three weeks after the first North American case had been confirmed in Dallas. The final version of the new protocol tool – renamed the Emerging Infection Disease Surveillance (EIDS) Tool (SRI/MERS/Ebola) (Figure 1) – was released to North American users on October 20th, and a week later the software version was released. It was also released to other countries in numerous languages shortly thereafter.

Having been involved in the IAED CBRN committee, and participating in conference calls with the CDC, NHTSA (the National Highway Traffic Safety Administration), the Paramedic Chiefs of Canada – and numerous conversations with dozens of 911 agencies whose managers, medical control authorities, and dispatchers, worked long and hard to implement an effective dispatch response to EVD – I am confident in saying there has been a widespread acknowledgement from all of the above stakeholders as to the importance of 911’s role in the response to an emerging disease outbreak.

Thankfully, EVD has not spread in North America, largely due to the rapid and well-coordinated actions of the key players at the national, state, and local levels – not the least of whom are our emergency responders and emergency dispatchers. It is clear therefore, that 911 centers, responder agencies, and public health can work together to implement a successful prehospital response for emerging disease outbreaks. National public health authorities from the CDC were able to provide guidance for emergency services, including EMS and 911. Specific protocols and procedures can be provided by an international standard setting organization, such as the IAED, if it moves quickly to respond to the public health emergency at hand.

Finally all stakeholder agencies must maintain an open line of communication, and be flexible enough to adjust and react to rapidly changing circumstances with well thought out actions.

TABLES AND GRAPHS

REFERENCES
Citation: Scott G. Ann Emerg Disp Resp 2015; 3(1): 5-7

Download Original Paper

  1. Centers for Disease Control and Prevention (CDC). Interim Guidance for Emergency Medical Services (EMS) Systems and 9-1-1 Public Safety Answering Points (PSAPs) for Management of Patients with Known or Suspected Ebola Virus Disease in the United States (released October, 2014). Retrieved from: http://www.cdc.gov/vhf/ebola/healthcare-us/emergency-services/ems-systems.html, accessed October 27, 2014 view
Why Evidence-Based Decision-Making Matters

Why Evidence-Based Decision-Making Matters

INTRODUCTION

One of my former mentors, Deputy Chief (Ret.) Jim Graham of the Chesterfield County (VA) Fire and EMS Department, was a huge proponent of the use of information, whether on a fireground operation, developing a new training program, or addressing budget reductions.  One of his favorite sayings—one that stays with me to this day—was:

“We must constantly strive to become better at data-driven decision making, instead of following the ‘I think, I feel, or I believe’ model.”

However, he usually followed that up with the caveat that we should strive to ensure that we had good information, not just any information.

This is an area in which emergency dispatch could improve: the ability to collect accurate, complete, and meaningful data from across the USA (or the world) about common issues or problems, including: recruitment, hiring, and retention of staff, leadership and management practices, quality assurance and improvement, and the impact of new technologies.

A data-driven success story

In 1964 and 1966, public pressure grew in the United States to increase the safety of cars, culminating with the publishing of “Unsafe at Any Speed” by activist lawyer Ralph Nader and a report from the National Academy of Sciences entitled “Accidental Death and Disability—The Neglected Disease of Modern Society.”  That 37page booklet, commonly referred to as the “White Paper,” addressed the huge and costly problem of accidental deaths and injuries in the United States.  In 1965, there were 52 million accidental injuries leading to 107,000 deaths, 10 million temporarily disabled persons, and 400,000 permanently impaired individuals.  Those deaths and injuries cost an estimated $18 billion (about $129 billion in 2012 dollars). The White Paper stated that accidents were the leading cause of death for persons aged 1-37 and the fourth leading cause of death for all ages in 1965. For people under 75, motor vehicle accidents constituted the leading cause of accidental death.

Public opinion moved the United States Congress to enact the National Traffic and Motor Safety Act in 1966. The Act created the National Highway Safety Bureau (now the National Highway Traffic Safety Administration) and empowered the federal government to set and administer new safety standards for motor vehicles and road traffic safety. The Act was one of a number of initiatives by the government in response to the increasing number of cars and associated fatalities and injuries on the road following a period when the number of people killed on the road had increased 6-fold and the number of vehicles was up 11-fold since 1925.

So what happened?  As a nation, we took a systematic approach to solving the problem by creating the modern trauma care system and the Emergency Medical Services to reduce the mortality and morbidity rates for motor vehicle crashes.  We enacted new standards and regulations and requirements for automobile safety so that the automobile manufacturers were motivated to engineer and build safer cars.  The automobiles they started to produce included  lap/shoulder belt restraint systems, air bag restraint systems, energy-absorbing steering columns, and vehicle chassis construction that dissipates crash energy to protect vehicle occupants.

Finally, planners and engineers and road builders started to design and build safer roads for those safer vehicles to travel on. Those road construction design improvements included guard rails to prevent vehicles from striking stationary objects, e.g., bridge abutments. Guard rails, along with better banking in road curves to promote better contact between the vehicle and the road, became important tools in preventing vehicles from leaving the road on tight curves or when crossing into on-coming traffic. The reduction of the rate of death attributable to motor-vehicle crashes in the United States represents the successful public health response to a great technologic advance of the 20th century—the motorization of America.

That’s the “why” (the deaths, injuries, and costs to society were unacceptable) and the “what” (measures were taken to reduce those numbers) concerning our nation’s response to accidental death and injuries. So how did we get there? We collected data from across the nation from hospitals, state health departments, law enforcement agencies, and many other organizations that helped us understand the accidental death and injury problem. We analyzed that data and created meaningful information, information that drove the thousands of decisions that started in 1966 and ultimately had a positive impact on reducing accidental deaths and injuries from motor vehicle crashes in the United States.

Data, decision making, and judgment

Unfortunately, we don’t seem to have taken all the lessons of the White Paper to heart.  Management education today is still largely about educating for judgment—developing future leaders’ pattern-matching abilities, usually via exposure to a lot of case studies and other examples, so that those leaders will be able to confidently navigate the business landscape.  Whether or not they’re in business school, these leaders are told that they can trust their guts and their instincts and that once they’ve gained some experience, they can make accurate assessments in the blink of an eye.

This is a most harmful misconception. Human intuition is real, but it’s also really faulty.  Human parole boards do much worse than simple formulas at determining which prisoners should be let back on the streets.  Highly trained pathologists don’t do as good a job as image analysis software at diagnosing breast cancer.  Purchasing professionals do worse than a straightforward algorithm at predicting which suppliers will perform well.  America’s top legal scholars were outperformed by a data-driven decision rule at predicting a year’s worth of Supreme Court case votes.

When presented with this evidence, a contemporary expert’s typical response is something like, “I know how important data and analysis are. That’s why I take them into account when I make decisions.”  While this sounds right, it’s completely wrong.  Here the research is clear: When experts apply their judgment to the output of a datadriven algorithm or mathematical model (in other words, when they give themselves the opportunity to secondguess it), they generally do worse than the algorithm alone.  As sociologist Chris Snijders puts it, “What you usually see is [that] the judgment of the aided experts is somewhere in between the model and the unaided expert. So the experts get better if you give them the model. But still the model by itself performs better.”

Things get a lot better when we flip this sequence around and have the expert provide input to the model, instead of vice versa. When experts’ subjective opinions are quantified and added to an algorithm, its quality usually goes up. So pathologists’ estimates of how advanced a cancer is could be included as an input to the image-analysis software, the forecasts of legal scholars about how Supreme Court justices will vote on an upcoming case could improve the model’s predictive ability, and so on.

Of course, this is not going to be an easy switch to make in most organizations. Most of the people making decisions today believe they’re pretty good at it, certainly better than a soulless and stripped-down algorithm. They also tend to believe that using models and algorithms takes away from their decision-making authority and reduces their power and value. The first of these two perceptions is clearly wrong, the second a lot less so.

The practical conclusion is that we should turn many of our decisions, predictions, diagnoses, and judgments—both the trivial and the consequential—over to the algorithms. There’s just no controversy any more about whether doing so will give us better results. Emergency communications managers are no different from other kinds of experts in this regard. These leaders must improve their ability to collect and analyze data, and then trust it to make the right decisions, if they are to continue in their efforts to enhance the status of their profession and meet today’s increasingly complex challenges.

So how, if at all, will this great inversion of experts and algorithms come about? How can ECC managers get better results by being more truly data-driven? It’s going to take transparency, time, and consequences: transparency to make clear how flawed “expert” judgment can be, time to let this news diffuse and sink in, and consequences so that we care enough about bad decisions to go through the wrenching change needed to make better ones.

Why should emergency communications center (ECC) managers pursue evidence-based decision-making and problem solving for their centers?

The “standard” approach to problem solving in an ECC is similar to that taken by many other disciplines, particularly those in the allied public safety fields of law enforcement, fire protection, and EMS: fix blame rather than fix the problem. We do this because in many cases we have not accurately defined the problem. Many managers believe that this issue—not defining the problem—is the foundation for the “I think, I feel, or I believe” methodology for decision-making. It is a management approach that has been shown to be frustrating to management (because we keep re-managing the same or similar problems) and demoralizing to ECC staff (because the root causes are never identified and everyone, regardless of their individual job performance, is subjected to the “training fixes all problems” mentality of management).

Regardless of the scope and magnitude of a center’s operations, it is imperative that an ECC manager develop his or her knowledge, skills, and abilities—as well as those of their staff—to engage in evidence-based problem solving (EBP) and evidence-based decision-making (EBD). EBP and EBD can provide significant benefits to an ECC, including:

  • Better methodologies for developing information for presentation to community policy makers and decision makers when requesting necessary resources, e.g., new technology, additional staffing, revisions to policy, etc.
  •  Better methodologies for problem solving, such that true root cause(s) are properly identified.
  • Better methodologies for objective decision-making, such that each problem gets the best solution.
  • Improved morale among staff because problem-solving is objective and seeks to identify all causative facts for a given problem. • Improved recruitment and hiring outcomes for new staff members because “your center’s reputation precedes you.” (People like to work where the incumbent staff is happy, well-trained, and properly cared for by management.)
  • Improved customer satisfaction in the public safety agencies and communities for whom you provide services because: (1) changes and improvements to your internal processes are likely to result in improved service delivery; and (2) changes and improvements to your internal processes are better received by those customers when ECC management can provide the “why, what, when, and how” behind such improvements.

What has to happen in the future?

In order for data analysis to lead to better decision making, organizations must embrace some key data-based business approaches. A good place to start is to ensure that the managers in our ECCs have the requisite knowledge, skills, and abilities to engage in both evidence-based research and evidence-based problem solving. Call it statistical literacy if you like, but training and professional development for center managers should include the fundamental elements of data collection and statistical analysis.

Attaining this body of fundamental statistical knowledge will provide the ECC manager with the ability to meaningfully explore data to find new patterns and relationships (i.e., data mining)—for example, how the number of calls for service varies by time and day of the week. They can use statistical analysis and quantitative analysis to help explain why a certain result occurred.

One example might be looking at the impact that a particular type of severe weather event, e.g., an ice storm, has on the number of calls for service received.

The new computer and communications technologies that have been installed in many ECCs are vast repositories  of potential data for data mining. The key word is potential: ECC managers and their staffs must increase their knowledge, skills, and ability (KSA) to use these systems. While there is much emphasis—and rightly so—on developing the KSAs of front-line personnel to use these technologies—e.g., CADS, GPS, Automatic Vehicle Locators (AVL), etc.—ECC managers need to take the education and training process to the next level. That next level is the ability to locate the right data, extract it, clean it, and organize it into useful management information.

Defining the next level

Many ECCs currently have performance level goals and objectives for their daily operations that include, for example, the number of 911 and non-emergency calls answered and processed; the number of calls for service dispatched to public safety agencies that they service; the processing time for 911 calls; and the processing time for a call for service to be dispatched. These are practical and important measures for an ECC, but they are only the “tip of the iceberg” when it comes to the potentially valuable data that’s now available for EBD and EBP.

Successful managers use Business Analytics (BA), or the methodological analysis of past performance to identify issues and outcomes for the future, to objectively evaluate and make improvements to their operations through automation and optimization of business processes. These data-driven managers view their data as an organizational asset and leverage it for continuous improvement. In order for that to happen, managers must ensure that they are collecting and using quality data, and that both they and their subordinates understand both the technologies and the organization’s commitment to data-driven decision making.

All data is not statistical

The applicable data that can be useful to enhancing the EBD and EBP capabilities of ECC managers should not be limited to statistical data that neatly fits into a spreadsheet. People are the most significant resource for any ECC, and there are many opportunities for ECC managers to collect and analyze data so that they can better match their resources (people and equipment) with their needs (answering and dispatching calls for service). Some specific areas in which data should be collected are:

Telephone lines and Operational Staffing. Information should be collected on the number of telephone lines and operational staff employed by the ECC: Telephone service providers should also be contacted to determine which data is currently available and which needs to be collected to allow analysis of operational staffing and telephone line requirements.

Personnel and Fiscal Management. Managers must establish standards for workstation assignment frequency and a method for tracking workstation assignments. They must document the process by which workstation assignments are made and the number of hours that individuals staff workstations, then ensure that the process is understood and followed so that good data is being collected on personnel requirements for the ECC. Finally, they should develop a process for documenting the reasons that workstation positions must be staffed with personnel working overtime shifts and document the process by which overtime is scheduled. It can also be beneficial to establish a method for tracking who is offered overtime and who actually works overtime shifts.

Operational Procedures.  Data-focused managers establish a methodology for updating operational procedures and related documentation.  This should include the ability for stakeholder organizations, e.g., public safety agencies served, to submit data.

Quality Assurance.  It is also necessary to establish and implement quality assurance processes that measure customer and stakeholder satisfaction, ensure that operational procedures are followed, and identify areas for improvement.  For many ECCs, this will be in addition to the quality assurance processes that they already have in place for their Emergency Medical Dispatch (EMD) program.

Training of Personnel.  ECC managers have a fiduciary responsibility, as well as an operational responsibility, to ensure that their personnel receive cost-effective training that enables them to do their jobs safely, effectively, and efficiently.  The data collected for ensuring such efforts should be part of a learning management system (LMS) that includes job descriptions, training and certification requirements for both entry-level and incumbent staff, and recertification requirements.

SUMMARY

ECC managers who learn to use research tools to achieve meaningful data mining and interpretation for their center operations will be more successful in navigating current and future change drivers. They will be successful because they will have the ability to present objective information to stakeholders for decisions on policy, funding, staffing, and a host of other issues that are critical to the operation of today’s ECC. Those who do not will continue to see their centers understaffed, underfunded, and misunderstood by the citizens and public safety agencies to whom they provide services.

REFERENCES
  1.  Wikipedia.com [Internet] Unsafe at any speed. Available from: http:// en.wikipedia.org/wiki/Unsafe_at_Any_Speed
  2. Wikipedia.com [Internet] Accidental Death and Disability: The Neglected Disease of Modern Society. Available from: http:// en.wikipedia.org/wiki/The_White_Paper 3. McAfee A. Harvard Business Review Blog Network. [Internet] Boston, MA. Andrew
  3. McAfee. Big Data’s Biggest Challenge? Convincing People NOT to Trust Their Judgment. Available from: http://blogs. hbr.org/2013/12/big-datas-biggest-challenge-convincing-people-notto-trust-their-judgment/
  4.  Ibid.
  5.  Ibid.
  6.  Newton P. National Fire Academy. Executive Fire Officer Program. [Internet] Emmitsburg, MD: Paul L. Newton. Strategic Planning for The Chesterfield Emergency Communications Center. Available from: www.usfa.fema.gov/pdf/efop/efo32186.pdf
  7.  Knox County Emergency Communications District. [Internet] Knoxville, TN: Dane County. 9-1-1 Public Safety Communications Center Strategic Plan. Available from: pdf.countyofdane.com/…unications_ Strategic_Plan.pdf

Citation: Avsec R. Why evidence-based decision making matters. Ann Emerg Disp Resp 2014; 2(1): pgs.5–8

Download Original Paper

Topics:||

Research and the Realities of Police Dispatch

The reality of police dispatching is that there is nothing routine.  Police calls change frequently simply due to the type of business.  A perceived cold call of “breaking and entering” into a property can quickly turn into an in-progress ”robbery” when it is discovered that a suspect is on the scene and has a weapon.  A report of an “assault” can quickly turn into an “active assailant (shooter)” situation, one of the most dangerous and complex types of incidents.  The constantly-changing police world is just never routine.

Because of the constantly-changing nature of policing and police dispatch, research is particularly important in this field to make sure we get things right, based on what’s happening in real police dispatch and law enforcement agencies.  For example, one of the most pressing questions in police dispatch research right now is: How much time elapses from initial call receipt in the Communication Center to dispatch of responders?  There is an expectation from the public that law enforcement agencies respond to their calls for service in a timely manner.  I know of no Police Chief, Sheriff, or agency head who is not concerned with response times.

The amount of time that a call remains in the Communication Center prior to being dispatched is also of significant concern to agency administrators.  At the same time, there is a concern that protocol use would increase these response times.  That’s not what we see in agencies, but a scientific study of police protocol user agencies regarding the average time for initial dispatch versus the same data for non-protocol user agencies would be beneficial in identifying the reality on the ground and showing agencies and agency administrators the benefits of protocol use.  Many factors influence how long it takes a call to be processed, i.e., call volume, structured call-taking versus un-structured call-taking, personnel available to answer 9-1-1 calls, etc.  Although multiple factors influence call processing times, in-depth analysis and review of pre and post-implementation of the Police Priority Dispatch System would assist user agencies in determining one of the benefits of the Police Priority Dispatch System (PPDS).  Of course, while call-processing times are highly important to agencies, accuracy and completeness of information gathered is also critically important.

For agencies that do not use some form of structured call-taking (scripted process), call processing times may be longer or shorter, as what information is gathered and forwarded to responding officers is based on the experience (or inexperience) of the call-taker.  While an experienced call-taker may gather critical and relevant information, an inexperienced call-taker may not.  I’m sure we all agree, this is not a good situation.  Structured call-taking benefits agencies, and most importantly responding officers, by ensuring that all call-takers ask the necessary and critical questions all the time.  There is no “guessing” on what to ask, as the key questions that have to be asked are provided to all call-takers via the PPDS.  While there is a possibility that call-processing times may go up, it can be justified by the benefits provided through accuracy of information provided to responders.  A study into pre- and post-PPDS implementation would be interesting and beneficial to agencies as they determine the value of the PPDS to their organization.

Dispatchers and dispatch administrators know their jobs and are, for the most part, excellent at what they do.  The law enforcement mindset is hard to change because the perception is that “we have always done it this way and it works.”  However, if administrators can be shown the benefits of change, change will be implemented.  Dispatch researchers should know that, as younger agency administrators are moving into decision making roles, there will be more openness to doing things differently in the future.  These new administrators grew up in the technology era and are more technologically savvy.  They recognize that there may be better ways to do things—and they know that they often involve technology.  Younger generation Emergency Dispatchers not only want the newest technologies, but literally can’t work without them.  Research showing the benefits of change is vital if necessary changes are to be instituted and ultimately accepted.

Citation: Knight C. Ann Emerg Disp Resp 2013; 1(2):5

Download Original Paper

Topics:|