Category: CPSE Exchange

Strategic Planning: Improving Service and Planning for the Future

James White (ret)CFO – 2008

By James E. White, CFO- 2008
In my 34 years as a career firefighter, I can think of no more beneficial tools used by myself and my organization than those found within the Commission on Fire Accreditation International (CFAI) agency accreditation process. When I first became a chief officer in the early 1990s, it was difficult for me to find ways to legitimately improve my agency’s effectiveness. We needed the support of our community on so many fronts. New facilities, equipment, and personnel were among the glaring issues facing us. We struggled to gain the necessary political support to prove that the improvements that were needed would, in fact, change the outcomes of the events we were responding to. We needed a plan.
I was first exposed to the idea of fire service accreditation in 1997. Then, the CFAI offered a model of agency self-assessment based on response to numerous individual performance measures. The three main elements of the model have been maintained throughout the years and have included those detailed responses to the self-assessment performance measurements, a detailed community risk assessment/standard of cover, and a community-driven strategic plan.  I figured this was just what we needed to take what I knew to be a great agency and make it extraordinary. Measuring our current performance and planning for continuous improvement was exactly what we needed. I’d like to say “and the rest is history,” but I would be wrong.
Finding this model was only the beginning of a real love affair with this process. The strategic planning exercises alone forced everyone to sit down and get on the same page about our future. Although a fire chief has the ability to take the department in any direction he sees fit, I learned quickly that true community-driven strategic planning would develop true goals and objectives with issues the community really wanted us to tackle. Continuous improvement was our goal, and our strategic plan set the course for us not only to improve but to improve in ways the community wanted us to. Once we had the community’s support, the whole plan began to come together. Having a plan is one thing; proving the changes worked is another.
Before adopting the CFAI model for agency accreditation, I cannot tell you how effective we actually were. I couldn’t tell you how long it took the first unit to arrive on scene and, when it did arrive, how many firefighters it took to change the outcome of the event. The standard of cover process forced us to inwardly look at our performance. From measuring alarm processing and firefighter turn-out time to overall agency performance, today I can say there isn’t anything we don’t measure. We proved through accreditation that, in fact, we do what we say we are going to do and all our actions are measurable.
Through the agency accreditation, we have moved from a fire department we knew was good to one that continuously improves. After 20 years now, I am often asked, why do you do accreditation? I normally respond, “We don’t really ‘do accreditation’; accreditation does us.” We use the process of self-assessment as our business plan. To those who question, I ask: If not the CFAI model, what business plan do you use?
We know that he CFAI process has made us a better, safer, and more accountable agency. Our people are better, service is better, and outcomes continue to improve. I have shared our work with hundreds of fire chiefs and fire departments across the country. Some will always look at this as only a feather in the helmet of the fire chief. To those who question accreditation as a waste of time, again, I would ask you: What’s your plan for improvement? Can you measure your performance honestly, and are you in fact changing outcomes with the service you provide? In the end, isn’t that what we are all about?

James E. White, CFO, MiFireE, has enjoyed a 37-year career in the fire service. He has served fire service agencies in northern Virginia, South Carolina and Florida and concluded his career as fire chief for Winter Park (FL) Fire Rescue. Jim is currently the Fire Training Program Director at Valencia College in Orlando (FL) where he oversees all state firefighter certification programs for the college. He has been an IAFC member for more than 25 years and remains a credentialed as a Chief Fire Officer (CFO) designee through the Commission on Professional Credentialing (CPC).

Message from CPSE CEO, Preet Bassi, regarding COVID-19 impact on CPSE operations (Updated 09/04/20)

The Center for Public Safety Excellence (CPSE) continues to monitor COVID-19 matters on an ongoing basis relying on verified sources and expert recommendations. CPSE understands the current situation is extremely fluid and the safety of you, your agency personnel, and communities continue to be the priority.
Unprecedented times call for unprecedented measures. CPSE is taking the following steps to ensure our continuity of operations and the safety of our staff, agencies and officers, contractors, and volunteers:

In-person workshops are resuming September 22, 2020. Visit the CPSE University for a list of upcoming workshops.
In-field technical advisor work has returned.
All non-essential staff travel for 2020 has been cancelled.
Headquarters staff is teleworking part of the week. We have the infrastructure in place to telework without significant delays or disruptions. Meetings will be conducted virtually, and the CPSE team continues to be available via email and phone.
The Commission on Fire Accreditation International (CFAI) cancelled the August 2020 hearings. This message outlines how specifically your agency may be affected by this cancellation.
A 90-day grace period for officers seeking re-designation remains in place.

CPSE is committed to working with our agencies seeking accreditation/reaccreditation and officers seeking designation/re-designation with maximum flexibility. Please reach out to program staff if you have any questions. We know you are on the frontlines of responding to this global pandemic and we want you to know that CPSE has your back.
Stay safe,
Preet Bassi, CAE
CPSE Chief Executive Officer

CPSE Learning Opportunities

On Demand Webinars

Resource Documents

Upcoming Webinars

Peer Assessor Training

QITA Workshops

At the Center for Public Safety Excellence® (CPSE®) we are excited to bring you the latest learning opportunities to advance your department and career.
Over the past couple of months CPSE has been busy building out resources in the CPSE University. We encourage you to take the time to access these free resources.

On-Demand Accreditation Webinars
Creating a Strategic Plan – A webinar explaining the creation of a community-driven strategic plan.
Developing Strategic Plan Goals and Objectives – A webinar explaining how to develop effective goals and objectives for a strategic plan.
Performance Statements – A webinar explaining development of performance statements.
Writing the Four-Part Answer – A webinar explaining development of the four-part answer for the self-assessment manual.

Click here to view

Resource Documents
Annual Appraisals – A handout to assist with developing annual appraisals.
Critical Tasking – A handout to aid in conducting critical tasking for all emergency response types.
Engaging Stakeholders – A handout explaining how to engage stakeholders in the CRA/SOC process.
Fire Department Strategic Plans and Community Master Plans: Are They The Same? – A handout comparing and contrasting fire department strategic plans and community master plans.
Performance Statements – A handout outlining how to develop performance statements.
Writing the Four-Part Answer – A handout explaining development of the four-part answer for the self-assessment manual.
Edit PDF in PDFescapeEdit PDF in New TabOpen NormallyPerformance Statements – A handout outlining how to develop performance statements.Performance Statements – A handout outlining how to develop performance statements.Performance Statements – A handout outlining how to develop performance statements.Performance Statements – A handout outlining how to develop performance statements.Performance Statements – A handout outlining how to develop performance statements.F in PDFescapeEdit PDF in New TabOpen Normally
Edit PDF in PDFescapeEdit PDF in New TabOpen Normally
Edit PDF in PDFescapeEdit PDF in New TabOpen Normally

Upcoming Webinars
These live webinars are available at no cost and feature helpful advice from CPSE 2020 sponsors. Look for new sessions to be added in the coming weeks.
Evidence-based Deployment and Station PlanningPresented by Darkhorse EmergencyAugust 12, 1100-1200 EDTRegister
NEW! Identifying Risk in the Community, GIS Provides Answers!Presented by by EsriAugust 26, 1200-1300 EDTRegister
NEW! Forecasting and Interpreting the Risk: A Quantitative Approach with the IAFFPresented by the International Association of Fire FightersSeptember 10, 1400-1500 EDTRegister

Peer Assessor Training Program
October 14 and 21Distance Learning Webinar$400Register

Quality Improvement Through Accreditation (QITA)
Reduced Cost – For the remainder of 2020 a single registration is $625. Add a second registration from your department and receive a 25% discount off that registration.
September 22-24Temple, Texas Instructor – Joe PowersRegister
October 6-8Fairfax City, VirginiaInstructor – Steve OlsonRegister
October 13-15Pinehurst, NCInstructor: Tom O’BrienRegister
October 20-22Canton, GeorgiaInstructor – Ernst PiercyRegister
November 2-4Bowling Green, KentuckyInstructor: Gary WestRegister
For questions or assistance contact or 703-691-4620.

“You Don’t Know What You Don’t Know”

Battalion Chief David Farnum (CFO-2018)

The Organizational Benefits of Conducting a Complete Community Risk Assessment/Standard of Cover
by Battalion Chief David Farnum Jr. (CFO-2018)
“You don’t know what you don’t know.” At first this may sound like a non-sensical or disconcerting statement, but I would argue that it is critical to acknowledge this when developing your agency’s community risk assessment/standard of cover (CRA/SOC). Even as long-serving fire service professionals, we have knowledge gaps and may have potential risks within our communities that we have not yet identified, but that we still need to address. Simon Sinek, an author on leadership and a TED talk guru asks in a blog post:
“Knowing we don’t know everything, what’s the best way to learn more? . . .
Most of us stay in our industry to help us be better at what we do. We read our own industry’s trades, we attend our own industry’s conferences, we talk to others from our industry and we take classes offered by “experts” from the inside. Though we may learn bits and pieces this way, we can never learn to innovate and solve problems or think in new ways like this.”
Applying this to the fire service, both as an industry and within our own organizations, we rarely “look outside.” If you started your career in a fire station, then you understand. Questions are asked, and answered, at the station’s kitchen table, resulting in complex problems being solved in 24 hours. But we also know the fragility of these answers – they’re only good until shift change.
The overriding organizational benefit of going through the CRA/SOC process is to couple resiliency with a depth of understanding and explanation, and this benefit can be broken down into three parts:

Revealing gaps in an existing risk assessment
Establishing response benchmarks for identified risks
Aligning internal and external communication with risk and response

You may be thinking “where do we get started?” and that’s a fair question. But if you’re not going to do this, then who is? Who else is better equipped to gather input from your community, from experts in your department, and from your command staff/leadership team?
Revealing gaps in an existing risk assessment
Conducting a complete CRA/SOC will assist your agency in revealing gaps in an existing risk assessment. Most fire service agencies have an established framework for performing some form of risk reduction that is usually a blend of company-level, fire prevention, or high-risk population inspections on a reoccurring schedule, be it mandated or adopted. Over time it’s often reduced to working down a list. Staffing levels and workload mean that we move from one inspection to another completing paperwork and recording results or deficiencies but rarely thinking broadly about what we’re seeing. We are not given to seeking more work by thinking about risk more broadly. Another integral part of the fire service is that we have addressed community need (and thereby risk) by expanding our operations over time.
Your local volunteer rescue squad is not able to respond to calls any longer, so as a result the fire department begins to train and respond as emergency medical technicians. A hazmat leak results in the loss of a major employer, so the fire department trains members to the technician level and adds apparatus. These are both examples of reactive measures, not a deliberate result of an assessment process. Taking a step back and allowing your agency the time and space to integrate what you’re already doing with a strategic review of what you might have missed or haven’t had time to address is the most important part of conducting an open and honest review of community risk – the probability of an incident or event occurring and the consequence(s) to the community. Use your knowledge and intuition as a place to start, and then be methodical, because the CRA/SOC (if followed and implemented) impacts every aspect of your agency, from the pre-incident awareness of your firefighters, officers and chiefs, to how and why these members respond.
But, how do you know the quality of these responses? The common refrain in the fire station is, “Well, do you see any columns of smoke?”  That’s much too reductive, but it’s also revelatory – as an industry we gather relatively little information on our incidents, and it is often simplistic. We will wind up in a situation where “an undiagnosed gap in knowledge means you might not fully understand a problem” as outlined by Art Markham in his 2012 Harvard Business Review article. So, how does your agency ensure that it has better insight? One common tool we have is incident response data – that is how quickly companies respond to calls for service and which apparatus and personnel respond on each call. Being some of the few consistent and fixed data points we measure as an industry, it is important that these data are defined and collected accurately, so they may serve as a foundation (baseline) for analysis.
Establishing response benchmarks for identified risks
This brings me to the second organizational benefit of conducting a complete CRA/SOC: if you haven’t already, your agency will establish response benchmarks for identified risks, where a benchmark is the goal or objective that you are striving to meet. At its most simplistic, a benchmark will be used as a measure of each company’s response performance over time and will identify those incident types and locations where companies are unable to respond within each benchmark.

Fire service agencies, regardless of size or complexity, are linked to specific locations – fire stations and support facilities. But, are these in the correct locations? Were they based on population growth trends that have changed since a station was built? In order to get to these and other similar questions, your ability to respond must be measured and analyzed against your established benchmarks. The map to the left shows the response performance of an agency’s first-due companies over a five-year period. This depth of analysis could only be done, however, after the agency completed the initial step of capturing its response data and establishing response benchmarks for each risk class and category for comparison. This intentional activity provides the foundation for the entire data analytics process. Now that you’ve identified your community’s risks, you’ve established expectations for response, what can you do with this information?
Aligning internal and external communication with risk and response
“Tell your story.”  This is the last and perhaps the most important organizational benefit of conducting a complete CRA/SOC. Your agency will have the opportunity to align internal and external communication with risk and response. Put a bit differently – develop a consistent way to communicate about community risk and how your agency plans, trains, and responds to mitigate those risks. This is important because no one else will be able to communicate your message as well as you can. In the current municipal fiscal environment, every department in every municipality is competing for reduced or limited funds. It is critically important that your message be clear and consistent. Every day your agency is communicating on multiple levels: within your organization, within your community, and to municipal staff and elected officials. Whoever is going to speak at a public event, at a municipal board meeting, or at a fire station, needs to do so in a way that demonstrates your agency’s understanding of community risk and how you deploy and respond.
Why is this critical? Most city staff, whether it’s during a budget development process or for a specific capital request, not only need, but require, data-supported justification. Elected officials who ask questions in public hearings are requesting that this information be presented comprehensively and that it be data driven. Shift your perspective and begin to see these meetings and requests as opportunities to explain what you’re doing and how you’re doing it.
You may ask – But what if we’ve established benchmarks and are far from meeting our response expectations? In this case see it as an opportunity for you to detail what factors are driving your response challenges. This can be neighborhood specific – a captain is now able to explain to a homeowners association why it takes a ladder company longer to respond; or it can be more broad, where an operations chief explains to the county board that a local EMS provider changed its response plans increasing the agency’s time on scene for each incident, which in turn is affecting deployment agency-wide. The ability to provide examples like these rises out of following the CRA/SOC development process.
In the proceeding sections, I described three direct and tangible organizational benefits you will see after performing a complete CRA/SOC. Like accreditation as a whole, this is a process that needs review and periodic re-appraisal, as no community’s needs are static, nor are their risks. Reviewing your existing risk assessment, establishing response benchmarks for identified risks, and aligning communication with risk and response takes time and effort, but will result in a more focused and successful community-oriented fire service agency.
CPSE provides the following resources to assist your agency in developing a Community Risk Assessment/Standards of Cover:
Community Risk Assessment Standards of Cover, 6th Edition Publication
Quality Improvement Through Accreditation Course
Technical Advisor Program CRA-SOC Facilitation
CPSE University on-demand handouts and videos

David Farnum (CFO-2018) is a 19-year member of the Charlotte Fire Department, is the department’s Planning and Research Battalion Chief and serves as the department’s accreditation manager. David volunteers as a peer team leader with CFAI and is a member of CPSE’s Technical Advisor Program, with a focus on Community Risk Assessment/Standards of Cover.

Information Management: Connections to the CFAI Model

Ernst Piercy, CFO, Fire Chief (ret.), former CFAI Chair

by Chief Ernst Piercy, CFO
Contributors: Joseph Yacker, Information Systems Director, Spokane Valley Fire Department; Leonard Chan, Management Analyst, Houston Fire Department; and Xavier Anderson, Management Analyst, Los Alamos County Fire Department
The fire service, long steeped in tradition, has provided superb services to our communities for many years. However, the use of data and information management in general is in need of an upgrade, and not just technologically speaking.
When fire service leaders think of data collection it is natural for them to refer to their familiarity with, arguably, one of the most successful data collection efforts in our industry; the National Fire Incident Reporting System (NFIRS). What distinguishes the NFIRS report is its completeness, and its structure. It also has some weaknesses that can skew our data collection efforts as we move forward.
As anyone who has written an NFIRS report knows, the completeness of the standard is obvious. If you print out a blank report form, it goes on for pages. As you complete a report you will eventually realize that just about anything you would want to report about an incident can be addressed somewhere in the form. It is clear that, while this design may have been built to answer some specific questions many years ago, it was also designed to so completely describe an incident that new ways to interpret and investigate the data would naturally evolve over the years. Unfortunately, the value of the data collected in the NFIRS database has suffered in a number of ways for this completeness. It takes a significant amount of effort to thoroughly document an incident, and this has led to users taking shortcuts. Zero codes represent the default general code at the beginning of a series and they have become a shortcut that users commonly use to avoid the requirement to enter further detail required by more accurate codes.
So, what does this have to do with fire service accreditation? In short, nothing, but yet, everything. Clean data are the genesis of accurate reporting and must be engrained in our daily lives. The modern fire service has, at its fingertips, an unprecedented number of tools with which to collect data, just at the time when it seems to be more poised to innovate its operations. Community risk reduction, community paramedicine, fire service accreditation, community outreach, and health and wellness all represent areas that are innovations to the fire service, relatively speaking. They drive data collection and are themselves driven by data collection.
As you read this, you likely have 4-5 devices within your reach that can be utilized as platforms for data collection. Portable devices, coupled with a global data network, increases your reach, capability, and reliability on a daily basis. An overwhelming market of software tools stands ready to serve your needs. Even so, technology cannot solve our data collection problem. We still need to be able to understand and communicate our data collection objectives, and make decisions based on those understandings and objectives. Those decisions will drive the platforms, the networks, and the software, and allow us to leverage their power for success. But only if we put effort into both the design and execution of our data collection efforts.
Data as a flashlight
As an example, if a department’s Community Risk Reduction program has analyzed census demographics, this could be overlaid with incident information about fires, their causes, and the losses they incur. Based on this, a conclusion could be made that many of the homes in the community are either not equipped with smoke alarms, or they are old enough that many no longer operate properly. We commit to improving home safety, many times by addressing the smoke alarm needs. We apply for grants and create community partnerships, we acquire equipment and devise a plan. You may ask firefighters to check for functioning smoke alarms when they are in a home for any reason and equip them with tools and equipment as needed. Your focus will increase on vulnerable populations and offer services before you are called into their home. But we want to ensure that we know if our efforts result in a measurable improvement. We need to collect data so we need data collection to dovetail with all of our efforts to not only provide conclusive results at the end of project (if there is one), but to dynamically drive continuous improvement during delivery.
How do we measure our progress?
There are a variety of ways, but let’s consider the following: input measures, which could include the value of resources, typically displayed as a number; output measures, which include a quantity or number of units; efficiency measures, which define the number of units used per output; service quality measures, which include customer satisfaction; and outcome measures, which are the gold standard of measurement. Outcome measures include the determination and evaluation of the results of an activity, as compared to your intended results.
How much data is enough?
So, let us consider the scope of your data collection, keeping in mind that it must support your outcome objectives. Some would say that an easy way to go wrong is to buy into the (perhaps correct) notion that there are valuable nuggets of information available to us whose value will only be revealed later. So the inclination is to guess at what data to collect, well beyond what we know we need to support our current objectives. Is there a downside to collecting more data than what you need? Considering that you have someone on site, so you may as well collect as much data as possible, correct? If that person is a robot, or an altruist, then the only risk is a few more minutes taken to collect more data. More than likely, however, that person that is working on your report has a real life, full of stresses and distractions and daily requirements. If you are collecting more data than is obviously in need, then they will start to take shortcuts. And, there is no way to ensure that they will cut out the least valuable data in your request; they may even give up entirely on the data collection effort.
The answer is to start with the data you really need. As responses start to drive additional questions, then add that to the data collection as you go. It is true that you may not be provided a full analyses of every record, but you may end up with a more complete data set at the end. Combine this conservative approach with a substantial and open discussion with your data collectors about why the data are needed, and what questions you hope to answer. Involve them in the process; maybe they will be the ones that will suggest that you are missing that one critical question on your form. A quick reminder about the importance of both quantitative data and qualitative data. Quantitative data are measurement about quantities (numbers) and are critical to the measurement of your programs. Qualitative data is descriptive, and are measurement regarding observed phenomena. Both are important, although quantitative data is certainly easier to validate. Finally, make sure your folks know you care about the data. A good dataset requires leadership.
So, how do can we tie data collection into the fire service accreditation model? Here are three quick examples:
Strategic Planning. Your data are important but needs should be summarized, themes identified, and, if feasible, quantified. Collecting and tracking information from stakeholder feedback reassures the stakeholders that their feedback was considered. Using this data in the internal development of your strategic plan ensures that your outcomes are aligned with community expectations. In the ninth edition of the model, this is critical in criterion 3B, during the development of your goals and objectives, which serve as the foundation for your strategic plan.
Community Risk Assessment-Standards of Cover (CRA-SOC). Your risk assessment must have information that is useful and digestible for all personnel. The document should not be just a “front office” product that is compiled simply to check off a box. The data should help improve day-to-day situational awareness and increase knowledge of the community served. The document should not only help shape the top-level function of placement of units and stations, but also guide equipment, training, and public education strategies. In other words, data should drive deployment. A process must be in place to validate your data, including the development and implementation of an outlier policy. Your data sources should be listed, so updates can be done seamlessly while ensuring credibility with the AHJ and stakeholders. This data are critical to the development of and responses to criteria 2B and 2C of the fire and emergency services self-assessment model.
The CRA-SOC provides a real glimpse into the performance of the agency. It identifies strengths and challenges and, by performing an annual appraisals for each response program, it provides a road map for the five years of data. In other words, collect data, then act upon it.
Writing to performance indicators in the model. Beyond just measuring performance times, many agencies struggle in truly measuring how successful they are meeting performance indicators. A good number of times some agencies just provide the generic “it is going well.” Stakeholders often lack context on what is considered doing a good job. An incident may have looked good from the public’s perspective but the crews may be frustrated by the lack of things such as working hydrants, late arriving units, or even near-misses. So how do we determine how well we are doing? Metrics should not be established for the sake of creating metrics, the numbers should have meaning and not lead to unintended consequences. The temptation to mold the self-assessment model into a public relations document should be avoided. The measures used, and the model itself, are not designed to make an agency to look good but rather to identify areas of improvement.
An argument could be made that data should be used in the response to all performance indicators in the model, but, admittingly, there are cases were that could be difficult. In situations whereby quantifiable data are not available, then qualifiable data (observed phenomena) should be used to appraise your programs. As an example, how do you address the appraisal for you response to performance indicator 2A.1? It states, “Service area boundaries for the agency are identified, documented, and legally adopted by the authority having jurisdiction.” If, within the description, you have told the reader that the service area boundaries are identified (and adopted), how do you appraise this? Certainly, quantifiable data may not be an option, although telling the reader how clearly defined your response areas are could lead to more accurate response plans, dropping borders, adding or deleting the need for mutual aid, etc.
On the other hand, there are situations whereby quantifiable data must be used to address your appraisals. In performance indicator 5E.1, which states in part, “Given its standards of cover and emergency deployment objectives, the agency meets its staffing, response time….” you should describe what your benchmarks (goals) are, and in the appraisal you would use data to provide your actual response performance, including any performance gaps. This should be replicated in each of the programs that follow (5F.1, 5G.1, etc.).
Benefits of data information and collection, include everything from data visualization (both in statistics and mapping) to justifying or shifting funding for your programs. With sound data collection, fire service leaders can easily justify the decisions they are making for the future success of the organization.

Ernst Piercy is a retired fire chief with more than 35 years in the fire service, most recently serving as the Regional Fire Chief for Navy Region Southwest in San Diego, California.  He served as the accreditation manager in the Air Force Academy Fire Department’s successful bid for international fire service accreditation in March 2001 and led his team through successful re-accreditation bids in 2006 and 2011.
Chief Piercy is a 2011 graduate of the Senior Executives in State and Local Government Program at Harvard University, has completed the Executive Fire Officer Program at the National Fire Academy (2007), and is a Chief Fire Officer Designate since 2003.