To access the remainder of this piece of premium content, you must be registered with Firehouse. Already have an account? Login
Register in seconds by connecting with your preferred Social Network.
Complete the registration form.
et’s break this question down into two fundamental issues. We will tackle the issue of where information comes from first, then take a brief look at best practices. Information or data can hurt you or help you, often depending on where it comes from and how it is applied. Two concerns come to mind: Is the information from a reputable source and is the information accurate and based on the current environment?
We will start with determining a reputable and reliable source. This is perhaps the easier part of the question; the more complex issue is whether the information, though from a reputable source, is the type of data that should be applied to our organization. We often overlook the second element of data application – is it applicable to us? We have seen many organizations try to force-fit standards, meaning they take standards, policies and procedures that were established for a different-style organization and apply them to theirs. Essentially, they find that not all standards were created for blanket or comprehensive application and are rudely awakened when they are not compliant with the adopted standard. Think of the analogy of trying to force a square peg into a round hole.
Compare apples to apples
In our opinion, the adoption of standards, policy and procedure is a local choice, based on local criteria such as organizational model, the economic and political environment and local expectations of service. Identifying comparable organizations is one place to begin when seeking a reliable and reputable source. When seeking comparable organizations, you must compare apples to apples. For example, it makes no sense for a small rural fire department to seek comparable data from a large urban organization. It is not that the information is not useful, because it can be if applied in a prudent manner. However, the data being reviewed should be examined with a question: Is it practical and realistic for us to adopt this information (policy or procedure) for our use? What are the implications if we do apply this information to our agency?
It would not be wise to apply the response metrics of a large urban fire department, which can place 16 response personnel on scene in six minutes to structure fires, in comparison to a small rural department that does not have the response capacity of the larger organization. If we want to compare apples to apples, where do we look for that comparable agency? A first stop is through the Center for Public Safety Excellence (CPSE), which identifies agencies that have successfully completed its accreditation program. The accreditation process identifies departments that have evaluated their services through a demanding self-analysis. In this manner, using comparable departments from the CPSE that have undergone the accreditation process provides for reliable, accurate and comparable data.
Part two of our answer involves data. Is the information accurate and based on the current environment? Collecting data can be tricky. We first must begin with the end in mind. In other words, what are we attempting to demonstrate (prove, support, disprove, etc.)? We use a four-step process to collect our data:
1. Clarify your goal. Again, why are we collecting the data? For example, what problem are you trying to understand or solve by collecting this data? Be specific and focus on a single issue. Do not try to gain insight of a systemic (organizational) problem by collecting only one data point. For example, if your goal is to reduce response times, it is best to collect data from various data points, not just one. There is much more to response and response times than the one measure of time (how long it takes us to arrive on scene).