Hendon Publishing - Article Archive Details

Understanding the TICP scorecard process

In last month’s issue, we took a look at one of the most significant developments affecting communications interoperability over the past year. Tactical interoperable communications plans (TICPs) were required by the Department of Homeland Security (DHS) in FY2006 grant funding of Urban Area Security Initiative (UASI) regions and designated metropolitan areas in states without UASI regions. DHS also required limited, full-scale exercises of each of the 75 plans submitted, observed by independent peer evaluators and subject matter experts.

Homeland Security Secretary Michael Chertoff announced this past May that after completion of these exercises, DHS would produce a scorecard clearly reflecting the level of communications interoperability in each of these 75 metropolitan areas. Immediately following the completion of the TICP plan validation exercises, the department began development of scorecards with the goal of releasing them before the end of 2006.

DHS convened five panels of subject matter experts to develop the 75 scorecards. Each panel reviewed a host of materials for up to 15 metropolitan regions. Materials included the original TICP plan, peer reviewer comments in acceptance of the plans, evaluator findings during the plan validation exercises, after-action reports and improvement plans created in cooperation with each region following the exercises, and closeout self-assessments completed by each late in the year.

Panel consensus was reached on eight elements grouped into three general areas—governance, standard operating procedures, and usage. The governance score was based the state of maturity of five elements affecting tactical interagency communications:

1. Formal decision-making structures

2. Formal cross-jurisdictional agreements

3. Strategic plans

4. Funding

5. Leadership

The standard operating procedures score was based on two dimensions:

1. Communications policies, practices, and procedures

2. Command and control during incidents

The usage score was based solely on demonstrated familiarity and frequency of use through the TICP validation exercise.

Each of the eight elements was evaluated based on a capabilities maturation model created by SAFECOM in development of its National Baseline Assessment survey and methodology. In development of the scorecards, which focus narrowly on tactical communications, four levels of maturity were used: Early, intermediate, established, and advanced.

Across all eight elements, specific metrics distinguished each level. Though metrics varied by element, generally early development was indicated by little or no demonstrable evidence of progress. Intermediate and established levels were distinguished by evident or demonstrated success, with the lesser indicating some difficulties. Full development was distinguished by proactive efforts maintaining and/or further developing the capability. For example, a jurisdiction would have been scored as being fully developed in usage of tactical interoperable communications capabilities if it demonstrated and provided further evidence of regular use of interagency communications resources at their disposal.

Scorecard panelists first reviewed all available material individually and assigned their own scores by element ranging from 1 (early) to 4 (advanced). Panelists were then convened in person to develop consensus scores, language for the scorecards, and recommendations for improvement. Three final scores for governance, SOPs, and usage were arrived at based on the average of individual elements in the former two, which again had five and two elements, respectively.

The final scorecard—a two-page document succinctly presenting results—was prepared by DHS from the panelists’ work. A score for each of the three areas evaluated was graphically depicted through use of a “Harvey Ball”—a circle with one-quarter, half, three-quarters, or fully darkened. A separate block describing communications technologies in use through the jurisdiction for interoperability was provided at the end of each scorecard. The scorecard development process itself was distinguished by strong insistence on consistency across panels and each of the 75 jurisdictions. Though similar wording was often used in findings and recommendations as appropriate, each jurisdiction was distinctly evaluated.

Individual scorecards were released to homeland security representatives in the subject cities and states on Dec. 22, 2006. The final report, entitled “Tactical Interoperable Communications Scorecards—Summary Report and Findings” describing the methodology and containing scorecards for all 75 jurisdictions, was publicly released on Jan. 3, 2007. It was presented during a press conference held by Chertoff during a busy day in Washington, D.C., amidst memorial services for President Ford and warm-up for the dramatically reconfigured 110th Congress.

Because of its focus on one particular aspect of emergency communications—the tactical realm of first responders during the first operational period of an incident—the scorecard may or may not reflect the total interagency communications capabilities of a region. DHS took care to couch both the purpose of the TIC plan, validation exercises, and the scorecard itself in these terms. Unfortunately, popular press coverage of the scorecards at both the local and national levels has not detailed this fine, but important, distinction.

Though some dissatisfaction was heard across the country upon release of the scorecards, there was a solid methodology for their development. The wide range of scores between large and small jurisdictions alike shows that issues of communications interoperability are not the sole domain of major metropolises or always dispensed by the application of advanced technology. They show that there is much work to be done nationally, particularly in the realm of the people and procedural components of advanced systems of communications between public safety agencies.

The DHS scorecard report and findings can be found online at http://www.dhs.gov/xlibrary/assets/grants-scorecard-report-010207.pdf.

Dan M. Hawkins is director of public safety programs for SEARCH, the National Consortium for Justice Information and Statistics, where he manages multiple technical assistance programs to public safety agencies nationwide funded by the U.S. Departments of Justice and Homeland Security. These programs provide assistance in automated systems development, planning and integration of justice information systems, and communications interoperability.

Published in Public Safety IT, Jan/Feb 2007

Rating : Not Yet Rated

Comments 0 Comments

No Comments

Article Images

Click to enlarge images.

Events and Tradeshows: LAOPFMTRPSIT
Close ...