The adoption of standards is one issue. Audits or reviews to determine whether standards are followed is another. That applies to standards that are internally set since there are no industrywide standards at this stage. For example, earlier this year, the Government Accountability Office (GAO) issued a report on the TSA’s canine detection program. It discussed two programs: teams that screen airport cargo and teams that screen passengers.
The two federally-funded programs are used in airports around the nation to detect explosives. The canine detection programs were funded with $101 million in 2012, up from $52 million in 2010. The increase was attributed to an increase in the stipend the TSA provides to law enforcement agencies that participate in the programs.
The report found that although the agency tracked the number of training minutes canine teams conducted on a monthly basis, as well as the types of explosives and search areas used when training—in accordance with SWGDOG best practices—it did not analyze the data. An analysis of the data by the GAO revealed that a number of the TSA’s canine teams did not meet the TSA’s own training requirements for at least six months in 2012. Even more, “this information was previously unknown to TSA until we brought it to its attention,” the GAO wrote.
The report also found that the TSA deployed the new passenger screening program at low-risk airports without studying its effectiveness. “TSA began deploying [passenger screening canine] teams in April 2011 prior to determining the teams’ operational effectiveness and before identifying where within the airport environment these teams would be most effectively utilized,” the report noted.