Question: Auditing emergency management programmes: Measuring leading indicators of programme performance Heather Tomsic Received (in revised form) 2nd May, 2016 Standards, Training and Communications Coordinator






Auditing emergency management programmes: Measuring leading indicators of programme performance Heather Tomsic Received (in revised form) 2nd May, 2016 Standards, Training and Communications Coordinator Safety, Security and Emergency Management Division, Corporate Services Department, Metro Vancouver 16th Floor, 4330 Kingsway Avenue, Burnaby, BC V5H 4G8, Canada. Tel: +1 604-432-6246; e-mail: heather. Tomsic@metrovancouver.org Heather Tomsic's background includes heavy construction sites, pulp and paper mills, oil refineries and dam projects as a pressure vessel specialist. Since 1996 she has par- ticipated in national association committee work in the area of industrial occupational analysis, occupational health and safety and professional safety accreditation. Having completed her Masters in Adult Education at Simon Fraser University in 2001, she works at the corporate programme management level in occupational health and safety for a major public utility in British Columbia's lower main- land. As a management systems person and nationally accredited safety professional, her focus is the managed improvement of work- place conditions, safe work practices and emergency preparedness, in and around the industrial installations and processes she once built. An integral part of system improvement is its measurement and evaluation, and her project management and systems planning, development and measurement experience includes management and conduct of annual audits on her employer's behalf since 2011. ABSTRACT Emergency Management Programmes benefit from review and measurement against estab- lished criteria. By measuring current vs required programme elements for their actual currency, completeness and effectiveness, the resulting timely reports of achievements and documenta- tion of identified gaps can effectively be used to rationally support prioritised improvement. Audits, with their detailed, triangulated and objectively weighted processes, are the ultimate approach in terms of programme content meas- urement. Although Emergency Management is often presented as a wholly separate operational mechanism, distinct and functionally different from the organisation's usual management structure, this characterisation is only com- pletely accurate while managing an emergency itself. Otherwise, an organisation's Emergency Management Programme is embedded within that organisation and dependent upon it. Therefore, the organisation's culture and structure of management, accountability and measure- ment must be engaged for the programme to exist, much less improve. A wise and successful Emergency Management Coordinator does not let the separate and distinct nature of managing an emergency obscure their realisation of the need for an organisation to understand and manage all of the other programme components as part of its regular business practices. This includes its measurement. Not all organisations are suf- ficiently large or capable of supporting the use of an audit. This paper proposes that alternate, less Heather Tomsic Journal of Business Continuity & Emergency Planning Vol. 10, No. 1, pp. 57-75 Henry Stewart Publications, 1749-9216 formal, yet effective mechanisms can be explored, as long as they reflect and support organisa- tional management norms, including a process of relatively informal measurement focused on the organisation's own perception of key Emergency Management Programme performance indicators. Keywords: evaluation, audit, measure- ment, performance, leading indicators, management systems INTRODUCTION Why should organisations audit or otherwise measure their Emergency Management Programme? This paper examines the applicability and benefits of developing and implementing Emergency Management Programme audit milestones to effect the identification and support of necessary, prioritised and iterative Emergency Management Programme improvements. Identifying areas of strength, gaps and critical growth requirements through an effective audit process, particularly those not readily discovered during plan testing, provides for sustainable, defensible and progressive Emergency Programme improvement: Management Stakeholders are involved in the process of evaluation and measurement of the Emergency Management Programme, regularly but infrequently sharing an opportunity for systematic, incremental and measured improvements Rationale and methodology for sys- tematic gap analysis produces focused recommendations that recognise and support existing areas of strength, and specifically address prioritised areas cur- rently requiring improvement. Audit action plan creation is lodged at responsibility levels with the authority to act or delegate responsibility for each recommendation's implementation. Subsequent audit results provide ongoing comparable evaluations, vali- dating implemented recommendations, while flagging iterative improvement requirements. However, instead of solely depending upon use of a complex, time-consuming and burdensome audit, the writer intro- duces an alternative less formal assessment of leading indicators of emergency man- agement indicators. As many organisations already conduct regulatory-driven annual audits of their safety management system, where possible, these key questions could be inserted into the existing annual system measurement rather than being conducted separately. By using this type of evaluation process, the parallels between any man- agement system can be drawn upon, so that the outcomes of its measurement of effectiveness can be more easily accepted and understood by those charged with responsibility for its success. By coordinating efforts to measure how well it manages prevention of incidents (safety and security) with a concurrent measurement of the organisation's efforts to manage the eventual occurrence of serious, critical incidents, emergencies or events, pairing both systems' measurement will make synergistic gains apparent and harness targeted support for both systems' continuous, responsible and measurable improvement. Parallel measurement efforts mean that additional allocation of audit conduct resources is not required. Cost savings result for the organisation that implements this approach, identifying and resolving noted deficiencies in both systems' implementa- tion in a systematic and prioritised way. Post-measurement, organisational efforts can be rationally focused within identi- fied Emergency Management Programme areas requiring improvement. As managers recognise their necessity, these emergency management activities are subsequently integrated into operational norms, just as with safety management system requirements. This less formal approach may not be workable for Emergency Management Programmes which require a formal audit report to support year-to-year funding arguments to finance specific system improvements. However, a performance indicator-based measurement process may well produce a more general and accessible set of programme assessments, leading managers to a quicker and more complete understanding (and implementation) of needed improvements. RATIONALE FOR DEVELOPING AN EMERGENCY MANAGEMENT PROGRAMME AUDIT PROCESS Why develop an Emergency Management Programme audit process? Large public and private resources are currently assigned to emergency management activities at the personal, municipal, provincial and federal levels in Canada. Increasing public and private realisation is that preparation is key to mitigation and recovery from any type of emergency, and this basic general understanding of the need for emergency preparedness, particularly at the organi- sational level, also necessitates a suitable means of measurement be included, to ensure that time and energy is well spent on the implementation and improvement of an effective Emergency Management Programme. Regardless of current support, few organisations can justify creating or maintaining an Emergency Management Programme without having a process in place to measure and improve its effective- ness. This paper provides an opportunity for consideration of the necessity of a systematic approach to that measurement Emergency Management Programme gap analysis, assessment of implementation effectiveness or formal audit. Measurement underpins any Emergency Management Programme: Emergency management is a systematic and documented process to facilitate the assessment, prevention and mitiga- tion of potential risks and their resultant impacts, while maintaining effective response and recovery operations to ensure the continuation of services during an emergency.1 Based on the realistic assessment of the hazards and risks to, and vulnerability of, the organisation's systems and services, all subsequent Emergency Management Programme development and improvement will benefit directly from measurement of each Emergency Management Programme activity's effectiveness since and during its implementation at designated intervals. A defensible continuous improvement process requires formalisation and struc- ture, so that everyone's efforts to contribute to the emergency management processes are recognised and supported as they meet, or work to meet, the required levels of effective implementation. at measurement organisational Without standardised designated intervals, Emergency Management Programme efforts and activities may be localised, isolated and even competitive, given finite resources. This can be avoided by understanding the current state of the programme's effectiveness, its areas of strength, and using key performance indi- cators to flag those elements requiring improvement. By establishing designated intervals of standardised measurement: Emergency incidents and/or events are not the only whole-system testing mechanism. Current or emerging hazards and risks and shared vulnerabilities become reg- ularly assessed, removed as deemed appropriate, and/or up- or downgraded in impact. Organisational resources can be rational- ised to support systematic improvement of identified areas of the whole Emergency Management Programme. Focused and systemic improvements mean that inter-agency, -organisational and -governmental efforts can be coordinated more efficiently, and organisational costs are reduced when duplication of efforts or mis-assignment of resources is avoided through effective, prioritised manage- ment. This measured approach reduces the potential for large, sudden and unexpected gaps in emergency preparedness which are normally made painfully apparent during an actual catastrophic emergency. It can assist in preventing overlapping planned dependence on limited area resources; or discovering Emergency Management Programme limitations or outright failures in key areas previously untested by real- time occurrences. In short, measuring Emergency Management Programme effectiveness on an ongoing and regular basis prevents its uncontrolled and unwieldy development in higher visibility areas which are less cost-effective, as it prevents under-prepa- ration in the lower visibility, higher yield areas of emergency management that may be less well known or understood. SETTING THE STANDARD FOR MEASUREMENT In terms of providing a solid base for an organisation's Emergency Management Programme, there is no more effective approach than collaboratively constructing an organisational standard of management practice for its systematic implementation. This organisationally endorsed standard of practice must clearly outline all required elements of their Emergency Management Programme, including designated and del- egated responsibilities by organisational entity and level, for its implementation and ongoing maintenance to be sustained. Operational responsibilities must be centred around standardised Emergency Management Programme elements and meet all applicable federal, provincial and municipal legislative and regulatory requirements (in Canada.) Not only does the process of its iterative development include and involve affected stakeholders, but when completed, the document pro- vides a known touchstone and framework for all organisational emergency manage- ment activities. An emergency management standard of management practices which captures organisational expectations for identified activities lists programme accountabilities, role responsibilities and training (or certi- fication) requirements for all involved, and provides an ideal base for either formal or informal objective measurement. Review asks the question: Have we done what we have said will be done? Without this internally embedded ownership of the programme requirements, like management systems, in many jurisdic- tions, bare-bones legislative requirements alone may become the basis of emergency preparedness activities, and business con- tinuity can subsequently become the key driver for both public and private enter- prises' standards of practice. safety When an emergency management standard is endorsed at senior organisational level, its requirements becomes regularised as part of the ongoing management plan and the measurement process internalised. Integral to its successful implementation are mandated requirements for three-deep management personnel to be identified, trained in, and actively participate in ongoing Emergency Management Programme exercises. Without the knowledgeable, accountable participant understanding their interactive roles and responsibilities, progressive system improvements can move slowly or stall entirely at the initial stages of implementation. With clear management commitment to standardised programme activities, their clearly enunciated measurement mecha- nisms become the norm rather than an additional burden. Measurement of man- agement activities related to Emergency Management Programme implementation is now an organisational expectation, as is the tracking of its effectiveness, using key performance indicator measurement. The real benefit of an established standard of programme management approach is that rather than a series of various organisational exercises being con- ducted to test only some system component implementation, actual full programme measurement can be conducted which effectively compares and contrasts current performance indicators against accessible, clear and standardised requirements. This measurement is done by an individual or team who have substantial expertise in emergency management, against the requirements outlined in the standard of management practices, which includes legislated obligations where they apply, ensuring that the mechanism chosen for such programme measurement produces information compatible with the organisa- tion's management structure. DOCUMENTATION REVIEW Review emergency management standard requirements to identify what the organi- sation has established as being needed to construct effective emergency man- agement processes. Identify programme components and respective elements, role and responsibility expectations, comparing and contrasting them against accepted Emergency Management Programme principles and components. This would include review of Emergency Plan docu- mentation, communications, After Action Reports and, in time, existing Audit Action Plan recommendations. While the intrinsic value of a complete and effective Emergency Management Programme may be eminently evident to its Programme Coordinator, only by having established measurable organisationally supported programme activities and outcomes will responsible managers be able to easily and effectively identify what's working well and what's missing. DOES DOCUMENTATION INDICATE THAT STANDARD REQUIREMENTS ARE MET AT THE INTERVALS WHICH, AND BY/FOR INDIVIDUALS WHOM IT STIPULATES? Observation Conduct site visits to confirm implemen- tation of stated Emergency Management Programme requirements, including in situ application and/or implementation of received training, emergency management communications, information, activities and access to plans. Are those tasked with meeting standard requirements acting upon them as required by the standard? REPRESENTATIVE INTERVIEWS Stakeholders should regularly, but infre- quently and without consecutive requests except in specific circumstances,2 be invited to participate in the organisation's Emergency Management Programme audit process. Using recognised system audit processes, representative stakeholder interviews can confirm the actual level of personal understanding, expectations and individual or group participation. Emergency Management Programme audit interviews would include desig- nated programme personnel with formally assigned roles, management and supervi- sory representatives, and regular workers within the organisation, as well as key external agency representatives. Numbers of representative interviewees are identified during each audit cycle who participate in knowledge and activities measurement in regards to their Emergency Management Programme. Are participants supported to understand, and do they meet, their respective Emergency Management Programme standard roles? TRIPLE PERSPECTIVE AUDITS Coordination of documentation, observa- tion and interview results will provide a triple perspective on what the organisation feels it has implemented, what is in situ at various sites, and how well understood and supported the Emergency Management Programme is at present. Most importantly, the results recognise de facto efforts and actual implementation strengths, as well as flagging those areas still requiring improvement. This initial set of findings is then filtered through knowl- edgeable analysis to prioritise subsequent action items for immediate and long-term improvements. These organisation-spe- cific Emergency Management Programme action plans capture realistic and sup- portable recommendations for prioritised and defined improvement in needed programme areas. An annual action plan provides the basis of the subsequent audit examination where were recommenda- tions realistic, supportable, and did they improve the programme? Management Each Emergency Programme component is assessed against standard requirements: Hazard, risk and vulnerability assessment Prevention and mitigation Preparedness Response Recovery This approach is not new, for '[t]o develop an effective, workable emergency manage- ment plan, planning should also recognize a hazard, risk and vulnerability analysis (HRVA) and a process for continuous improvement as key elements in the emer- gency management continuum'.4 However, while much time and effort is expended getting the basics of emergency management into place, at this point there appears to be little formalised, interval measurement built in to many Emergency Management Programmes, except in tar- geted exercise design elements, so that gaps in the operational activities become the focus for situational analysis or After Action Reports, 5 rather than the pro- gramme itself. The writer proposes that Emergency Management Programmes would benefit immensely from being sys- tematically scrutinised once introduced and implemented, ensuring that audit activities examine all components of an organisation's Emergency Management Programme. IDENTIFYING AND MEASURING EMERGENCY MANAGEMENT PROGRAMME COMPONENTS: ORGANISATIONAL HAZARD, RISK AND VULNERABILITY ANALYSIS (HRVA) The Emergency Management Programme's HRVA component is the basis for the activ- ities expanded upon in other programme components, and is to be revisited and examined carefully at least biennially. In a 24-month period, ongoing programme improvement activities and impactful events may well change the order of HRVA prioritisation, if not its contents. For some organisations, other types of business cycles can impact vulnerabilities, whether local or global in nature. A clear and current picture of the applicability of the 'Top Five Hazards' can provide a schedule for review, exercise or audit of anticipated Preparedness and Response Plans suited for anticipated seasonal or ancillary events, or other business cycle impacts. In the short term, hazard-specific and all-hazard Preparedness and Response Plans can directly benefit from more focused work as relevancy is clearly estab- lished and plan preparations have specific organisational and operational benefits. Even low-level or slow-moving emergen- cies can often provide a good opportunity to test activation and response procedures against limited demands. Often lower- level emergencies -brief power outage; small-scale wind storm; localised land- slide or flooding prompt organisational support to move towards more realistic and supportable Prevention, Preparedness and/or Response Plans. When a larger-scale emergency does occur, it need not occur locally to focus organisational energies on a previously lower-rated or unidentified vulnerability. Instant access to world events can prompt a 'that could be us' moment which often trig- gers impetus for local plan improvements. The Emergency Management Programme's HRVA must be kept current and realistic for it to guide and justify resources. Little is gained, and much can be lost in Emergency Management Programme credibility, due to over-prep- aration in one area of risk at the expense of preparedness, suddenly evidenced in another higher priority or emerging area of vulnerability. Current and accurate HRVA priorities are more easily aligned with organisational demands and resources, as realistic resiliency is seen as practically constructed for all participants in a meas- urable and defensible manner. Documentation of HRVA review can be captured in management meeting minutes via specialised task force reports provided at least biennially to senior management. This not only captures the gains made and updates the organisational leadership, but provides opportunity for recognition of vital Emergency Management Programme efforts beyond the usually reported Response or Rescue activities, both of which are relatively short-term and costly activities. However, cueing all Emergency Management Programme component measurement similarly, coordinated efforts for measurement and reporting are best introduced and sustained in a standardised audit process. IDENTIFYING AND MEASURING EMERGENCY MANAGEMENT PROGRAMME COMPONENTS: PREVENTION AND MITIGATION Cost-saving in the long term, Prevention. and Mitigation can be measured most effec- tively in terms of comparative resources remaining productive vs anticipated service or production losses during an emergency, preferably estimating as much as possible from external occurrences. While over the last several decades, service redundancy has not been popular for public sector utilities, globally illustrative examples of catastrophic events affecting large popula- tions, such as the earthquake in Nepal, are gradually changing this outlook. Coming to the fore as diligence, the planned prevention of a geographic area's sudden loss of water supply following wide- scale emergencies, e.g. through potential secondary supply lines; or the necessity for continued sewage containment and treatment for industrial, commercial and residential purposes; the necessity for water and sewer services redundancy is becoming recognised as critical to public infrastructure preparedness.? Preparedness and Mitigation capability requires that an organisation be com- mitted to a complete understanding of its mandate for service (or operational) demands. Understanding the full effects of localised emergencies (incidents which are beyond normal operational incident response capabilities) ensures that such incidents can be anticipated, causal factors examined and prevention measures con- sidered. Whether the measures prevent the emergency or mitigate its impact, innate resiliency is also being constructed, as such preparation supports and speeds recovery. IDENTIFYING AND MEASURING EMERGENCY MANAGEMENT PROGRAMME COMPONENTS: PREPAREDNESS Preparedness documentation review is conducted against standard requirements for preparedness - weighing existing supporting plans, procedures, guidelines, internal and external resource inventories and emergency liaise and contact lists against needed ones. The emergency management standard effectively identifies each set of required programme components and their activi- ties: Observation is also conducted through careful measurement against its requirements. Interviews reflect questions around the role responsibilities, activities and training requirements for respective participants, reflecting the existing abili ties of the organisation in relation to its required capabilities. IDENTIFYING AND MEASURING EMERGENCY MANAGEMENT PROGRAMME COMPONENTS: RESPONSE Emergencies are those incidents which are beyond normal operational incident response capabilities. In an operational sense, regular incident prevention and mitigation are everyday occurrences, and operational incidents occur and are dealt with through well-developed standard operating protocols. Emergency response kicks in when this capability is exceeded and an extraordinary response gency response is required. emer- Structured into the conduct of every emergency response are well-developed and extensive sets of documentation forms and protocols. Emergency Management British Columbia (EMBC)8 has provided for a standardised provincial set of opera- tional expectations to be met in addition to local, municipal and federal require- ments. Whether emergency services are provided externally or internally, response plans require planning to ensure that they have adequate resources. Managing the current operational period, and planning for the next, requires close attention of those directly involved in site-situated Incident Command Post and Emergency Operations Centre activities. Operational periods and role responsibilities, switch from organisa- tional to emergency management mode. This cannot occur effectively without training and exercise, no matter how well-read the individual. Prepared simu- lations and exercises pale beside actual, even small-scale emergencies, as the reality of the necessity and consequences of in situ decision making becomes known, and experience replaces planning assumptions. In relative terms, this exhaustively docu- mented component seems well measured. Situation reports, incident briefings, inci- dent objectives and organisation assignment lists through to demobilisation checkout forms provide ample sources of specific information contained in After Action Review report findings. However, meas- urement of response's actual effectiveness could better be its resiliency in the face of 'failure' upon reaching the end of suf- ficient response resources. Understanding of the severity and fre- quency of types of emergencies which the organisation faces is not best served by con- stant preparation for one large catastrophic event, which is why most organisations and governments have moved to an all-hazard approach. Using smaller emergencies and exercises or simulations to hone inter- workgroup, intra- and inter-organisational communications and liaise will serve the organisation better. Supporting active participation in emergency response activ- ities at site, support or control locations involves and prepares more participants for actual response. Measurement of response requires a careful look at the types of emergen- cies faced and anticipated and comparative roles, experience, training and participa- tion of those tasked with both formal and informal roles during an organisational emergency, to assess the degree of personal preparedness to which an individual sub- scribes. This can best be measured during the interview portion of the audit. IDENTIFYING AND MEASURING EMERGENCY MANAGEMENT PROGRAMME COMPONENTS: RECOVERY Minor emergencies can provide an organi- sation with opportunities to expand their recovery capabilities through an intentional scaling up of lessons learned. Extended or odd working hours caused by emergency management requirements, elevated levels of risk and stress produced by an emer- gency environment, which are managed successfully, can provide opportunity for individuals to anticipate successful recovery. BCERM's first three response goals are the key to identifying and confirming recovery occurs to some extent at almost every emergency: Are responders safe and healthy? Have lives been saved? Potential suffering reduced? If so, then recovery, even though it may be concurrent with an ongoing emergency, has begun. Establishing realistic and individually recognisable emergency goals allows the incremental holds or gains to be rec- ognised and valued by those enmeshed within the emergency. This is key to resiliency. Preparing an individual with possible terms of success is key to pro- moting recovery. After ensuring the initial BCERM goals are met: Is 'success' a graduated return to full operations? Is recovery a temporary but safe full shut- down of operations? Pre-definition of success ensures even small gains are recog- nised and the process of recovery is at least commencing. Recovery entails multiple plan imple- mentation, and is focused on ensuring the pace, timing and degree of recovery is optimised. Emergency Preparedness Programmes are mandated by federal10 and provincial legislation and regulation to ensure that they reflect locally appli- cable emergency needs, including plans for recovery. In essence, these layers of prescribed responsibilities ensure a suite of pre-planned activities, support and ser- vices for affected and involved people during and post-emergency. Emergency social services located at the community and municipal levels of emergency management programmes are readily identifiable as direct contribu- tors to recovery during an emergency. Temporary in nature, they provide the basics of personal safety and security for those who come to their doors literally a safe haven for those who require it. Recovery to pre-emergency normalcy is rare, as the criticality of the event and permanent physical changes caused by its occurrence prevent going back in time' to before it occurred. This is true of any significant event. However, as emergen- cies often have profound, negative, even deadly outcomes for some, the degree of individual and community recovery is most dependent upon its own defini- tion of successful recovery. Lac-Mgantic will never 'fully recover' but it will con- tinue to work to make those surviving members 'whole', and we are all sobered by the realisation of the impact that the tragic and deadly train derailment and petroleum product explosion had on its community. EMERGENCY MANAGEMENT PROGRAMME STANDARD OF MANAGEMENT PRACTICES WITHIN ORGANISATIONAL CONTEXT14 Legislation and regulation may define externally mandated organisational requirements, but the emergency manage- ment standard of management practices must do much more than just restate these obligations. 15 It must clearly provide the context in which the Emergency Management Programme and its plans exist, including organisational expectations for defined roles, emergency manage- ment accountabilities and required activity responsibilities. In this way, it sets the parameters under which all organisational emergency management activities fall. Measurement against its requirements will comprise two levels of accomplishment: Compliance activities mandated by leg- islation, regulation and organisational policy, including specific and required organisational programme and plan activities Programme and plan responsibilities required to support creation, imple- mentation and documentation of the organisation, department, divisional or GAP ANALYSIS AND PLAN IMPROVEMENT The organisation's emergency management standard against which employee activity is to be measured must include clear, written requirements identifying measurable out- comes. It must detail by-role identification of accountabilities or responsibilities for defined emergency management activities, and their documentation, at the planning, preparation, training, exercising or activa- tion, and evaluation stages. Designated three-deep management personnel can then recognise and meet their Emergency Management Programme responsibili- ties by working through their required activities in the scheduled time frames as they would in any management system structure. Where gaps exist in either activity (missing, incomplete or out-of-date conduct) or its documentation (accu- racy, completeness or timely/systematic recording), the weighted importance of the component's element will serve to have it recorded as a recommendation requiring actions to meet compliance requirements or an area of improvement identified to assist the department to meet localised plan support requirements. From the full suite of recommendations and areas for improvement, the organisa- tion's emergency management coordinator creates an Executive Action Plan Report. It contains only the highest priorities for subsequent period action, and additionally required compliance activities; and the highest priorities for departmental sup- porting activities affecting compliance. Department-based emergency manage- ment coordinators are able to use the full report's departmental activities details to identify their individual areas for improvement. In this way, adherence to the requirements of the emergency man- agement standard can be identified and requirements given higher organisational priority for action item recommendations, and supporting activities flagged for local improvement. The Executive Action Plan Report is presented to senior management for their review, discussion and endorsement. At this point, the recommendations for compliance with the corporate emer- gency management standard requirements comprise the set of activities for highest organisational focus during the next period. Progress on these highest priority action items will then be reported on at quarterly intervals through completion. KEY CONCEPTS OF RATIONALISED AUDIT Rather than work to introduce a complex, time-consuming and burdensome Emergency Management Programme audit tool, the writer proposes that con- sideration be given to the use of a limited number of questions around eight key leading management system performance indicators 16 in fact, measuring the per- ceptions of those individuals tasked with the programme's implementation and asking them to define what and how well it is designed, communicated and being implemented. Many organisations already conduct reg- ulatory-driven safety management system annual audits, and it may prove as effective to insert these eight key questions into existing annual audit processes 18 instead of conducting Emergency Management Programme measurements separately. By coordinating efforts to measure how well it manages prevention of incidents (safety and security) with a concur- rent measurement of the organisation's efforts to manage the eventual/anticipated occurrence of serious, critical incidents, emergencies or events, pairing of both system gains apparent, consequently supporting both systems' continuous improvement more efficiently. Additional audit resources, time or effort will not be required, so additional costs normally inherent for the organisation newly imple- menting a formal measurement approach for their emergency management system will not be incurred. Ongoing financial savings can be realised as the organisation resolves identified deficiencies in a system- atic and prioritised way. When supported organisational efforts can be rationally focused within the identi- fied Emergency Management Programme areas requiring improvement, managers are able to recognise their identified defi- ciency areas and measure improvements in a word, manage their Emergency Management Programme responsibilities as they integrate these necessary activities into existing operational norms. Much work has already been done by the Institute for Work and Health9 (IWH) in examining the reliability of a short questionnaire approach to system measurement in comparison with a detailed examination of system existence and effectiveness through documentation review, field observation and representa- tional interview (formal audit). The IWH reports the work that was done to design and conduct a confidential, system-based mechanism, with their objectives being to: Develop a short questionnaire to measure leading indicators of organi- sational occupational health and safety performance. Determine the face validity of questions for people who will need to administer the questions and those who will be using the information. Determine guidelines and recommen- dations to ensure data integrity. Define a clear process for administering generating reports, including specifics of the report. Collect data to assess the reliability and validity of the survey tool. 20 Their objectives parallel those of organisa- tions with plans to measure Emergency Management Programme effectiveness, as only the first objective changes: Develop a short questionnaire to measure leading indicators of organisational emergency man- agement performance. In 2009, 808 organisational question- naires were administered by self-identified, IWH-selected, internal health and safety persons across nine different Health and Safety Associations. A total of 642 of the results of the questionnaire were com- pared directly with their reported injury and illness rates using firm numbers. Organisational injury and illness rate data. was then correlated with the organisa- tion's questionnaire responses. Results indicated that when questions were used in their entirety, the short questionnaire was effective for measuring the organisa- tion's 'Organisational Performance Metric (OPM) and was more accurate for both large and small firms when administered individually and in person'.21 This leading indicator' measurement approach is not restricted to single prov- ince use, as it is currently being vetted in the local safety arena, 22 as WorkSafeBC23 funds provincial research through the University of British Columbia's School of Population and Public Health, to assess its use in private long-term care facilities. The writer suggests that use of the Organisational Performance Metric state- ments of eight areas of organisational performance would indeed provide effective programme measurement 24 and posits these can be utilised as the basis of performance measurement for any man- agement system, including emergency agement USE OF ORGANISATIONAL PERFORMANCE METRICS Using the OPM questionnaire's statements as a base, it is conceivable that meas- urement of an Emergency Management Programme's effectiveness may be possible by confirming the organisational status with the following emergency management- focused statements:25 (1) Formal emergency management capa- bility reviews are conducted at regular intervals as a normal part of our busi- ness practice. (2) Everyone at this organisation values ongoing emergency management improvement. (3) This organisation considers emergency management at least as important as safety, production and quality in the way that our work is done. (4) Workers and supervisors have the information and training that they need to be prepared for an emergency beyond normal operational incident response capabilities. (5) Employees are always involved in deci- sions affecting their participation in emergency preparation, response and recovery. (6) Those in charge of emergency man- agement have the authority to make the changes they have identified as necessary. (7) Those who act on their emergency management responsibilities receive positive recognition. (8) Everyone has the tools, equipment and logistical support they need to be prepared for a workplace emergency beyond normal operational incident response.26 Inarguably, more work needs to be done before the above statements can be vali- dated as effectively measuring or predicting an organisation's performance metric for emergency management. However, it is reasonable to posit that organisations com- mitted to their Emergency Management Programme effectiveness would take a step forward towards continuous improvement by looking at such defined measurement models. Gathering leading performance data with the insertion of similarly worded and Emergency Management Programme- focused questions into the annual conduct of their formal safety management system audit may be the effective first step. Validation of this conjoined approach would still require knowledgeable anal- ysis of the response correlation with organisational Emergency Management Programme activities, for example: Observed participation in organisational exercises. Programme improvements reflecting ongoing application and understanding of emergency management principles, Systematic improvements of capabilities flowing from deficiencies identified and corrected from Emergency Response Plans' After Action Reports, and Plan Exercise and Event analyses. CONCLUSION Whether through formal (triple-per- spective) audit processes or summative questionnaire interviews combined with subsequent activity and documenta- tion review, measurement of existing Emergency Management Programme activities benefits the organisations sup- porting their existence. By identifying what's working well (meeting standardised performance targets, e.g. management training comple- tions), what's missing (gap identification, e.g. process for conduct assignment for new managers) and what still needs to be prioritised for action (gap anal- ea movement to all-bazard from. vsis specific-hazard approach), assessment of the existing Emergency Management Programme allows for more intelligent assignment and management of resources to prepare for emergencies and prioritise and measure programme improvements. In both formal audit and question- naire approaches, effective measurement is dependent upon a prior establishment of the standards of expectations for the purpose, scope, roles and responsibilities, training and activity requirements for identified individuals or positions. An emergency management standard which clearly estab lishes these measurable components allows for transparent and sustainable improve- ments in organisational preparedness in direct relation to its stated objectives. Measuring current activities and their documentation, or perceptions of current Emergency Management Programme attributes, against stated standard expec- tations allows for clearer understanding of achievements and outstanding require- ments. Regularised measurement and prioritisation of resource allocation sup- ports a cycle of continuous Emergency Management Programme improvement. REFERENCES AND NOTES (1) Justice Institute of British Columbia, Emergency Management Certificate Program, Emergency Management Planning, Module 1, p. 7. Available at: http://www.jibc.ca/programs-courses/ schools-departments/school-public- safety/emergency-management-division/ academic-programs/emergency- management-certificate (accessed 17th August, 2016) (2) See Appendix A Determine Representative Interviews for Stakeholder participation. (3) Ibid. (4) Justice Institute of British Columbia, Emergency Management Certificate Program, Emergency Management Planning, Module 1, p. 16. Available at:
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
