Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
These assessment tables are designed to organize and consolidate the information and obser- vations collected during an SMS assessment. These tables can be detached from the guidebook and used during the assessment. The tables are based on OHSAS 18002 and the Canadian CSA Standard Z, and were adapted using ICAO Doc 9859 (2006) guidelines and reviewed to address FAA AC 150/5200-37 (2007). B.1 How to Use The worksheets are a series of tables which contain all of the expectations associated with each SMS pillar and element. There is space for the assessment team to record information references, score, and observations (justification for the given score). The tables shown in Annex A (Tables 32 through 36) are organized as follows: ⢠The first column contains SMS expectations; ⢠The second column is for references, i.e., the source of observations or information collected during the document review or interviews; ⢠The third column is for the score (from 0 through 5) assigned to each expectation; and ⢠The fourth column is for the observations/information collected by the assessment team that justify the assigned score. References During the assessment, team members should collect as much information as possible to reference their observations, including the name, position, department of the person being interviewed, the observation location and time, the document title, and publication date and reference number. If the expectations worksheets are going to be part of the final deliverable to the client, a complete reference may not appear in the final version to protect the privacy of individuals. However, this information should be available to the assessment team during the scoring process and for future reference, if required. Scoring Scoring should be conducted by all members of the assessment team according to the method- ology outlined in Section 6.6. Pillar and element scoring should be done after all the team membersâ observations have been recorded on the worksheet (Annex C, Table 37). Observations Members of the assessment team should transfer their observations from their notebooks to the worksheets and score each expectation. There need only be one working copy of this 159 A N N E X B Using Assessment Tables
document, which is passed among team members. This may be done at the end of each day of the site visit. B.2 SMS Scoring Methodology Once all of the observations have been recorded, the team should score the SMS elements as a group. There should be a consensus, by the team, on the score assigned to each element. In the event that there is a disagreement, the Team Leader will make the final decision. The following is the criteria that should be used. Step 1âScore Expectations ⢠Expectations are scored first. ⢠They are given a score of âMEETS Expectationâ or âBELOW Expectationâ based on the infor- mation collected during assessment. This is done to remove subjectivity. The score may have one or more comments associated with it that provide justification and context. ⢠Expectations may be scored by individual team members. Another team member may override the initial score if he/she can provide the information required to justify a change. ⢠Team members should come to a consensus on the score assigned to each expectation. Step 2âScore Elements ⢠Sub-elements and elements are scored next. ⢠The assigned score is not based on a mathematical average of the expectations scores; however, expectations scores serve as a guide for pillar and element scoring. ⢠Sub-elements and elements are given a score of 0 through 3 as follows: â â0â is given when none of the expectations under the element are met (comments/ justification required) â â1â is given when some of the expectations under the element are met (comments/ justification required) â â2â is given when all expectations under the element are met (no comment required) â â3â is given when all the expectations are met or exceeded (comments/justification required) â â3â may be assigned if the assessment team believes that the organization has done an exceptional job, meeting or exceeding all the expectations under this element, and deserves extra mention (comments/justification required) â â4â is given in the event that the organization exhibits best practice for this element (rare, extremely subjective, and may only be assigned by an SMS expert with particular industry experience [comments/justification required]). ⢠Add the sub-element scores to assign element scores, as required. Again, the score is not based on a mathematical average; the sub-element scores serve as a guide for the element scores. ⢠Elements can only be assigned whole numbersâno decimals, please! ⢠All team members should agree on the element scores before assigning pillar scores. ⢠Element scores should be recorded on the scoring table presented in Annex C. Step 3âAdd to Score Pillars ⢠Pillars are scored last, following a similar process to element scoring. ⢠The assigned score is not based on a mathematical average of the element scores. Element scor- ing serves as a guide for pillar scoring. ⢠Pillars are given a score of 0 through 4, following the same criteria used for the sub-element scores. ⢠Pillars can only be assigned whole numbers. ⢠All team members should agree on the final pillar scores. ⢠Pillar scores should be recorded on the scoring table presented in Annex C. 160 Safety Management Systems for Airports