National Academies Press: OpenBook

Animal-Vehicle Collision Data Collection (2007)

Chapter: Chapter Three - Survey

« Previous: Chapter Two - Literature Review
Page 8
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 8
Page 9
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 9
Page 10
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 10
Page 11
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 11
Page 12
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 12
Page 13
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 13
Page 14
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 14
Page 15
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 15
Page 16
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 16
Page 17
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 17
Page 18
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 18
Page 19
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 19
Page 20
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 20
Page 21
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 21
Page 22
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 22
Page 23
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 23
Page 24
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 24
Page 25
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 25
Page 26
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 26
Page 27
Suggested Citation:"Chapter Three - Survey." National Academies of Sciences, Engineering, and Medicine. 2007. Animal-Vehicle Collision Data Collection. Washington, DC: The National Academies Press. doi: 10.17226/23138.
×
Page 27

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

This chapter contains the methodology and results for the animal–vehicle collision and animal carcass data survey. [See the introduction (chapter one) for the definitions of AVC and AC data.] METHODS Survey Questions and Design The survey consisted of three sections: (1) an introductory letter including several introductory questions, (2) AVC data ques- tions, and (3) AC data questions. The full survey forms are in- cluded as Appendix B. If the DOT or DNR concerned did not collect AVC or AC data, the respondent only filled out the introductory questions. If the DOT or DNR concerned did collect AVC and/or AC data, the respondent was asked to com- plete the remaining section(s) of the survey (AVC and/or AC questions) as well. The questions covered a wide range of topics related to AVC and AC data, starting with the reasons the DOT or DNR concerned did or did not collect these data, and which road types and/or geographical areas were included. Other key sections of the survey focused on the parameters recorded and potential reporting thresholds; potential training and instruction for data collectors; data analyses and data sharing; and potential obstacles to implementing, advancing, or improving data collection and analyses. Finally, the respon- dents were asked to send in examples of data sheets used for the collection of AVC and AC data (Appendixes C and D). The Topic Panel members requested that at least two key individuals be approached for each state or province: a rep- resentative of the DOT (with a focus on public safety) and a representative of the DNR (with a focus on natural resource conservation). Interviewees and Response Method The survey was sent to the official TRB representative for the DOT in each state and province (Table 1). In addition, the survey was sent to a known specialist at the DOT in each state and province, and to additional specialists at DOTs in selected states or provinces. The survey was also sent to a known specialist at the DNR in each state and province, and to addi- tional specialists at DNRs (Table 1). For DOTs and DNRs combined, the total was 247 contacts. The above-mentioned contacts occasionally forwarded the survey to others within 8 their organization if they believed these individuals would be more knowledgeable with regard to the subject. The number of people who were forwarded the survey could not be tracked. Apart from the list of the official TRB representatives for each state and province, the following sources were used to select potential contacts in each state or province: (1) the panel members’ networks, (2) the Western Transportation Institute–Montana State University (WTI–MSU) network, and (3) suggestions from individuals at the state or provincial DOTs and DNRs. The survey was posted on a website and the interviewees were encouraged to fill out the survey on this website. The survey was also available in MS Word (with check boxes and drop down menus) and PDF format that could be returned by e-mail, fax, or mail. TRB sent the survey to the interviewees on March 6, 2006, with reminders sent on March 15, March 27, and April 3, and the website was closed for responses on April 5, 2006. The Institutional Review Board for the Protection of Human Subjects at Montana State University declared that the questionnaire was exempt from review in accordance with the Code of Federal Regulations, Part 46, section 101 (b)(3) on February 9, 2006. Crash Forms In addition to the survey, and in addition to the AVC and AC forms that the interviewees forwarded in response to the survey, the crash forms posted on the website for the National Center for Statistics and Analysis of the NHTSA (“Crash Forms” 2006) for all 50 states were reviewed. The review focused on the following topics: • Are animal–vehicle collisions recorded? • Do the forms differentiate between wild and domestic species? • Do the forms allow for the entry of the species name of the animal that is involved in a collision? • Are there reporting thresholds (e.g., $1,000 in vehicle damage, a human injury, or a vehicle towed)? • How is the location of the accident described [e.g., use of coordinates (GPS or map) and distance to the nearest landmark]? CHAPTER THREE SURVEY

9The data for the 50 states (“Crash Forms” 2006) were supplemented with accident report forms from two provinces (British Columbia and Northwest Territories) (Appendix C), and the four responses from other Canadian provinces (Alberta, Manitoba, Newfoundland, and Nova Scotia) to the applicable portions of the survey. Data Analysis In some cases there was more than one respondent for an individual DOT or DNR. In such instances, the answers for these respondents were combined into one response, which resulted in a maximum of two responses for each state or province; one for a DOT and one for a DNR. The responses were summarized by calculating the number and/or percentage of respondents that selected the different options or categories for their responses. The percentages were calculated as the number of responses in each category divided by the total number of respondents to that question. For these calculations, the maximum number of respondents was two for each state or province (one for the DOT and one for the DNR). In the text, percentages refer only to the respondents and responses relevant to specific questions. For example, there were 25 DOT respondents to the AVC survey. If 15 marked “yes” to a question, 8 marked “no,” and 2 did not respond, the percentage “yes” is 65% (15/23), and the percentage “no” is 35% (8/23). Thus, it is important to realize that the percentages for different questions are based on different totals if the number of respondents differed. Finally, several questions permitted multiple responses, in which case the sum of the percentages in the categories could add up to more than 100%. In certain cases, chi-square tests were run to determine whether responses differed by agency type (DOT vs. DNR) or nation (United States vs. Canada). In this synthesis report the term “significant” was reserved for P-values ≤0.05. These statistical tests were only conducted when the expected sample sizes in each cell were ≥5, as chi-square tests with expected frequencies <5 generate unreliable results. Data Summary Tables The summary tables of the responses are included in the appendices (Appendixes E, F, and G). The percentages in the summary tables are calculated differently than in the text. These percentages were based on the number of agencies that responded to the survey as a whole, so that nonresponse to certain questions could be assessed. Using the previous example with the 25 DOTs responding to the AVC survey, with 15 answering “yes” to a question, 8 answering “no,” and 2 not responding, in the survey tables these percentages appear as “yes” = 60% (15/25), “no” = 32% (8/25), and no response = 8% (2/25). RESULTS Respondents For DOTs and DNRs combined the response rate was 88.9% (56 of 63 states and provinces) (Table 2). DOTs (63%) had a slightly higher response rate than DNRs (57%) (Table 2, Figures 1 and 2). Therefore, DOTs and DNRs were similarly represented in the responses to the survey. (Note: some agencies did not answer all the questions, or all parts of one question, causing variable sample sizes within and between individual questions.) The response rate for the AVC portion was higher than for the AC portion of the survey (Table 2). Note that DOTs and DNRs only responded to these portions of the survey if they actually collected AVC or AC data. Data Types (Introduction Survey and Crash Forms) Based on the responses to the introductory questions from the survey, AVC data are collected or managed by more DOTs than DNRs (Figure 3). AC data are collected or managed by more responding DNRs than DOTs (Figure 3). Individuals Approached for Survey U ni te d St at es Ca na da To ta l TRB representatives for DOT (one per state or province) 50 13 63 Known specialist for DOT (one per state or province) 50 13 63 Additional representatives for DOT 43 7 50 Subtotal 143 33 176 Known specialist for DNR (one per state or province) 50 13 63 Additional specialists for DNR 8 0 8 Subtotal 58 13 71 Total 201 46 247 TABLE 1 NUMBER AND TYPE OF INDIVIDUALS APPROACHED FOR SURVEY Responding States and Provinces U ni te d St at es Ca na da To ta l Response to some portion of AVC or AC survey (DOT or DNR) Response to some portion of AVC or AC survey (DOT) Response to some portion of AVC or AC survey (DNR) Response to some portion of AVC survey (DOT or DNR) Response to some portion of AVC survey (DOT) Response to some portion of AVC survey (DNR) Response to some portion of AC survey (DOT) Response to some portion of AC survey (DNR) 25619 1349 Response to some portion of AC survey (DOT or DNR) 43 13 56 30 10 40 30 6 36 25 8 33 21 4 25 10 1 11 13 3 16 TABLE 2 NUMBER OF STATES AND PROVINCES RESPONDING TO EACH SURVEY

10 Based on a review of the crash forms, all responding states and provinces record animal–vehicle collisions as at least a checkbox or code on the crash form, except for one state. Absence of Animal–Vehicle Collision and Animal Carcass Data Collection Programs (Introduction Survey) This section relates only to the DOTs and DNRs that reported that they do not collect AVC or AC data. Results from agencies that collect either AVC or AC data or both data types were excluded from this section. For DOTs, the most common reason for not collecting AVC or AC data is equally that they were not interested (n = 4; 29%) or that “someone else” collects such data (n = 4; 29%), with two responses each for the expense, time involved, and “other” responses including “no demonstrated problem” and “AC pick-ups might be logged by road foremen but no one collects that data.” Responses by DNRs differed somewhat. The most common reason DNRs do not collect AVC or AC data is that “someone else” collects such data FIGURE 2 Study area and respondents by state and province. 30 30 6 10 US DNR US DOT CAN DNR CAN DOT FIGURE 1 Respondents to surveys by nation and agency.

11 9 3 10 8 3 9 6 12 3 3 2 1 6 1 0 5 10 15 20 25 30 AVC ONLY AC ONLY BOTH NEITHER CAN DNR CAN DOT USA DNR USA DOT FIGURE 3 Number of agencies from the United States and Canada that collect AVC and/or AC data. (n = 8; 53%), followed by the expense (n = 4; 27%) and the amount of time associated with data collection (n = 2; 13%). DOT respondents had varying opinions on whether, in their professional opinion, their agency should begin to collect AVC or AC data. Of the eight respondents, three (38%) answered “yes,” whereas two answered “no” (25%) and three were undecided (38%). Most of the DNR respon- dents (n = 8; 80%) believed that, in their professional opinion, their agency should not begin to collect AVC or AC data. Next, the agencies were asked what changes would need to be made before their agency would begin collecting AVC or AC data. Most DOTs (n = 7; 39%) responded that a need had to be demonstrated first. Other changes included more funding (n = 4; 22%), better training (n = 3; 17%), and more personnel (n = 2; 11%). One DOT indicated that the devel- opment of a mechanism for field data entry would be required before their department would begin collecting AC or AVC data. Most of the responding DNRs (n = 8; 40%) also stated that a demonstrated need would be required. Other required changes included more funding (n = 5; 25%) and more personnel (n = 4; 20%). AVC Survey The AVC survey form can be found in Appendix B, with the summary data contained in Appendix F. Rationale for AVC Data Collection and Roads and/or Areas Included (AVC Section 1) Agencies were asked why they collect or manage AVC data by ranking reasons in order of importance, with 1 being most important. Most DOTs indicated that public safety was the primary reason for collecting AVC data (n = 20; 83%), with wildlife management or conservation the number two reason (n = 11; 58%) and accounting the third (n = 8; 57%; Figure 4). Other reasons given were that it is a legal requirement for them to report AVCs that result in property damage of $1,000 or greater (n = 2; Manitoba and South Dakota), and that it allows for the identification of high-collision areas so that warning signs can be put in place (n = 2; Alberta and New Hampshire), which is closely linked to public safety as well. DNR respondents were almost equally divided between public safety and wildlife management/conservation as the primary reasons they collect or manage AVC data, with accounting reasons the next most important reason (Figure 4). Other reasons why DNRs collect or manage AVC data included tracking diseases such as chronic wasting disease and rabies (n = 2). On average, DNRs have collected AVC data for longer than DOTs, with 20.9 years of collecting for the average DOT (95% C.I. = 15.49, 26.40; n = 18), as compared with an average of 31.4 years of collecting for DNRs (95% C.I. = 20.91, 41.95; n = 7). However, this difference was not significant when tested with a two-sided, two-sample t-test; t = 1.734, P = 0.115. Ohio and Nebraska DNRs have recorded AVC data since the 1950s, the longest recording period of all respondents. Note that some answers were unquantifiable, including “many years ago” and “for ever,” could not be used in the calculations. Similar percentages of responding DOTs and DNRs reported that collection of AVC data was mandatory (n = 18; 75% and n = 6; 67%; P = 0.986). Of the 25 responding DOTs, 24 (96%) collect data on Interstates, 24 (96%) collect data on arterial roads, 19 (76%) collect data on collector roads, and 13 (52%) collect data on

local roads. One of these DOTs collects data on Interstates only, and the Northwest Territories DOT collects data on all roads except for Interstates because it has none. All 10 DNRs that responded to the question collect data on Interstates and arterial roads, 6 (60%) also collect data on collector roads, and 8 (80%) also collect data on local roads. The geographic limits of the reporting area for DOTs included all roads in the state or province (n = 10; 43%), all state or federal lands (n = 7; 30%), and all public lands in their state or province (n = 4; 17%). The Alaska DOT reports on all areas where state police crash reports are completed and the Manitoba DOT reports on all areas under provincial jurisdiction, excluding municipal roads. The geographic limits of the reporting area for the 11 responding DNRs contained all areas in the state or province (n = 5; 45%) or all state and/or federal lands (n = 3; 27%). Two respondents report on all roads on public lands in the state or province, and one reports on all areas with certain exceptions, such as military bases, certain federal lands, forest access roads, and tribal lands. Overwhelmingly, all agencies responded that the land- scape surrounding the areas where they collect AVC data are both rural and urban (n = 32; 94%), with only New Hampshire and Vermont DOTs indicating the landscape is predominantly rural. When asked what other organizations or individuals collect AVC data on the road systems that are covered, most agencies indicated that some branch of law enforcement is involved (n = 13). Other responses included other governmental branches (i.e., city or county; n = 4) and private organizations or individuals (i.e., nongovernmental organizations, interested members of the public; n = 4). Correspondingly, when asked what other organizations or individuals collect AVC data on the road systems that are not covered, the agencies indicated that no organizations or indi- viduals collect AVC data in these areas (n = 5) or that another government agency (i.e., city or county) was in charge of these data (n = 5). 12 AVC Parameters Recorded and Reporting Thresholds (AVC Section 2, Crash Forms) Respondents were asked, “What organization(s) does the actual animal–vehicle data collection on the ground?” Multi- ple agencies collect AVC data; however, most frequently, the Highway Patrol or other law enforcement agencies were selected, with 25 responses (45%) indicating their participa- tion. DOTs and DNRs were roughly equal, with 13 and 11 responses (24% and 20%), respectively. Other answers included local contractors and private citizens. Data are often reported to DOTs and DNRs by drivers (n = 25; 48%) or by other agencies (n = 17; 34%). Other responses included local law enforcement (n = 6; 12%) and interested individuals (n = 2; 4%). Based on the survey responses most DOTs have reporting thresholds for AVCs (n = 16; 64%), whereas only a few DNRs do (n = 4; 33%). This difference was significant (P = 0.040). These thresholds generally involved a combina- tion of human injury, property damage, and involvement of a certain species. Twelve respondents indicated that property damage generally needs to be in excess of $1,000 U.S. or Canadian, whereas two respondents noted that in excess of $500 in property damage would be required to report the collision, and one respondent stated that any amount of “reportable vehicle damage” would be sufficient to record the collision, but it was unclear what that threshold was. Nine DOTs and DNRs indicated that their thresholds depend on what animal species or groups of species were involved in the collision (e.g., deer, bear, and moose). Based on a review of the crash forms, all 50 states and 5 of the 6 responding provinces have thresholds under which vehicle collisions are not recorded (Figure 5). The most common threshold is a minimum estimated damage of $1,000 (22 states and 4 provinces), although many states have dam- age thresholds in the range of $500–$750 (19 states). Four states have reporting thresholds under $500, and two states (Alaska and Delaware) have reporting thresholds of more than DNR RESPONSES 0 2 4 6 8 10 12 14 PU BL IC SA FE TY WI LD LIF E AC CO UN TIN G OT HE R Rank 4 Rank 3 Rank 2 Rank 1 DOT RESPONSES 0 5 10 15 20 25 30 PU BL IC SA FE TY WI LD LIF E AC CO UN TIN G OT HE R Rank 4 Rank 3 Rank 2 Rank 1 FIGURE 4 Ranked reasons why DOTs and DNRs collect AVC data.

13 $1,000. Alberta, Connecticut, Maryland, and Texas have non- monetary thresholds, including all reported crashes or crashes where the vehicle is towed. Note that five states will report collisions with less damage than the threshold if there is a human injury or fatality involved. DOTs and DNRs described the search and reporting efforts as both “incidental” (DOT—n = 6, 29%; DNR—n = 3, 25%) and “monitoring” (DOT—n = 8, 38%; DNR—n = 5, 42%), with P = 0.838. Ten of 11 of the “other” respondents clarified their answers by noting the importance of accident collision reporting in the data and how the AVC data may underestimate the true number of collisions. DOTs and DNRs (n = 11; 37%) stated that surveys or checks for AVCs largely occur as these collisions are reported or seen, whereas seven respondents (23%) indicated that checks occur daily (4 DOTs and 3 DNRs), 4 (13%) indicated they occur weekly (3 DOTs and 1 DNR), 1 DNR checks for AVCs monthly, and 2 DOTs check annually. “Other” responses from DOTs included a review of countywide routes every 2–3 years, and that checks occur at lower frequencies for lower classifica- tion highways. DOTs and DNRs were asked which parameters they record as a part of AVC reporting. Nineteen DOTs responded to all or parts of the question. Most of the responding DOTs always record the date (n = 19; 100%), time (n = 13; 76%), district or unit (n = 15; 79%), the name of the observer (n = 12; 71%), road or route identification (ID) (n = 18; 95%), collision location (n = 14; 78%), the occurrence of human fatalities (n = 14; 82%), human injuries (n = 12; 71%), and property damage (n = 12; 71%) (Table 3). Most DOTs (n = 7; 47%) never record the type of human injuries, the sex (n = 9; 53%), or age (n = 11; 65%) of the animal con- cerned, or whether or not the animal carcass was removed (n = 9; 53%). Some DOTs always record the amount of prop- erty damage (n = 6; 38%), whereas others never do so (n = 5; 33%). The same applies to the species of the animal (seven DOTs always record the species name, five usually, and three sometimes). DNR responses primarily differed from DOT responses in that the majority of DOT responses were either “always” or “never,” whereas DNR responses also included the other categories (usually, sometimes, rarely; see Table 3). Interestingly, most DNRs (n = 7; 78%) always record the species name and always or usually include the sex (n = 6; 67%) of the animal involved. Based on a review of the crash forms, the most common method of documenting AVCs is a checkbox or a code for the object of collision referring to “animal” only (19 states and 1 province) (Figure 6). In these cases, if a species name is to be recorded, it would have to be in the crash narrative or the comments at the discretion of the recording official, and the information may not be accessible in the final crash database. The next most common method of entering AVCs is a checkbox or a code for “deer” and a checkbox or a code for “animal other than deer” (12 states). Eight states and two FIGURE 5 Minimum reporting threshold for a collision based on a review of the crash forms (United States, British Columbia, and Northwest Territories) and the survey responses (Canada). No information was available for the provinces shown without shading.

14 FIGURE 6 How AVCs are indicated on crash forms. Provinces or states without shading did not collect AVC data on crash forms or they represent states and provinces with missing data. Note: Shaded areas mark category with the most frequent response. DOTDNR Recorded Parameters A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se Date 38 23 8 0 0 31 76 0 0 0 0 24 Time 23 8 15 15 8 31 52 8 4 0 4 32 District/unit 38 15 8 0 0 38 60 8 0 4 4 24 Name of observer 31 23 8 8 0 31 48 8 0 8 4 32 Road/route identification 31 15 15 0 0 38 72 4 0 0 0 24 Collision location 23 38 8 0 8 23 56 12 0 0 4 28 Human fatalities 38 8 8 0 8 38 56 0 0 0 12 32 Human injuries 31 8 15 0 8 38 48 4 4 0 12 32 Type of injury 8 23 0 15 15 38 24 0 4 4 28 40 Property damage 15 8 15 8 15 38 48 8 0 0 12 32 Amount ($) of property damage 8 8 15 8 23 38 24 8 4 8 20 36 Species of animal 54 15 0 0 0 31 28 20 12 0 8 32 Sex of animal 23 23 8 8 8 31 8 0 16 8 36 32 Age of animal 15 15 15 8 15 31 4 0 12 8 44 32 Removal of animal 31 15 15 0 0 38 16 0 8 8 36 32 TABLE 3 ANIMAL–VEHICLE COLLISION PARAMETERS RECORDED BY DNRs AND DOTs (all in percentages)

15 provinces allow multiple choices for wild species and/or domestic species. These states use checkboxes with species involved in collisions (e.g., Nevada has checkboxes for dog/ coyote, burro, cattle, horse, deer, bear, antelope, big horn sheep, elk, and other animal). Kansas has similar codes (deer, other wild animal, cow, horse, other domestic animal), but also allows the species name to be written in a space. Six states only have checkboxes for “wild animal” and “domestic animal,” with no space for specific comments unless the officer records that type of information in the crash narrative. Four states and three provinces use checkboxes for “animal” adjacent to a line where the species of animal can be written. AVC Location Recording and Spatial Resolution (AVC Section 2—Continued) Based on the survey responses most DOTs (n = 11; 58%) always use reference posts (miles or kilometers) to identify the location of a collision (Table 4). Most DOTs never use a GPS (n = 11; 69%) or map (n = 7; 44%) to record the loca- tion of the AVC. Some DOTs always use road sections to record the location of the AVC (n = 7; 39%), whereas others never do so (n = 4; 22%). The methods used by DNRs are more variable, with one DNR reporting collision data by house number or road intersection. The precision of the spatial location of the AVC data is variable for both DOTs and DNRs. For most DOTs the location is rarely or never within 1 yard or meter (DOTs—n = 10, 77%; DNRs—n = 6, 86%), 15 yards or meters (DOTs—n = 8, 67%; DNRs—n = 5, 83%) or 30 yards or meters (DOTs—n = 7, 58%; DNRs—n = 4, 57%). The AVC data from DOTs are always or usually accurate to 0.1 mile or kilometer (n = 13; 68%) or 1 mile or kilometer (n = 6; 50%), whereas the data from DNRs are rarely or never accurate to 0.1 mile or kilometer (n = 4; 58%). However, the data from DNRs are always or usually accurate to 1 mile or kilometer (n = 5; 63%). One DNR always reports locations within one yard or meter, whereas one DOT usually and two DOTs sometimes report locations with this resolution. One DNR sometimes reports locations within 15 yards or meters, whereas the Mississippi DOT always reports collisions at this resolution, and the Iowa, Kansas, and Minnesota DOTs sometimes report collisions at this resolution. The Connecticut DNR usually and the Rhode Island and Vermont DNRs some- times report collision data to 30 yards or meters, and the Kansas DOT usually and the Colorado, Iowa, Maryland, and Min- nesota DOTs sometimes report collisions at this resolution. Four DNRs noted that location resolution is variable depending on the survey route and what references are available. For DOTs the reference posts (miles or kilometers) used in describing animal–vehicle collision locations were mostly 1 mile or 1 kilometer apart (n = 7; 44%), whereas only one DNR uses reference posts at this distance. Two DNRs and two DOTs use reference posts 0.1 mile apart. Two DOTs have reference posts 0.2 mile apart, and one DOT reports reference posts that are 500 ft apart. One DOT and one DNR use references based on roadway or geographic features causing variable spatial resolution. Another DNR reports that major routes have reference posts every 2, 4, or 5 km, whereas minor routes have no reference posts. One DOT uses reference posts 2 km apart. Based on a review of the crash forms, the most common method of locating a collision is based on distance from a roadway feature, such as an intersecting road, bridge, mile post, or other reference post (29 states and 4 provinces) (Figure 7). Twenty states record latitude and longitude or another coordinate-based system. We cross-checked the in- formation from the crash forms, the instruction manuals accompanying the crash forms (if provided), and the survey data gathered to determine whether these coordinate loca- tions are based on map coordinates or GPS. We found that 14 states do use GPS units when available. Note that many of these states do not require the use of a GPS and that several states and provinces use maps to derive the coordi- nates of crash locations. Species and Species Groups Recorded for AVCs (AVC Section 2—Continued) Amphibians are generally never recorded by DOTs and DNRs (Table 5). However, two DOTs do record amphibians TOD RND Recorded Parameters A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se GPS coordinates 0 8 15 8 23 46 4 0 4 12 44 36 Map coordinates 15 8 23 8 15 31 4 8 24 0 28 36 Miles/kilometers post 0 8 31 0 15 46 44 16 8 4 4 24 Road section 0 23 23 0 8 46 28 24 4 0 16 28 0Other 8 0 0 0 92 0 0 0 0 16 84 Note: Shaded areas mark category with the most frequent response. TABLE 4 HOW ANIMAL–VEHICLE COLLISION LOCATION DATA ARE REPORTED BY DNRs AND DOTs (all in percentages)

to the species (Vermont and Northwest Territories). The Kansas DOT records amphibians as “other wild animal.” The Vermont DNR records amphibians to “order.” In all, two DOTs and one DNR noted that they record all amphibian groups, endangered and otherwise (Vermont DOT and DNR and Northwest Territories DOT). Reptiles are generally never identified by DOTs and DNRs (see Table 5). However, two DOTs record reptiles to genus 16 (Mississippi and Northwest Territories), and the Vermont DNR records reptiles to the order. The Vermont DOT records endangered reptiles only, whereas the Northwest Territories DOT records all reptile groups. Birds are recorded by some DOTs and DNRs (see Table 5). Five DOTs never report birds and five noted that only large birds are generally reported, or that it is based on the vehicle- operator’s description, which varies in detail. Of the DOTs, FIGURE 7 Location system used by each state or province based on a review of the crash forms. If it was uncertain as to whether GPS or maps were used to derive coordinates for location, the state was assigned to the category for map coordinates. Unshaded states, provinces, and territories did not have information available. Notes: Shaded areas mark category with the most frequent response. X = not an option for responses. TOD RND Recorded Parameters Sp ec ie s G en u s Fa m ily O rd er Cl as s N ev er O th er N o R es po ns e Sp ec ie s G en u s Fa m ily O rd er Cl as s N ev er O th er N o Re sp on se Amphibians 0 0 0 8 0 62 15 15 8 0 0 0 0 52 12 28 Reptiles 0 0 0 8 0 46 23 23 0 8 0 0 0 56 4 32 Birds 15 0 0 8 0 31 23 23 4 12 0 8 8 20 20 28 Large wild mammals 69 8 0 8 0 0 15 0 12 44 0 0 0 4 12 28 Small wild mammals 31 0 8 0 0 8 23 31 8 12 4 8 0 28 8 32 Domestic animals 15 X X X 0 23 38 23 40 X X X 0 12 20 28 TABLE 5 SPECIES GROUPS RECORDED BY DNRs AND DOTs IN ANIMAL–VEHICLE COLLISION DATA COLLECTION PROGRAMS (all in percentages)

17 Vermont records birds to species; Mississippi, Northwest Territories, and Wyoming record birds to genus; Colorado and South Dakota record birds to order; and Iowa and Manitoba record birds to class. Of the 10 responding DNRs, two report birds to species (Delaware and Kentucky), one reports birds to order (Vermont), four never report birds, and three report birds sporadically. Bird groups of interest to responding DOTs included all bird groups (n = 2; 13%), endangered species (n = 2; 13%), game birds (n = 1; 7%), and raptors (n = 3; 20%). Four DOTs (27%) noted that typically only large birds are recorded, because some DOTs have a damage threshold. The Colorado DOT records birds occasionally, based on time and knowledge of their crews. Of the DNRs that report birds (n = 12; 75%), groups of interest include endangered species (n = 3; 25%), game birds (n = 3; 25%), and raptors (n = 3; 25%). Large wild mammals (deer and larger) are recorded by most DOTs and DNRs (see Table 5). Most DOTs record large wild mammals to the genus, whereas most DNRs identify large wild mammals to the species. One DOT noted that, although they record large mammals to genus, they are recorded only as comments on the police AVC records, and their names are not entered into the database. One DNR (Nova Scotia) records only black bear, white- tailed deer, and moose (no other bear or deer species in their area), and one DNR records white-tailed deer only (Rhode Island). One DNR reports furbearers (Ohio). Large mammal groups of interest to DOTs include ungulates (n = 8), game species (n = 7), carnivores (n = 4), all species (n = 5), and endangered species (n = 2). DNRs mostly in- dicated interest in ungulates (n = 8), with the next highest response for game species (n = 5), carnivores (n = 3), all species (n = 2), endangered species and non-natives (Newfoundland). Small wild mammals (smaller than deer) are only recorded by some DOTs and DNRs (see Table 5). Of the 17 respond- ing DOTs, 7 never report small mammals, and of the 9 re- sponding DNRs, 4 report small mammals to species. Some DOTs identify small mammals to the genus or species (n = 5). Two other DOTs record small mammals as “other wild an- imals” if they are involved in crashes that meet the reporting thresholds, and one DOT noted that small wild mammals are recorded at the discretion of the field personnel and these observations are entered into the database. Groups of special interest to DOTs include all small mammals (n = 3), carni- vores (n = 2), and one response each for endangered species and game mammals. Small mammal groups of interest to DNRs include carnivores (n = 4), game species (n = 3), and one response each for all small mammals, endangered species, and non-native species. One DNR reported that species are recorded depending on the interest of specific projects underway. Domestic animals are identified by some DOTs and DNRs (see Table 5). Of the 18 responding DOTs, 10 report domestic animals to species, 3 never report domestic animals, and 1 of the 5 “other” responses stated that domestic animals are described as “all other animals” if they were involved in a crash that meets reporting thresholds. Five DOTs record all domestic animals (although some record only if reporting thresholds are met) and three record large species only. Three DNRs record large species only. Portions of carcasses are frequently kept for further analysis by both DOTs (n = 9; 50%) and DNRs (n = 7; 54%). Further analyses include disease testing and a means to gather more information about population dynamics. Chronic wasting disease was the most frequently mentioned disease (n = 4; Connecticut, Kentucky, Rhode Island, and Virginia), followed by rabies (n = 2; Kentucky and Mississippi), and West-Nile Virus (n = 1; Connecticut). Samples to investigate the reproductive state (Nova Scotia DNR) and age (Missouri DNR) of the animal concerned are also gathered from carcasses. One DOT noted that the DNR in the same state collects specific information from black bear carcasses; however, it is unclear what parameter and for what purpose. Training and Instruction for AVC Data Collectors (AVC Section 3) Although AVC data are typically collected by law enforcement personnel, these organizations were not approached for this synthesis; the synthesis was restricted to DOTs and DNRs. Given that limitation, more responding DOTs (n = 9; 69%) than DNRs (n = 1; 11%) train their employees in AVC data collec- tion (P = 0.093). The DOTs have variable training regimens. Four DOTs train employees once, one trains them every year, one trains them on the job, one trains them bi-yearly, and one trains them “periodically.” DOTs employ different training techniques, including literature (n = 3; 18%), on-the-job training (n = 8; 47%), seminars (n = 3; 18%), new employee training classes (n = 1; 6%), and police training academies (n = 1; 6%). The 11 responding DOTs train employees in filling out forms (n = 10; 91%), the purpose and importance of data collection (n = 9; 82%), and the importance of collecting accu- rate data (n = 6; 56%). DOTs do not always train employees regarding which AVCs to record (n = 5; 45%), how to identify species (n = 3; 27%), how to age carcasses (n = 1; 9%), how to use a GPS (n = 1; 9%), or how to enter and manage data (n = 1; 9%). None of the responding DOTs train their employees in car- cass sexing or necropsy. Three DOTs provide their employees with data sheets or forms, and one provides aides to familiarize employees with the road system and related reporting software. One DOT (Mississippi) provides employees with species iden- tification guides and GPS units to document AVC location in- formation. Only one responding DNR trains its employees. The training takes place in the field with experienced personnel and with a seminar. The DNR trains its people in the purpose of data collection, the importance of collecting accurate data, how to fill out data collection forms, what collisions and carcasses should be recorded, how to identify species, how to age and sex

carcasses, how to use a GPS, how to obtain accurate location in- formation, and supplements this with training by veterinarians to investigate potential diseases of the animals. However, the DNR does not train its employees in how to perform a necropsy nor how to enter and manage data. The DNR provides its employees with data sheets or forms, but no other tools or materials. AVC Data Analyses and Data Sharing (AVC Section 4) Significantly more DOTs share AVC data with other orga- nizations than DNRs (P = 0.024). Nineteen of 22 DOTs (86%) share their data, compared with 6 of 12 DNRs (50%). DOTs most frequently share data with DNRs (n = 7), fol- lowed by information released to the public (n = 4). Infor- mation is also shared with law enforcement agencies (n = 3), research groups (n = 2), auto insurers (n = 2), and any other organization that may be interested (n = 4). DNRs that share data most frequently do so internally or with other natural resource agencies (n = 2), whereas one shares information with the public, one shares information with stakeholders or “whomever requests it,” and one shares with DOTs. Most responding DOTs (n = 17; 77%) and DNRs (n = 11; 91%) analyze AVC data. The differences between DOTs and DNRs were not significant (P = 0.561). DOTs noted that data analysis is also done by local DNRs (n = 2) or by law enforcement (n = 3); however, most responding DOTs noted that their data are analyzed by their own per- sonnel (i.e., crash analysts, traffic engineers, highway tech- nical staff, etc.; n = 12; 71%). Most responding DNRs noted that data are analyzed by a wildlife biologist (n = 8; 73%). The one DNR that does not analyze its own data reported that a research biologist for a deer project does the analysis. Data are analyzed annually by most responding DOTs (n = 8; 40%), although many also analyze data as needed or on request (n = 5; 25%). Two DOTs analyze data as often as specific projects require, and two analyze data at periods of longer than 1 year. Three DOTs analyze data more frequently than annually (i.e., continuously or quarterly). Similarly, most DNRs analyze data annually (n = 8; 67%), with three DNRs analyzing data as needed or on request and one analyzing as often as specific projects require. Respondents were asked to describe the purpose(s) of data analysis. The 19 responding DOTs overwhelmingly responded that the identification of problem areas is the primary function of data analysis (n = 17; 89%), whereas only 2 (11%) of the DOT respondents included monitoring wildlife trends, diseases (n = 1; 5%), other wildlife or ecological concerns (n = 2; 11%), and other transportation concerns (n = 3; 16%). DOTs reported ancillary purposes, 18 including to investigate the frequency of deer–vehicle collisions, track shifts in populations of certain species and the spread of non-native species, provide data to a DNR, budget for future projects and identify areas where mainte- nance needs to focus, and receive reimbursement from the DNR for each deer removed. The 12 responding DNRs frequently described a dual purpose of monitoring wildlife trends (n = 8; 67%) and identification of problem areas (n = 7; 58%), whereas other DNRs indicated disease monitoring (n = 1; 8%), other wildlife or ecological concerns (n = 3; 25%), or other transportation concerns (n = 2; 17%). Other wildlife or ecological concerns include estimating age and sex composition, rates of reproduction, effects of winter severity, and collecting data on endangered species. Other concerns include determining what kind of mitigation measures may be needed and where they may be installed and investigating times of day, weather, and road conditions that may be associated with accidents. DNRs reported ancillary purposes that include public relations, documentation of invasive or expanding species populations, and providing a basis for population goals. DNRs and DOTs were asked which of the following data processing tools are used in data analysis: computer databases, frequency graphs, statistical cluster analysis, statistical analysis for trends, and GIS. All but 1 of the 19 responding DOTs use computer databases (n = 18; 95%), most use frequency graphs for kills along certain road sections (n = 13; 68%), and almost half use statistical cluster analysis (n = 9; 47%). Fewer than half of the respondents use statistical analysis for trends (n = 6; 32%) or GIS (n = 8; 42%). All but one of the responding DNRs use computer databases (n = 10; 91%); most perform statistical analysis for trends (n = 7; 64%) and GIS (n = 6; 55%). Less than half the DNR respondents use frequency graphs (n = 5; 45%) or statistical cluster analysis (n = 4; 36%). Data are entered into one database by most states and provinces (75%). However, the DOT respondent from one province noted that data are put in a province-wide database; however, the DNR respondent from that same province noted that they are not, suggesting that the DNR may not be aware of the database. Most responding DOTs and DNRs enter data in the centralized database on at least a monthly basis (n = 7, 39%; n = 4, 36%) or from 1 to 6 months after receiv- ing the data (n = 3, 17%; n = 6, 55%). One DNR and two DOTs enter the data more than 6 months after data collection, and one DNR and two DOTs noted that the time between data collection and data entry varies widely. The results of data collection and analysis are published annually by DOTs and DNRs (n = 8; 47% and n = 7; 54%), with four DOTs (Maryland, New Hampshire, Ohio, and Wyoming) and two DNRs (Newfoundland and Nova Scotia) publishing as needed or on request. One DOT and one DNR publish at intervals of longer than 1 year, and one DOT (Colorado) and one DNR (Manitoba) publish at intervals shorter than 1 year (i.e., monthly and quarterly). Three DOTs

19 and two DNRs do not publish the results of their data for external review. Both DOTs (n = 13; 72%) and DNRs (n = 10; 83%) share results with the personnel that collects the data. Data publication is often in electronic form, and the reports are either distributed though e-mail or posted on the Internet, with seven responding DOTs (46%) and five responding DNRs (45%) preferring this method. Two DNRs and two DOTs publish in different media depending on the request. One DNR and three DOTs send media to other agencies, and one DOT relies on public media (television). Other publication media include booklets, mail, and presentations. Most responding DOTs (n = 16; 89%) share results with other organizations or in- dividuals, including DNRs, local law enforcement, non-profit groups, research groups, and the general public. All responding DNRs (n = 11; 100%) also share results with other organizations or individuals, including local agencies, hunters, trappers, and the general public. All DOTs (n = 18; 100%) believe that the collection and analysis of AVC data leads to on-the-ground mitigation measures, whereas 82% of the DNRs (n = 9) responded similarly. Two DNRs indicated that the data do not lead to mitigation measures. Thirteen DOTs responded with examples of mitigation measures deployed based on AVC data. These include the use of warning signs (n = 13; 100%), crossing structures (including underpasses, multi-use bridges, and wildlife overpasses; n = 4, 31%), fencing (n = 5; 38%), alteration of vegetation along the right-of-way (n = 3; 23%), striping and rip-rap (n = 1; 8%), and lighting of problem areas (n = 1; 8%). Six DNRs responded with comments regarding what kinds of mitigation measures are employed. These include warning signs (n = 6; 100%), speed limits (n = 2; 33%), and changes to the habitat along the right-of-way. Most responding DOTs (n = 14; 82%) indicated that the mitigation measures are put in place because of the DOT alone, although one DOT indicated that mitigation results from cooperation between DNRs and DOTs. Three DOTs noted that other parties were involved, including planners, Transportation Management System Coordinators, transportation district management, local individuals, field personnel, and analysts. Similarly, five of the responding DNRs (55%) indicated that DOTs do the mitigation, with two respondents indicating that mitigation occurs through cooperation between DNRs and DOTs. One respondent noted that it depends on if the mitiga- tion is requested by a town, municipality, or DOT, and one believed the question was not applicable. Potential Obstacles to Implementing or Improving AVC Programs (AVC Section 5) According to the 17 responding DOTs, the most commonly reported problem with AVC programs is that AVCs are underreported (n = 7; 41%), whereas data quality (consis- tency, accuracy, and/or completeness) was identified as a problem by four DOTs, and the lack of spatial accuracy was also identified as a problem by four DOTs. One DOT believed that automated tools in the database could simplify data analysis, whereas another commented that changes to the database entry software would result in (partially) incompatible data. One DOT reported that the publication of yearly reports is often behind schedule. Two DOTs reported no problems with data collection. Sixteen DOTs elaborated on how AVC data collection can be improved. The most frequent suggestion was to improve data quality in terms of consistency, accuracy, and complete- ness (n = 6; 38%). Improving spatial accuracy is important to 25% of respondents, increasing accuracy of species identifi- cation is important to 19%, and increased resources (such as personnel time and training) are important to 13%. One DOT indicated that improving the consistency of data reporting on a state-wide level would be beneficial. Another DOT indi- cated that public recognition of the value in collecting these data would be important, whereas yet another indicated that expanding and improving AVC data collection and integrating it with carcass removal data would be helpful. Two DOTs did not believe that their data collection methods needed improvement. Of the eight responding DNRs, four (50%) have concerns with data quality (i.e., inconsistency, inaccuracy, and/or incompleteness). Spatial accuracy concerns two (25%) of respondents, one DNR mentioned underreporting, and yet another DNR has problems with incompatible methods used by data collectors and data analyzers. Two DNRs have problems with the interval between data collection, feedback, and analysis. Only one DNR reported no problems with data collection. Of the nine responding DNRs, most (n = 6; 67%) believe that AVC data collection methods could be improved through increasing spatial accuracy, especially through incorporating GPS technology in the data collection procedures. Three DNRs (33%) also believe that improving data quality (making the data more consistent, accurate, and/or complete) is important. One DNR indicated that improving species identi- fication would be helpful, whereas another DNR indicated enhanced timeliness in filing reports would be helpful. Increased resources for data collection were important to two DNRs. One DNR believed that AVC data collection methods did not need to be improved. The procedures for AVC data analyses are thought to have similar problems. Eleven DOTs indicated one or more problems with AVC data analyses, whereas five indicated no problems with existing data analyses. The most com- mon data analysis concern for DOTs is the quality (consis- tency, accuracy, and completeness) of the data (45%), followed by spatial accuracy (27%). Three DOTs indicated

20 that underreporting of AVCs causes problems in data analysis. Four of eight responding DNRs (50%) indicated that poor data quality was problematic. Spatial accuracy was problematic to three (38%) of responding DNRs. Three other DNRs (38%) indicated no problems with data analysis. Thirteen DOTs offered ideas on how to improve AVC data analysis methods. Improving spatial accuracy (e.g., through the use of GPS technology) and improved spatial analyses (e.g., through the use of GIS) is important to five (38%) (Alaska, Alberta, Maryland, Utah, and Wyoming). Three DOTs (Minnesota, New Hampshire, and Wyoming) (23%) indicated that improving data quality (consistency, accuracy, and completeness) is important. Five DOTs (British Columbia, Maine, Manitoba, Maryland, and Wyoming) (38%) also indi- cated that improving the timeliness of data entry would facilitate data analysis. British Columbia added that more reporting from rural areas would be helpful. Similarly, most DNRs that responded with suggestions on how to improve AVC data analysis methods believe that the use of GIS and improving the spatial accuracy of the data (e.g., through the use of GPS technology) is beneficial to the data analyses (43%; Ohio, Ontario, and Rhode Island). Ontario, Rhode Island, and Vermont DNRs (43%) indicated that timeliness with data entry would facilitate data analyses, and the Newfoundland DNR noted that data analysis for AVCs could be improved through changes in the database and data entry process. Ontario and Rhode Island indicated that including cluster analyses would be beneficial. Data dissemination is not regarded as a problem by DOTs (n = 11; 73%) or DNRs (n = 9; 100%). Other comments reiter- ated that the use of GPS technology and GIS facilities is needed (one DOT), that there is little support for reducing AVCs and improving AVC data collection programs because AVCs form only a small portion (<1%) of the total number of collisions that result in human injuries or fatalities (one DOT), that not all en- gineers cared about the subject and that traffic planners needed to be involved with AVC data earlier in the planning process (one DOT), that coordinating data collection and dissemination with other state agencies could be problematic (one DOT), that making information available through the Internet may be ben- eficial (one DOT), and that a more formal annual report would aid in data dissemination (one DOT). AC Survey The AC survey can be found in Appendix B, with the sum- mary data contained in Appendix G. Rationale for AC Data Collection and Roads and/or Areas Included (AC Section 1) Survey participants were asked why they collect or manage AC data, ranking responses in order of importance, with 1 being most important and 4 being least important. Re- sponding DOTs ranked public safety (n = 5; 50%) and accounting (n = 4; 50%) as the top reasons to collect or manage AC data (rank 1), with wildlife management or conservation ranked as second most important (rank 2; n = 5; 50%) (Figure 8 upper). Other reasons DOTs collect or manage AC data include requests by the public and “research.” DNRs mostly ranked wildlife management or conservation as the most important reason (n = 9; 75%) with public safety ranking second (n = 5; 45%) (Figure 8 lower). Other reasons why DNRs collect or manage AC data include disease monitoring. On average, DNRs have collected AC data longer than DOTs, with 22 years of collecting AC data for the average DNR (95% C.I. = 15.2, 28.9; n = 10), and 12.2 years of collecting AC data for the average DOT (95% C.I. = 2.0, 22.4; n = 6), but differences were not significant when tested with a two-sided, two-sample t-test (P = 0.153). The earliest collections of AC data were undertaken in 1966 by the Newfoundland DNR, 1978 by the Ohio and British Columbia DOTs, and 1979 by the Nova Scotia DNR. Half of the responding DOTs reported that AC collection is mandatory (n = 5), and the other half reported it is either voluntary or semi-voluntary (n = 1 and 4). Of responding DNRs, 64% reported that the collection of AC data is mandatory (n = 7), whereas 36% reported it is voluntary or semi-voluntary (n = 1 and 3). These percentages were not statistically different (P = 0.850). Of the nine DOTs that responded, all collect data on Interstates (100%), eight (89%) collect data on arterial roads, five (55%) collect data on collector roads, and one (11%) collects data on local roads. Of the 12 DNRs that responded, 11 (92%) collect data on Interstates, 11 (92%) collect data on arterial roads, 10 (83%) collect data on collector roads, and 7 (58%) collect data on local roads. The Idaho DNR does not collect data on Interstates or arterial roads. The geographic limits of the reporting area for the 10 re- sponding DOTs primarily included all areas (or roads) under their jurisdiction, without further specification (n = 5; 50%). Two DOTs report on all roads in all areas within their states, and one DOT reports on “many of the main freeways and major arterials, especially in rural areas where collisions with animals are a concern.” The British Columbia DOT records data on all numbered highways under the agency’s jurisdiction, except for those maintained by the federal government, and the Maryland DOT records data statewide for all state-maintained roads including Interstates. Another DOT noted that their geographic limits vary. The geographic limits of the reporting area for the 12 responding DNRs included all roads in the entire state or province (n = 5; 31%), all roads in the state or province with the exception of some federal lands (Kentucky), forest roads (Newfoundland), and tribal lands (Wisconsin). The North Dakota DNR reports on all Interstate, state, and county highways in all areas, and the

21 North Carolina DNR reports on all highways in the state. Two DNRs did not report geographic boundaries. Responding agencies indicated that the landscape sur- rounding the areas where they collect AC data are both rural and urban (n = 18; 82%), with four respondents indicating that the surrounding landscape is predominantly rural (North Dakota DNR, Oklahoma DNR, Utah DOT, and Virginia DOT). When asked which other organizations or individuals collect AC data on the road systems that are covered by their agencies, most respondents indicated that no other agency or organization works these roads (n = 7; 32%), with several respondents indicating that a branch of law enforcement also covers these roads (n = 6; 27%). Other responses included other governmental branches (i.e., city or county; n = 3; 14%) and private organizations or individuals (i.e., nongovern- mental organizations, interested individuals, n = 4; 18%). Correspondingly, when asked what other organizations or individuals collect AC data on the roads not covered by their agency, most agencies did not respond (n = 14; 52%) or responded with “unknown” (n = 6; 22%). Other responses included DOT, DNR, law enforcement, other governmental agencies (i.e., city or county, n = 2) and that no other entities gather data on these roads (n = 1). AC Parameters Recorded and Reporting Thresholds (AC Section 2) Respondents were asked “Who reports the carcass to the agency or data collector?” Twenty-four agencies responded to this question, with 14 indicating that multiple agencies collect these data. The most frequent source of carcass data is DOTs (n = 16; 67%), followed by DNRs (n = 15; 63%) and highway patrols or other law enforcement agencies (n = 11; 46%). Other answers included private companies or the general public (n = 6; 25%). Typically (other) agencies (n = 10; 100%) report the presence of a carcass to a DOT; although drivers report data to many DOTs as well (n = 6; 60%). Other sources of carcass 0 2 4 6 8 10 12 PU BL IC SA FE TY WI LD LIF E AC CO UN TIN G OT HE R Rank 4 Rank 3 Rank 2 Rank 1 0 2 4 6 8 10 12 14 PU BL IC SA FE TY WI LD LIF E AC CO UN TIN G OT HE R Rank 4 Rank 3 Rank 2 Rank 1 FIGURE 8 Ranked reasons why DOTs (upper) and DNRs (lower) collect AC data.

data include law enforcement and contractors (n = 2 each). Agencies (n = 11; 79%) and drivers (n = 12; 86%) are the most frequent data sources for animal carcasses for DNRs. Roughly equal proportions of DOTs (n = 7; 70%) and DNRs (n = 8; 57%) have reporting thresholds for animal carcasses (P = 0.831). For DOTs, these thresholds usually involve a combination of carcass location and species involved. Most responding DOTs reported a threshold of whether the carcass was in the road (n = 5; 56%); in the right-of-way, even if not visible to drivers (n = 6; 67%); and if the carcass was in the right-of-way and visible to drivers (n = 6; 67%). Five DOTs responded that certain species must be involved for the carcass to be reported (56%). For DNRs, these thresholds usually involve certain species only (n = 7; 58%). The species of interest to both DOTs and DNRs were deer (n = 12); moose (n = 3); bear (n = 4); certain medium- and large-sized mammals, including livestock, furbearers, carnivores, and other ungulates; and birds (n = 8). Search and reporting efforts for ACs were described as monitoring by most responding DOTs (n = 6; 75%), but as incidental by most responding DNRs (n = 10; 71%). These differences were not quite significant (P = 0.060). The Montana and Utah DOTs indicated that both monitoring and incidental reporting occur, depending on the routes. The frequency of checks for ACs is variable. Five DOTs (38%) search daily, two (15%) search weekly, two (15%) search daily and weekly (depending on road type and classification), and one (8%) reported that the frequency of sur- veys varied. DNRs often record ACs as they are encountered or reported (n = 6; 46%), although some DNRs perform daily 22 searches (n = 2, with one additional DNR search daily over a 1-month span), and other DNRs searching for ACs weekly (n = 1), daily and weekly (n = 1), monthly (n = 1), another reporting ACs incidentally, and two others reported only that the frequency of the checks varied. Agencies were asked which parameters they regularly record as a part of AC reporting (Table 6). Ten DOTs responded to all or parts of this question. Most responding DOTs either always or usually record the date (n = 10; 100%), district or unit (n = 8; 80%), road or route ID (n = 10; 100%), carcass location (n = 8; 80%), and species of the animal con- cerned (n = 8; 88%). Most DOTs record the observer’s name either always or usually, and the sex of the animal sometimes. Most DOTs never record time, the age of the animal, or whether the carcass was removed (n = 5; 50%). Human fatal- ities, human injuries, types of injuries, presence of property damage, or estimated amount of property damage are never recorded by the responding DOTs. Of the 16 DNRs that took the AC survey, 5 (31%) did not respond to this question. Most responding DNRs always or usu- ally record date (n = 10; 91%), district or unit (n = 10; 91%), the name of the observer (n = 7; 64%), road or route ID (n = 8; 73%), carcass location (n = 7; 64%), species of animal (n = 11; 100%), and whether the carcass was removed (n = 6; 55%). Most DNRs always or usually record the sex (n = 7; 64%) and age of the animal carcass (n = 6; 55%). Most DNRs (n = 8; 73%) never record the presence of human fatalities, human injuries, types of injuries, or amount of property damage sustained as a result of this carcass. Another 64% never record whether property damage occurred. TOD RND Recorded Parameters A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se Date 50 13 6 0 0 31 82 9 0 0 0 9 Time 19 6 13 13 19 31 9 18 18 0 45 9 District/unit 50 13 6 0 0 31 64 9 0 0 18 9 Name of observer 31 13 25 0 0 31 27 27 18 0 18 9 Road/route identification 31 19 13 0 6 31 73 18 0 0 0 9 Carcass location 25 19 13 6 6 31 55 18 9 0 9 9 Human fatalities 6 6 0 6 50 31 0 0 0 0 91 9 Human injuries 6 0 0 13 50 31 0 0 0 0 91 9 Type of injury 0 6 0 13 50 31 0 0 0 0 91 9 Property damage 6 0 0 19 44 31 0 0 0 0 91 9 Amount ($) of property damage 0 6 0 13 50 31 0 0 0 0 91 9 Species of animal 50 19 0 0 0 31 64 9 0 0 9 18 Sex of animal 25 19 13 6 6 31 9 18 36 9 18 9 Age of animal 13 25 0 25 6 31 0 9 27 18 36 9 Removal of carcass 31 6 13 0 19 31 36 9 0 0 45 9 Note: Shaded areas mark category with the most frequent response. TABLE 6 ANIMAL CARCASS PARAMETERS AND FREQUENCY OF RECORDING THESE PARAMETERS BY DNRs AND DOTs (all in percentages)

23 AC Location Recording and Spatial Resolution (AC Section 2—continued) Animal carcass location recording varied between DOTs and DNRs (Table 7). Most DOTs never use GPS technology (n = 8; 89%) or maps to derive coordinates (n = 6; 67%). Most DOTs always or usually use mile or kilometer reference posts (n = 9; 90%) and/or road sections (n = 8; 80%). Of the responding DNRs, most rarely or never make use of GPS technology (n = 6; 60%) or maps to derive coordinates (n = 6; 55%). DNRs sometimes use mile or kilometer reference posts (n = 5; 50%) and usually or sometimes record the road sections (n = 7; 78%). Other responses included the use of landmarks (e.g., 1 mile north of Swift River), zoogeographic region, or county name. The accuracy for AC locations is generally at or more than 0.1 mile or kilometer, with only one of the nine DOTs using more accurate descriptions. The British Columbia DOT noted that it usually records ACs at 1 yard or meter, although it noted that location accuracy precision is only theoretically at the 1-meter level; in reality the locations are described less accurately. The Maryland DOT also rarely records carcass positions at 1 meter or yard and at 15 meters or yards, although it sometimes records carcasses at 30 yards or meters. Carcasses are always or usually recorded at the 0.1 mile or kilometer (n = 6; 67%) or 1 mile or kilometer level (n = 4; 57%). Location accuracy of ACs is rarely under 0.1 mile or kilometer for DNRs, with the Kentucky DNR reporting that it always records ACs within 1 yard or meter. Idaho rarely records ACs within 1 yard or meter and 15 yards or meters, Idaho and South Dakota rarely record ACs within 30 yards or meters, and Vermont sometimes records ACs to 30 yards or meters. Two DNRs reported that they always record within 0.1 mile or kilometer (Nova Scotia and South Dakota), one DNR usually (Vermont), one DNR sometimes (Wyoming), one DNR rarely (Wisconsin), and four DNRs never report to this level of accuracy. Four DNRs usually record AC locations to 1 mile or kilometer, whereas two others sometimes, one rarely, and one never record at this accuracy level. Other DNR responses included the use of geographic references, county name, or zoogeographic region. Reference and mile posts used in determining location descriptions for ACs are usually 1 mile apart on roads that DOTs (n = 5) and DNRs (n = 4) collect data on, and fewer are located at 0.1-mile intervals (DNR = 1; DOT = 3). The Maryland DOT uses reference posts located 500 ft apart. Species and Species Groups Recorded for ACs (AC Section 2—continued) Amphibians are generally not recorded by DOTs or DNRs (Table 8). Of the 10 DOTs responding, 9 (90%) never record amphibians, whereas one DOT almost never records amphi- bians. Of the 12 DNRs responding, only 1 (8%) reported amphibians to species level, although this DNR only inciden- tally reports amphibians. Other DNR responses included “our agency does not have jurisdiction over amphibians,” that the question was not applicable to their area (Nova Scotia), and that amphibians are rarely reported (Kentucky). Reptiles are also rarely recorded by DOTs and DNRs (see Table 8). Of the nine responding DOTs, eight never record reptiles, and one almost never records reptiles. Of the 11 DNRs responding, only 1 DNR records reptiles to the species level, although reptiles are only incidentally reported. One DNR records all reptile groups to order (Kentucky), eight DNRs never record them, and one DNR noted that its agency does not have jurisdiction over reptiles. Birds are generally recorded in more detail than reptiles or amphibians (see Table 8). Of the eight responding DOTs, the Wyoming DOT records all raptors to genus; British Columbia DOT reports birds at the discretion of its personnel; Idaho DOT records raptors and other “large birds”; Virginia identifies hawks and turkeys; and Maryland identifies turkeys, owls, and eagles. Four DOTs (50%) never record birds, and one DOT rarely records them. The Arizona DNR records game birds and turkeys to species, but noted that all birds except wild turkeys are incidentally reported. The Kentucky DNR records all birds to species, New Hampshire DNR records endangered birds to Note: Shaded areas mark category with the most frequent response. TOD RND Recorded Parameters A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se A lw ay s U su al ly So m et im es R ar el y N ev er N o Re sp on se GPS coordinates 0 6 19 13 25 38 0 0 0 9 73 18 Map coordinates 6 6 19 19 19 31 0 0 18 9 55 18 Mile/kilometer post 6 6 31 13 6 38 55 27 9 0 0 9 Road section 6 25 19 0 6 44 36 36 0 0 18 9 Other 13 6 6 0 0 75 0 0 0 0 9 91 TABLE 7 HOW ANIMAL CARCASS LOCATION DATA ARE REPORTED BY DNRs AND DOTs (all in percentages)

species, and Pennsylvania DNR records endangered birds to species but rarely collects them. The Idaho DNR noted that birds are rarely recorded; usually only for specific projects. Eight DNRs never record birds (62%). Large wild mammals (deer size and greater) are the most often recorded animal group, with all responding DOTs recording large mammals (n = 7, 70%, classify to species, and n = 3, 30%, classify to genus) (see Table 8). Large mammal groups of special interest to DOTs include all large wild mammals (n = 5; 50%) and game species (n = 5; 50%). Three DOTs record ungulates (Idaho, Iowa, and Utah), two record carnivores (Idaho and Utah), one records endangered species (Idaho), and one records non-native species (Idaho). All but one of the responding DNRs record large wild mammals (n = 12; 92%), with 11 classifying them by species and Arizona recording them to family. Ungulates were the large mammal group of highest interest to responding DNRs (n = 7; 54%). Other large mammal groups recorded by DNRs include all species (n = 2; Kentucky and Newfoundland), endangered species (n = 4; 31%), game species (n = 4; 31%), carnivores (n = 4; 31%), and non-native species (South Dakota). Small mammals are classified to the species level by two responding DOTs (20%), to family by two DOTs (20%), are never recorded by four DOTs (40%), and are rarely recorded by two DOTs (20%) (see Table 8). The New York State DOT noted that the larger small mammals (i.e., coyotes or beaver) are regularly recorded. Small mammal groups of interest to DOTs included all species (n = 2), and larger small mammal species where identification is possible (n = 2). The British Columbia DOT records small wild mammal groups at the dis- cretion of the maintenance contractors. Small mammals are identified to species by four responding DNRs (40%), whereas four respondents (40%) never and two respondents (20%) rarely record small mammals. Small mammal groups of interest to DNRs include all small mammals, endangered species, carnivores, and non-native species (n = 1 each). One DNR was interested in furbearer species only. More DOTs (n = 6; 60%) than DNRs (n = 2; 22%) record domesticated animals to the species level (see Table 8). Five 24 DOTs record large species only (45%), whereas two DOTs responded with “other,” and elaborated that small species are occasionally recorded (n = 1) and that “dogs and cats etc.” are recorded (n = 1). Domesticated animals are usually identified to species by only two of the nine responding DNRs, with one DNR never recording domestic animals. Six responding DNRs (67%) marked “other,” but did not elaborate. When asked which groups of domestic animals are recorded, three DNRs noted large species only. Both DNRs (n = 9; 69%) and DOTs (n = 6; 60%) keep por- tions of carcasses for further analysis. One DOT answered “yes” to this question, but noted that the DNR is the agency that collects data on black bears for further analysis. Further analyses included disease testing for chronic wasting disease (Arizona, Iowa, Kentucky, New York, South Dakota, and Wisconsin), West Nile Virus (New York, British Columbia, and Wisconsin), and rabies (Kentucky). Reproductive data are also gathered from the carcasses (Missouri). Training and Instruction for AC Data Collectors (AC Section 3) Section 3 was designed to investigate what training, instruc- tion, and other aides are provided to AC collectors. More DOTs (n = 5; 50%) than DNRs (n = 2; 14%) train their AC data collectors; however, to obtain the appropriate sample size for the chi-square test (five or more expected sample size in each cell), the “don’t know” answers (n = 2 for both DNR and DOT) were pooled with the “no” answers. With this stipulation, the differences were not significant (P = 0.149). Of the responding DOTs, two train their data collectors just once, one trains them yearly, one trains them annually or more frequently, and one selected “other” but did not specify further. One DOT uses literature combined with on-the-job training for its data collectors, whereas three train them on the job and one uses a seminar. The two DNRs that train their AC data collectors noted that their training was not specific to AC data collection, but that the information dissemination and general training could be applied to AC data collection. One Notes: Shaded areas mark the category with the most frequent response. X = not an option for responses. TOD RND Recorded Parameters Sp ec ie s G en u s Fa m ily O rd er Cl as s N ev er O th er N o Re sp on se Sp ec ie s G en u s Fa m ily O rd er Cl as s N ev er O th er N o R es po ns e Amphibians 6 0 0 6 0 44 25 25 0 0 0 0 0 64 9 27 Reptiles 6 0 0 6 0 50 6 31 0 0 0 0 0 73 9 18 Birds 25 0 0 0 0 31 13 31 0 9 0 18 9 36 9 18 Large wild mammals 69 0 6 0 0 6 0 19 64 27 0 0 0 0 0 9 Small wild mammals 25 0 0 0 0 25 13 38 18 0 18 0 0 36 18 9 Domestic animals 13 X X X 0 6 38 44 55 X X X 0 9 27 9 TABLE 8 SPECIES GROUPS RECORDED BY DNRs AND DOTs IN ANIMAL CARCASS DATA COLLECTION PROGRAMS (all in percentages)

25 DNR answered subsequent questions, implying that an addi- tional DNR trains its data collectors. Five DOTs responded to how they train their data collectors (Idaho, Montana, New York, Ohio, and Wyoming). All train their employees in the purpose of collecting the data, four train their data collectors in the importance of recording accurate information, four train in filling out forms (Idaho, Montana, Ohio, and Wyoming), three train in which ACs to record (Idaho, Ohio, and Wyoming), two train in species identifica- tion (Idaho and Wyoming), one trains in determining the age of a carcass (Wyoming), two train in obtaining accurate infor- mation (Idaho and Montana), and one trains in handling car- casses potentially infected with chronic wasting disease, West Nile virus, and in carcass composting (New York). None of the DOTs train their data collectors in carcass sexing, necropsy, the use of GPS technology, or data entry or management. Only one DOT responded to the question asking what tools and materials are provided to AC data collectors. This DOT provides worker safety materials. The three DNRs that train their data collectors train them in different aspects of data collection. One DNR trains them in filling out forms only. Two DNRs train their employees in the purpose of data collection and the importance of record- ing accurate information, filling out forms, which ACs to record, and in taking accurate location information. One of these two DNRs also trains its data collectors in species iden- tification, carcass aging, carcass sexing, necropsy, and use of GPS technology. None of the DNRs trains its employees in data entry or management. Two DNRs responded to the ques- tion regarding the materials and tools provided to assist with AC data collection. The Newfoundland DNR provides its data collectors with specially designed data books. Arizona Game and Fish Department provides workers with species identification guides, GPS units, and necropsy kits. AC Data Analyses and Data Sharing (AC Section 4) A higher percentage of DOTs (n = 9; 90%) than DNRs (n = 8; 53%) share AC data with other organizations, although this dif- ference was not significant (P = 0.197). The DOTs that share their data do so with DNRs (n = 4; 44%), interdepartmentally (n = 5; 56%), with consultants and academic institutions (n = 1; 11%), whomever requests the data (n = 1; 11%), and one DOT shares data through GeoData Services data linkage efforts. Of the eight responding DNRs, three (38%) share their data with DOTs, the general public (n = 4; 50%), interdepart- mentally (n = 2; 25%), and with researchers (n = 1; 13%). Most responding DOTs (n = 7; 78%) and DNRs (n = 11; 73%) analyze AC data. One DOT responded that data are analyzed by a DNR, and one DOT noted that the data are analyzed by “various entities.” DOTs indicated that data analyses were mainly performed by personnel within the DOT (n = 7; 78%), including highway safety technicians, TMS coordinators, planners, etc., with two DOTs (22%) sending data to wildlife biologists at DNRs. The three DNRs that do not analyze their own data remarked that they are analyzed by a biologist, other conservation agency, or that they are only in the process of beginning data analysis. Data analyses for DNRs are all performed by wildlife biologists (n = 10 out of 10 respondents). Four DOTs analyze data annually (44%), three others analyze data annually and on request or depending on specific needs (33%), and three analyze data as needed only (33%). One DOT noted that data analysis frequency varies, and another DOT noted that data analysis occurs as time permits on a case-by-case basis. Data are analyzed annually by seven responding DNRs (64%), whereas one analyzes either annually or on request, one analyzes data only as needed or on request, and two reported that analysis frequency varies. Respondents were asked to describe the purpose(s) of the data analyses. DOTs overwhelmingly responded that the identification of problem areas is the primary function of the data (n = 8; 80%), with only two DOTs (20%) stating that wildlife and/or ecological reasons is the primary function of the analyses. Wildlife conservation and other ecological reasons were overwhelmingly selected as a secondary purpose in data collection from the six responding DOTs (n = 4; 67%). The 11 responding DNRs also indicated that identification of problem areas is a purpose of data analysis (n = 7; 64%), but monitoring wildlife population trends received five responses (45%), and other wildlife and/or ecological reasons received four responses (36%). When identifying other purposes that the data serve, three DNRs noted wildlife population monitoring or general wildlife/ ecological reasons. One DNR also noted public relations and one the importance of non-native species monitoring. The agencies were asked which data processing tools are used in AC data analysis: computer databases, frequency graphs, statistical cluster analysis, statistical analysis for trends, and GIS. All but one of the responding DOTs use computer databases (n = 8; 89%). DOTs also use frequency graphs for road sections (n = 4, 44%; British Columbia, Iowa, Utah, and Wyoming) and GIS facilities (n = 4, 44%; Idaho, Iowa, Maryland, and New York), and, although less frequently, statistical cluster analyses (Iowa and Wyoming) and statistical analysis for trends (Iowa). All but two of the responding DNRs use computer databases (n = 9; 82%), and most use statistical analysis for trends (n = 6; 55%), but fewer use frequency graphs for road sections (North Dakota and South Dakota), sta- tistical cluster analyses (Connecticut and Missouri), or a GIS (Arizona, Nova Scotia, and South Dakota). Data are entered into one centralized database for most states and provinces (12 of 17 responding states and 2 of 3 responding provinces). Most responding DOTs (n = 4; 44%) and DNRs (n = 4; 40%) noted that data entry into the centralized database occurs monthly or more frequently. The

Iowa, Maryland, and Ohio DOTs noted that data entry would occur over 1 to 2 business days. One DOT estimated the time interval at 3 months, whereas another DOT noted it could take 1 to 6 months to have the data entered, and one DNR mentioned it could take 1 to 2 months. Three DNR respon- dents noted that data entry could take more than 6 months. Three DNR respondents and two DOT respondents noted that turnover between data collection and entry varies greatly. DOTs commonly publish AC data at intervals of less than 1 year (n = 4; 40%) or on request (n = 2; 20%), with one agency publishing at a frequency of more than 1 year. The Maryland DOT publishes the data on an intranet server concurrent with data entry. Responding DOTs publish in different manners depending on request (n = 3), use the data internally or share it with other agencies and stakeholders (n = 3), use public media (n = 1), or vary in their publication methods. All responding DOTs (n = 9) share their results internally and with other organizations and individuals, including DNRs, and the general public. DNRs (n = 7; 64%) generally publish their data yearly, with two respondents (18%) publishing data only in internal reports and two (18%) not publishing data currently. Data are published in a manner as requested by three DNRs, in a booklet or report by three others, and web-based by one. Eight of the responding DNRs (80%) share their results with other organizations or individuals, including DOTs, other local agencies, the Audubon Society, the general public, and/or whoever requests the data. Most DOTs (n = 8; 88%) believe that collection and analysis of AC data leads to on-the-ground mitigation measures, but only 50% (n = 5) of responding DNRs agreed. One DOT believes that the data do not lead to mitigation measures. These differences were not significant (P = 0.185), although sample sizes were relatively low. Eight DOTs responded with examples of mitigation measures that were put in place based on AC data. These included warning signs (n = 7), fencing (n = 5), and crossing structures (n = 3). One DOT indicated that it was working toward deploying mitigation in response to AC data. Five DNRs re- sponded with comments regarding what kinds of mitigation measures are employed. The measures include warning signs (n = 4), wildlife fencing and under- or overpasses (n = 1), and one DNR respondent noted that mitigation is planned but has not yet been implemented. These mitigation efforts are mostly attributed to DOTs (n = 11) and secondarily to DNRs (n = 3), law enforcement (n = 1), and other agencies (n = 1). Potential Obstacles to Implementing or Improving AC Programs (AC Section 5) The most common problem experienced by both DOTs (n = 6; 60%) and DNRs (n = 9; 64%) in data collection procedures is the lack of consistency. Reasons for lack of 26 consistency include personnel problems (i.e., getting all personnel to do equal levels of data collection, changing personnel, personnel not completing data sheets, personnel recording information inconsistently) and consistency in reporting locations. Two DOTs noted that districts differ in data collection procedures within the state, which hampers data synthesis efforts. Other problems include a lack of a state-wide database and inadequate follow-up procedures to verify certain data, inadequate staff time to collect data for animals other than deer and other large mammals, the state of the animal carcass when it is encountered or removed, that data collection is not mandatory, and that observations of some species are too low for “statistical reliability.” Three DOTs and one DNR reported no problems with AC data collection. Most responding DNRs and DOTs believe AC data collection methods can be improved by making data collec- tion more consistent and/or improving the spatial accuracy of AC locations, especially through the use of GPS technol- ogy. Eight responding DOTs mentioned the need for increased data quality (i.e., consistency, accuracy, and completeness; n = 4; 50%), increased spatial accuracy (n = 4; 50%), and additional resources (n = 2; 20%), such as personnel and training. Four responding DNRs (40%) indicated that improving consistency in data collection is im- portant, five (50%) mentioned improvements in the spatial accuracy of the data, whereas two other DNRs mentioned a need for a centralized database, one DNR noted that consid- erable training and funding is useful, and another DNR in- dicated the need for more tools (such as GPS units) to allow for more spatially accurate data collection. Five of the 18 re- spondents (28%) specifically mentioned coordinates ob- tained through GPS or maps, the use of GIS facilities, and the need for field computers integrated with a GPS unit that allows for digital data entry in the field and precise and con- sistent locations. Data analyses have problems similar to data collection. Of the nine DOTs that responded, six (67%) believe data qual- ity (i.e., consistency, accuracy, and completeness) is prob- lematic for analysis, two DOTs believe that a lack of resources makes analyses more difficult, one DOT believes that the lack of spatial accuracy presents difficulty with the analyses, and that the inadequate data on “small animals” is also problematic. One DOT believes there are no problems with AC data analyses. Of the nine DNRs responding to this question, five (56%) believe that a lack of consistency in data collection is problematic for analysis, one DOT believes that a lack of spatial accuracy is problematic, and two DOTs believe felt that inadequate resources makes AC data analyses more difficult. Two DOTs believe there are no problems with data analyses. Of the five responding DOTs, four believe integration with GIS will improve analysis, four believe that faster and/or automated data entry will improve analysis, whereas two believe that more consistent data entry and collection will

27 improve data analysis. One other DOT suggested cluster analyses. The eight responding DNRs believe that data analyses can be improved through integration with GIS (two DNRs), faster data entry (one DNR), more consistent data entry (one DNR), making reporting mandatory (one DNR), and obtaining better data (one DNR). Three DNRs believe data analyses did not need to be improved. Most responding DOTs (n = 4; 57%) and DNRs (n = 8; 80%) believe there are no problems with AC data dissemination. The remaining responses included a need for more resources (two DOTs and one DNR) and that a lack of the consistency or compatibility of the data and reporting procedures makes dissemination of data difficult (two DOTs and one DNR). Suggestions to improve AC data dissemination include: • Dedicating personnel to this activity. • Enhancing communication between DOTs and DNRs. • Disseminating data electronically instead of on paper. • Entering the data into a centralized database.

Next: Chapter Four - Successful Examples »
Animal-Vehicle Collision Data Collection Get This Book
×
 Animal-Vehicle Collision Data Collection
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 370: Animal-Vehicle Collision Data Collection examines the extent to which data from animal–vehicle collision accident reports and animal carcass counts are collected, analyzed, and used throughout the United States and Canada.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!