National Academies Press: OpenBook

Practices for Developing Transparent Best Value Selection Procedures (2015)

Chapter: Chapter Four - Best Value Case Examples That Support Transparency

« Previous: Chapter Three - Current Practices in Best Value Procurement
Page 21
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 21
Page 22
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 22
Page 23
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 23
Page 24
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 24
Page 25
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 25
Page 26
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 26
Page 27
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 27
Page 28
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 28
Page 29
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 29
Page 30
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 30
Page 31
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 31
Page 32
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 32
Page 33
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 33
Page 34
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 34
Page 35
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 35
Page 36
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 36
Page 37
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 37
Page 38
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 38
Page 39
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 39
Page 40
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 40
Page 41
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 41
Page 42
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 42
Page 43
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 43
Page 44
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 44
Page 45
Suggested Citation:"Chapter Four - Best Value Case Examples That Support Transparency ." National Academies of Sciences, Engineering, and Medicine. 2015. Practices for Developing Transparent Best Value Selection Procedures. Washington, DC: The National Academies Press. doi: 10.17226/22192.
×
Page 45

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

22 CALIFORNIA DEPARTMENT OF TRANSPORTATION Overview With the passage of its D-B legislation in 2009, Caltrans began using best value procurement. Its thoroughly documented pro- cess builds on other state procedures. Similar to other DOTs, Caltrans uses a two-step procedure for best value procurement selection: • Step 1: RFQ/SOQ Evaluation—Prequalification of pro- posers; and • Step 2: RFP/Proposals Evaluation—Selection of the final proposer. In the first step, RFQs are issued to receive information in the form of SOQs from the interested proposers, which allows the agency to determine the proposers who are qualified to suc- cessfully deliver the project. In the second step, Caltrans issues RFPs to the pre-qualified proposers requesting them to submit proposals. After completing the evaluation, a proposal offering the best value is awarded. It can be noted that Caltrans’ initial authority required prequalification, but did not allow short- listing. Their new authority (AB 401) does allow shortlisting. Evaluation Criteria/Award Algorithms In the RFP, Caltrans lists the evaluation criteria. Caltrans uses both pass/fail and technical evaluation factors. Table 5 summa- rizes a typical list of RFP evaluation criteria. Table 6 presents the adjectival rating guidelines for the technical factors. The TRC evaluates the technical proposals against the tech- nical factors and subfactors contained in the RFP. The strengths and weaknesses of each proposal are assessed and documented by the TRC. It can be noted that proposals are only evaluated against the technical factors and are not compared with each other. After independently reviewing the proposals, the TRC meets to discuss the proposals to determine an adjectival rat- ing (“Poor” to “Excellent”) for each category and subcategory contained in the RFP through consensus rating. The adjectival ratings are converted into a technical score based on adjectival conversion factors and weightings. Price proposals are evalu- ated only after technical proposal evaluation. The lowest price is assigned the maximum points available for price. The points for the other proposals are assigned on a prorated basis using the lowest price. INTRODUCTION This chapter builds on the best value literature review and state-of-practice survey from the previous chapter. The primary objective of this chapter is to document the case examples and experiences of the agencies found to have the most effective best value experience. After explaining the selection processes for the case examples, this chapter provides readers with details to assist in developing fair, objective, and transparent best value procurement procedures. SELECTION OF CASE EXAMPLES The data from the national survey and literature review were used to select the state DOTs appropriate for further study. The following selection criteria were used: 1. Years of experience using best value procurement; 2. Use of best value procurement with different project delivery methods, including D-B-B, D-B, and CM/GC; 3. The number of best value projects; 4. Comprehensiveness and availability of best value pro- cess documentation; and 5. Willingness of agency personnel to participate in the study as determined by the survey response. Based on these criteria, 11 DOTs were invited to participate in the case example portion of the study. Participation required a structured interview, documentation, and reviewing the final analysis for accuracy. Seven DOTs agreed to participate; California, Florida, Michigan, Minnesota, New York, Oregon, and Utah. A structured interview protocol was used during data col- lection. Each DOT was interviewed using the same questions. The general question categories were: 1. Proposal evaluation criteria, 2. Selection methodology and award algorithm, 3. Structure of evaluation committee, 4. Debriefing procedures, 5. Industry outreach efforts, and 6. Lessons learned. Appendix B provides the complete list of protocol questions, and this chapter presents the findings in the same general order as the protocol. chapter four BEST VALUE CASE EXAMPLES THAT SUPPORT TRANSPARENCY

23 Type of Factor Evaluation Factors Pass/Fail Legal Financial Technical/Quality Management/Administration Evaluation Criteria Environmental Compliance and Public Outreach Plans Responsiveness to RFP and Design Concept Transportation Management Plan and Safety Source: Caltrans (2012b). TABLE 5 BEST VALUE EVALUATION CRITERIA senilediuG gnitaR lasoporP senilediuG gnitaR QOS gnitaR Excellent SOQ indicates significant strengths with few minor weaknesses, if any. • The technical proposal demonstrates an approach with unique or innovative methods of approaching the proposed work with an outstanding level of quality. • The technical proposal contains many significant strengths and few minor weaknesses, if any. Very Good SOQ contains a few minor weaknesses that are outweighed by the strengths. • The technical proposal demonstrates an approach offering unique or innovative methods of approaching the proposed work. • The technical proposal contains much strength that outweighs the weaknesses. • Weaknesses, if any, are very minor and can be readily corrected. Good SOQ contains weaknesses that are balanced by strengths. • The technical proposal demonstrates an approach that offers an acceptable level of quality. • The technical proposal contains strengths that are balanced by the weaknesses. • Weaknesses are minor and can be corrected. Fair SOQ contains weaknesses that are not offset by strengths. Weaknesses could adversely affect successful project performance. • The technical proposal demonstrates an approach that marginally meets the RFP requirements/objectives. • The weaknesses are not offset by the strengths. • There are a significant number of weaknesses and very few strengths. Poor SOQ contains significant weaknesses with very minor strengths, if any. • The technical proposal demonstrates an approach that contains significant weaknesses/deficiencies and/or unacceptable quality. • The technical proposal fails to meet the stated RFP requirements/objectives and/or lacked essential information and is conflicting and/or unproductive. • There are a significant number of weaknesses and very few strengths, if any Source: Caltrans (2012a). TABLE 6 RATING GUIDELINE After completing the evaluations of the technical pro- posals and price proposals, Caltrans performs the final best value calculation to determine the “Final Total Proposal Value” (FTPV) using the formula shown here. The contract is awarded to the proposer with the lowest FTPV. To increase the transparency and fairness of the evaluation process, Caltrans uses the following strategies: • It provides a detailed description of the technical evalu- ation factors, the objectives and requirements for each evaluation factor, their relative weights, and the infor- mation to be submitted in the RFP. • The rating result of individual evaluation factors is deter- mined by a consensus of the TRC members. • Price is only considered after completion of the proposal evaluation process. • The adjectival conversion factors are sealed until the technical reviewers have completed evaluating all proposals. Evaluation Committee Caltrans uses two main committees to evaluate technical proposals: (1) TRC (with its technical subcommittees), and (2) the process oversight committee. The TRC chairperson is the point of contact for the evaluators and is responsible for the proposal evaluation scoring and documentation of the evaluation process. The primary duty of the TRC and technical subcommittee is to review the RFP and evaluation manual and assess the proposals. First, the technical subcommittee submits to the TRC its suggestions on the strengths and FTPV ($) = Proposal Price Value + Qualitative Value Qualitative Value (QV) = Technical Score Value (in $)* *(Maximum Technical Points - Technical Score Factor of Proposer) Where: Technical Score Value (in $) = Lowest Proposal Price Value/ Maximum Price Points Technical Score Factor of Proposer = Maximum Technical Points* *(Technical Score of Proposer/Highest Technical Score)

24 weaknesses of the proposals. Next, the TRC evaluates the strengths and weaknesses and assigns an adjectival rating to each technical criterion. The score given by the TRC is the final score of the technical proposal. The process oversight committee consists of the non- voting group of observers who perform pass/fail evaluation of the proposals and observe the deliberations of the TRC. Figure 9 outlines the main steps of the technical proposal evaluation process. The roles and responsibilities of the evaluation committee members are as follows: Technical Review Committee (TRC) • Review RFP and evaluation manual; • Individually review and assess proposals; • Review consensus strengths and weakness reports from technical subcommittees; • Forward clarification requests to TRC chair; and • Determine TRC consensus score, which will become the official final technical proposal score for each proposal. Technical Subcommittee • Review RFP and evaluation manual; • Individually review and assess proposals; • Record strengths and weaknesses on worksheets provided; • Forward clarification requests to TRC chair; and • Participate in consensus meetings. Process Oversight Committee • Provides a non-voting group of observers; • Performs pass/fail evaluation; • Opens the price proposal at the public bid opening; and • Submits price proposal information to TRC chair. Caltrans requires that the Evaluation Committee maintain and manage the fairness and integrity of the entire evaluation process; for example, • Each member must execute a confidentiality agreement form, conflict of interest agreement form, and conflict of interest statement; • Each member must have no contact with proposers during the process; • No member may disclose the contents of proposals or proceedings; and • Proposals and evaluation materials must be kept in a safe and secure location. Training Caltrans mandates the training for evaluation committee members before the review and scoring of proposals. The members of the committee are required to sign a confiden- tiality and conflict of interest statement before the training; normally a two-hour session. Caltrans requires that each member of the evaluation committee participate in the training session for a given project. During the training session, the overview of best value procurement and the rules and roles of the evaluation process are explained in detail. Further, the training session provides a step-by-step description on how to rate evaluation criteria from the pass/fail to technical factors of the proposals. Debriefings Upon request, Caltrans conducts debriefings for the unsuccess ful proposers. The debriefing process occurs in person approximately 90% of the time. For the remaining 10% it is conducted by phone, primarily when the proposers are not from the area. Caltrans tries to include members from the TRC in the debriefing meetings to help explain how a criterion was rated and the reasoning behind each rating. It is important to note that Caltrans does not discuss other proposals during the debriefings. The debriefing meeting also outlines the strengths and weaknesses of the proposals based on the evaluators’ com- ments from the technical proposal evaluation process. Caltrans asks proposers to submit their feedback on the evaluation process in order to continually improve the process with regard to transparency and objectivity. Step 1: Pass-Fail evaluation Step 2: ATC responsiveness review Step 3: Technical Proposal responsiveness Step 4: Technical Proposal Review Step 5: Technical Scoring FIGURE 9 Evaluation steps (Source: Caltrans 2012b).

25 Lessons Learned The Caltrans agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process: • Evaluation criteria should be made as clear as possible to the proposers to ensure the success of the best value procurement process. • The TRC must reach a consensus as a whole. It does not use averages of individual scores. The technical scoring committee can use a “+” and “-” system in order to reach the consensus. • The agency ensures that the RFP is well-defined, as they believe that this sets the project up for success. • Clear communication between parties during the entire evaluation process is essential to obtain transparency in best value selection. • Each project needs to maintain a single point of contact between the proposers and the agency in order to be consistent in responses. FLORIDA DEPARTMENT OF TRANSPORTATION Overview FDOT uses best value procurement primarily for Adjusted Score Design-Build (ASDB) projects. ASDB is defined as follows: [T]he contract award is based on the lowest adjusted score, which is determined by dividing the price proposal by the combined Expanded Letters of Interest score and technical proposal score. Under the ASDB procurement, a two phase process is used which combines the evaluation scores of the Expanded Letters of Interest (phase I) and the technical proposal (phase II). A maxi- mum of 20 points may be awarded for the Phase I Expanded Letter of Interest, which would be added to the maximum of 80 points awarded for the Phase II technical proposal submittal (FDOT 2012). Evaluation Criteria/Award Algorithms FDOT selects best value projects based on two phases: Phase I—Evaluation of Expanded Letters of Interest (ELOI), and Phase II—Evaluation of Technical and Price Proposals. Typically, FDOT uses a standard set of criteria for both phases. However, for any given project, this standard set of evaluation criteria may need to be modified to meet the facility needs. To be successful, FDOT recommends that pro- posal evaluators participate in the development of evaluation criteria. FDOT judges the relative ability of each submitting com- pany or entity to perform the required services based on quali- fication information and the ELOI. Unless otherwise noted in the specific D-B advertisement, the criteria for evaluating the Phase I submittals will include: 1. D-B firm name and prequalification. 2. Past performance evaluations, D-B project experience, organization, and staffing (0–7 total points): – Contractor grades – Professional consultant grades – Performance history with other states or agencies, if none with the department – D-B project experience of the contractor and pro- fessional consultant – Similar types of work experience – Environmental record – Contractor experience modification rating (current year) – D-B firm organization – D-B firm staffing plan – D-B firm coordination plan. 3. D-B project requirements and critical issues (0–13 total points): – Understanding of D-B project requirements – Identification of critical issues – Outline for addressing critical issues. To eliminate potential bias that may occur during the eval- uation of ELOI, FDOT uses the following strategies: • Evaluate each responsive ELOI and compile information (i.e., data, comments, etc.) to support the ELOI scores; • Check all evaluation categories to ensure minimum qualifications are met for the category; and • Document strengths and weaknesses of each proposer. It can be noted that when the ELOI evaluation process is completed, proposal evaluators must attend the selection committee meeting to confirm their evaluations and scorings. Once all proposers’ scores are calculated, FDOT will provide a notification to each proposer regarding their ELOI’s score and the scores of all responsive proposers. Within 48 hours of receiving this information, proposers must declare their intent to participate in Phase II of the procurement process; the evaluation of price and technical proposals. During Phase II, FDOT provides a template including a score for each item in the Technical Proposal (Table 7). The maximum number of points for the technical proposal is 80. FDOT notes that deviations from these items and estab- lished ranges must be approved by the Central Office. Also, for a particular project, the “Credit will be given for” under each item should be tailored to meet the facility’s needs. FDOT also recognizes that the evaluation of ELOI and technical proposals involves subjectivity. As a result, to reach unbiased and objective results (in addition to these strategies), FDOT clearly states the roles of the evaluators as follows: • Review RFP and advertisement to have full understand- ing of the project and proposer’s expectations;

26 RFP Section Example Technical Proposal 1. Design (25–40 points) Credit will be given for the quality and suitability of the following elements: • Structures design • Roadway design and safety • Drainage design • Design coordination plan minimizing design changes • Geotechnical investigation plan • Geotechnical load test program • Minimizing impacts to adjacent properties and structures through design • Traffic control plan design • Incident management plan • Aesthetics • Utility coordination and design 2. Construction (25–40 points) Credit will be given for the quality and suitability of the following elements: • Safety • Structures construction • Roadway construction • Drainage construction • Construction coordination plan minimizing construction changes • Minimizing impacts to adjacent properties and structures through construction • Implementation of the environmental design and erosion/sediment control plan • Implementation of the maintenance of traffic plan • Implementation of the incident management plan • Utility coordination and construction 3. Innovation (0–10 points) Credit will be given for introducing and implementing innovative design approaches and construction techniques that address the following elements: • Minimize or eliminate utility relocations • Materials • Workmanship • Enhance design and construction aspects related to future expansion of the transportation facility 4. Value Added (5–10 points) Credit will be given for the following value added features: • Broadening the extent of the value added features of this RFP while maintaining existing threshold requirements • Exceeding minimum material requirements to enhance durability of project components • Providing additional value added project features proposed by the D-B firm The following value added features have been identified by the department as being applicable to this project. The D-B firm may propose to broaden the extent of these value added features. Value Added Feature Minimum Value Added Period sraey 3 tlahpsA deddA eulaV sraey 5 tnemevaP etercnoC deddA eulaV sraey 5 stnenopmoC egdirB deddA eulaV sraey 3 gnithgiL deddA eulaV Price Proposal A Price Proposal guaranty in an amount of not less than five percent (5%) of the total bid amount shall accompany each Proposer’s Price Proposal. The final selection formula that the selection committee shall use in adjusted score follows: TS TVCPCTBPP )*(+ = Adjusted Score BPP = Bid Price Proposal PCT = Proposed Contract Time TVC = Time Value Costs ($_______________ per day) TS = Technical Score (Combined Scores from ELOI and Technical Proposal) The final scoring would come out something like the example below: Firm ELOI Score Technical Score Price Adjusted Score A 20 70 $6.7 million 74,444 B 18 62 $6.5 million 81,250 C 19 51 $6.3 million 90,000 Source: FDOT (2012). TABLE 7 AN EXAMPLE OF TECHNICAL AND PRICE PROPOSAL EVALUATION IN FDOT

27 • Evaluate ELOIs based on the scoring criteria provided in the advertisement; • Evaluate technical proposal based on the rating criteria provided in the RFP; • Provide comments to defend scores—the score must be substantiated by comments; • Comments are to be concise and identify the strengths and weaknesses of the proposal; • Scoring by evaluators does not need to be the same across all evaluators; however, each evaluator should be con- sistent across each team with respect to scoring; • Perform evaluations independently; • Develop good understanding of the evaluation criterion for each phase; and • Attend meetings and make site visits. Evaluation Committee FDOT divides its evaluation committee into two groups. The first group includes the technical review committee, which evaluates each proposal using direct point scoring and ranks all proposal scores. To maintain a transparent process, the technical review committee provides extensive comments and the rationale behind its ratings. During the evaluation process, individual TRC members independently evaluate the proposals except where they may independently need to solicit advice from an expert, such as for the structural design of a bridge. Such independence helps FDOT arrive at a fair selection. The second group includes the best value selection com- mittee. Based on the results from the first group, this group uses the adjusted score algorithm to identify the lowest adjusted score proposal. FDOT requires that in the Central Office the selection committee include the appropriate assistant sec- retary or their designee (who will serve as chairperson), the appropriate director, and the appropriate office head or as appointed by the chairperson. The manager of the contractual services office serves as a non-voting member and as recording secretary at all meetings. In the district, the selection committee contains the district secretary (who serves as chairperson), the appropriate director, and the appropriate office head or as appointed by the district secretary. A representative from the contracting unit serves as a non-voting member and as the recording secretary at all meetings. It can be noted that by including non-voting members as the recording secretary, FDOT enhances the objectivity of the evaluation process. Industry Outreach In the interview FDOT indicated that it is important to have a strong connection with the industry to implement a best value approach. FDOT has worked closely with industry to develop guidelines and standard requirements for best value selection. Through this process, the agency learns from industry’s viewpoints and correspondingly industry can further understand the agency’s expectations. This process indirectly improves the transparency and objectivity of the best value approach. Moreover, FDOT is willing to provide contractors and design firms with the “ins and outs” of the best value selection process to make sure that they completely understand what is required. Training FDOT provides a number of workshops related to best value selection. Proposal evaluator training is one of the sessions designed to help the department establish a transparent best value approach. The main goal of this training is to convey the standards for evaluating ELOI and technical proposals. The training provides detailed information for participants with regard to two-phase ASDB/best value procurement, ELOI requirements, ELOI evaluation criteria, guidelines for ELOI evaluation, technical proposal evaluation criteria, guidelines for technical evaluation, and industry feedback on best value procurement. Procurement Meeting and Debriefings FDOT conducts various procurement meetings and debriefings to foster transparency in best value selection. These meetings are briefly described in the following sections. Pre-bid Meetings FDOT conducts pre-bid meetings to discuss project details and clarify any concerns. This is a public meeting. The main objectives of this meeting are to provide a setting for all parties to discuss the proposed project goals and objectives, clarify evaluation criteria, and review other relevant issues. The out- comes of this meeting will help to finalize the RFP. FDOT notes that all proposers receive the same information from the pre-bid meeting in a timely manner. At the end of this meeting, the contracting unit and the project manager update evaluation criteria if needed. FDOT communicates any cri- teria changes to each member of the proposal evaluators and proposers in a timely fashion. One-on-One Alternative Technical Concept Meetings After the pre-bid meeting, one-on-one meetings can be used to discuss proposers’ ideas for Alternative Technical Concepts (ATCs). During these meetings, the proposer has an opportunity to bring up different ideas for the project, which the department can accept, request more information on, or deny. All infor- mation that is discussed regarding ATCs is kept confidential. Page-Turn Meetings In the “page-turn” meeting, FDOT meets with each proposer formally for 30 minutes after the technical proposals have

28 been submitted. FHWA is invited to sit in on federal-aided oversight projects. The goal of the page-turn meeting is for the D-B firm to guide the TRC through the technical proposal, highlighting sections that the D-B firm wishes to emphasize. The page-turn meeting occurs between the date the techni- cal proposal is due and the question and answer session, in accordance with the schedule of events section of the RFP. The department terminates the page-turn meeting promptly at the end of the allotted time. An audiotape or videotape record of all or part of the page-turn meeting is maintained and becomes part of the contract documents. The page-turn meeting does not constitute discussions or negotiations. An unmodified aerial or map of the project limits provided by the D-B firm is acceptable for reference during this meeting. The unmodified aerial or map may not be left with the depart- ment upon conclusion of the page-turn meeting. Use of other visual aids, electronic presentations, handouts, etc., during the page-turn meeting is expressly prohibited. At the end of the 30 minutes, the TRC is allowed 5 minutes to ask questions pertaining to information highlighted by D-B firm. Participa- tion in the page-turn meeting shall be limited to five D-B firm representatives. D-B firms desiring to opt out of the page-turn meeting may do so by submitting a request to the department. The page-turn meeting is the best opportunity for proposers to describe their proposal process and ideas to the evaluators. From the comments and feedback that the state has received regarding the best value selection process, this meeting is the most important to the industry and they believe that FDOT should continue it. Question and Answer Meetings FDOT may meet with each proposer, formally, for a question and answer (Q&A) session. FHWA is invited to sit in on federal-aided oversight projects. The purpose of the Q&A session is to seek clarification and ask questions as it relates to the technical proposal. The department may terminate the Q&A session promptly at the end of the allotted time. The department may audiotape or videotape all or part of the Q&A session. All such recordings will become part of the contract documents. The Q&A session does constitute “discussions” or negotiations. Proposers are not permitted to ask questions of the department except to clarify a question posed by the department. No supplemental materials, handouts, etc., are allowed to be presented in the Q&A session. No additional time is allowed to research answers. Within one week of the Q&A session, the D-B firm submits to the department a written clarification letter summarizing the answers provided during the session. The questions, answers, and written clarification letter become part of the contract documents and are considered by the department as part of the technical proposal. The D-B firm shall not include infor- mation in the clarification letter not discussed during the Q&A session. The department provides some (not necessarily all) proposed questions to each D-B firm as they relate to their technical proposal approximately 24 hours before the scheduled Q&A session. Debriefings Debriefings are important for developing transparent best value approaches. In its debriefing meetings, FDOT discusses how and why a proposal received a certain score. Because of the Sunshine law in Florida, all proposal information may be discussed unless something in the proposal or ATC in the proposal is deemed proprietary. FDOT stated it has never had a protest that has gone to court because the process and how the score is arrived at is explained with all proposers during the debriefing meetings. Lessons Learned The FDOT agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process. • The department must ensure that the advertisement and RFP are clear and concise. • The TRC needs to be well trained so that they have a comprehensive understanding of the way the technical proposal will be scored and how to provide candid com- ments associated with their scores. • The department should conduct various procurement meetings (i.e., pre-bid, one-on-one, page-turn, and Q&A meetings) and debriefings as public meetings to foster transparency. • FDOT provides open access to pre-bid questions and posts responses on a website for all proposers’ reviews. • The agency conducts D-B training workshops and solicits feedback from internal and external parties involved. • Communication is a key to obtaining transparency during the best value evaluation process. MICHIGAN DEPARTMENT OF TRANSPORTATION Overview MDOT uses either a one-step or two-step best value procure- ment process. In the one-step approach, all proposers submit technical qualifications and other required criteria before or simultaneously with their price proposals. In the two-step approach, a RFQ is issued in the first step in order to short- list the proposers. Proposers respond to the RFQ by submit- ting a SOQ for their team. After the short-listing phase, all proposers are equal and the criteria used for the RFQ are not included in the second step (the final technical proposal eval- uation and scoring). During the interview with MDOT, the agency project manager emphasized that they only use best value on appropriate D-B and D-B-B projects that tangibly benefit from selection by non-price factors.

29 Evaluation Criteria/Award Algorithms MDOT indicated in the interview that establishing a well- defined list of evaluation criteria is one of its most important factors in achieving a fair and transparent best value selection. MDOT does not have a standard template for best value proj- ects; instead, it conducts best value selection on a project- by-project basis. To enhance the fairness of the evaluation process, MDOT Guide (2013) notes that “When developing the list of items to be evaluated and scored, the selection team should focus on project specific needs that can be objectively defined, evaluated, and scored. However, some subjectivity may be used as long as a consistent approach to scoring is doc- umented by the selection team” (MDOT 2013, italics added). For a given project, the selection team develops evalua- tion criteria for the technical portion of the evaluation. The technical criteria can be a single term (e.g., aesthetic of a bridge or the approach to maintaining traffic) or multiple terms (e.g., contractor’s qualification, innovations, understanding of the project). The following sections summarize evaluation criteria and award algorithms for the M-21 over I-75 Bridge Replacement Project. Evaluation Criteria Used for M-21 over I-75 Bridge Replacement Project For this project it was imperative to minimize the impacts on public mobility while still keeping safety in mind. This led to the technical score of the proposals for this project having a 50% weight on the mobility; 60% of which was based on user delay cost and 40% on the proposer’s traffic management plan. In addition, clearly defined evaluation criteria and their weights were included in the RFP for transparency. Table 8 summarizes the evaluation criteria, along with their descrip- tions and weight for the M-21 over I-75 Bridge Replacement Project (MDOT 2008). Award Algorithms Used for M-21 over I-75 Bridge Replacement Project This best value contract was awarded based on three rounds of evaluation. First, MDOT conducted an initial review of the technical proposals for responsiveness. Second, MDOT conducted a pass/fail evaluation. The minimum technical pro- posal required to be responsive is 40 points. The proposals were evaluated based on the following pass/fail criteria: • The major participants and key personnel shall not have changed since the submission of its SOQ. • The terms, conditions, ideas, concepts, and techniques of the proposal comply with all governmental rules. • Proposer information, certifications, and documents are complete, accurate, and responsive. Third, MDOT evaluated the technical proposals based on direct point scoring, with a scale of 100 points (Table 8). Finally, the best value contract is awarded based on a com- posite score using the following formula: Evaluation Criteria Maximum Points Descriptions Mobility 50 MDOT’s goal is to minimize impact to the traveling public while getting the work completed as quickly and safely as possible. Scoring will be greatest to those Proposers who provide a mobility plan that minimizes impact to the traveling public while ensuring a fast, efficient, and high-quality construction. MDOT will review and score mobility based on two parts: Part 1 (30 points) for user delay costs and Part 2 (20 points) for the traffic management plan. Progress Schedule 20 The scoring represents MDOT’s goal to provide a project that is substantially completed with the shortest construction schedule. Quality Assurance/Quality Control (QA/QC) 15 Provide a QA/QC plan that addresses both design and construction activities. This document should address how errors are minimized, what process is used to oversee work, and shows authority for QA/QC reviewers when to change or stop work. Project Communications 10 Provide a communication plan that outlines both internal communication of the design/build team and your proposal for communication with MDOT, the firm performing design assistance, and the firm performing the construction engineering. Aesthetics 5 Provide an explanation of how the proposal addresses a structure that has positive aesthetics and why. Source: MDOT (2008). TABLE 8 EVALUATION CRITERIA FOR M-21 OVER I-75 BRIDGE REPLACEMENT PROJECT Final Best Value Score = ((30%) p Proposal Price) + (70%) p (Proposal Price/(Technical evaluation score) p 0.01)

30 Evaluation Committee The MDOT best value evaluation committee often includes a project manager, construction engineer, and other personnel related to the project. The committee specifically includes a member of a statewide Central Selection Review Team to help mitigate biases and ensure a defensible evaluation pro- cess. Although MDOT does not use an oversight committee for the whole process, the presence of a Central Selection Review Team member helps to increase the transparency and fairness of the evaluation process. The evaluators are kept in isolation until they finish rating each technical proposal. To maintain transparency, a MDOT project manager is the sole agency contact person for receiving clarification requests and other communications about the project, the RFP, and the proposal submittal. Also, MDOT does not accept any oral requests (in person or by phone) for clarification. Proposers are not allowed to discuss the RFQ or RFP with other MDOT staff members or MDOT consultants involved in the proj- ect before the contract is awarded. MDOT staff members or MDOT consultants must notify the MDOT project manager if proposers discussed the project with them during the pro- curement phase. Procurement Meeting and Debriefings MDOT conducts both procurement meetings and debriefings to enhance the fairness and transparency of the selection process. For example, MDOT required a mandatory pre-bid meeting on the M-21 project that was open to all proposers. In this meeting, MDOT answered general questions related to the project and the best value selection. For specific questions, the proposers were required to submit requests for clarification and the responses were then distributed to all the proposers. This process provided each team with the same information. Debriefings are conducted within 60 days of awarding a contract. The debriefing can be by phone, but it is typically conducted in person. In the debriefing meetings, the proposers are provided with information from the evaluation process regarding their scores, and the strengths and weaknesses of their proposals. It should be noted that the debriefing may not include point-by-point comparisons of evaluation criteria between proposers. However, the Freedom of Information Act in Michigan allows the proposers to request all informa- tion received by MDOT after the project has been awarded. It provides for certain information received by a state agency to be disclosed to the public. Debriefings cannot reveal any information exempt from release under the Freedom of Information Act. Lessons Learned The agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process. • A list of evaluation criteria must be well-defined. • Evaluation criteria and award algorithms are to be devel- oped on a project-by-project basis. • The MDOT project manager serves as a single point of contact for clarification requests and other communica- tions with proposers during best value selection. • Procurement meetings and debriefings will help enhance the fairness and transparency of the selection process. • The Freedom of Information Act in Michigan fosters transparency in best value selection. MINNESOTA DEPARTMENT OF TRANSPORTATION Overview MnDOT has used best value procurement for both D-B and D-B-B projects. In 2013, MnDOT published a best value manual for D-B-B projects to increase the consistent use of best value selection (MnDOT 2013). In 2011, MnDOT published a manual for D-B projects, which also enhances consistency in best value selection (MnDOT 2011). Both of these manuals are heavily referenced in the descriptions of the process that follow. Evaluation Criteria/Award Algorithms Evaluation criteria are determined differently for D-B-B and D-B projects. The D-B-B best value manual recommends the following five main categories of best value evaluation criteria: 1. Qualifications of personnel: Depending on project characteristics, personnel with specific licensure, train- ing, or certifications may add more value to the project. These criteria can be evaluated by using qualitative scores or “pass/fail” ratings. 2. Experience of personnel on similar projects (pass/fail criteria): Personnel with experience on similar projects are required to be successful. 3. Experience of contractor on similar projects (pass/fail criteria): Contractors with experience on the projects of a similar size, type, or complexity may benefit the project. 4. Availability of key personnel, equipment, or materials (pass/fail criteria): The availability of which will be critical to successfully completing the project. Con- tractor will indicate availability of these items in the proposal. 5. Ability to meet completion date: Establish pass/fail criteria related to project completion requirements and request contractor completion dates (MnDOT 2013). To enhance transparency, MnDOT requires that after an award all technical proposals be filed and all technical and cost proposals be both open to public inspection as required or permitted by the Minnesota Government Data Practices Act. These documents will be available for viewing and the

31 results posted publicly in accordance with MnDOT standards (MnDOT 2013). Different from the D-B-B best value process, D-B best value criteria and selection methodologies are always pre- sented in the RFQ and RFP for a given project. Price is not considered and evaluated in step 1 of the procurement process (SOQ evaluation). Table 9 summarizes a typical list of criteria for RFQ. It can be noted that MnDOT does not provide a standard set of RFP criteria; instead, MnDOT develops technical eval- uation criteria on a project-by-project basis. To enhance the fairness and transparency in the evaluation process, MnDOT indicates that the evaluation criteria should: • Be clear, defendable, and easy for the proposers and public to understand; • Not overlap scoring criteria in the SOQ, especially with respect to key personnel who have already been evalu- ated in the SOQ; • Focus on items that bring measurable value to the project; • Be tailored to the individual project (void/minimize recycling criteria from project to project); and • Be appropriately balanced versus the weight of the price proposal. Although the evaluation criteria are different between the RFQ and RFP, MnDOT uses a five-point adjectival scoring system to rate both SOQs and proposals. Table 10 presents the evaluation guideline for these adjectival ratings. Each technical proposal receives a maximum score of 100 points: 50 points for responsive criteria and 50 points for technical merits. The adjusted score is determined by dividing the proposal price by the technical proposal score. The price proposals will be reviewed for responsiveness after the cal- culation of adjusted scores. Table 11 illustrates an example of the evaluation process. To enhance the transparency and fairness of the evaluation process, MnDOT uses the following strategies: • The agency does not offer scorers any unique instruction that is not accessible or visible to the proposers. • The agency provides a detailed description of the tech- nical evaluation factors, the objectives and requirements for each evaluation factor, the relative weights of the evaluation factors, and the information to be submitted in their RFQs and RFPs. The scoring criteria must speak for themselves, without any interpretation from those who created them. • TRC members will independently score each proposal by assigning a percentage based on the qualitative assess- ment rankings by multiplying the percentage with the maximum total points for each category. • TRC chair, with assistance from the Process Oversight Committee (POC), will determine the average score for each technical proposal from all of the scores provided by the TRC members. • The evaluation committee can use a clarification or com- munication process to resolve any ambiguities, errors, or omissions related to the criteria stated in RFQs and RFPs. • The rating process must be documented on the work- sheet for each evaluation factor. • Evaluation teams and the selection committee must clearly document strengths, weaknesses, deficiencies, and risks associated with each factor in the worksheet. Evaluation Committee There are minor differences in the structure of the evalua- tion and selection committees between D-B best value and D-B-B best value projects. For D-B-B best value projects, the evaluation committee is comprised of the TRC, POC, and Technical Advisors (TAs). The evaluation committee of D-B best value projects has a technical subcommittee in addition to TRC, POC, and TAs. Minnesota State Statute 161.3420 requires that the TRC be comprised of at least five members, one of whom shall be the Associated General Contractors (AGC) representative (MnDOT 2011). Figure 10 shows the structure of the evaluation committee of the D-B project best value selection process. It can be noted that the Commissioner of Transportation is responsible for appointing the evaluation committee members. The roles and responsibilities of the evaluation committee members are as described in the following sections. Process Oversight Committee (POC) • A group of non-scoring observers (i.e., a program man- ager, FHWA representative, or a representative from the protest official’s office), who are appointed to observe the evaluation process and provide support to TRC and TAs, if necessary. • POC may submit a written report and/or specific questions to the TRC chair to be used during any oral presentations. Type of Factor RFQ Evaluation Factors Pass/Fail Legal Financial Technical/Quality Submitter organization and experience Key personnel experience Project management approach Project understanding Source: MnDOT (2011). TABLE 9 RFQ EVALUATION CRITERIA

32 erocS noitpircseD lasoporP noitpircseD QOS etaR Excellent (E) • Submitter has exceptional qualifications. • SOQ supports an extremely strong expectation of successful project performance. • SOQ indicates significant strengths with few minor weaknesses, if any. • SOQ contains an outstanding level of quality. • Proposal demonstrates an approach with unique or innovative methods of approaching the proposed work with an exceptional level of quality. • Proposal contains many significant strengths and few minor weaknesses, if any. • There is very little risk that the Proposer would fail to satisfy the requirements of the D-B contract. 90–100% Very Good (VG) • Submitter has strong qualifications. • SOQ supports a very good expectation of successful project performance. • SOQ contains a few minor weaknesses that are outweighed by the strengths. • Proposal demonstrates an approach offering unique or innovative methods of approaching the proposed work. • Proposal contains many strengths that outweigh the weaknesses. • There is little risk that the Proposer would fail to satisfy the requirements of the D-B contract. Weaknesses, if any, are very minor and can be readily corrected. 75–89% Adequate (A) • Submitter has sufficient qualifications. • SOQ supports an adequate expectation of successful project performance. • SOQ contains weaknesses that are balanced by strengths. • Proposal demonstrates an approach that offers an adequate level of quality. • Proposal contains strengths that are balanced by the weaknesses. • There is some probability of risk that the Proposer may fail to satisfy some of the requirements of the D-B contract. Weaknesses are minor and can be corrected. 51–74% Fair (F) • Submitter has limited qualifications. • SOQ supports a fair expectation of successful project performance. • SOQ contains weaknesses that are not offset by strengths. Weaknesses could adversely affect successful project performance. • Proposal demonstrates an approach that marginally meets RFP requirements and/or objectives. • Proposal contains weaknesses that are not offset by the strengths. • There are questions about the likelihood of success and there is a risk that the Proposer may fail to satisfy the requirements of the D-B contract. 25–50% Poor (P) • Submitter has little or no qualifications. • SOQ supports a weak expectation of successful project performance. • SOQ contains significant weaknesses with very minor strengths, if any. • Proposal demonstrates an approach that does not meet the stated RFP requirements and/or objectives, lacked essential information, is conflicting, is unproductive, and/or increases MnDOT’s risk. • Proposal contains many significant weaknesses and very minor strengths, if any. • There is not a reasonable likelihood of success and a high risk that the Proposer would fail. 0–24% Source: MnDOT (2011). TABLE 10 QUALITATIVE RATING GUIDE Proposer Technical Score Price ($) Adjusted Score (price/technical score) A 85.00 6,808,808.00 80103.62 B 82.61 7,496,356.00 90743.93 C 93.40 7,218,533.00 77286.22 D 89.72 6,406,360.00 71403.92 Source: TH 2 Crookston Slope Stability Project (MnDOT 2014). TABLE 11 EXAMPLE OF EVALUATION PROCESS

33 Technical Advisors (TAs) • TAs (i.e., members of the core project team) serve as advisors to the TRC and provide their input to the TRC members during the evaluation process. • These members do not score the proposals. Technical Review Committee (TRC) • TRC usually includes five members who perform the eval- uation and scoring of the proposals. There is a chair for this committee who is the head of the evaluation process. • Each TRC member performs an independent review of each submitted technical proposal. All TRC members have an equal weight in scoring the proposals. • TRC chair serves as a point of contact if a TRC member, Technical Subcommittee (TS) member, or TA has ques- tions relative to the evaluation process. • Submits written requests for clarification to proposers if the evaluation team determines that a proposal contains unclear information or otherwise needs clarification. Technical Subcommittee (TS) • TS members include individuals with expertise in specific fields relative to the technical scoring criteria. • TS members serve as advisors to TRC members dur- ing the evaluation process. They sometimes come to the evaluation meetings and make presentations. • TS members often submit their strength and weakness assessments to the TRC chair for distribution to the TRC members for consideration when completing the scoring matrices. With regard to the fairness and transparency of the selec- tion process, MnDOT requires the evaluation committee to maintain and manage the integrity of the entire evaluation process. The evaluation committee typically performs the following tasks: • All personnel sign certifications of confidentiality and non-disclosure, and statements concerning conflicts of interest. • The deliberations of all teams and committees and the knowledge of individual participants in the evaluation process must be held in the strictest confidence. • All information provided by the proposers or generated by the evaluation must be safeguarded. • No information regarding the contents of the proposals; the deliberations by the TRC, TS, or TA; recommenda- tions to the Commissioner of Transportation; or other information relating to the evaluation process is to be released or be publicly disclosed without the authoriza- tion of the TRC chair. • The TRC chair is responsible for all communication outside the proposal evaluation and TRC. Training MnDOT conducts training for the technical evaluation com- mittees before they are allowed to proceed to the evaluation process. Training is based on the MnDOT manual developed by the Office of Construction and Innovative Contracting. The training session often involves reviewing the evaluation manual and obtaining lessons learned from past evaluation processes or relevant case protests. The full TRC teams (i.e., the FHWA on the POC, AGC representative, etc.) are invited to the launch. The contractors and consultant are not invited to the training session. Training is intended to provide guidance to the evalua- tion committees to ensure it is done in a fair and transparent manner. During the training sessions, the evaluation commit- tee is updated on any changes in best value legislation and how to adapt these changes into the evaluation process. An overview of the best value procurement process is presented to all committee members to ensure that all evaluators fully understand best value concepts and the evaluation process. The evaluation committee members are also trained to deter- mine what is appropriate and inappropriate when rating the evaluation criteria and how to avoid the bias that may occur during the evaluation process. Debriefings MnDOT conducts oral debriefings if they receive a request from the unsuccessful proposers. The objective of the debrief- ings is to provide feedback on their SOQ and proposals. The debriefings for technical proposals are conducted within 60 days after the contract is awarded. If there is a protest over the award of the contract, MnDOT will delay the debrief- ing process. Debriefings can be conducted in person or by phone with the proposers. Different from other DOTs, MnDOT encourages a local AGC member, who is a part of technical evaluation, to conduct debriefings with the unsuccessful pro- posers. One of the most important parts of the debriefings is to provide comments on strengths and weaknesses of both the SOQs and technical proposals that are made during the evaluation process. In addition, MnDOT explains all evalua- tor’s comments in detail to the proposers. FIGURE 10 Evaluation committee structure (Source: MnDOT 2011).

34 MnDOT notes that the debriefings may not include point- by-point comparisons of the debriefed proposer’s proposal with other proposals. To ensure the evaluation process is fair and transparent, the score breakdown of SOQs and the technical proposal and evaluators’ comments are placed on a public website. MnDOT keeps the proposals in electronic format so that they can easily search for relevant information with regard to transparency in the evaluation process. Lessons Learned The MnDOT agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process: • It is important that the evaluation criteria be written as clearly and fully as possible, and be balanced appropri- ately versus cost. There is no hidden information in the evaluation manual. • Evaluators must respect the process, read the proposals thoroughly, and make appropriate decisions. • It may appear appropriate to include members of local partners on TRCs for projects in which they are very involved; however, it is important to be cautious. Such individuals may not be willing or able to submit to the process and may score based on the interests of their county or town instead of following the manual. • It is suggested that the interview and classification pro- cess be implemented appropriately. Interviewing is very useful for obtaining transparency because it helps clear up any misconceptions. If scorers note misconceptions, it degrades industry acceptance of the process. Inter- views are time-intensive for all involved; therefore, it may make sense to aggressively use the classification process instead. • In making all scoring comments public, the evaluators must record the comments that reflect their opinion. Any comments that are no longer relevant to their beliefs must be corrected after listening to the roundtable discussion. • It is suggested that non-proposing contractors view the proceedings (under confidentiality, of course). • The value of a point should be emphasized and used during the evaluation process even for adjectival rating systems. The evaluators should not round their scores. For example, some individuals like to score 85, 90, and 95 only. When points are worth $2.5 million, this scor- ing process can be a problem. NEW YORK STATE DEPARTMENT OF TRANSPORTATION Overview For the last three years, NYSDOT has utilized best value procurement in both D-B-B and D-B projects. This process is conducted in one step, but involves two parts. The first part contains traditional construction plans, proposals, bid items, and quantities. The second part includes a description of the technical evaluation factors, their relative weights, and the weighting of price versus technical evaluation factors. For D-B projects, a two-step best value selection approach is used. In the first step, NYSDOT uses a two-way feedback clarification process between proposers, project management team, evaluation team, and selection committee to resolve ambiguities, errors, omissions, errors, or a mistake in an SOQ. In step 2, NYSDOT uses a communication process among project management team, evaluation team, and selection committee to resolve ambiguities and uncertainties caused by RFQs and RFPs during the evaluation process. For each best value project, NYSDOT prepares in advance a procurement management plan that outlines the factors, evaluation teams, and a selection process. Evaluation Criteria/Award Algorithms D-B evaluation criteria are established in both the RFQ and RFP for a given project. Price is only considered and evaluated in the second step. NYSDOT indicated that, for some projects, the price component may account for 50% of the adjusted price award algorithm. For other projects, when the budget is more important, the price can reach up to 80% of the adjusted price algorithm. NYSDOT uses pass/fail and quality evalu- ation factors in both the RFQ and RFP. Table 12 summarizes a typical list of pass/fail and quality RFQ and RFP evaluation factors. The guidelines to evaluate quality factors are presented in Table 13. Type of Factor RFQ Evaluation Factors RFP Evaluation Factors Pass/Fail lageL lageL laicnaniF laicnaniF SOQ responsiveness Proposal responsiveness Technical/Quality Experience Experience and qualifications Past performance Management approach snoitulos lacinhceT yticapaC Project understanding Project support Source: NYSDOT (2011). TABLE 12 BEST VALUE EVALUATION CRITERIA

35 To enhance the transparency and fairness of the evaluation process, NYSDOT uses the following strategies: • NYSDOT provides a detailed description of the quality evaluation factors, the objectives and requirements for each quality evaluation factor, the relative weights of the quality evaluation factors, and the information to be submitted in their RFQs and RFPs. • The result of rating individual evaluation factors must be arrived at through consensus of the members of evaluation teams and the selection committee as applicable. • Price is only evaluated in the RFP/proposals evaluation process. • Evaluation teams and the selection committee can use the clarification or communication process to resolve any ambiguities, errors, and omissions related to these criteria stated in RFQs and RFPs. • SOQ ratings do not carry over to the RFP/proposals evaluation process. • The rating process must be documented on the work- sheet for each evaluation factor. • Evaluation teams and the selection committee must clearly document strengths, weaknesses, deficiencies, and risks associated with each factor in the worksheet. • Narratives are required for each qualitative/descriptive rating. NYSDOT indicated that “after the evaluation is com- plete, the selection committee will prepare a written evalu- ation narrative to accompany the qualitative/descriptive rating of each proposal. The narrative will include strengths, weaknesses, and deficiencies for each proposal and will fully support the qualitative/descriptive rating assigned” (NYSDOT 2011). One example of a D-B best value project is the Tappan Zee Hudson River crossing project. This project defines “best value” as “the greatest overall benefit, under the speci- fied selection criteria, obtained through the tradeoff between price and technical benefits” (New York State Thruway Authority 2012). The RFP of this project indicated that techni- cal merits and price were given approximately equal weight- ing for best value evaluation. Rating SOQ Rating Guidelines Proposal Rating Guidelines Exceptional The Proposer has provided information relative to its qualifications that is considered to significantly exceed stated objectives/requirements in a beneficial way and indicates a consistently outstanding level of quality. There are essentially no weaknesses. The Proposer has demonstrated an approach that is considered to significantly exceed stated criteria in a way that is beneficial to the department. This rating indicates a consistently outstanding level of quality, with very little or no risk that this Proposer would fail to meet the requirements of the solicitation. There are essentially no weaknesses. Good The Proposer has presented information relative to its qualifications that is considered to exceed stated objectives/requirements and offers a generally better than acceptable quality. Weaknesses, if any, are very minor. The Proposer has demonstrated an approach that is considered to exceed stated criteria. This rating indicates a generally better than acceptable quality, with little risk that this Proposer would fail to meet the requirements of the solicitation. Weaknesses, if any, are very minor Acceptable The Proposer has presented information relative to its qualifications, which is considered to meet the stated objectives/requirements, and has an acceptable level of quality. Weaknesses are minor and can be corrected. The Proposer has demonstrated an approach that is considered to meet the stated criteria. This rating indicates an acceptable level of quality. The Proposal demonstrates a reasonable probability of success. Weaknesses are minor and can be readily corrected Potential to Become Acceptable N/A The Proposer has demonstrated an approach that fails to meet stated criteria as there are weaknesses and/or deficiencies, but they are susceptible to correction through discussions. The response is considered marginal in terms of the basic content and/or amount of information provided for evaluation, but overall the Proposer is capable of providing an acceptable or better Proposal. Unacceptable The SOQ fails to meet the stated objectives and/or requirements and/or lacks essential information and is conflicting and/or unproductive. Weaknesses/deficiencies are so major and/or extensive that a major revision to the SOQ would be necessary and/or are not correctable. The Proposer has demonstrated an approach that indicates significant weaknesses/deficiencies and/or unacceptable quality. The Proposal fails to meet the stated criteria and/or lacks essential information and is conflicting and/or unproductive. There is no reasonable likelihood of success; weaknesses/deficiencies are so major and/or extensive that a major revision to the Proposal would be necessary. Source: NYSDOT (2011). N/A = not applicable. TABLE 13 EVALUATION GUIDELINES

36 Table 14 summarizes quality and technical factors defined in the RFP. These factors and their sub-factors were rated by using ten level adjectival ratings as shown in Table 15. This project received three proposals. The technical factors were evaluated by a nationally recognized team of subject- matter experts. The leaders of the technical review teams, the Authority’s Value Assessment Team, summarized the strengths and weaknesses of each evaluation factor submitted to the selection committee. As described earlier, all materials that could reveal a pro- posers’ identity were removed. The three proposers were coded using nicknames: Catskills, Oneida, and Niagara. Table 16 presents the results of technical rankings and price proposal evaluations. The project was awarded to Niagara, which combines a low price and an acceptable technical proposal. To provide more transparency in the selection process, the scoring committee conducted a best value tradeoff com- parison between Niagara and Oneida (Catskills was elimi- nated from the process because it was lower in technical ranking and higher in price that those of Oneida). Tables 17 and 18 respectively summarize the superior elements of each proposal. In contrast with the D-B best value procedure, NYSDOT has recently employed the best value approach for the tradi- tional D-B-B delivery method. As summarized in the intro- duction to this section, the D-B-B best value process involves two parts. The first part contains traditional construction plans, proposal, bid items, and quantities. The second part includes a description of the technical evaluation factors, their relative weights, and the weighting of price versus technical evaluation rotcaf-buS rotcaF Design and Construction Solution Construction approach Service life of the crossing Maximize the public investment Bridge, structures, and aesthetic design concepts Geotechnical Roadway design concept New York State Thruway Authority operations and security Management Approach Schedule Organization and general management Design management Construction management Key Personnel and Experience Key personnel Experience of the firms Past performance Environmental compliance Public outreach and coordination with stakeholders Source: New York State Thruway Authority (2012). TABLE 14 EVALUATION FACTORS Source: New York State Thruway Authority (2012). TABLE 15 BEST VALUE EVALUATION CRITERIA *Rankings shown were determined prior to extensive communications and discussions with the three proposers. **In accordance with the RFP, the price evaluation is based on Net Present Value (NPV) of each proposer’s bid amount distributed over the duration of the contract. Source: New York State Thruway Authority (2012). TABLE 16 RESULT OF TECHNICAL RANKINGS AND PRICE PROPOSAL EVALUATION

37 Source: New York State Thruway Authority (2012). TABLE 17 NIAGARA’S PROPOSAL ADVANTAGES OVER ONEIDA factors. The evaluation criteria and selection methodologies for D-B-B best value are established by a project-by-project basis after approval through Special Experimental Program (SEP) 14. Figure 11 illustrates an example of best value evalu- ation for D-B-B projects. Evaluation Committee There are slight differences in the structure of the evaluation/ selection committee between D-B and D-B-B best value projects. For D-B projects, NYSDOT uses its own manual, which outlines the best value selection process. To achieve transparency and fairness of best value selection, NYSDOT requires the procurement management team, evaluation team, selection committee, and all individual participants to main- tain and manage the integrity of the entire evaluation process. Examples of this requirement include: • All personnel involved in the evaluation process must sign certifications of confidentiality and non-disclosure, and statements concerning conflicts of interest. • The deliberations of all teams and committees and the knowledge of individual participants in the evaluation process must be held in the strictest confidence. • All information provided by the proposers or generated by the evaluation must be safeguarded. • The procurement management team set rules, guidelines, and procedures for the safeguarding of all information. • During the evaluation and selection process only the chairperson of the selection committee can approve the release of any information. Similar to other agencies, NYSDOT separates the evalua- tion of price and technical proposals. In addition, as mentioned previously, NYSDOT assigns a nickname to each proposal before evaluating the technical proposals so that the firms’ identities are removed. This eliminates any favoritism or bias that may occur during the evaluation process. Finally, NYSDOT may use observers to enhance the transparency and fairness of the best evaluation. Observers will be designated in writing and held to the same standards of confidentiality, integrity, and no conflict as members of the evaluation teams and the selection committee. Training The NYSDOT D-B procedures manual includes a training module (NYSDOT 2011). The training module focuses on an overview of the D-B process, as well as best value procure- ment selection. The objectives of training are to make sure all evaluators fully understand project goals and objectives, the evaluation process, best value criteria, and how to conduct

38 TABLE 18 ONEIDA’S PROPOSAL ADVANTAGES OVER NIAGARA Source: New York State Thruway Authority (2012). LRT = light-rail transit. FIGURE 11 Examples of evaluation criteria/award algorithms for D-B-B best value (Source: Foglietta 2012).

39 the evaluation. Project-specific training is provided to those involved in the procurement process in advance of reviewing the SOQs and RFP responses. These training sessions help the evaluators clarify any uncertainty and ambiguity before eval- uating proposals. In addition, the agency, in cooperation with FHWA [as part of the Every Day Counts (EDC) initiatives] has conducted statewide training of design and construction staff from all 11 regional offices. Debriefings Debriefing is a key component in keeping the process trans- parent, and helps the proposers understand their strengths and weaknesses as well as the rationale behind why their propos- als were not selected for an award. It is viewed as a learning process for the proposers to be better prepared when partici- pating in future projects. NYSDOT uses one-on-one meetings for debriefings with all proposers including both selected and unsuccessful proposers. These debriefing meetings are done in person, often with the selection committee present, only after the contract has been awarded. This is not the only time that one-on-one meetings are used during the evaluation process. The proposers passing all pass/fail evaluation factors are invited to interview or make presentations regarding their pro- posals to the selection committee. To avoid potential biases caused by presentations or interviews, NYSDOT clearly states that if any issues or questions that relate to the specifics of the project arise they must be formally put into a question. This process allows NYSDOT to address the question with a clari- fication that goes out to all proposers. Also, the agency needs to make sure that no proposer can have any inside informa- tion regarding the evaluation process. Figure 12 presents an excerpt of the debriefing procedure from the Instruction to Proposers for the Tappan Zee Hudson River Crossing Project. Lessons Learned The NYSDOT agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process. • Too many factors or sub-factors dilute the selection criteria. It is better to focus on fewer criteria that are the most important for the project. This will improve transparency by better communicating what is most important to the owner. • The agency needs to provide the time necessary for a complete and accurate RFP. The agency needs to make sure that the RFP is well-defined and comprehensive. Too many addendums to an issued RFP creates confusion for all parties. • Formally publishing all questions submitted by proposers along with responses improves transparency, a procedure that can be done without disclosing which team submitted the questions. • One-on-one meetings are very beneficial for communi- cating owner intent and improving transparency on the project. Even though the discussions may be “unofficial,” they may result in formal questions being submitted and addendums being issued. • Although it is policy not to disclose individual scores, sharing all technical scores and prices will significantly support transparency and fairness of best value selection (the industry is in favor of this consideration). OREGON DEPARTMENT OF TRANSPORTATION Overview Oregon DOT (ODOT) has been implementing best value procurement procedures for both D-B and CM/GC delivery methods. Under the D-B delivery method, ODOT employs a two-step procurement to select best value projects. • Step 1: RFQ and submittal of SOQ. Department short- lists the proposers after evaluating the SOQs. In this step, ODOT evaluates both the pass/fail factors and the technical and quality factors listed in the RFQ against the evaluation criteria to shortlist the proposers to submit final proposals. Typically three firms are shortlisted for any particular project to submit proposals. Debriefing of Unsuccessful Proposers Unsuccessful Proposers shall be debriefed upon their written request submitted to the Agencies’ Designated Representative within a reasonable time. Debriefings shall be provided at the earliest feasible time after a proposal is selected for award. The debriefing shall be conducted by a procurement official familiar with the rationale for the selection decision and contract award. Debriefing shall: A. Be limited to discussion of the unsuccessful proposer’s proposal and may not include specific discussion of a competing proposal; B. Be factual and consistent with the evaluation of the unsuccessful proposer’s proposal; and C. Provide information on areas in which the unsuccessful proposer’s technical proposal had weaknesses or deficiencies. Debriefing may not include discussion or dissemination of the thoughts, notes, or rankings of individual members of the selection committee, but may include a summary of the rationale for the selection decision and contract award. FIGURE 12 Debriefing process for best value selection, NYSDOT (Source: NYSDOT 2012).

40 • Step 2: RFP and submittal of proposals. The depart- ment evaluates the proposals and selects the final one. In step 2, ODOT evaluates the quality of the proposal based on the pass/fail and technical and quality factors before reviewing the price proposal. Failure to achieve a “pass” rating on a pass/fail element may result in the proposal being declared non-responsive. Technical pro- posals determined to be non-responsive will not be con- sidered further during the evaluation process of technical and quality factors. Price proposals are evaluated for price realism and reasonableness (ODOT 2006). Under the CM/GC delivery method, ODOT follows a one-step procedure, utilizing only a RFP. Because there is no pre-qualification process, the RFP is released as a public document and is open to all proposers to submit proposals (ODOT 2008). Evaluation Criteria/Award Algorithms ODOT’s best value process for D-B includes a list of evalua- tion criteria in the RFQ and RFP to proposers. The evaluation criteria consist of both pass/fail and technical and quality evaluation factors. Table 19 summarizes a typical list of pass/ fail and technical and quality evaluation factors stated in the RFQ and RFP. ODOT uses a direct scoring method to rate technical and quality factors for both SOQs and proposals. Table 20 pre- sents a sample result from the SOQ evaluation process. The percentage rating guidelines used to evaluate technical and quality factors for RFP are presented in Table 21. The ODOT D-B manual specifies that the evaluation committee will complete a worksheet indicating strengths, weaknesses, and deficiencies of each proposal for all the technical and quality factors along with its comments to sup- port the evaluation process and percentage ratings assigned for each factor. In addition, a proposer, who does not achieve “Pass” rating in “Pass/Fail” evaluation criteria or receives a quality score of less than “21%” for any technical and quality evaluation sub-factors, or a quality score of less than “41%” for any technical and quality evaluation factor, will not be eligible for selection (ODOT 2006). The price proposals are opened after the final consensus scores for the technical and quality proposals are developed by the selection official. The price proposals are reviewed for price, realism, and reasonableness. The final score of the proposals is obtained from the best value selection formula developed by the agency representatives. The D-B best value formula is as follows: Total Score = (Quality weight) p Qf + (Price weight) p Pf Where: Qf (Quality Factor) = Proposer’s Total Quality Score/ Highest Proposal Quality Score, and Pf (Price Factor) = Lowest Proposal Price/Proposer’s Price amount. Type of Factor RFQ Evaluation Factors RFP Evaluation Factors Pass/Fail Legal Legal Financial Disadvantaged business enterprise Technical/Quality Experience Proposer’s organization and expertise Past performance Project controls and management Backlog/capacity Technical solutions Project understanding Context sensitive and sustainable solutions Completion and Connectivity Diversity plan outline Source: ODOT (2006). TABLE 19 D-B BEST VALUE EVALUATION CRITERIA Factor Points Available A B C D 81 32 02 81 03 ecneirepxE Past Performance 30 15 16 17 16 Backlog/Capacity 20 8 17 20 13 Project Understanding 30 26 19 23 15 Completion & Connectivity 10 8 7 8 5 76 19 97 57 021 latoT Source: ODOT (2006). TABLE 20 SOQ EVALUATION SCORING CONSENSUS SUMMARY

41 The quality and price weights are determined by the project development team during the development of the RFP, and were mentioned in the RFPs issued to the proposers. Table 22 illustrates an example of a best value selection scoring process. For CM/GC projects, the best value selection process includes the evaluation of two components: a project pro- posal and a price proposal. The project proposal is evaluated based on five categories. Table 23 illustrates an example of these five categories of CM/GC best value evaluation criteria. The CM/GC best value formula is calculated as follows: Total Score = (Project Proposal Weight p Pf-1) + (Price Proposal Weight p Pf-2) Where: Pf-1 (Project Proposal Factor) = Proposer’s Project Pro- posal Score/Highest Proj- ect Proposal Score; and Pf-2 (Price Proposal Factor) = Lowest CM/GC Fee Percentage/Proposer’s CM/GC Fee Percentage. Table 24 shows an example of the best value selection scoring process. Evaluation Committee There are slight differences in the structure of the evaluation/ selection committee between CM/GC and D-B best value projects. For CM/GC projects, the proposal evaluation team is made up of six to eight ODOT employees. Facilitators are the staff members of the ODOT Office of Procurement who have extensive CM/GC experience in contracting, evaluation, and selection. ODOT uses a CMGC manual to promote a standard set of procedures for the evaluation process (ODOT 2008). For D-B projects, the evaluation committee is often com- prised of a facilitator, technical evaluation support personnel Rate Criteria for Percentage Range 81%–100% The Proposer has demonstrated an approach that is considered to significantly exceed stated criteria in a way that is beneficial to the Agency. This rating indicates a consistently outstanding level of quality, with very little or no risk that this Proposer would fail to meet the requirements of the solicitation. There are essentially no weaknesses. 61%–80% The Proposer has demonstrated an approach that is considered to exceed stated criteria. This rating indicates a generally better than acceptable quality, with little risk that this Proposer would fail to meet the requirements of the solicitation. Weaknesses, if any, are very minor. 41%–60% The Proposer has demonstrated an approach that is considered to meet the stated criteria. This rating indicates an acceptable level of quality. The Proposal demonstrates a reasonable probability of success. Weaknesses are minor. 21%–40% The Proposer has demonstrated an approach that fails to meet stated criteria, as there are weaknesses and/or deficiencies. The response is considered marginal in terms of the basic content and/or amount of information provided for evaluation. Modification would be required for the Proposal to be acceptable. 0%–20% The Proposer has demonstrated an approach that indicates significant weaknesses/deficiencies. The Proposal fails to meet the stated criteria and/or lacks essential information and is conflicting and/or unproductive. There is little reasonable likelihood of success; weaknesses/deficiencies are so major and/or extensive that a major revision to the Proposal would be necessary. Source: ODOT (2006). TABLE 21 GUIDELINES FOR TECHNICAL/QUALITY FACTOR EVALUATION IN RFPs PROPOSER A: Quality Proposal Score = 1850 Price Proposal = $45,259,600 PROPOSER B: Quality Proposal Score = 1900 Price Proposal = $44,900,000 PROPOSER C: Quality Proposal Score = 1950 Price Proposal = $49,259,450 Quality Proposal Score: (1850/1950 = 0.9487) x 60% = 0.5692 Quality Proposal Score: (1900/1950 = 0.9744) x 60% = 0.5846 Quality Proposal Score: (1950/1950 = 1.0000) x 60% = 0.6000 Price Proposal Score: ($44,900,000/$45,259,600 = 0.9921) x 40% = 0.3968 Price Proposal Score: ($44,900,000/$44,900,000 = 0.3968) x 40% = 0.4000 Price Proposal Score: ($44,900,000/$49,259,450 = 0.9116) x 40% = 0.3646 TOTAL SCORE: 0.5692 + 0.3968 = 0.9660 TOTAL SCORE [Best Value]: 0.5846 + 0.4000 = 0.9846 TOTAL SCORE: 0.6000 + 0.3646 = 0.9646 Source: ODOT (2006). TABLE 22 EXAMPLE OF SCORING PROCESS

42 (TESP), scoring team with a chairperson, selection official, and observer. ODOT developed evaluation and selection plans that help committees maintain transparency and objectivity during the evaluation process. The roles and responsibilities of each evaluation committee member are briefly summarized here from the ODOT Design-Build Manual (ODOT 2006). Facilitator The facilitator is responsible for controlling and maintaining the integrity of the entire evaluation and selection process according to the evaluation plan. The facilitator works under the guidance and direction of the scoring team chairperson. Typically, the facilitator performs the following tasks: • Retains confidentiality with regard to the evaluation process and is responsible for managing and monitor- ing the entire process for confidentiality, integrity, and procurement sensitivity. • Provides training for the evaluation and selection process participants before the start of evaluations. • Provides guidance and assistance for the evaluation and selection process participants throughout the entire eval- uation and scoring process. • Maintains a complete file of the proposal evaluation and selection process including all individual and consensus worksheets, communication activities for clarifications, summary of scores, recommendations from the scoring team, and approval of the proposal quality scores by the selection official. Technical Evaluation Support Personnel (TESP) TESP evaluates the proposals and provides comments on the strengths and weaknesses of the proposals based on the evaluation criteria issued in the RFP. TESP members are gen- erally from ODOT’s Technical Services staff (headquarters or regions) or from ODOT consultants who are familiar with the alternative delivery methods or have played a technical role in the RFP preparation. It should be noted that TESP members do not “score” the proposals, but they do provide technical assessment of the proposals. The typical roles and responsi- bilities of TESPs include the following: • Individually, review the proposals and evaluate the spe- cific response categories and subcategories, including any innovative solution sections assigned to them. Notably, consultation with other TESPs evaluating related qual- ity response categories and subcategories is allowed and encouraged. However, consultation should be strictly lim- ited to the specific coordination item or issue in question. • Prepare concise questions to the proposers to clarify any problems in the proposals that may occur during the evaluation process. • Provide briefings and oral presentations concerning evaluation comments to the scoring teams. Category Evaluation Factors Total Points Available liaF/ssaP stnemeriuqer lageL I II Proposers organization and key personnel expertise 1,400 III CM/GC roles and responsibilities/goals 600 1,000 hcaorppa tcejorP VI liaF/ssaP eniltuo nalp ytisreviD V Source: I-5 Willamette River Bridge (ODOT 2008). TABLE 23 CM/GC BEST VALUE EVALUATION CRITERIA Source: I-5 Willamette River Bridge (ODOT 2008). SCORING EXAMPLE: TABLE 24 AN EXAMPLE OF CM/GC SCORING PROCESS

43 Scoring Team The members of the scoring team are ODOT employees who have previous experience with similar projects (understand comprehensively the project’s evaluation categories and sub- categories) or are familiar with the evaluation process in the best value selection. This team consists of a group of five or more individuals and a chairperson. The scoring team is responsible for evaluating and scoring the proposals based on criteria stated in the RFP. Specifically, the team performs the following tasks: • Reviews the proposals and evaluates all response catego- ries and subcategories and innovative solution sections and assigns scores. It is important to note that scores are not to be shared among scoring team members. • Evaluates categories and subcategories and innovative solution sections. This evaluation process must be com- pleted in accordance with the objectives and requirements and scoring guidelines contained in the RFP. • The chairperson approves the initial and substitute assign- ments of members to the scoring team, the TESPs, and the facilitator. • The chairperson coordinates and ensures timely comple- tion of the evaluation and re-evaluation processes among scoring team members. • The chairperson provides a briefing and/or oral presenta- tion concerning pass/fail ratings and technical and quality scores to the selection official, seeking approval of the final proposal scores. Selection Official The selection official is responsible for reviewing the results and the recommendation of the scoring team. If the process is clear and transparent, the selection official shall approve the final assigned scores for the proposers. It is noted that evaluators may remand specific category and subcategory and innovative solution bonus point scores back to the scoring team for re-evaluating and rescoring prior to approval. Observers Observers are appointed by the scoring team chairperson or selection official to make sure that the procedures of evalua- tion are being followed and the process is fair and transparent. To achieve transparency and fairness of the evaluation process, ODOT requires the facilitator, TESP, scoring team, selection official, and observer to maintain and manage the integrity of the entire evaluation process. Examples of this requirement include: • All personnel involved in the evaluation process must sign certifications of confidentiality and non-disclosure, and statements concerning conflicts of interest. • During the evaluation and selection process, only the selection official can approve the release of any information. • All information provided by the proposers or generated by the evaluation must be safeguarded. • Evaluation teams need to submit a written request to the facilitator requesting a clarification from the proposer with regard to proposals who acts as a media between evaluation committee and proposers. The facilitator keeps a copy of all communications and responses as part of the official record of the evaluation and selection process. Training ODOT mandates training for the evaluation committee before it evaluates the proposals. The evaluation and selection plan, training materials, and evaluation worksheets are provided for the training sessions. Evaluators receive the same training and guidance on the evaluation process. Training generally includes two 3- to 4-hour sessions: • The first session consists of the larger TESPs or evalua- tion and selection team members and provides the basics in reviewing, evaluating, and scoring a proposal. • The second session consists of the smaller group of personnel tasked with either evaluating or scoring all the proposals (scoring team), or reviewing and confirming the evaluation and selection team scoring results (selec- tion committee). During the training, the evaluation personnel are also pro- vided with the project scope, project schedule, evaluation and selection schedule, roles and responsibilities of each party, and objectives of the selection process. Technical evaluators are educated on what to look for and how to rate each item during the evaluation process. Debriefings ODOT conducts debriefings with unsuccessful proposers if requested. Typically, ODOT conducts debriefings with an individual proposer in person within the first 20 days after the contract is awarded. To improve the transparency and fairness of the selection process, ODOT allows the unsuccessful pro- posers to review the winning proposal and the scoring results of all other proposers. Lessons Learned The ODOT agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process: • The agency provides a detailed description of the tech- nical and quality evaluation factors, the objectives and

44 requirements for each technical and quality evaluation factor, and the relative weights of the technical and quality evaluation factors. • Evaluation teams and selection committees use a clari- fication or communication process to resolve any ambi- guities, errors, and omissions related to these criteria stated in the RFQ/RFP. • Evaluation committees clearly document strengths, weak- nesses, deficiencies, and risks associated with each factor in the evaluation worksheets. • In-depth training for all evaluation and scoring team members is a requirement to ensure consistent and stan- dardized scoring of proposals. • The result of rating evaluation factors must be arrived at a consensus of the committee members. • Dissemination of the same information to all proposers in a timely manner helps foster transparency of best value selection. All records of the procurement and evaluation and selection process will become part of the public record. UTAH DEPARTMENT OF TRANSPORTATION Overview Utah DOT (UDOT) uses best value procurement for D-B projects. Similar to other DOTs, UDOT has been using a two-step procedure. In the first step, UDOT evaluates both pass/fail and technical factors listed in the RFQ to short- list the proposers to submit final proposals. In the second step, UDOT evaluates technical proposals before reviewing the price proposal. Failure to achieve a “pass” rating on a pass/fail element may result in the proposal being declared non-responsive. Technical proposals determined to be non- responsive will not be considered further during the evalua- tion process. Price proposals are evaluated based on proposal price, price accuracy, completeness, and reasonableness (UDOT 2012). Evaluation Criteria/Award Algorithms Evaluation criteria are mentioned in both the RFQ and RFP. Table 25 summarizes a typical list of pass/fail and technical and quality evaluation factors in the RFQ and RFP. Table 26 shows the relative importance of technical factors considered during the evaluation process. The guidelines to evaluate quality and technical factors are presented in Table 27. UDOT performs a risk analysis to determine the overall added values of each proposal. A risk analysis often is con- ducted in the following five key areas: 1. Maintenance of traffic, 2. Utilities, 3. Geotechnical, 4. Right-of-way, and 5. Schedule. The determination of a best value award is based on the following: • Base build price (within the limits of construction funding); • Option proposal price(s) (within the limits of construc- tion funding); Type of Factor RFQ Evaluation Factors RFP Evaluation Factors Pass/Fail Cover letter Legal Acknowledgement of receipt Financial ssenevisnopser lasoporP lageL laicnaniF SOQ responsiveness Technical/Quality Experience of firms Maintenance of traffic Past performance Third party coordination Demonstrated capacity Roadway and drainage design Organization and key managers Structures and geotechnical design yaw-fo-thgiR Public involvement Project management Project controls Source: UDOT (2012). TABLE 25 BEST VALUE EVALUATION CRITERIA ecnatropmI evitaleR srotcaF lacinhceT Maintenance of Traffic Critical Third Party Coordination Critical Roadway and Drainage Design Significant Structural and Geotechnical Design Significant tnacifingiS yaW-fo-thgiR tnatropmI tnemevlovnI cilbuP tnatropmI tnemeganaM tcejorP tnatropmI slortnoC tcejorP Source: UDOT (2012). TABLE 26 RELATIVE IMPORTANCE OF TECHNICAL FACTORS

45 • Time (proposer’s number of days for the substantial completion dates of the RFP); • Technical merit; • The risk analysis of the added value elements; and • The best technical score. UDOT notes that a proposal price carries the most weight in the best value selection process. The department has estab- lished a maximum limit to price proposals for best value selection. Each proposal within approximately 10% of the lowest price proposal will be evaluated for possible best value selection. To maintain fairness in the price proposal process, UDOT notes that proposals with prices that exceed this maximum limit are unlikely to be awarded. Evaluation Committee To establish a fair and uniform best value approach, UDOT uses three committees to evaluate proposals: (1) analysis committee, (2) evaluation committee, and (3) selection com- mittee. The analysis committee members consist of technical experts. The analysis committee analyzes and evaluates the proposals based on the goals, including: • Finding the facts within the proposals; • Identifying the added values, risks, strengths, and weak- nesses; and • Identifying any deficiencies. The evaluation committee typically consists of three to five members. The project manager or project director is a chair of this committee. The evaluation committee evaluates technical proposals, reviews ratings for technical factors, and assigns blinded aliases (blinded technical information) for each proposal. This committee must ensure that the evalu- ation process is based on RFP evaluation criteria. Although the evaluation committee chair offers one-on-one meetings to each proposer, any communication after these meetings must follow the processes outlined in the RFP. The number of meetings may vary depending on the size and complexity of the project. The selection committee consists of three UDOT senior leaders. This committee meets with the evaluation com- mittee early in the process to discuss the project goals and objectives. The selection committee evaluates and assigns an overall rating to technical proposals with the cost value of any price limit boundaries for technical enhancements. The selection committee also needs to approve updates to goals and evaluation criteria throughout the development of the project. The selection committee then reviews blinded technical infor- mation from the evaluation committee combined with blinded price proposals to make a determination of best value. Training UDOT conducts training for the analysis and evaluation com- mittee before the technical proposal evaluation process. The Rating SOQ Rating Guidelines Proposal Rating Guidelines Exceptional The proposer has provided information relative to its qualifications that is considered to significantly exceed stated objectives/requirements in a beneficial way and indicates a consistently outstanding level of quality. There are essentially no weaknesses. The proposer has demonstrated an approach that is considered to significantly exceed stated criteria in a way that is beneficial to the department. This rating indicates a consistently outstanding level of quality, with very little or no risk that this proposer would fail to meet the requirements of the solicitation. There are essentially no weaknesses. Good The proposer has presented information relative to its qualifications that is considered to exceed stated objectives/requirements and offers a generally better than acceptable quality. Weaknesses, if any, are very minor. The proposer has demonstrated an approach that is considered to exceed stated criteria. This rating indicates a generally better than acceptable quality, with little risk that this proposer would fail to meet the requirements of the solicitation. Weaknesses, if any, are very minor Acceptable The proposer has presented information relative to its qualifications that is considered to meet the stated objectives/requirements and has an acceptable level of quality. Weaknesses are minor and can be corrected. The proposer has demonstrated an approach that is considered to meet the stated criteria. This rating indicates an acceptable level of quality. The proposal demonstrates a reasonable probability of success. Weaknesses are minor and can be readily corrected Unacceptable The SOQ fails to meet the stated objectives and/or requirements and/or lacks essential information and is conflicting and/or unproductive. Weaknesses/deficiencies are so major and/or extensive that a major revision to the SOQ would be necessary and/or are not correctable The proposer has demonstrated an approach that indicates significant weaknesses/deficiencies and/or unacceptable quality. The proposal fails to meet the stated criteria and/or lacks essential information and is conflicting and/or unproductive. Weaknesses/deficiencies are so major and/or extensive that a major revision to the proposal would be necessary. Source: UDOT (2012). TABLE 27 EVALUATION GUIDELINES

46 agency believes that providing training for the analysis and evaluation committee members improves the consistency and fairness. The purpose of training is to ensure that the process is followed as outlined in the RFP, Instructions to Proposers, and Evaluation and Selection Manual. UDOT also notes that one of the roles and responsibilities of the evaluation committee is to train the analysis committee(s). In addition, the evaluation committee members need to (1) sign a confidentiality form and conflict of interest statement and (2) limit the communica- tion about the proposal with others during the evaluation pro- cess. The main points of the training are to focus on the roles and responsibilities of committee members so that each mem- ber can understand and comfortably perform his or her jobs. Debriefings UDOT conducts debriefings with both successful and unsuc- cessful proposers if requested. Debriefings often include a sum- mary of the rationale for the selection decision and highlight key points such as strengths, weaknesses, risks, innovations, or enhancements. The unsuccessful proposers are provided with a review of the comments about the strengths and weak- nesses of their proposals made by the evaluation commit- tee. The unsuccessful proposers also are allowed to review the winning proposals. A procurement official familiar with the rationale for the selection decision explains the evaluation process and how the score was established. The procurement official is also responsible for clarifying any ambiguity related to the evaluation process as well as answering any questions from proposers. Finally, UDOT maintains the proposals for the public record for up to a year. Lessons Learned The UDOT agency representative for this study provided the following lessons learned for developing and maintaining a transparent best value process: • UDOT provides a detailed description of the technical evaluation factors, the objectives and requirements for each evaluation factor, the relative importance of the technical evaluation factors, and the information to be submitted in its RFQs and RFPs. • Proposers correspond with the department regarding the RFP only through the department’s designated point of contact. Any communication determined to be improper may result in disqualification. • Evaluation teams must clearly document strengths, weaknesses, deficiencies, and risks associated with each criterion. • All personnel involved in the evaluation process must sign certifications of confidentiality and non-disclosure, and statements concerning conflicts of interest. • Consultant services make sure that all the proposals are blinded and marked with some aliases before forwarding them to the analysis committee for evaluation. • Differences of opinion between committees, and/or selection committee and the selection official, are addressed through consensus. Each side agrees on the resolution before moving to the next step in the process. • Process witnesses are appointed to ensure there is no bias toward any proposer and check whether the analysis and evaluation committee’s ratings align with the project goals and evaluation criteria. SUMMARY This chapter documents case examples and experience from the agencies that were found to have the greatest best value experience. The agencies use a wide variety of evaluation criteria and select these criteria to align with unique project goals. The study found that agencies use the adjusted bid, adjusted score, and weighted criteria award algorithms in combination with direct point evaluation rating methods to support transparency. The transparency stems from the con- cept that these algorithms most closely resemble low-bid procurement. However, other award algorithms and rating methods are in use on a project-by-project basis. For exam- ple, some agencies prefer to use adjectival ratings on com- plex D-B evaluations. The agencies provide project-based and/or programmatic training for best value procurement. Timely and comprehensive debriefings are common with these experienced agencies. These agencies provide examples of effective practices for industry outreach and continuous improvement of their best value processes. The most com- mon lessons learned focused on clarity of evaluation criteria, well-defined RFPs and evaluation plans, in-depth evaluator training, thorough debriefings, and open communications through a single point of contact.

Next: Chapter Five - Conclusions and Future Research »
Practices for Developing Transparent Best Value Selection Procedures Get This Book
×
 Practices for Developing Transparent Best Value Selection Procedures
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 471: Practices for Developing Transparent Best Value Selection Procedures examines practices related to the best value bid approach to procuring highway construction services. Best value procurement is a process to select the most advantageous offer by evaluating schedule, technical merit, management solutions, and past performance in addition to price.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!