National Academies Press: OpenBook

Best Value Procurement for Highway Construction: Legal Issues and Strategies (2023)

Chapter: II. BEST VALUE PROCUREMENT PROCESS

« Previous: I. INTRODUCTION
Page 8
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 8
Page 9
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 9
Page 10
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 10
Page 11
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 11
Page 12
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 12
Page 13
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 13
Page 14
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 14
Page 15
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 15
Page 16
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 16
Page 17
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 17
Page 18
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 18
Page 19
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 19
Page 20
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 20
Page 21
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 21
Page 22
Suggested Citation:"II. BEST VALUE PROCUREMENT PROCESS." National Research Council. 2023. Best Value Procurement for Highway Construction: Legal Issues and Strategies. Washington, DC: The National Academies Press. doi: 10.17226/27175.
×
Page 22

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

8 NCHRP LRD 90 dence of bid protests for best value procurement. On the other hand, the exibility that allowed the procuring agency in a best value procurement makes it dicult for protesters to argue that the agency failed to comply with applicable requirements, so long as the evaluators comply with the requirements specied in the RFP and applicable law. Furthermore, many protests re- lating to best value procurements concern “pass/fail” factors or other situations that also provide a basis for protests for DBB procurements. While Figure 3 illustrates that the 33 transpor- tation agencies surveyed perceive that there is a greater risk of protests for best value procurements than for low bid, it is not evident from case law that there is a greater risk of successful bid protests for best value procurements, as relatively few best value cases have been published. e courts rarely overturn the agency’s decision. A. Best Value Procurement Approach One of the rst steps in best value procurement is to develop a methodology for making the selection decision. One tool that many agencies have used as part of the RFP planning process for complex highway projects involves a workshop, including agency and consultant representatives, to discuss the project scope and the agency’s goals and objectives for the project, to identify critical proposal components, and to make preliminary determinations regarding criteria for evaluating the submittals and their relative importance. e workshop sets the framework for obtaining ap- provals from senior management if needed and provides infor- mation necessary to dra the instructions to proposers. A wealth of literature, both nationally and internationally, addresses the best value selection components.23 NCHRP Report 23 U. Ojiako et al., What is Best Value in public sector building con- struction? Proceedings of the Institution of Civil Engineers - Management, Procurement and Law, 2014. delays to the project, and reputational harm and diversion of resources due to litigation. Information and lessons learned from protests should be used to modify the best value procurement approach and the evaluation and scoring process. A key strategy for mitigating the risk of a successful protest is to ensure that the implementation of the best value process is well documented. is is also useful for collecting lessons learned to improve the agency’s best value procurement ap- proach for future procurements. II. BEST VALUE PROCUREMENT PROCESS A best value procurement process is more complex than a low bid procurement process. Best value procurements involve signicant up-front investment to develop the instructions to proposers, determine evaluation criteria based on the project goals, objectives, and characteristics, assemble the evaluation team and a support team which may include stakeholder repre- sentatives as well as agency and consultant personnel, establish a system for evaluations and develop award algorithms, and set up communication channels with proposing rms to ensure that the procurement process is fair and consistent for all propos- ing rms. Attention to detail in all these areas is vital to a suc- cessful best value procurement process. Communication proto- cols are particularly important to create if the process includes ATCs and one-on-one meetings. As discussed in Section  III, the procurement package must be carefully draed to ensure that the requirements for submitting and evaluating proposals are clearly set out. e procuring agencies must strictly adhere to the process detailed in the procurement package to avoid grounds for complaints or protests from proposers. e added complexity associated with best value procure- ments would logically tend to increase the potential for errors in the procurement process, potentially resulting in a higher inci- Figure 3. Perception by trans ortation a encies in potential bid protests occurring in best value and low bid procurements (n = 33 state transportation agencies). 6% 24% 24% 45% 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 50% Low bid procurements are more likely to be subject to a bid protest than best value procurements Low bid and best value procurements are equally likely to be subject to a bid protest Not Sure Best value procurements are more likely to be subject to a bid protest than low bid procurements

NCHRP LRD 90 9 561: Best-Value Procurement Methods for Highway Construc- tion Projects describes the four concepts (best value parameters, evaluation criteria, evaluation rating system, and award algo- rithms) that are shared in common by best value procurements for multiple highway construction projects and provide a foun- dation for the best value selection process, as shown in Figure 4. ese concepts enable the agency to establish a framework for the best value procurement process that facilitates fairness and transparency.24 B. Best Value Parameters e best value parameters commonly include the basic performance triangle of construction (cost, time, and quality), with the addition of technical, qualication, and performance parameters.25 Once the agency identies the goals for a project, it can then determine relevant parameters and consider their order of importance. at, in turn, enables the agency to devel- op its evaluation criteria, rating systems, and award algorithms consistent with its best value parameters (see Figure 4). us, identifying best value parameters for a project is the rst step in executing a transparent best value procurement.26 Some com- mon best value parameters used by transportation agencies are outlined below. 1. Cost/Price Although cost is usually a highly signicant parameter in best value award decisions, best value procurements allow the agen- cy to balance cost against other project parameters to determine the best value proposal. Unlike low bid procurements, which typically look at a xed price or unit prices applied to estimated construction quantities, best value procurements costs may be 24 See Tran, supra note 3. 25 See Tran, supra note 3. 26 See Molenaar, supra note 1. divided into three dierent components: preconstruction costs, initial costs (capital costs), and life cycle costs. Where a best value selection process is used for DBB, xed price DB, or P3 projects, the initial costs will be an easily ascer- tainable amount that can be considered in the tradeo process or converted to a number for purposes of a formulaic selection. For Design-Build-Operate-Maintain (DBOM) and P3 projects, the evaluation must account for operations and maintenance costs which are dependent on variable factors such as future trac or ridership projections. For P3 projects, the evaluation must address nance-related elements, which require a subjec- tive evaluation. For PDB/P3 and CMGC projects, where the contractor is selected at an early stage of the project design, pro- posers will not be able to determine initial costs with any degree of certainty. e price evaluation will need to focus on pricing elements such as preconstruction costs, fees and markups on initial costs, or unit prices to the extent that the project includes components for which unit pricing is feasible.27 A life cycle cost analysis may be appropriate for a best value selection process where the proposer submits design concepts together with a xed price since it allows the agency to con- sider whether savings may justify the higher initial costs of a proposed design over time. However, if the agency wishes to consider life cycle costs as part of the evaluation process, it must require the proposer to submit sucient information to enable the evaluators to assess lifecycle costs. A life cycle cost analy- sis is simpler when the contract also includes operations and maintenance services, allowing the agency to rely on the pricing submitted for those services in determining whole life costs. But even if that is the case, the life cycle cost analysis will necessarily involve a degree of judgment by the evaluators before the op- erations and maintenance pricing can be incorporated into the award algorithm since the operations and maintenance period will likely be shorter than the useful life of the facility, costs will be subject to escalation and changes, and the agency may incur costs during the operation and maintenance period separate from the proposed amounts. 2. Time/Schedule Best value procurement encourages proposers to focus on time-cost tradeos that can optimize the project value to the agency and the users. is contrasts with most DBB projects, where the agency sets project schedule goals such as comple- tion dates and lane opening milestones. A+B bidding is gener- ally considered a low bid selection methodology, with days to completion given a dollar value added to the contract price. Low bid contracts oen include lane rental provisions to encourage the contractor to perform work eciently during times when it is less likely to disrupt trac. Both concepts can be used in 27 An early PDB contract, for the Foothill-South Transportation Corridor in Orange County California, included descriptions of “units” such as lineal foot of pavement and incorporated unit prices, providing for escalation to be applied and allowing the contractor the opportunity to request adjustments to the unit prices based on changed circum- stances. e project did not move forward to the construction phase, so this method was never tested. Figure 4. Best value concepts.a a See Scott, supra note 5.

10 NCHRP LRD 90 considered in the evaluation process. An agency may further include a formal ATC process in the procurement, whereby pro- posers are permitted to submit, for agency approval in advance of the preparation of the proposal, proposed deviations from the technical requirements for the project that may be included in the proposal.28 In many cases, proposed design solutions are consistent with the contract specications and do not require the submittal of an ATC. C. Evaluation Criteria 1. Establishing Criteria Agencies establish evaluation criteria (also called evalua- tion factors) based on specic project goals, objectives, and characteristics. e agency must develop a clear evaluation plan and associated criteria to make the selection process as fair and understandable as possible so that the agency makes an appro- priate selection, the proposers understand the rationale for the selection, and the potential for bid protests or complaints is reduced. Best value involves consideration of price-related ele- ments (for example, a xed price, unit pricing, or payments to be made under a P3 agreement) as well as other factors such as technical and performance. e evaluation process also in- volves reviewing compliance with RFP requirements and as- sessing other “pass/fail” factors which may include matters such as minimum qualications, nancial capacity, acceptable safety record, and provision of a responsive proposal. Technical factors dier based on the type of project and the agency’s goals and objectives, but may include matters such as design solutions, construction staging and trac management, environmental compliance approach, project management approach, project understanding, third party coordination, quality management approach, risk mitigation approach, public outreach plan, subcontracting plan, and past performance. For DBOM and P3 projects, evaluation factors typically include an approach to operations, maintenance, and handback (the requirements and process for the transfer of operations and maintenance of the project to the agency at the end of the contract term). For design-build-nance-operate-maintain (DBFOM) and P3 proj- ects, factors would include the feasibility of the nancing plan and cost-related information included in the nancial proposal. Figure 5 provides the percentage of the 35 transportation agen- cies surveyed that utilize each evaluation criterion in their best value procurement processes. 2. Difficulty of Evaluation Each of the best value criteria identied in the RFP must be evaluated. Some criteria are easier to evaluate than others, re- sulting in greater transparency and reduced risk of claims that the process was unfair. Criteria requiring a more subjective evaluation make the evaluation process more dicult. Figure 6 shows the survey results regarding the ease of evaluating specic 28 FHWA permits the use of ATC processes in procurements for federally funded DB projects, provided the process complies with 23 CFR § 636.209. best value procurements for alternative project delivery method projects. In addition, best value procurements allow the agency to consider the proposers’ plans to accelerate the schedule with reference to the eect of the acceleration plans on the whole project and the surrounding community. 3. Qualifications/Past Performance Best value procurement typically involves a shortlisting process to ensure that only the most qualied rms submit proposals, as opposed to a prequalication process that allows participation by all contractors meeting minimum require- ments. Shortlisting assures that the qualied rms most likely to meet the agency’s goals will be interested in proposing, without the need for concern that a less qualied rm might bid low to “buy the job.” e shortlisting process entails a review of the same type of information typically included in prequalication questionnaires, such as nancial strength, experience with rel- evant projects, involvement in fraudulent activities, and debar- ment, but can also include matters such as qualications of key personnel proposed for the project and past performance record on previous projects. FHWA’s Design-Build Rule includes certain requirements that must be accounted for in evaluating past performance in DB best value procurements. 23 CFR § 636.205 specically allows past performance to be used as an evaluation criterion, among other things requiring the agency to describe the ap- proach to evaluating past performance in the RFQ/RFP and to permit proposers to provide information regarding corrective actions taken to address problems on prior projects. Section 636.206 provides that, if a proposer has no record of relevant past performance or if information on past performance is un- available, the proposer may not be evaluated favorably or unfa- vorably on past performance. e Design-Build Rule also includes restrictions concerning duplication of criteria in both the RFQ and RFP. Subject to cer- tain exceptions, 23 CFR § 636.303(a) states that prequalication criteria should not also be included as proposal evaluation crite- ria, and § 636.303(b) limits proposal evaluation criteria “to the quality, quantity, value and timeliness of the product or service being proposed.” 4. Quality Best value procurement allows agencies to assess factors rel- evant to ultimate project quality, such as the proposer’s quality management plans and proposed warranties exceeding contract requirements. Key personnel and past performance are also interrelated with project quality and can be reviewed as part of the quality criterion, subject to the constraints discussed in Section  3. In addition, DBOM and P3 contracts may include operations and maintenance, further assuring project quality. 5. Design Solutions Best value procurements, especially in DB and P3 project delivery, permit agencies to take advantage of contractors’ inno- vation by allowing contractors to propose innovative design solutions or ecient and creative constructability ideas to be

NCHRP LRD 90 11 Figure 5. Frequency in using different evaluation criteria in best value procurement (n = 35 state transportation agencies). 23% 31% 34% 54% 57% 66% 71% 71% 74% 74% 77% 77% 83% 89% 91% 91% 94% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Financial plan Operations and maintenance factors Financial capability Right of way/utilities/third parties' coordination Public information and communications Management capability Past performance Project delivery approach Safety compliance Environmental compliance Quality management Technical capability Traffic management Project management Time/schedule Technical approach/technical solutions Price/cost 49 52 53 53 55 55 57 59 59 59 61 62 62 66 66 70 79 0 10 20 30 40 50 60 70 80 90 100 Right of way/utilities/third parties' coordination Project management Quality management Safety compliance Management capability Public information and communications Operations and maintenance factors Technical capability Project delivery approach Environmental compliance Financial plan Technical approach/technical solutions Past performance Traffic management Financial capability Time/schedule Price/cost Ease of Evaluation (0 = very difficult, 100 = very easy) Figure 6. Ease in evaluating best value criteria (n = 33 state transportation agencies).

12 NCHRP LRD 90 be crucial for achieving project success.29 30 31 Evaluation criteria are categorized into price/cost, performance, technical solu- tions, and management solutions (see Table 2). Agencies should be aware that certain criteria may be more likely to result in protests than others. One area that presents a relatively signicant risk of challenge concerns past perfor- 29 See D. Gransberg, Does Low Bid Award Facilitate Wrongdoing? US Implications of Quebec’s Charbonneau Commission Report, 12 J. Leg. Aff. Dispute Resolut. Eng. Constr. (2020). 30 See Tran, supra note 3. 31 See M. Abdelrahman et al., Best value model based on project spe- cic characteristics, J. Constr. Eng. Manag. (2008). criteria, with a score from 0-100, with 100 being very easy to evaluate and zero being very dicult to evaluate. Price is the easiest criterion to evaluate according to the data collected in the national survey questionnaire, while evaluating ROW/ utilities/third-party coordination presents the most diculty. Selecting criteria that present an easier approach for the evalu- ation team based on their experience and knowledge can help a transportation agency and might therefore reduce the potential for protests associated with best value procurements. e fact that certain criteria may be dicult to evaluate does not mean that the agency should avoid considering them. In fact, one of the reasons for using best value procurements is to allow non-cost criteria to be considered since such criteria can Table 2. Categorized Evaluation Criteriaa Category Evaluation Criteria Parameter Price/Cost Price Evaluation Cost Low Bid Cost Lifecycle Cost Lifecycle Costs Financial & Bonding Requirements Prequalication Performance Past Experience/Performance Evaluation Past Project Performance Safety Record (or Plan) Past Project Performance Current Project Workload Past Project Performance Regional Performance Capacity (Political) Past Project Performance Key Personnel & Qualications Personnel Experience Technical Solutions Project Schedule Evaluation Time Trac Management Trac Control Technical Proposal Responsiveness Performance Specications Innovation & Aesthetics Performance Specications Site Utilities Plan Performance Specications Coordination Performance Specications Construction Methods Quality Parameter Using Performance Indicator Proposed Design Alternate & Experience Design with Bid Alternate Mix Designs & Alternates Performance Specications Environmental Protection/ Consideration Performance Specications Site Plan Performance Specications Management Solutions Utilization of Small Business Subcontractor Information Subcontractor Evaluation/Plan Subcontractor Information Management/Organization Plan Project Management Plans Construction Warranties Warranty Warranty Credit Warranty Credit Construction Engineering Inspection Quality Parameter Measured with % in Limits Quality Management Quality Management Plans Cultural Sensitivity Performance Specications Incentives/Disincentives Incentives/disincentives a See Scott, supra note 5.

NCHRP LRD 90 13 the likelihood of proposer challenges to award decisions. e evaluators in a best value procurement process rate price along with more subjective, non-price factors, thus potentially leading to allegations of unfairness.34 Considerable thought and analysis should be used in forming these teams, as well as in training evaluators regarding the criteria and process for evaluations to avoid claims of bias or inconsistency. e technical evaluation team members must be knowledgeable regarding both the tech- nical and the programmatic aspects of a highway construction project, enabling them to consider how specic proposals align with the project goals and objectives. It is also critical to assure that the technical evaluators have appropriate experience, with no personal stake in the outcome or other bias in favor of or adverse to any of the proposers.35 e team should include a balanced representation of the various technical,  managerial, nancial, and programmatic experiences needed to assess the criteria with reference to the project objectives and scope (such as designers and engineers, lawyers, construction managers, project managers, nancial advisers, and others). Figure 7 shows dierent types of disciplines commonly in- cluded on evaluation teams and the frequency of their use by the agencies surveyed. Best value procurement evaluation teams are oen limited to public agency representatives with relevant expertise (i.e., from procurement, project management, engineering/design, preconstruction, legal, nancial, contract- ing, and planning), who are supported by third-party consul- 34 See Tran, supra note 3. 35 Id. mance evaluation, which involves the potential for reputational harm to the proposer and presents due process concerns.32 How- ever, as noted above, most protests are resolved in favor of the agency and do not necessarily aect the selection decision. One example concerns a protest of an agency’s shortlisting decision where a proposer was not included on the shortlist due to its past performance record. In response to an appeal by the unsuc- cessful proposer, the State comptroller held that the evaluation of current claims, delays, and liquidated damages was consistent with the RFP description of the process for evaluation of a pro- poser’s past performance. e comptroller also acknowledged that procuring agencies should be given deference in developing appropriate evaluation factors to determine the most qualied rms to be shortlisted for a project.33 D. Evaluation Team Selection and Use e structure and hierarchy of an evaluation team and the makeup of the team, including technical, managerial, and pro- grammatic personnel, are relevant in assessing the transparency and fairness of the process, as well as in addressing organiza- tional conict issues. Careful attention to the selection of a pro- posal evaluation team is essential since the team plays a signi- cant role in ensuring fairness and transparency, thus reducing 32 See Scott, supra note 5. 33 In the matter of the Appeal led by Tully, Impregilo/Salini, JV chal- lenging the Determination of the NY. State Dep’t of Transp. concerning Selection of Shortlisted Contractors for the Kosciuszko Bridge Project SF20130339 (State of NY Oce of the State Comptroller Aug. 17, 2013). 30% 31% 36% 51% 51% 63% 73% 74% 74% 86% 87% 87% 87% 89% 92% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 3rd party representatives Operations & maintenance manager/director Representatives from other agencies Operations & maintenance personnel Planning personnel Contracting officer 3rd party procurement consultants Financial manager/director Construction division manager/director In-house legal Procurement and contracts group Pre-construction manager Engineering / design Project manager / Resident engineer Procurement division manager/director Figure 7. Frequency using common evaluation team members for best value procurements (n = 31 state transportation agencies).

14 NCHRP LRD 90 A technical evaluation committee is composed of subject matter experts (SMEs) who are responsible for reviewing pro- posals and providing recommended ratings in their areas of expertise. For complex procurements, the committee may have multiple subcommittees tasked with considering dierent as- pects of the proposal, with a committee member acting as the chair of each subcommittee. Aer completing technical evalua- tions, the committee submits a strengths and weaknesses analysis of each proposal to the selection committee to assist it in making rating decisions. e committee members/subcommittee chairs are usually public agency representatives to avoid potential or- ganizational conicts. Some agencies include consultants as full members of technical evaluation subcommittees, while others prefer to limit them to an advisory role. A process oversight committee is a non-scoring committee whose members function as observers of the evaluation process for consistency. Members could include FHWA representatives, transportation agency executives or consultant representatives. Sometimes, the agency appoints a single individual to play this oversight role, as was the case for the Tappan Zee project in New York, which utilized a single oversight consultant. tants. e legislation applicable to the Minnesota Department of Transportation (MnDOT) includes an unusual requirement in that it requires a Technical Review Committee of at least ve individuals, including one person nominated by the Minnesota chapter of the Associated General Contractors (AGC) and ap- proved by the Transportation Commissioner.36 is require- ment is intended to reduce the incidence of protests based on the premise that proposers are more likely to trust a process that one of their peers is a part of. Notably, the Minnesota Supreme Court upheld MnDOT’s selection of the high bidder for the DB contract to reconstruct the I-35W bridge aer its 2007 collapse based on a determination that the agency followed the statutory procedures—including the requirement to have an individual nominated by the AGC of Minnesota as a member of the evalu- ation committee.37 Transportation agencies hire third-party procurement specialists as advisors for best value procurement in highway construction. ird-party consultants can also help agencies analyze the RFP and best value procurement process to identify issues that could lead to bid protests or complaints. However, while third-party consultants may have experience and knowl- edge vital to the best value procurement process, agencies will need to make business decisions with respect to the best value procurement, relying on experience and knowledge provided by the third-party consultants in areas where the agency personnel lack relevant experience. A survey and review of case studies of state transportation agencies identied practices used to ensure clarity, fairness, impartiality, and transparency when implementing best value procurement.38 ese practices include thoughtful formation of the evaluation committees and the structure for those com- mittees. e makeup of the best value evaluation team diers slightly from agency to agency and project to project. As shown in Table 3, many agencies generally include: A selection committee that evaluates the overall proposal (technical and price) in accordance with the RFP. Members of the committee conduct their own independent reviews but, in many cases, will rely on the input received from technical evalu- ation committees/subcommittees. e committee may make the selection decision based on a majority vote or using a con- sensus approach. e committee chair is responsible for docu- mentation of the process and results. e selection committee may rely on advisors (e.g., third-party procurement specialists) to assist with the process and advise on RFP and other require- ments relevant to the selection committee’s decisions. 36 Minn. Stat. §§ 161.3420. 37 Sayer v. Minn. Dept. of Transp., 790 N.W.2d 151 (Minn. 2010). See also C. Lopez del Puerto et al., Emergency Megaproject Case Study Protest: e Interstate Highway 35 West Bridge, J. Leg. Aff. Dispute Resolut. Eng. Constr. (2017), available for download at: https://www. google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact= 8&ved=2ahUKEwjh96fFyer4AhUIIzQIHaT_DBwQFnoECAYQAQ&url= https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F315321639_ Emergency_Megaproject_Case_Study_Protest_e_Interstate_Highway_ 35_West_Bridge&usg=AOvVaw3SAxw8clyUraRd9sagbY_W. 38 See Tran, supra note 3. Table 3. Sample of State DOT Evaluation Committee Structuresa State DOT Evaluation Committee Structure California Technical Review Committee Technical Subcommittees Process Oversight Committee Florida Selection CommitteeTechnical Review Committee Michigan Technical CommitteeCentral Selection Review Team Member Minnesota Selection Committee Technical Review Committee Technical Subcommittees Process Oversight Committee Technical Advisors New York Selection TeamTechnical Evaluation Team Oregon Selection Ocial Scoring Team Facilitator Technical Evaluation Support Personnel Observers Utah Selection Committee Technical Evaluation Committee Technical Review Committee a See Tran, supra note 3.

NCHRP LRD 90 15 study, including internally developed evaluation rating systems, which may be: • Adjectival-based rating systems with consensus rating or other methods of determining rankings for each proposal in each criterion. In some cases, evaluators assign adjectival rates that are later converted to scores. e ratings for indi- vidual criteria are then combined to determine an overall rating, with reference to the relative order of importance stated in the RFP. • Point-based rating system with weightings assigned in the RFP; each proposal is evaluated against the criteria, weight- ings applied, and scores totaled. Generally, agencies use a 100-point scale. • Determining an overall score using a consensus approach or averaging individual member scores. One problem with determining the overall rating based on av- eraging is that outlying ratings may skew the result. Some systems address the possibility of wide variations in scores by eliminat- ing the high and low scores and then taking the average of the individual members’ scores. However, this does not necessarily produce a better result, as the outlying rating may be based on an issue noted by the evaluator that was not readily apparent to the others. is concern can be eliminated by using a consensus rating system coupled with documentation of rationales for the rating so that the proposers understand the agency’s rationale for the selection and reduce the likelihood of bid protests. Within each system, the rating scheme and relative impor- tance of criteria vary depending on the project. e relative weightings are typically set based on their relationship to the achievement of prioritized goals. In some cases, the enabling law requires certain criteria to be included and supplemented by additional criteria developed by the agency related to the goals of the project. Technical ratings are combined with price as specied in the RFP. E. Evaluation Rating Systems and Award Algorithms An evaluation rating system is a tool that assists in the trans- parency and fairness in the best value selection process when applied consistently to all proposals. ese rating systems dene how transportation agencies allocate weights or scores for each evaluation criterion. In addition, the agency must decide which award algorithm to use to combine the evaluation criteria and evaluation rating system to determine the recommended award for a highway project. 1. Evaluation Rating Systems Figure 8 illustrates the common rating systems used by transportation agencies for best value procurements (Satisc- ing, Modied Satiscing, Adjectival Rating, and Direct Point Scoring, which are described below), with the systems orga- nized by increasing complexity from le to right. Figure 9 shows the frequency of the various evaluation rat- ing systems used by the transportation agencies surveyed in this Figure 8. Best value evaluation rating systems and level of complexity.a a See Scott, supra note 5. 62% 62% 77% 80% 89% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Modified Satisficing Satisficing Adjectival Rating Direct Point Scoring Internally Developed Rating System Figure 9. Frequency of using different best value evaluation rating systems (n = 32).

16 NCHRP LRD 90 Satisficing is rating system, also known as “Go/No Go” is the simplest and easiest among all evaluation rating systems. Evaluators es- tablish minimum standards for each evaluation criterion in the form of a statement of the minimum level of compliance that must be met by a proposal to be considered acceptable. en, based on the minimum values, evaluators decide if proposals are considered acceptable or not. Satiscing is an all-or-nothing process, and it is not critical to determine an accurate value for alternatives, as alternatives are either accepted or not accepted. e main advantage of using satiscing is that the system can reduce the number of alternatives to evaluate. e primary dis- advantage of using a satiscing rating system is that it is not an appropriate evaluation methodology when an agency wants to evaluate value-added features or alternative designs. Best value proposals include a satiscing element since agencies typically review proposals on a pass-fail basis, includ- ing determining whether the proposal is responsive to the RFP requirements. To support the transparency of the procurement process and more easily defend against protests, the agency should consider in advance what might be considered a fatal deciency and what consequences would apply (for example, whether the proposer would be allowed an opportunity to cure or whether the agency would reject the proposal). In addition, the agency should ensure that the procurement le includes a narrative justication for the decisions made. Bid protests related to this aspect of a best value evaluation are generally led by proposers alleging they were dropped from consideration due to minor deciencies or by proposers objecting to the agency’s decision to waive deciencies. Two cases involving DBB projects are illustrative of the former type of protest. In DeSilva Gates Construction, LP v. Caltrans,39 Caltrans rejected DeSilva’s bid for an alleged failure to comply with a California law requiring contractors to identify, with their bids, all subcontractors performing work over a speci- ed amount. Aer the bid due date, the protester submitted a bidders list that included a subcontractor not identied in the original bid. is information exceeded the IFB requirements because the subcontract was below the statutory threshold for listing. e trial court upheld the protest, ruling that the statute does not prohibit listing subcontractors below the threshold and that rejection of the bid was improper. e appellate court upheld the trial court’s decision. In addition to determining that DeSilva’s bid did not violate the listing law, the court noted that Caltrans had allowed the next lowest bidder to cure a de- ciency in its bid, but did not oer DeSilva a similar opportunity, thus giving the next lowest bidder an unfair advantage. In the West Virginia case of Wiseman Constr. Co. v. Maynard C. Smith Constr. Co.,40 the procuring agency had rejected a bid for failure 39 DeSilva Gates Construction, LP v. Department of Transporta- tion, 242 Cal. App. 4th 1409, 1412 (Cal. Ct. App. 2015). Available for download at: https://casetext.com/case/desilva-gates-contruction-lp-v- dept-of-transp. 40 Wiseman Constr. Co. v. Maynard C. Smith Constr. Co., 236 W. Va. 351, (W. Va. 2015). Available for download at: https://casetext.com/ to include references. e court held that the agency failed to act reasonably in rejecting the bid since (1) the agency admitted the bidder was qualied, (2) the IFB required bidders to identify references on the forms provided, but did not include a form to list the references, (3) only three of the six bidders provided reference information, and (4) the agency did not contact any of the references provided by the other three bidders. e court determined that the agency would have had full discretion to reject the bid had the bid forms included a placeholder for ref- erences, but since the agency’s error caused the problem and it did not rely on the references in making the award decision, the agency’s failure to waive the requirement constituted an arbi- trary abuse of discretion. e following cases provide examples of protests relating to agency decisions to waive deciencies. Silver Bow Construction v. State Department of Administration Division of General Ser- vices41 involved an Alaska agency’s decision to accept a 15-page proposal responding to an RFP for DB services that limited the page count to 10 pages, based on the procurement ocer’s conclusion “that Alaska Commercial’s proposal did not contain more substance than the others, that it was not in the State’s best interest to ‘needlessly reduce competition’ by disqualifying ac- ceptable proposals ‘strictly on form,’ and that all four proposals had technical deciencies.” e Court held that the agency did not abuse its discretion in waiving the requirement and did not violate the protester’s equal protection rights. In Fluor-Astaldi- MCM v. Florida DOT,42 concerning an RFP for a P3 project, a losing proposer raised numerous objections to the agency’s de- cision to award the contract to another proposer. Among other grounds for protesting the award, the protester alleged that the contract time proposed by the successful rm was unrealistic, rendering the proposal non-responsive (the agency had consid- ered this issue in the evaluation and concluded the proposed contract time was aggressive but feasible). e protester further claimed that the nancial proposal was non-responsive since its lender was not providing the full credit needed and relied on a pool of banks to make up the dierence and objected to the process followed by the agency in determining that the nan- cial proposal was acceptable (the RFP did not require a specic credit amount or preclude the use of a pool of creditors and per- mitted the agency to seek clarication of the nancial proposal). e protest also included other allegations of deciencies in the technical and nancial proposals. e Department of Admin- istrative Hearings applied Florida law to nd that the protester failed to prove that the agency’s decision to accept the proposal case/wiseman-constr-co-v-maynard-c-smith-constr-co. 41 Available for download at: https://law.justia.com/cases/alaska/ supreme-court/2014/s-15087.html. 42 Recommended Order by Division of Administrative Hearings available for download at: http://rules.elaws.us/doahcase/17-005800bid. Petition for writ of certiorari denied by Court of Appeal, available for download at: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source= web&cd=&ved=2ahUKEwjCkquouuz4AhW4KkQIHTn3D3sQFnoEC AgQAQ&url=https%3A%2F%2Fcases.justia.com%2Fflorida% 2Ffirst-district-court-of-appeal%2F2017-17-4726.pdf%3Fts% 3D1510610858&usg=AOvVaw2-Pjk8dtKSUKIgznMTpgT4.

NCHRP LRD 90 17 the DeSilva case, supra) the circumstances require the agency to waive a particular deciency. Adjectival Ratings (Definitions, Performance Indicators, and Differentiators) As an extension of modied satiscing, the adjectival rat- ing system includes ratings such as “acceptable,” “good,” “very good,” and “exceptional,” providing a description for each rating and explaining what it means in the context of proposal evalua- tions. e agency provides training to evaluators to ensure that they are aware of the requirements for dierent ratings with respect to the evaluation criteria and that each team member is “on the same page” with respect to the prerequisites for the ratings. is system involves a more continuous scale than modied sucing. e three elements of an adjectival rating system are deni- tions, performance indicators, and dierentiators. Denitions must be clear and relevant to the specic evaluated factors. Indicators for each of the adjectival ratings assist evaluators in determining the appropriate ratings for evaluation elements (e.g., Excellent, Very Good, Good, Fair.). Adjectival ratings are generally considered to improve consistency among evaluators since they create a common frame of reference, but consistency can be improved by establishing dierentiators for specic grades/ratings. Table 4 includes an excerpt from a recent state highway agency RFP, showing how adjectival ratings are deter- mined for certain evaluation criteria. Direct Point Scoring is evaluation system is the most complex since it allows for more rating levels. Although numerical scores may appear to provide a more precise distinction of merit and create an ap- pearance of objectivity due to the quantitative values produced, in fact, the process allows the evaluators to have signicant dis- cretion in determining ratings and is thus inherently subjective. To some extent, the level of discretion can be limited by requir- ing evaluators to apply predetermined scales or preferences in making the rating decisions. RFPs for direct point scoring proj- ects should dene the scoring system in a way that allows the proposers to understand what the agency is looking for. e main advantage of direct point scoring is the exibil- ity it provides the evaluators. However, this method presents a number of challenges. First, it is not uncommon for evaluators to spend time trying to determine a precise score, and the focus on minutia may result in compression of scoring, with the result that the dierential in scores between one proposer and another may not accurately reect the relative values of the dierent proposals. Proponents of adjectival ratings believe that they make it easier for the evaluator to focus on important aspects of the proposals instead of making scoring distinctions based on minor dierences. In addition, direct point scoring may present challenges in resolving potential inconsistencies in scoring be- tween individual evaluators. For example, reaching a consensus regarding the numerical score for an evaluation factor may be more dicult than agreeing on whether or not the factor meets the requirements for a specic adjectival rating. Some of these “was clearly erroneous, contrary to competition, arbitrary or capricious.” In some cases, proposers may protest non-responsiveness determinations despite clear language in the RFP. A protest led by Bombardier objecting to the rejection of its proposal for the Core Systems DBOM Contract for the Honolulu High–Capacity Transit Corridor Project43 concerned an exception taken by the proposer to the contract provisions relating to limitations on liability. e proposer objected to the contract language in multiple submittals and engaged in discussions with the agency, asking for the provision to be changed, but the agency retained the existing language. e proposer submitted a best and nal oer (BAFO), including a statement that the proposal assumed that the contractor’s indemnication obligations were inadver- tently omitted from the cap on liability. e agency determined that the BAFO included an unacceptable condition and rejected it. e Administrative Hearings Ocer for the protest deter- mined that the protest was untimely, the proposal was properly rejected as conditional, the integrity of the procurement pro- cess would be undermined if the proposal was upheld since the agency had already published all of the proposed prices, and the city had satised its duty to conduct meaningful discussions with the proposer. e circuit court upheld the hearings ocer’s decision. e appellate court held that the protest was timely but that the error was harmless because the city had satised its duty to conduct meaningful discussions, and the proposal was properly rejected as conditional. e court also held that Bombardier’s protest arguments would undermine the integrity of the procurement process. Modified Satisficing is rating system relates to varying degrees of responsive- ness by dierent proposers. Modied satiscing uses a range of possible ratings to allow evaluators to rate a given category of a proposal across a variety of degrees. erefore, nearly responsive proposals can be given lower ratings instead of being rejected, which means that proposals that meet or exceed the published criteria would receive higher ratings. e agency must be able to dierentiate between minor deciencies that do not eliminate a proposal from continuing in the competition and major or fatal deciencies that require a proposal to be rejected. e simplest form of modied satiscing is a discrete rating system of “Red- Amber-Green.” Green = Fully responsive; Amber = Not respon- sive, but deciency is minor; Red = Not responsive due to fatal deciency. e last step in evaluating proposals with a modied satiscing system is to combine ratings for each criterion to establish an overall rating for each proposal. Bid protests related to this system might occur due to a lack of clarity in the RFP regarding deciencies that automatically result in a “fail” determination and ambiguities in the rating system used. Agencies must also consider whether waivers are permitted under applicable law and whether (as was the case for 43 Bombardier Transp. (Holdings) USA Inc. v. Dir., Dep’t of Budget & Fiscal Servs., 289 P.3d 1049 (Haw. Ct. App. 2012).

18 NCHRP LRD 90 each proposer’s price by its score to obtain an adjusted score. e department calculates costs divided by the technical score and awards the proposer with the lowest adjusted number. Examples of internally developed algorithms used by state transportation agencies surveyed include: • 30/70 to 40/60 technical score/price score weighted break- down with highest overall score determining apparent best value. Example (assuming max technical score = 100): Oeror Score = 60 × lowest bid + 40 × Oeror tech scoreOer bid 100 • Hybrid cost algorithm using a curve tting formula that reduces the price score with increasing price proposals. • Assignment of a Quality Credit Percentage to each Tech- nical Proposal based on the proposal’s overall consensus Technical Score. In order to determine the Quality Credit Percentage, the State Contract Ocer uses a table based on the maximum quality credit percentage, which is estab- lished for each project. e Technical Review Committee may elect to assign point values to the nearest one-half of a point. In such an event, the Quality Credit Percentage is determined by linear interpolation. e Quality Credit Percentage is used along with cost comparison in weight breakdown based on the project. diculties may be resolved by providing the evaluators with training and an explanation of dierentiators between dierent ratings. 2. Award Algorithms Best value award algorithms establish the method that will be used to combine best value parameters, evaluation criteria, and evaluation rating systems to enable the selection committee to make a nal award recommendation. As shown in Table 5 and explained below, transportation agencies use several dier- ent methods. Award algorithms for best value procurement can be classied by three types: (a) meets technical criteria–low bid, (b) value unit price (e.g., xed budget–best proposal, adjusted bid, adjusted score, weighted criteria, and quantitative cost– technical tradeo), and (c) qualitative cost–technical tradeo. Below is a summary of the award algorithms, including each value unit price option and how transportation agencies might apply each algorithm to reduce the potential of bid protests or complaints. Figure 10 shows transportation agency use frequency for various award algorithms, including general and internally developed algorithms. Statutory requirements for algorithms constrain some agencies. For example, Arizona’s DB law requires the DOT to announce the technical proposal score for each pro- poser, then publicly open the sealed price proposals and divide Table 4. Example of Adjectival Rating for Different Evaluated Areas ADJECTIVAL RATING DESCRIPTION Project Management/Quality Management Design and Construction Plan (D&C Plan) Excellent e Project Management/Quality Management Value Added Responses (VARs) provide superior benets and value and/or result in outstanding improvements in implementation or level of overall quality of the Project. ere are no questions, concerns or weaknesses. e D&C Plan greatly exceeds the requirements of the evaluation subfactor and provides superior benets and value, and/or results in outstanding improvements in implementation and level of overall quality of the Project. ere are no questions, concerns or weaknesses. Very Good e Project Management/Quality Management VARs provide signicant benets and value and/or result in meaningful improvements in implementation or level of overall quality of the Project. Questions, concerns or weaknesses are very minor. e D&C Plan exceeds the requirements of the evaluation subfactor and provides signicant benets and value, and/or results in meaningful improvements in implementation and level of overall quality for the Project. Questions, concerns or weaknesses are very minor. Good e Project Management/Quality Management VARs provide added benets and value and/or result in improvements in implementation or level of overall quality of the Project. Questions, concerns or weaknesses are minor. e D&C Plan exceeds the requirements of the evaluation subfactor and provides added benets and value, and/or results in improvements in implementation and level of overall quality for the Project. Questions, concerns or weaknesses are minor. Meets Minimum ere are no Project Management/Quality Management VARs that provide added benets and value or result in improvement in the implementation or quality of the Project. (e evaluation factor receives zero points.) e D&C Plan is responsive and meets the minimum requirements of the evaluation subfactor. ere are no unique or innovative characteristics. ere may be questions, concerns or weaknesses.

NCHRP LRD 90 19 43% 63% 71% 81% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Meets Technical Criteria - Low Bid Qualitative Cost - Technical Tradeoff Value Unit Price Internally-Developed Algorithm Table 5. Award Algorithm Methodsa Best Value Award Algorithm Algorithm Variables Award Determination Meets Technical Criteria–Low Bid If T > Tmin, Award to PminIf T < Tmin, Non-Responsive T = Technical Score P = Project Price Lowest Price Value Unit Price Adjusted Bid AB = P/TAward ABmin AB = Adjusted Bid Numerical analysis using point scoring, a mathematical combination of price and non- price factors, or a quantitative tradeo analysis Adjusted Score AS = (T x EE)/PAward ASmax AS = Adjusted Score EE = Engineer’s Estimate Weighted Criteria TS = W1S1 + W2S2 + … + WiSi + W(i+1)PS Award TSmax TS = Total Score Wi = Weight of Factor i Si = Score of Factor i PS = Price Score Quantitative Cost–Technical Tradeo TIncrement = [(Tj/Ti) – 1] x 100% PIncrement = [(Pj/Pi) – 1] x 100% If TIncrement > PIncrement, Award Proposali If TIncrement < PIncrement, Retain Proposalj for possible award and repeat with Proposalj+1 Repeat Process until TIncrement > PIncrement T = Technical Score P = Project Price Fixed Price– Best Proposal Award Tmax, Fixed P T = Technical Score P = Project Price Qualitative Cost–Technical Tradeo Similar to above, only no quantitative analysis of dierence. Award to proposal that has best value in proposed scope. Evaluation panel reaches consensus as to which proposal is the best Qualitative tradeo analysis of cost and technical factors a See Scott, supra note 5. Figure 10. Frequency of transportation agencies using various best value award algorithms (n = 30).

20 NCHRP LRD 90 of best value most closely resembles the traditional DBB low bid procurement process. Fixed Budget–Best Proposal is value unit price algorithm is a variation of best value procurement procedure in which an agency stipulates the con- tract price in the proposal request, as well as the qualitative and technical evaluation factors for project elements upon which the selection will be determined. Each proposer submits a technical proposal addressing the qualitative and technical factors for the stipulated price. Proposals are then evaluated and rated based on the non-cost factors since the price is xed, and the highest rated technical proposal is selected for award at the stipulated price. In a “build to budget” variation, the best value proposer is selected based on how many optional elements of the project the proposal commits to build. Table 7 provides an example of the xed price-best proposal award algorithm. Adjusted Bid Adjusted bid best value procurement is a two-stage value unit price algorithm in which a bid is rst analyzed based on technical merit and is then scored using pre-dened technical criteria. Aer the technical score is determined, the price com- ponent of the bid is opened and analyzed. e price component is then divided by the technical score, and the lowest adjusted bid is the winning bid. An example of an adjusted bid award algorithm is shown in Table 8. Adjusted Score To the converse of adjusted bid, adjusted score is a two-stage best value award algorithm in which the agency analyzes a techni- cal bid and assigns a technical score. en, the agency opens and analyzes the price component of the bid. e adjusted score is cal- culated by multiplying the technical score by the estimated total • Points are allocated to each section of the proposal based on the relevance of that section to the project goals. is requires the agency to develop and prioritize a well thought out set of goals for each project. • Assignment of a point value for each specic evaluation criterion as applied to the proposal. An evaluation team of no less than ve people score the proposal individually, and then a consensus meeting is held to determine a nal score. • A two-part technical score based on a combination of the scores received on the letter of interest and the techni- cal proposal. e technical score is divided into the price proposal score, and the proposal with the lowest combined score is declared the best value proposal. e Missouri DOT has used two algorithms to determine best value: one involving a pass/fail determination regarding price reasonableness and the other involving an integrated cost assessment. Both algorithms include a pass/fail determination regarding price reasonableness compared to an engineer’s esti- mate. For the former algorithm, if the price allocation is found to be unreasonable, then that proposal is considered unrespon- sive and not considered further. For the latter, the price is evalu- ated further and scored with points assigned that are then inte- grated into the team’s overall score for the proposal. Meets Technical Criteria–Low Bid Price is the most important criterion for this award algorithm (see Table 6). All non-cost criteria are evaluated using a prede- termined rating system. Direct point scoring might be used to determine whether the technical proposal meets a minimum technical score. e agency determines a competitive range that includes proposals that meet the technical criteria and are considered fully responsive. e bidder within the competitive range with the lowest price proposal is selected. is variation Table 6. Meets Technical Criteria–Low Bid Award Algorithm Example Oeror Technical Score (60 Max; 40 Min) Price Proposal Amount Contractor A 51 $1,400,000 Contractor B 53 $1,200,000 Contractor C (Best Value) 44 $1,100,000 Contractor D 39 Non-responsive Table 7. Fixed Price-Best Proposal Example Oeror Technical Score (100 Max) Contractor A 91 Contractor B (Best Value) 93 Contractor C 84 Contractor D 79

NCHRP LRD 90 21 with the highest total score is awarded the project. is algo- rithm has been called one of the most intuitive and transpar- ent approaches.44 45 However, as noted previously, using number scores does not eliminate subjectivity in determining ratings. Furthermore, as is the case with all formula-based approaches, this algorithm may result in a designated “best value” proposal that, for reasons not considered in setting the formula, is not the right answer for the project.46 When a formula is used, agencies 44 See M. Calahorra-Jimenez et al., Importance of Noncost Criteria Weighing in Best Value Design Build US Highway Projects, J Manage Eng (2021). 45 See Molenaar, supra note 1. 46 See, for example, the unpublished case of Siemens Transp. v. Met- ropolitan Council, No. C8-00-2213, (Minn. Ct. App. Jun. 19, 2001). at case involved a best value procurement for 18 light rail vehicles with an option for the agency to purchase additional cars at the same project price (e.g., engineer’s estimate) and dividing the result by the submitted proposal bid price. e contractor with the highest adjusted score is awarded the project, as shown in Table 9. Weighted Criteria is is another two-stage value unit price award algorithm. e agency scores the technical proposal using predetermined criteria that are weighted by deemed importance. en, the price proposals are scored by assigning the maximum price score to the lowest price proposal and determining the price proposal score of the other proposals by assigning a score pro- portional to the lowest bid score. is may be calculated by di- viding the lowest price by the price in the proposal being evalu- ated and multiplying that number by the maximum price score. As shown in Table 10, the technical and price scores are then added to determine a proposal’s total score, and the proposal Table 8. Adjusted Bid Award Algorithm Example Oeror Technical Score Price Proposal Adjusted Bid Contractor A 0.85 $1,200,000 $1,411,765 Contractor B 0.95 $1,250,000 $1,315,789 Contractor C (Best Value) 0.90 $1,150,000 $1,277,777 Contractor D 0.70 $1,100,000 $1,571,429 Table 9. Adjusted Score Award Algorithm Example Oeror Technical Score (1,000 Max) Price Proposal Calculations (Engineer’s Estimate = $10M) Adjusted Score Contractor A 930 $10,937,200 (930,000,000) = 10,937.200 85 Contractor B (Best Value) 890 $9,000,000 (890,000,000) = 9,000,000 99 Contractor C 940 $9,600,000 (940,000,000) = 9,600,000 98 Contractor D 820 $8,700,000 (820,000,000) = 8,700,000 94 Table 10. Weighted Criteria Award Algorithm Example Oeror Technical Score (60 Max) Calculation of Price Score Price Score (40 Max) Calculation of Total Score Total Score (100 Max) Contractor A 51 (10,000,000 × 40) = 1,200,000 33 51+33 = 84 Contractor B (Best Value) 53 (10,000,000 × 40) = 1,250,000 32 53+32 = 85 Contractor C 44 (10,000,000 × 40) = 1,100,000 36 44+36 = 80 Contractor D 39 (10,000,000 × 40) = 1,000,000 40 39+40 = 79

22 NCHRP LRD 90 should carefully consider dierent scenarios to ensure that the formula will produce the expected result. Quantitative Cost–Technical Tradeoff In this value unit price award algorithm, the agency uses a formula to assign a numerical value to the incremental dier- ence between the highest technical score and other technical scores. e agency also uses a formula to provide a numeri- cal value to the incremental dierence between price scores. As shown in Table 11, the agency awards the contract to the proposer with the lowest price unless a proposal’s added price increment is oset by the proposal’s added technical score in- crement. is algorithm relies on the selection ocial to make intelligent decisions instead of constraining the selection pro- cess based on a predetermined formula.47 Qualitative Cost–Technical Tradeoff For this award algorithm, an agency engages in an evaluation process that requires the agency to determine whether the bene- price. Bombardier oered a price for lower quality vehicles that allowed the agency to order 22 vehicles within its budget, while Siemens oered a higher quality vehicle and a higher price. Even though Siemens received the highest score under the formula in the RFP, the agency relied on other language in the RFP stating that award would be made to the proposal that, when considered in its entirety, best conforms to the overall long-term interests of the council. e court upheld the dis- trict court’s decision not to enjoin award to Bombardier, nding that the agency’s award decision was not arbitrary, capricious, or unreasonable. 47 See Scott, supra note 5. Table 11. Quantitative Cost–Technical Tradeoff Award Algorithm Example Proposal Price Weighted Score Price Increment Score Increment A (Best Value) $4.0M 300 NA NA B $4.3M 400 +8% +33% C $4.4M 405 +3% +1% Note: NA = Not Applicable. Figure 11. An example of a qualitative cost–technical tradeoff decision model. ts (if any) oered by higher-priced proposals in comparison to lower-priced proposals outweigh the cost dierential between the proposals. e agency selects the proposer that it deems to provide the best value to the agency.48 e agency’s evaluation process must be consistent with the RFP documents and have a rational basis. Figure 11 is an example of a qualitative cost–tech- nical tradeo decision model. F. Procurement Procedures e best value procurement procedures for a transportation project are generally constrained to some extent by governing statutes and regulations, as well as the agency’s internal policies, 48 e following sources include descriptions of the process fol- lowed for a qualitative cost–technical tradeo evaluation: Best Practices Procurement & Lessons Learned Manual, Federal Transit Administration, FTA Report No. 0105 (Washington, DC: 2016). Available at: https://www.transit.dot.gov/sites/a.dot. gov/files/docs/funding/procurement/8286/fta-best-practicespro- curement-and-lessons-learned-manual-2016.pdf. Nancy Smith et al., Public-Private Partnership (P3) Procurement: A Guide for Public Owners, Federal Highway Administration (Mar. 2019). Available at: https://www. wa.dot.gov/ipd/pdfs/p3/ toolkit/p3_procurement_guide_0319.pdf. See also the description of the evaluation process for the Utah Department of Transportation’s I-15 Reconstruction Project, start- ing on p. 19 of the Department’s Initial Report dated Aug. 19, 1997, I-15 Reconstruction Project, Special Experimental Project No. 14 (SEP-14).

Next: III. POTENTIAL ISSUES CREATING RISK OF PROTESTS IN THE BEST VALUE PROCUREMENT PROCESS »
Best Value Procurement for Highway Construction: Legal Issues and Strategies Get This Book
×
 Best Value Procurement for Highway Construction: Legal Issues and Strategies
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Best value procurement is becoming more popular with transportation agencies because it allows them to consider factors other than cost. However, best value procurement is also more complex and can lead to protests if not conducted properly.

NCHRP Legal Research Digest 90: Best Value Procurement for Highway Construction: Legal Issues and Strategies, from TRB's National Cooperative Highway Research Program, addresses the best value procurement systems used for highway projects and notes the flexibility regarding selection criteria, rating systems, and award algorithms.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!