MEDIA CENTER

The Challenge of Applying the LPTA Process to the Procurement of Complex Services

 

 

The Challenge of Applying the LPTA Process to the Procurement of Complex Services

As this TASC white paper was going to press, Under Secretary of Defense Frank Kendall released his “Better Buying Power 2.0“ initiatives. Among them was this:

When LPTA is used, define Technically Acceptable to ensure needed quality:
Industry has expressed concerns about the use of Lowest Price, Technically Acceptable (LPTA) selection criteria that essentially default to the lowest price bidder, independent of quality. Where LPTA is used, the Department needs to define TA appropriately to ensure adequate quality.

TASC welcomes this policy directive and hopes this white paper, and particularly the discussion and examples in Sections 4-6, will advance understanding within the acquisition community.

Executive Summary

In today’s tightening fiscal environment, parts of the Government are moving to increased reliance on the Lowest Price Technically Acceptable (LPTA) source selection method.1 Under defined circumstances, the LPTA process can control costs and result in best value for the Government. But the LPTA process, which can eliminate agency discretion to value technical and other non-cost superiority, is not appropriate for all acquisitions. Within the context of professional services, even under the best of circumstances, the LPTA method creates risks that need to be mitigated by a precisely drafted solicitation and a technically rigorous, carefully managed proposal evaluation process.

First and foremost, we recommend that solicitations for complex professional services avoid using the LPTA source selection process. In our view, such solicitations should instead adopt a classic best-value/cost-technical tradeoff approach to afford the Government flexibility to value technical superiority relative to potential cost savings.

However, if the Government decides to adopt the LPTA approach for such professional services, we recommend that the Government mitigate the risk of such an approach by: (a) setting the values and metrics for the acceptability of services (e.g., the size and breadth of the offeror’s team, domain expertise, personnel qualifications and certifications, and access to and mastery of processes, technology, and tools) high enough to ensure the Government receives true best value without sacrificing quality; and (b) including stringent past performance requirements.

November 2012

1. The Best Value Continuum

Under Federal Acquisition Regulation (FAR) Part 15, an agency can obtain “best value” by using any one or a combination of source selection approaches in which the relative importance of cost or price may vary. Making cost or price dominant through an LPTA process is one such approach, and is appropriate in an acquisition in which both “the requirement is clearly definable” and “the risk of unsuccessful contract performance is minimal.” FAR 15.101. Conversely, when an acquisition has less definitive or changing requirements and greater performance risk, then technical or past performance considerations, or both, should play a more dominant role in achieving best value. Id.

By definition, the LPTA process is to be used when the best value is expected to result from selection of the technically acceptable proposal with the lowest evaluated price. FAR 15.101-2. LPTA source selections are prescribed for scenarios in which the Government would not realize any additional value from a proposal exceeding the Government’s minimum technical or performance requirements. The LPTA process is often for acquisitions of commercial or non-complex services or supplies that are clearly defined and expected to be low risk.2 As a result, the LPTA process has traditionally been reserved for commodities or near-commodities (e.g., facility maintenance services, laundry services, and custodial services).

The Department of Defense (DoD) has made it clear that although the LPTA process can yield lower costs than other evaluation methods, it is by no means a panacea for budget pressures. Shay Assad, the DoD’s director of defense pricing, recently stated that “there’s absolutely no policy from DoD that says we want you to start using low-price, technically acceptable approaches.”3 He went on to stress that once acquisition officials have fully examined their requirements and determined that there is no value in providing a level of service above a certain amount, only then should they make price the decisive factor.

One former Under Secretary of Defense for Acquisition, Technology and Logistics has sounded the alarm on over-reliance on the LPTA process:

The LPTA tool -- generally used for the purchase of items that are commodities where there is little difference among competing offerings -- is failure waiting to happen, an example of when low cost can become very expensive.

Of great concern is the fact that shifting to LPTA from the normal ‘best value’ buying practice -- where risk, performance and cost are all considered -- is increasingly being seen across the Federal Government in response to budget shrinkages. This is a dangerous trend, especially when acquiring critical professional services. The impact will be disastrous for the nation, and it must be stopped.

The nation cannot move to ‘buying cheap’ for critical services; the long-term costs are too high. 4

Organizations such as the Professional Services Council have echoed this warning.5

2. LPTA in the Context of Complex Professional Services

With reference to the two-part test of FAR 15.101, first, few if any professional services for mission support would seem to have “clearly definable” requirements. Rather, those requirements tend to be flexible and evolving, for very good reasons anchored in the complexity of the missions those services support. Second, it is rare with professional services for mission support that “the risk of unsuccessful contract performance is minimal.” Generally speaking, mission support services tend to be critically important to national security and must work the “first time and every time.” This combination of complexity and mission essentiality equates to far more than “minimal” risk.

In such circumstances, adopting an LPTA approach risks mission success and increasing total program costs, once rework and related costs are factored in. Systems engineering and integration (SE&I), systems engineering and technical assistance (SETA), and advisory and assistance services (A&AS) providers support the Government in controlling the costs of the prime development contractors and the mission enterprises by providing verification of developer statements; identifying and recommending mitigations of execution risks; and assisting the Government in determining the most effective courses of action for the future. Stronger mission support services actually reduce total program costs. Indeed, relatively small expenditures for high-quality systems engineering, independent validation and verification, and similar services can drive enormous cost savings in the large development contracts for such systems.

The Government is not well served by using the LPTA source selection process for highly complex professional services, such as high-end engineering service contracts in support of space systems acquisition. The LPTA method is ill-suited for such acquisitions due to both the highly complex nature of the services to be provided and the level of expertise needed to minimize risk on numerous critical national security space programs.

Under a mission support contract providing such a broad range of high-end engineering and other technical advisory and assistance services, the Government will likely realize additional value from a proposal exceeding its bare minimum technical and performance requirements. Within the framework of FAR 15.101, it would difficult for the Government to state requirements for such complex and variable services throughout the entire period of performance in a manner so “clearly defined” to warrant the traditional LPTA approach of giving no extra value to technical excellence. In addition, the risk to mission of unsuccessful performance under such a contract is far more than “minimal.” Successful performance may require more than merely satisfying all requirements in the statement of work. Mission failures could result from, for example, failure to stretch capabilities and technologies to address unforeseen circumstances; failure to bring proper expertise to bear; or a situation in which all deliveries are made, but they are of minimal impact on the mission.

Even if the stated requirements were met and all deliverables made, but there was unexpected cost growth on the development contract, breaches of Nunn-McCurdy requirements, etc., the service contractor may have been “unsuccessful,” as contemplated in FAR 15.101. Therefore, weighty consideration of non-cost/price factors such as technical qualifications, management acumen, and past performance seems to be warranted. For high-end engineering service contracts, we urge the Government to employ a classic best-value solicitation that retains its flexibility to make an informed cost-technical tradeoff and to select other than the lowest priced offer.

3. LPTA Risks

A. Program Risks

Employing an LPTA process in a solicitation for complex mission support services can result in, among other problems, significantly underbid work effort, increased risk of failure, and inability to meet customer needs on critical programs. An LPTA evaluation process would seriously degrade the Government’s opportunity to acquire engineering thought leadership and technical engineering superiority and would increase the risk that the awardee will not have technical qualifications to meet challenging current and future acquisition needs. In contrast, superior professional capabilities would help the Government save execution costs across a program in the long run.

Choosing a contractor based solely on the “lowest price” may cost the Government more in the end if rework and retraining are necessary. Any shortcomings in the performance of a high-end professional services contract could, in turn, increase overall program cost, including the costs of development contractors associated with less-than-excellent systems engineering, verification and validation, and other work performed by the service contractor. The Government also loses value if schedule reliability and innovation are eliminated as evaluation criteria. Innovation can save money in the mission being supported but requires staff members who know more than how to accomplish the rote elements of the job; they have to understand the reasons for their activities so they can continuously rethink and improve on how they and the Government approach each task. This is one reason an LPTA approach is often confined to basic services such as facility maintenance.

B. Systemic Risks

There are other significant risks inherent in the LPTA approach, beyond the immediate program. Extending LPTA into areas where it should not be used, such as non-commoditized, complex, mission-critical services, can cause systemic problems. Overuse of LPTA forces contractors to cut to the bone their general and administrative (G&A) and overhead costs. This is not all for the good.

For example, one of the first areas to be cut under LPTA-driven pressure is independent research and development (IR&D). This puts the government at risk of losing access to future technology innovation and consequently the cost savings associated with technical advance. The Cold War model of government-led technology innovation is obsolete except in a few isolated pockets of science and advanced technology. At the same time, the government has a difficult time getting all the technical innovation it needs from the commercial private sector. So the Government depends greatly on government contractor IR&D investments to get an adequate level of technical advancement and innovation -- and concomitant cost savings from substituting new technologies for labor and for more expensive, outmoded technologies. The LPTA model squeezes IR&D and therefore constricts government customers’ access to future technical innovation.

Learning and development (L&D) is another area susceptible to cost cutting under LPTA-driven pressures. But L&D is an area in which high-end professional-services contractors -- those who more often perform complex, mission-critical services -- must invest. L&D builds human capital; it helps transfer knowledge from more experienced (and more expensive) engineers to less experienced (and less expensive) engineers; it helps keep the entire workforce up-to-date on emerging technologies, tools, and processes; and it keeps young workers interested in and committed to government contracting. Without enough L&D, both the technical edge and the industrial base erode.

The overuse of the LPTA process runs another systemic risk: breaking up teams in the national asset class. In some areas, the institutional knowledge of a vital government mission resides as much, and sometimes even more, within the industrial base than within the government itself. (This is particularly true when the government program is led by active duty military personnel on fairly short rotations.) When a systems engineering or other mission support contract shifts from a classic best-value, cost-technical tradeoff approach to an LPTA approach, there is a substantial risk that the incumbent contractor will be displaced; indeed, that may be precisely what is intended. But not all of the incumbent team, and perhaps few of them, will move to the new, low-priced contractor. If the team is broken up, the government has lost an important mission capability, even if it saves money in the short run. Without that capability, the government will have a much harder time keeping the prime development contractors and the programs on schedule and within budget, not only on the immediate program but also on follow-on or kindred future programs.

In sum, overuse of LPTA will lead to undesirable systemic outcomes, including under-investment in technology and human capital development, and even the destruction of human capital accumulated into national-asset-class teams. The Government should guard against such overuse and should issue clearer guidance about when LPTA should be used and, conversely, when classic best-value, cost-technical tradeoff source selection should be maintained for complex, mission-critical work.

4. Defining “Technically Acceptable” in an LPTA Process

If the Government uses the LPTA evaluation process for high-end engineering services, the standards for technical performance should be precisely defined to set the bar sufficiently high to permit industry leaders to compete on price. The Government should ensure that, to receive an “acceptable” rating on technical criteria of such contracts, an offeror meets appropriately rigorous standards for technical approach, key personnel, and facilities.

The DoD prescribes that the acceptability of the product or service must be addressed in every LPTA source selection through consideration of one or more non-price evaluation factors/subfactors.6 These factors/subfactors are the uniform baseline against which each proposal is evaluated and will identify the minimum requirements that are key to successful contract performance, enabling the Government to make a determination of acceptability. To be considered awardable, an offeror must receive an “acceptable” rating in every non-price factor/subfactor. (This approach might prohibit evaluators’ ability to prioritize subfactor scores, as awards are based on “minimally meets all” criteria.) LPTA non-price factors/subfactors typically include technical and past performance criteria. The term “technical” refers to non-price factors other than past performance and may include the technical approach, key personnel and qualifications, and facilities.

Under the LPTA process, the assessment of whether the offerors barely meet these standards is the Government’s only opportunity to assess non-cost and technical abilities in making awards for these mission-critical services. Thus, the standards should be rigorous in all areas -- including the size and breadth of the offeror’s team, domain expertise, personnel qualifications and certifications, and access to and mastery of processes, technology, and tools -- even if it means that some offerors are found unacceptable or only one offeror is technically acceptable.7 One approach to make an LPTA assessment more robust is to require presentations by the key personnel to demonstrate their mastery of the domain and ability to deal with varying priorities and needs and unexpected situations. Scenario-based evaluations that let offerors provide solutions to specific problems that may arise during the period of performance are also a valuable tool to evaluate an offeror’s technical acceptability. The offeror should have to demonstrate that it understands the work sufficiently to solve problems and exhibits the flexibility and adaptability to address changing priorities and circumstances, which is not always brought out in LPTA awards. Carefully drafting the technical factors/subfactors will ensure that only truly competent offerors are deemed “acceptable,” mitigating the risk of unsuccessful performance under the contract.

5. Examples of Applying Rigorous Technical Requirements in an LPTA Process

We hope a few examples will be helpful. For instance, an RFP requirement for “long-term analyses” -- a vital function affecting mission critical work from a hypothetical high-end engineering service solicitation -- illustrates how an LPTA process can be made technically rigorous. Please assume the RFP’s requirement for long-term analyses was simply stated as follows:

The contractor shall conduct long-term analyses supported by (M&S), other software tools, requirements, and compliance documents when necessary. These analyses shall require responses approximately in 6–12 months. Types of analysis are shown in Appendix A.

Under the typical LPTA approach used for basic services, an offeror trying only to meet minimum requirements could satisfy this without demonstrating technical strength or an appreciation for the customer mission by proposing something like the following:

Using MS Excel spreadsheets, contractor assigns weights based on outside expert SME assessments to factors that contribute to accomplishing some action (e.g., sensor detection). These results are then rolled up into 0–100% metrics, indicating achievement of goals.

This answers the bare requirements for producing a series of recurring documents but without demonstrating any understanding of the mission. In a classic LPTA approach, the Government would be hard pressed to find this technically unacceptable and thus would have to make award if this offeror were low-priced (or face the possibility of protest), yet the offeror has only said that it knows how to use Excel and can produce reports.

In contrast, an offeror could propose the following technical approach that provides significantly more value to the mission:

Contractor will collaborate with stakeholders including outside experts, operators, warfighters, and others to collect requirements, document current systems, and identify threats. Contractor then leverages best-of-breed off-the-shelf tools and augments with organically developed M&S tools to evaluate current/modified systems, new concepts, enabling technologies, operating concepts, and infrastructure. Capability assessments of as-is and to-be architectures against identified requirements and guidance lead to identification of gaps and shortfalls as well as a quantification of impacts to the end users. Material and non-material solutions are evaluated to determine the best alternatives to resolve the identified shortfalls for a selected mission area. These alternatives are then assessed for cost versus technical performance measures (TPMs) to identify best-value solutions for programmatic consideration. Candidate system and enterprise architectures are then road-mapped along with these TPMs to show capability growth over time as older systems give way to new concepts, ensuring proper evolution of capabilities within projected costs throughout the planning horizon. A multi-resolution, or layered, simulation methodology (including engineering, engagement, mission, and campaign-level analyses) enables traceability of cause and effect from enterprise-wide impacts down to individual system performance using objective assessments.

This solution demonstrates more understanding of the mission and the technical excellence of its team.

In order to elicit such a robust commitment, the RFP’s requirement for long-term analyses, in contrast to the cursory version above, could be stated as follows:

The contractor shall employ a comprehensive, quantitative architecting analysis framework to ensure consistent and repeatable analysis of architectures. The contractor shall identify performance gaps and other shortfalls at the engineering, engagement, mission, and campaign levels. The contractor shall model and assess material and non-material concepts that close architectural shortfalls, including concepts proposed by outside experts, Government labs, industry, and other sources. In the absence of viable concepts, the contractor shall propose appropriate concepts to close gaps and resolve shortfalls. The contractor shall collaborate with the community -- forming working groups and conducting technical interchange meetings as appropriate -- to gather and confirm analysis assumptions (e.g., threat definition, requirements interpretation, CONOPS) to feed modeling and cost estimation. The contractor shall document all inputs and outputs of this process, including threat definitions, scenario definitions, system and concept definitions, models, analysis results, etc. These analyses shall require responses within 6–12 months.

So the same requirement has now been transformed from one where the “pass/fail” element of an LPTA approach has become meaningful. To “pass” the offeror must provide a robust account of its team’s breadth, domain expertise, tools and processes, past performance, etc.

The RFP could also require that offerors demonstrate their compliance with all elements of the statement of work. For example, it could ask competitors to address the requirements in scenarios such as the following:

  • Scenario 1: Show how you would support a communitywide architecture study with a comprehensive analysis strategy. Describe how you use your processes, tools, and expertise to identify architectural gaps and shortfalls and recommend best-value courses of action to close the gaps and resolve the shortfalls.
  • Scenario 2: A company has gained advocacy from congressional leaders for a new concept that they are proposing. As this concept did not exist at the time of the most recent architectural analysis, it was not included in the results of Scenario 1. Describe how you would accomplish the delta analysis and update the results and recommendations from Scenario 1.

The same rigor could be required in the Past Performance, stating, for example:

Show how you have applied your comprehensive architecture analysis process and proven models and tools to accomplish mission area architecting. Highlight impact of quantitative analysis results on having recommendations accepted by customers, stakeholders, and larger community.

Similarly, the Key Personnel requirement could provide:

Identify key personnel for leading analysis activities. Each must have 5 years of mission area experience, 5 years leading quantitative analyses, and 5 years leading mission area architecting efforts and must possess [describe educational requirements].

This focus on technical excellence can be reinforced throughout the solicitation, enabling the Government to meaningfully evaluate a technical proposal, but in an LPTA context, requiring offerors to demonstrate compliance with all requirements to be found technically acceptable. This approach provides the Government the best value for complex services and avoids the need to make an award to (or losing a bid protest to) an offeror that thinks it has met bare minimum technical acceptability by stating that it knows Excel and can regularly produce spreadsheets.

6. Considering Past Performance in an LPTA Process

Past performance should also be used as an evaluation factor in the LPTA process for acquiring complex mission support services, though competitors are rated only as “acceptable” or “unacceptable.” The past performance evaluation assesses the likelihood that a competitor will successfully perform the required effort based on the record of recent performance relevant to the products and/or services outlined in the solicitation requirements. Carefully crafted application of the past performance factor is an essential part of mitigating the risks created by assessing only bare technical acceptability in an LPTA approach.

Precisely defined minimum past performance solicitation requirements and careful consideration of an offeror’s performance record are essential to ensuring that the Government receives best value under the high-end professional service contracts. Such solicitations establish the criteria for relevancy and recency in relation to the specific requirements. Given the degree of complexity of the contract requirements and the consequences of unsuccessful performance, the Government should emphasize and prioritize past performance in solicitation requirements and set the performance criteria high enough to ensure only competent providers are deemed “acceptable.”

Past performance evaluation factors can be included in the award criteria in innovative ways to derive best value from an LPTA process. One common approach, used by the Air Force in a number of procurements, is to award to the lowest price, technically acceptable offeror if that offeror received a past performance confidence assessment rating of “substantial confidence.” Otherwise, the agency would perform a tradeoff between past performance and price to make a best value determination.8 This approach embraces the cost savings hoped for in the LPTA approach while achieving the best value required by FAR 15.101, providing the Government the flexibility to assess an offeror’s credentials beyond barely meeting minimum technical criteria.

The past performance evaluation itself also assesses how well the contractor performed on the contracts. Acceptable sources of past performance qualitative criteria include information provided by the offeror as solicited, information obtained from questionnaires, information obtained from government databases, and interviews with program managers, contracting officers, and the Defense Contract Management Agency.9

In drafting the past performance requirements for high-end, complex professional services solicitations, the Government should ensure that prior contract performance is similar in size, scope, and, especially, complexity. The past performance evaluation should reflect proven success in technical forecasting systems engineering services and schedule compliance, as these criteria are all essential to successful performance under the contract. Consideration should also be given to requiring successful past performance in cost containment, innovation, thought leadership, and agility in adjusting to shifting priorities on prior contracts.

The past performance requirements under such solicitations should demonstrate that offerors have done more than merely providing compliant services; instead, offerors should have a demonstrated record of adding insight that has enabled previous customers to enact initiatives of significant impact and meaningful cost savings. Such innovation and thought leadership can reduce costs and shorten schedules, thus ensuring the Government receives best value under the contract.

Cost realism should also factor in the past performance evaluation, given the complexity of the mission-critical services provided under high-end professional service contracts. Cost realism analysis can be used for the limited purpose of measuring an offeror’s understanding of the requirements or to assess the risk inherent in an offeror’s proposal. Realism should also take into account the effects of shifting priorities and dealing with the unknown, i.e., today’s low-cost provider can become tomorrow’s expensive option if the solicitation evaluation criteria did not require -- as part of the pass/fail technical assessment -- that the offeror fully demonstrate its depth of experience, reach-back capabilities, agility, and adaptability. Inclusion of a cost realism evaluation factor as a past performance criterion also is consistent with GAO and DoD policy.10 Creating a high standard for “acceptable” past performance under high-end, complex professional contracts is essential to mitigating the inherent risks of an LPTA source selection process.

7. Conclusion

In sum, we submit two recommendations:

First, we recommend that solicitations for complex and/or mission-critical services adopt a classic cost-technical tradeoff approach for achieving best value. The LPTA source method is simply not designed for these types of acquisitions. Under the FAR, LPTA cannot be used unless the service requirements are “clearly definable,” and “the risk of unsuccessful contract performance” is other than “minimal” -- and both tests must be met. In complex and/or risky procurements, there is clearly potential value to the Government in receiving a level of service above the minimal threshold. Innovation, scheduling rigor, program cost containment, and thought leadership can provide a higher quality service that ultimately benefits the Government more and costs the Government less in the long run.

Second, if the LPTA source selection method is selected, we recommend that technical and past performance requirements be rigorously and precisely defined to ensure the Government receives something that is truly and rigorously “technically acceptable.” The way to obtain the benefits and mitigate the risks of the LPTA process is to award the low-priced, technically acceptable offeror only if it achieves a high standard of technical capability and past performance (e.g., “substantial confidence”). Only by setting the technical and past performance criteria at a high level can the Government be confident that only the most qualified offerors are deemed “acceptable” -- even if that means finding a substantial number of offerors to be technically unacceptable. Applying detailed technical criteria, using scenario-based evaluations, placing high importance on past performance, and conducting price-realism analyses are all ways to mitigate the risks of unsuccessful performance after the source selection is made.

 

1 There is also an accelerating trend toward de facto LPTA source selections, when the Government issues a best-value solicitation but effectively (and improperly) converts it into an LPTA procurement during the evaluation process. See, e.g., First Line Transportation Security, Inc. v. United States, U.S. Court of Federal Claims No. 11-375C (Sept. 27, 2011) (granting bid protest). That trend, however, is beyond the scope of this paper.

2 “Department of Defense Source Selection Procedures,” Memo, Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (March 4, 2011) at A-1.

3 “GSA News Briefing,” GSA Daily News (April 30, 2012) at 11-12.

4 Jacques B. Gansler, “Shop for ‘best value,’ not ‘lowest price’,” Federal Times (June 12, 2011). In the interest of full disclosure, Dr. Gansler is now a member of the TASC Board of Directors, although he joined TASC well after he made these comments.

5 E.g., Professional Services Council, “Impacts of DoD Budget Reductions on the Services Sector Industrial Base and Potential Mitigation Strategies” (Nov. 29, 2011) at 3 (submittal to Deputy Secretary of Defense) (“While LPTA may serve the department appropriately in certain situations, industry is increasingly witnessing situations in which the department’s desire to access critical human capital and drive innovation are at odds with acquisition strategies that reward the lowest priced bidder and ignore best value propositions.”); see also Bob Lohfeld, “Will low-price contracting make us all losers?,” Washington Technology (June 4, 2012).

6 “Department of Defense Source Selection Procedures,” supra at A-1.

7 See, e.g., Kenjya Group, Inc.; Academy Solutions Group, LLC, B- 406314, B- 406314.2, 2012, CPD ¶ 141 (Apr. 11, 2012) (upholding NSA’s award under LPTA procurement for enterprise management and support services at Ft. Meade where rigorous application of personnel requirements resulted in only one offeror being found technically acceptable).

8 See, e.g.,ProLog, Inc., B-405051, 2012 CPD ¶ 84 (Aug. 3, 2011) (Air Force contract for supply and transportation services); CAE USA, Inc., B-404625, 2011 CPD ¶ 75 (Mar. 16, 2011) (Air Force contract for training support for the C–17 aircraft); Hillstrom's Aircraft Services, B-403970.2, 2010 CPD ¶ 303 (Dec. 28, 2010) (Air Force contract for aircraft corrosion control services).

9 “Department of Defense Source Selection Procedures,” supra at A-3.

10 FAR 15.404-1(d)(1); see also In the matter of Nova Technologies, B-405982.2 (May 16, 2012) (the evaluation of an offeror’s past performance is a matter of agency discretion, which GAO will not find improper unless unreasonable or inconsistent with the solicitation’s evaluation criteria); Raytheon Technical Services Company LLC, File B-406136; B-406136.2 (Feb. 15, 2012) (“...an agency may provide in the RFP for the use of price realism analysis for the limited purpose of measuring an offeror’s understanding of the requirements or to assess the risk inherent in an offeror’s proposal”); AMEC Earth & Envtl., Inc., B-404959.2 (July 12, 2011) at 8 (“…The nature and extent of such a price realism analysis are matters within the agency’s discretion.”).

Download PDF