2. Source selection techniques
schematic of best value continuum
Lowest Priced Technically Acceptable
(LPTA)
• Requirements well defined; Risk of
unsuccessful performance minimal;
and no value, need or willingness to
pay for higher performance
Subjective Trade-Off
Method
• Award based on a cost
benefit trade-off analysis
More Evaluation
Factors Used
Value Adjusted Total
Evaluated Price Tradeoff
Acceptable or Not
Acceptable; Lowest Cost
2
3. Source Selection Process
Industry View Black Hole – Poor Comm, No Schedule
Competitive
Range
Spend $M B&P to
propose against
poorly written
RFP and unclear
evaluation
process
Black Hole
Wait Discussions
Yes
Fix proposal
• Stiff
•Formal
•No true negotiation
•“Trade electrons”
Black Hole
Debrief Still
In?
Yes
Yes
Elimination
Debrief
• Stiff
•Formal
•Uninformative
•What did we miss?
No
Wait Request
for FPR
FPR
submitted
Fix proposal more
Black Hole
Selected
for Award
Debrief
• Stiff
•Formal
•Uninformative
•Why did we win?
•Who cares?
Yes
Elimination
Debrief
• Stiff
•Formal
•Uninformative
•Why did we lose?
•Was evaluation
conducted properly?
•Will we ever know so
we can improve?
Protest
Process
Protest
Decision
Stop
No
Elimination
No
4. Competition Perspectives
(Chain of Issues Grueling Experience)
RFI(S)
-May require a
“mini-proposal
-Not soliciting
industry input on
acquisition
parameters
-Some Gov’t
offices start
restricting
communication
Industry
Day(s)
Draft
RFP
RFP
Proposal(s)
Received
Competitive
Range/
Discussions
Final
Proposal
Revision(s)
SSA
Decision
-If eliminated
decide whether
to protest or not
- Can I
introduce more
efficiencies or
benefits?
-If eliminated
decide whether
to protest or not
-Respond to
ENs, which often
are unclear.
-Offeror tries to
fix proposal.
-Offerors may
want to ask
questions, but
told they are
limited to
responding to
ENs.
-Discussions are
too “stiff” and
formal.
-Submit what the
Govt wants, but
“fix” during
discussions
-Don’t know
schedule, so
wait
-Unclear Section M
-Often dictates
solution (“how”) vs.
objectives (“what”)
-Offerors are
surprised w/
changes since the
draft RFP—hesitant
to ask questions at
this point
-Always assume
there will be
discussions, unless
RFP indicates only
“award w/o
discussions
-Unclear Section
M
-Industry
“inspects in”
quality..or not
- Need to better
define best value
-Key Gov’t
officials may not
attend
-Repeats RFI
info
-No new insights
or industry input
-Stifled
communication
-Why am I here?
-If eliminated
decide
whether to
protest or not
-How did the
SSA eval
colors and
adjectival
ratings,
apply
tradeoffs and
then decide
best value?
5. Lessons Learned & Best
Practices
Get training
Build inch stone schedule chart early
Spend time discussing/debating/agreeing on evaluation criteria
Is it important
Will it be a discriminator
Use Risk Assessment process
Conduct industry days
Openness and transparency
Need to improve quality of RFPs and proposals
6. Lessons Learned & Best
Practices (contd)
Non Cost Team members participate in creating evaluation
criteria
Pay attention to cost/price impacts on technical performance
Particularly BOE/Technical Approach disconnects
Be aware of time sink events
Consensus meetings need to be time managed
Provide for evaluators that are slow readers
Use an electronic source selection tool
7. Lessons Learned & Best
Practices (contd)
Discuss ENs with offerors
Don’t just mail and hope for the best
Start evaluation reports early in the process
Don’t wait until the end to begin creation
Create a CONOPS for evaluation team
Early agreement on process, definitions, operations tempo, etc
Level sets the evaluation teams
Document, Document, Document
8. Dynamic Source Selection
Team Attributes
Try to include Source Selection Team Members in the development of
evaluation criteria
Use a disciplined methodology to develop evaluation criteria
Get the best people possible
Develop and share an inch stone schedule
Understand that fact of life issues occur during a source selection
Make sure everyone reads and understands the requirement, evaluation
criteria, definitions, etc before proposals are received
Get training for the team so everyone understands the process from
beginning to end
9. Inchstone Schedule
Ensure all Members of SSET have Read and are Familiar with RFP, including SOW and Sections L and M
After Receipt of Proposals
Build Quick Look Briefing or Paper; Identify primes, major subcontractors and showstoppers
Start Proposal Evaluation
Schedule and Provide Quick Look Briefing to SSA and SSAC Chair if required by SSA (These are minimum, may involve entire SSAC
Conduct Weekly Interchange Meetings Between Management, Technical, Past Performance and Cost/Price Team Leads
Ensure Consensus Results are Completed
Review Evaluation Results, Significant Strengths, Strengths, Uncertainties, Deficiencies, Significant Weaknesses and Weaknesses
Review Subfactor Colors and Risks
Ensure ENs are Prepared
Competitive Range Process
Conduct Competitive Range Briefing to SSAC
Determine from SSA if Interim Results are To Be Released to Offerors
Release ENs Through the PCO
Review Evaluation Results, Significant Strengths, Strengths, Uncertainties, Deficiencies, Significant Weaknesses and Weaknesses
Review Subfactor Colors and Risks
Determine if Team is Ready to Close Discussions and Issue Request for Final Proposal Revision (FPR)
Final Proposal Revision Process
Conduct FPR Decision Brief
FPR Received and Evaluations Started
Ensure Consensus Results are Captured
Review Evaluation Results, Significant Strengths, Strengths, Uncertainties, Deficiencies, Significant Weaknesses, and Weaknesses
Review Subfactor Colors, and Risks
Final Decision Process
Conduct Final Decision Briefing with SSAC/SSA
Complete Draft of SSDD and SSA signs
Contract Award
11. Definitional Issues
Significant Strength
An element of the proposal which exceeds an
objective and/or requirement of the TORP and/or
the SOO in a way that is beneficial to the
Government
Significant Weakness
A flaw in the proposal that significantly increases
the likelihood of unsuccessful Task Order
performance
Strength
An aspect of an offeror’s proposal that has merit
or exceeds specified performance or capability
requirements in a way that will be advantageous
to the Government during contract performance
Blue/Outstanding
Proposal indicates an exceptional approach and
understanding of the requirements and contains
multiple strengths
Purple/Good
Proposal indicates a thorough approach and
understanding of the requirements and at least
one strength
Green/Acceptable
Proposal indicates an adequate approach and
understanding of the requirements
Yellow/Marginal
Proposal has not demonstrated an adequate
approach and understanding of the requirements
12. Congressional Direction
FY18 NDAA (PL 115-91)
Sec 818(a) Enhanced Post-Award Debriefing Rights
Contract award >$100M, requirement for disclosure of
written, redacted source selection award determination
Contract award >$10M but <$100M, option for small
business or nontraditional contractor to request same
disclosure
Written or oral debriefing required for all contract
award, task or delivery orders > $10M
Both unsuccessful and winning offerors are entitled to
the disclosure and debriefing defined above
13. Congressional Direction
FY18 NDAA (PL 115-91)
Section 818(b) Opportunity for Follow-up Questions
Opportunity for a disappointed offeror to submit, within two
business days after receiving a post-award debriefing,
additional questions related to debriefing
The agency shall respond in writing to any additional
questions submitted within five business days after receipt
of the question. The agency shall not consider the
debriefing to be concluded until the agency delivers its
written responses to the disappointed offeror
14. Required Documentation
Reports
Proposal Analysis Report (or something similar)
Organized as a stand alone document and by offeror
Part of the administrative record
Suggested Outline -
Synopsis of acquisition strategy
Overall synopsis of offeror(s) proposal – Can come from Executive Summary
Evaluation Results – If color and risk scheme, overall factor/subfactor color and risk
By factor/subfactor
Identify strengths, significant weaknesses, weaknesses, and deficiencies
Comparative Analysis Report
If DoD >$100M with SSAC includes award recommendation
15. Required Documentation
Source Selection Decision Document
Use personal pronouns
Part of the administrative record
Must follow the evaluation criteria
Avoid new information at all costs
New information target rich environment for protest
Provided to the GAO and redacted version to unsuccessful offerors
(DoD)
Signed only by the SSA
17. GAO Protest footnotes
1 All entries in this chart are counted in terms of the docket numbers (“B” numbers) assigned by our Office,
not the number of procurements challenged. Where a protester files a supplemental protest or multiple
parties protest the same procurement action, multiple iterations of the same “B” number are assigned
(i.e., .2, .3). Each of these numbers is deemed a separate case for purposes of this chart. Cases include
protests, cost claims, and requests for reconsideration.
2 From the prior fiscal year.
3 Of the 2,672 cases closed in FY 2017, 256 are attributable to GAO’s bid protest jurisdiction over task or
delivery orders placed under indefinite-delivery/indefinite-quantity contracts.
4 Based on a protester obtaining some form of relief from the agency, as reported to GAO, either as a result
of voluntary agency corrective action or our Office sustaining the protest. This figure is a percentage of all
protests closed this fiscal year.
5 Alternative Dispute Resolution.
6 Percentage of cases resolved without a formal GAO decision after ADR.
7 Percentage of fully developed cases in which GAO conducted a hearing; not all fully -developed cases
result in a merit decision.
18. Pictures can also be presented more dramatically in widescreen.
Widescreen Pictures