Skip to main content
U.S. flag

An official website of the United States government

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

How to evaluate bids and proposals

It is common for agencies to use a scoring scheme to evaluate vendor proposals and bids. We recommend a different method that creates a detailed and defensible justification of the government’s vendor selection, which a scoring scheme does not. It also allows the government to give feedback to the vendors that didn’t receive the award by simply summarizing the proposal’s documented pros and cons.

As you will have explained in the solicitation for a performance-based services contract, this approach regards the three technical evaluation factors — technical approach, staffing approach, similar experience — combined as significantly more important than the price in evaluating the strength of a proposal.

Following are evaluation criteria for each of those technical factors. Each set includes positive signs and red flags to look for as you review proposals. They aren’t exhaustive, but should help an evaluation team get started and decide which vendors to interview.

Use our evaluator worksheet as a tool during the review process.


Technical approach

Ideally, the vendor proposes to use modern software development practices. The proposed approach should be appropriate for the scope of work and demonstrate technical proficiency.

Evaluate answers from the verbal interviews as part of the technical approach.

What to look for Positive signs Red flags
Competency Demonstrates knowledge of their preferred tools and methods, and is able to explain why they are appropriate for the project
  • Misidentifies core technologies in a way that shows inexperience communicating about or using them
  • Proposes a highly complex approach or uses highly complex language that confuses rather than clarifies
  • Proposes to outsource core technical competencies
  • Doesn’t mention using secure code practices
  • Doesn’t value testing code
Lack of novelty Recommends established software and infrastructure, as well as use of proven and effective design patterns
Lack of certainty Highlights areas of uncertainty in their technical approach (Since a vendor can’t know if a proposed approach will be effective until development begins, they should be candid that they can’t be sure.)
Vision Interprets the intended outcomes in a way that can enable the agency’s vision
Program goals Demonstrates a clear grasp of the agency’s mission and project’s aims described in the solicitation Doesn’t understand program goals that were described clearly in the solicitation
Open source software Has experience developing open source software | Doesn’t have experience developing open source software
Collaboration and communication Expects to work with an agency product owner and for that person to be an active team member — one who communicates proactively about risks and roadblocks
User research
  • Expects to conduct regular and ongoing user research to understand user goals and needs, and to use research findings to build features that support those goals and needs
  • Includes how qualitative and quantitative data will be leveraged to inform product and design decisions
  • Has a plan to conduct user research and test everything from rough prototypes to finished software with actual users throughout the entire design and development process
  • Seeks research participants from diverse backgrounds
  • Describes target groups for research
  • Research will be done with people who will actually use the service, ideally people with diverse perspectives and differing abilities
  • Research plan involves people:
    • Who have disabilities or use assistive technologies With limited digital skills or low literacy
    • Who may need help using the service in question
  • Research plan mentions:
    • Respect for participants
    • Informed consent
    • Potential harms and how they will be reduced
    • Diversity, inclusion, honesty, and transparency
  • Research plan methods are appropriate and the timeline is feasible
  • Combines user research with usability testing to ensure that features are meeting user needs
  • Doesn’t indicate that they will use user research to determine the design or the technical approach
  • Proposes a process that includes working for long stretches of time without interacting with the agency and/or users
  • Proposes using focus groups instead of structured one-on-one research interviews or usability testing sessions
  • Doesn’t use research methods appropriate to research goals (e.g., using surveys to uncover user needs or usability testing to validate user goals)
  • Design is described as User Acceptance Testing, performed only at the end of a project
  • Displays low maturity in UX research and design practices:
    • Research goals, questions, methods, and expected outcomes don’t align
    • Doesn't understand the difference between users and stakeholders
    • Doesn’t provide a user recruitment approach or interview protocol provided
User-centered design
  • Follows a user-centered design process (They explain how they make design decisions in relation to broader user goals and specific needs learned through user research.)
  • Indicates that design is considered part of the cross-functional agile development team — it doesn’t operate in a silo
  • Proposes that requirements will be collected from the business owner, rather than determined according to user needs uncovered through research
  • Prioritizes aesthetics over usability and usefulness
  • Can’t explain their design decisions
Development infrastructure
  • Focuses on automation, reliability, testability, infrastructure as code, etc.
  • Refers to modern automation and deployment tooling like Jenkins, Puppet, Chef, Travis CI, CircleCI, Kubernetes, Terraform, AWS, and Heroku
Accessibility
  • Offers specific, detailed description for how the team will build accessibility and testing into the development process
  • Lists applicable, up-to-date government accessibility standards
  • Doesn’t mention accessibility or explain how they will evaluate if their software meets accessibility standards
  • Offers “shall comply” without citing specifics, such as Sec. 508 and the protocol for satisfying it
Other
  • Bypasses page-limit rules in their proposal by using a tiny font size, reduced leading, etc.
  • Proposes long-term staff augmentation

Staffing approach

You want evidence that the staff has experience in their areas of expertise.

In addition, if the developers have presences on social coding platforms (for example, GitHub, GitLab, Bitbucket), review them to consider:

  • What kinds of projects have they worked on?
  • What languages have they worked with?
  • Is their code readable?
  • Does their code follow best practices for organization?
  • If their projects are open source, are they being actively used or forked?
  • Do their projects show expertise that doesn't appear in their qualifications?
What to look for Positive signs Red flags
Team size and roles Fewer than 10 team members, each of which has a clear role and purpose
  • Specifies too many key personnel, especially with individuals whose expertise overlaps with that of agency staff
  • Over-staffs the bid (If a vendor proposes a team that consists of people with far more experience than necessary, or more people than necessary, it suggests they either don't understand modern software development practices or are just trying to over-staff the engagement.)
  • Under-staffs the bid (A vendor might try to win the bid by proposing a smaller team than it knows is needed for the project, with the plan of increasing the size of the team later.)
  • Proposes positions that aren’t needed in an iterative development project, such as business analysts, enterprise architects, delivery managers, etc.
  • “Access to a database of resumes” is provided, but specific technical staff are not named
Team capacity The team will be assigned to the project full-time and won’t split members’ time with other projects (Developers, user researchers, designers, and all other key personnel should be fully staffed. A Scrum master or agile coach can be exceptions.)
  • The most qualified team member is allocated a small amount of time on the project
  • Proposed staff don’t currently work for the contractor and a letter(s) of intent from the proposed staff is not provided
  • Key staff aren’t proposed to be full-time on the project, or the project is to be staffed with mostly partial full-time personnel
Technical team members’ specialized experience and knowledge
  • Experience with modern software languages, such as Python, Ruby, PHP, C# (C Sharp), or JavaScript
  • Experience with web-based application programming interfaces (APIs), especially REST and GraphQL
  • Experience using Git for software version control The lead developer’s skill set and experience will enable them to conduct the work required by the project
  • The proposed lead developer lacks sufficient qualifications
  • Proposes outdated software technologies that don’t have an active developer community, e.g., ColdFusion, ASP, or FoxPro
  • Lack of experience with test automation, aka DevOps or test-driven development (TDD)
  • Proposed staff qualifications are copied in large part or completely from the internet
  • Key skills don’t appear in any qualifications, such as:
    • Agile development experience
    • Automated (unit/integration/end-to-end) testing
    • Continuous Integration and Continuous Deployment
    • DevOps
    • Application Protocol Interface (API) development and documentation
    • Open source software development
    • Cloud deployment
    • Building and testing public-facing sites and tools
Research, design, and product team members’ specialized experience and knowledge
  • The lead user researcher’s background demonstrates:
    • Understanding of how research can inform and shape strategy, design, and development
    • Familiarity with a variety of user research and usability testing methods
    • Experience deciding the method or methods to use that suit a given research question
    • Experience recruiting research participants appropriate to a project
  • The lead UX designer’s background demonstrates:
    • Strong craft skills and experience generating concepts that reflect overall project strategy, user research, and user-centered design best practices
    • Experience and ability communicating those concepts visually via a variety of methods, including sketching, wireframing, prototypes, and more polished mock-ups
  • The company, proposed subcontractor, or proposed staff are responsible for poorly designed websites
  • Key skills don’t appear in any qualifications, such as:
    • Product management and strategy
    • User research, such as contextual inquiry, stakeholder interviews, and usability testing
    • User experience design
    • Sketching, wireframing, and/or prototyping, and user task flow development
    • Visual design
    • Content design, UX writing, and copywriting

Similar experience

As part of the solicitation, you will have asked vendors to submit code repositories for projects that are similar in size, scope, and complexity to what the agency needs. If you do not have someone on your evaluation team that is familiar with code repositories, you should find a technical advisor.

What to look for in Positive signs Red flags
Technical evaluations
  • Proper use of Git, commit changes with personal accounts (not organizational)
  • Use of a branching or merging strategy
  • Informative comments
  • Evidence of peer code reviews and collaboration (work was performed in a reasonable number of GitHub comments)
  • Use of a CI/CD pipeline
  • Code that conforms well to the quality expectations in the solicitation’s QASP or set of quality indicators
  • Substantial projects: the projects weren’t created just to have something to point to for this solicitation
  • Iterative incorporation of user feedback into their development process
  • Demonstrates the value of testing:
    • Testing is built into the development process
    • Code tests are written well, test coverage is measured and covers most of the code
  • Use of consistent code style
  • Code displays modularity and opportunities for reusability
  • Sensible data model approach
  • Code includes evidence of accessibility considerations (e.g., appropriate alt text, ARIA attributes)
  • Evidence of accessibility testing: at minimum, an automated scan; more importantly, manual testing
  • The project is set up to be easily deployable by any newly onboarded developer
  • No source code is submitted
  • There is no Git history or only a single commit, which indicates that this is not the actual code repository and that the code was developed somewhere else (maybe not even with source control)
  • None of the provided code samples or described projects are similar in size, scope, and complexity to the project scenario in the RFQ
  • The code samples provided do not demonstrate an understanding of writing a modern, maintainable application
  • Code is undocumented; there are no code comments
  • No automated tests
  • The code has obvious vulnerabilities for attacks (e.g., missing SSL certificates, SQL injection attacks, credentials checked into the code, use of unvalidated JWTs)
  • Tests are disabled, which suggests that developers may have turned testing off instead of fixing errors; there seems to be a practice of deleting tests or code until the code passes
  • Code appears sloppy; there are large sections commented out, unused imports and definitions, or dead code (code that is in the project but is never used)
  • No instructions for setting up the project or documentation is boilerplate (e.g., a README)
  • Code contains secrets such as passwords, personally identifiable information, or access tokens
  • The cited projects lead you to suspect the vendor didn’t create them
  • There’s a finished product, but no code, or vice versa
Programmatic evaluations
  • Work that is conceptually similar to the agency’s needs
  • Work that is centered on user needs
  • Work that was completed by a team of a size similar to the size of the team that they’re proposing
  • Design artifacts that show continuous and ongoing usability testing and that indicate a user-centered approach to iterative design and development
  • Illustrates getting stakeholder buy-in on research findings
  • Demonstrates that they are comfortable with complexity and challenges
  • Communicates openly and emphasizes transparency
  • Identifies what is important to each set of stakeholders and tailors their approach accordingly
  • Describes frameworks and tools that support iterative development, constant improvement, user-centered design, risk management, and product prioritization
  • The cited projects aren’t similar in size, scope, or complexity to that described in the solicitation
  • Work that is led by solutionism
  • The projects don’t include design artifacts and research plans, or the plans are incomplete

Evaluator worksheet

Our evaluator worksheet can help you evaluate proposals for a custom software project. Download a printable PDF or view it in your browser.

18F De-risking Guide

An official website of the GSA’s Technology Transformation Services

Looking for U.S. government information and services?
Visit USA.gov