Meet partners where they are
Meeting partners where they are encourages a more participatory, and therefore sustainable, design process.
We view our partners as co-creators of our design process. We are especially mindful of their:
- Ability to participate
- Design maturity
- Remote friendliness
- Access to tools
- Security and privacy norms
- Policy requirements
Partner participation is essential to our team's principle of training design advocates. But just because an agency enters into an agreement with 18F doesn’t mean we take their ongoing participation as a given—we actively work to engage our partners at every step of the way.
At a high level, to encourage participation we:
- Build trust
- Respect people’s time and expertise
- Respect our partner agency’s culture (beliefs, norms, etc.)
- Approach new contexts with humility
- Create appropriate opportunities for our partners to participate in the design process and to learn by doing
- Demonstrate success, especially with quick wins
- Clarify our purpose (for example, with a problem statement)
- Use plain language
- Build shared understanding whenever possible
We also assume that people are:
A partner’s ability to participate inevitably affects our day-to-day work. We are honest and transparent whenever we feel that our partners are not participating as much as the work demands.
Some agencies explicitly choose to work with 18F so they can:
- Build something that’s more user-centered (their goal), and
- Mature their agency’s user-centered design practice (their subgoal)
These engagements will begin by contemplating questions related to design maturity such as:
- What is our partner agency’s design awareness and ability?
- Who in their agency already does design? Does that person have a full-time design role or is design the side job of an engineer, program manager, etc.
- How does the partner learn from those most impacted by design decisions?
- How often does our partner team prototype new ideas?
- How often does our partner team engage in critique?
- How do product changes ladder up to service design or organizational considerations?
By initially assessing design maturity, we're able to help our partners increase their skills over time, and better measure how far they’ve come.
As one example: 18F had a nine-week engagement in which our partner expressed interest in maturing their user-centered design practice. We determined our partner's initial level of design maturity through stakeholder interviews with their in-house designers and through contextual inquiry [18F methods]. As a result, we identified three learning objectives. We created a lightweight curriculum using backward design, a curriculum design method in which you first define desired learning outcomes, then identify how you’ll assess learning, before you devise training material.
At the beginning of each week, for the first six weeks of the engagement, we included 15-minute “homework” assignments in our weekly status emails. This got the team thinking about a particular aspect of user-centered design. Each assignment corresponded with a skill-building workshop held later in the week. This helped us meet the engagement goal and also helped the team mature its user-centered design practice. By the end of this engagement our agency partners were planning their own research and moderating their own usability tests.
Engaging partners in conversations about how they might level-up in their practice requires some willingness on their part. In cases where our partners don’t want, or don’t have the capacity, to learn more about user-centered design, we need to ask ourselves:
- How early in the process does it make sense to invite the partner in (when we’re sketching ideas, planning research, etc.)?
- How might we better incorporate our partner’s perspective into the creative process (via stakeholder interviews, design studios, etc.), so that they see design’s ability to facilitate constructive dialog?
- How could we more directly communicate design’s ability to meet their goals (such as demonstrating how proactive usability testing helps them reduce risk)?
Partners who are new to design will need to appreciate its direct benefits (like improved usability and customer adoption) before its indirect benefits (such as helping the team identify the most important problems for them to solve). We aim to give our partners something tangible, and let them experience the show before pulling back the curtain.
18F is a distributed team for many reasons. For example, being distributed allows us to hire people who would not traditionally join government or move to DC. It also allows us to include a broader cross-sampling of people when conducting design research. Being remote-first requires we maintain a number of remote-friendly practices discussed further on the 18F blog.
That said, the majority of agencies we partner with are not distributed teams. At the beginning of every engagement, it’s helpful to ask:
- What are our partner's normal working hours?
- What’s the highest fidelity way for everyone to communicate? We prefer video, which allows us to communicate more expressively, and to give non-verbal cues like a thumbs-up signal.
- What accommodations will we need to hold meetings, design studios, usability tests, synthesis exercises, etc. so that our remote colleagues can fully participate?
- Which activities should we prioritize conducting in person? (This might include stakeholder interviews or kickoff meetings.)
Depending on the answers to the above questions we might:
- Create a fully-remote team to reinforce solid distributed work practices or consider staffing the project with at least one person who is local (for example, 18F staff who can commute to our partner’s offices).
- Modify activities to facilitate remote or in-person participation, depending on what the project calls for (for example, arranging digital sticky notes on a digital whiteboard rather than using paper sticky notes).
18F has fairly unique access (within the US federal government, at least) to web-based collaboration software, including chat, whiteboards, wireframing, and prototyping tools. Our agency partners may not have access to these tools, or may be prohibited from using them the way we do. For example, they may be able to video chat, but may be unable to share their screen.
We coordinate with partners to identify which combination of tools will work best for our collaboration, often using a mix of communication tools already in use by the partner agency and new tools to which 18F can share access.
Our collaborations are frequently anchored by at least one or two in-person sessions, where we may use analog tools such as sticky notes, pencil, and paper, or conduct in-person research such as observation that can later be documented in a digital form.
Risk management is a big part of developing government digital services. 18F relies on a number of platforms to create secure, compliant-by-default websites and web applications, including the use of Cloud.gov Pages, Cloud.gov, Login.gov, and Search.gov.
As our work moves closer to production, our partners may ask us to help them obtain an Authority to Operate (ATO) for the products or services we’ve helped create. We often begin this conversation by identifying who at our partner agency will play key roles in the authorization process (such as the authorizing official and system owner). It can also be helpful to ask about:
- Authentication (it’s okay to have short-term and long-term solutions)
- Our partner agency’s existing policies around account management
- Our partner’s use of tools authorized government-wide via the FedRAMP authorization program.
We discuss privacy norms and relevant information practices to ensure mutual understanding of essential concepts and identify differences in each agency's approaches. However, our norms are not our partner’s norms; and just because GSA’s Privacy Office sanctions our design research program doesn’t mean that our partner agency’s privacy office will do the same (see legal and privacy). As we conduct design research on behalf of our agency partners, we may need to prompt conversation between the following GSA offices and their counterparts at our partner agencies:
- Privacy office
- Office of General Counsel
- Paperwork Reduction Act (PRA) Desk Officer
Beyond research, we engage with our partner's privacy office when the systems we design will collect or use information specific to an individual. For systems that will interact with personal information, called personally identifiable information (PII) in the government, we work with our partners and their organization's privacy office to assess the privacy impacts of that use, and document that impact in a Privacy Impact Assessment. If the system will routinely allow the government to retrieve PII by an identifier such as a social security number, then our partners will ensure there is a public notice and a legal purpose to process that PII. These notices are called System of Record Notices or SORNs.
Practicing user-centered design in government is complex. In some cases, we can find that policy dictates decisions we otherwise thought were ours to make about how a product or service functions or how we carry out the user-centered design process. For example, as we helped build the new FEC.gov as discussed on our blog 18F conducted usability testing to see how our proposed designs might affect the site’s usability. If we were to suggest changes to the forms that campaigns use to file with the FEC, the FEC might be legally required to solicit public feedback in the Federal Register over a multi-month period.
As we collaboratively design with partners, we should ask:
- Which policies will shape our design? (See Digital.gov's list of requirements for federal government websites)
- Which policies will shape our design process? (Are we considering a research design that will require approval from our partner’s Paperwork Reduction Act Desk Officer? See legal page for considerations.)
- Whose permission will be required if we need a policy exception?
We recognize our partners are working to deliver their missions in a complex ecosystem of regulatory, organizational, and technological policies and constraints. Taking the time upfront to agree on tools and practices helps set the foundation for a strong collaboration.