Research

User research will be a core aspect of the U.S. Web Design Standards project as it’s our main source of feedback and inspiration for future product development.

We plan to use a combination of research methods, quantitative research like collecting web analytics to see in how frequently the Standards are used around the federal government, and qualitative research like remote and in-person observational studies to see whether the Standards are making government sites easier for people to use.

We’ll use this research to see how well the Standards are working and what needs to be improved. We can also provide user research services to teams that are using the Standards. These will be priced to allow teams around the federal government to engage with us and improve their product for their users. This benefits the Standards as well because we’ll get feedback on where they need to be improved and extended.

Analytics Reporting

Quarterly Reports

Once a quarter the metrics displayed on the about our work
page will be updated to reflect the recent traffic and usage of the U.S.
Web Design Standards. We will use this information to identify future
improvements to the design patterns. You can use these metrics to help justify
the adoption and continued use of the Standards.

Dependencies

Your product team has defined the metrics and funnels you want to track and report

One of your product team members must be familiar enough with web analytics to set up the tracking and reporting

References

User Research Activities

Recruitment

One of our challenges is knowing who is interested in being part of our user research. To resolve this, we will implement several channels for federal designers and developers to signal their interest and provide their contact information to us. This will allow us to conduct research in a timely fashion.

Potential Methods

Re-activate the screener to intercept users of standards.usa.gov

Provide a “Contact Us” feature on standards.usa.gov

Perform outreach via federal listservs and other digital channels

Digital Service Team Remote Interviews

One of the most consistent ways we have collected feedback from our users has been by conducting remote interviews with digital service teams around the federal government. This has allowed us to collect direct feedback and suggestions for how to improve the product. Given this success, the team aims to continue using this method on a frequent (1-2 interviews per month) basis.

Bi-annual Contextual Inquiries

Twice a year, the team will schedule a round of in-person interviews with teams around the federal government to meet with and observe how teams have been using, or would like to be using, the U.S. Web Design Standards. The information gathered during these “contextual inquires” will ensure that the product is continuously delivering what federal teams and the general public need.

Open House Events (with Remote Options)

The team will host open house events at the GSA Headquarters in Washington D.C and remotely using web conferencing tools. During these events, the team will conduct training presentations and allow teams using the Standards to showcase their products and explain how they’ve implemented them. These events will be open to anyone in federal government.

Research as a Service Offerings

Usability Testing

The best way to learn how well the Standards are working is to watch people using them. You can sign up to have researchers on the U.S. Web Design Standards team conduct a usability study with your users and collect feedback on your product. This benefits us both: we get feedback on how well the Standards work in different contexts and you get direct user feedback on your product.

Package 1 - Remote Usability Evaluation

We will recruit users of your product – usually no more than 20 – and observe them using it via remote, online web conferencing tools. This approach is meant to highlight high level issues that your users are experiencing and provide guidance on future iterations of your product. Your team is encouraged to observe the sessions; our researchers will report back their findings.

Outcomes

Report with high level findings and recommendations for the partner agency

Cost and Timeline

Cost: $65,000 to $80,000

Timeline: 4 to 5 weeks

Package 2 - In-Person Usability Evaluation

We will recruit users of your product – usually no more than 20 – and observe them using it in person at a location to be determined. This evaluation is meant to be an in-depth analysis of your product, exposing both general trends and specific issues that affect your users. Your team is encouraged to observe the sessions, on location and via remote conferencing tools, and our researchers will report back their findings.

Outcomes

Report with high level findings and recommendations for the partner agency

Cost and Timeline

Web Analytics Setup

With the enrollment into the Data Analytics Platform (DAP) being part of the Policies for Federal Agency Public Websites and Digital Services memo, we would like to provide a service to help digital services meet this requirement. Regardless of whether a team is using the Standards or not, the integration into DAP should be consistent across the federal government and we have the tools and knowledge to help.

Package 1 - DAP Integration

We will work with digital services teams to put the basic code needed into place for passing their web analytics into the DAP platform. This is a lightweight service meant to assist teams that don’t have the necessary skills on hand to meet the policies requirements.

Outcomes

DAP code included on all required sites and pages

Integration into the DAP platform

Cost and Timeline

Cost: $8,000 to $16,000

Timeline: 1 to 2 Weeks

Package 2 - Basic Reporting and Dashboards

In addition to the services described in the above package, an analyst will work with a digital service team to set up basic reporting and dashboards that give agencies greater insight into how people use its digital products. The information gathered can help inform outreach, business and technical strategy, and user research activities the digital service team might want perform.

Outcomes

DAP code included on all required sites and pages

Integration into the DAP platform

Requirements gathering of desired reports and dashboards

Report and dashboard set up

Cost and Timeline

Cost: $24,000 to $31,000

Timeline: 3 to 4 Weeks

Become part of the community

The U.S. Web Design Standards has grown into a blossoming, open source community of government engineers, content specialists, and designers. We currently support dozens of agencies and more than 100 sites, which is fueled through an active community of contributors both in and out of government.