Civic Innovation Corps

Utah Performance Measure Portal

As part of Coding It Forward’s inaugural Civic Innovation Corps program, I was a Design Fellow at the Utah Governor’s Office of Planning and Budget (GOPB). I led the design of a new performance measure portal used by the state’s 35 agencies spanning 280+ programs. Our goal was to provide a consolidated system for agencies to create measures and report performance data, and for executive and legislative analysts to make decisions based on this information.

TiMELINE

2021

ROLE

UX/UI Design Lead

TOOLS

Figma, LucidPress, Airtable

client

Utah Governor’s Office of Planning and Budget


0 / Background

Current User Experience

State agencies are required to report performance measures to executive and legislative analysts each fiscal year. This is particularly important for agencies that need additional funding for programs and for analysts to allocate taxpayer dollars effectively.

In accordance with the newly passed legislation HB326 in the State of Utah, we were tasked with creating a new performance measure reporting feature in an existing budget reporting web app, Budget Prep.

Currently, not only are agency performance measures and budgets are reported in two separate systems, but each branch of government enforces different reporting requirements and processes. I was able to audit the existing performance measure and budget reporting systems to assess usability issues.

SMIS - Where performance measures are reported

  • Restrictive Reporting. Only certain metric formats are accepted within monthly time frames.

  • Confusing Navigation. A user typically needs to click around to find a specific performance measure.

  • Information Overload. Users are presented with jargon, data, and graphs that they may not know how to decipher.

Budget Prep - Where budgets are reported

  • Lack of Engagement with Instructions. A Budget Prep Guide is updated and sent every year, but estimated less than 10% of users actually read it.

  • Table Format Preferred, but not without Usability Issues. Certain functionalities are nonintuitive, which can cause data input or saving errors.

1 / User Research

Challenge: How can we make a bitter process sweeter?

After interviewing 15+ users and stakeholders across different agencies and government branches, we found that reporting performance measures is a convoluted process primarily seen as a means to request budget changes rather than a way for agencies to track and assess their performance over time.

After each interview I documented, summarized, and coded user feedback in Airtable.

Key Insights

Agency staff are scarce on time and resources, so they are not prioritizing performance measure reporting. 

Tenielle

Agencies need flexibility when creating and reporting performance measures.

Kamron

“The more opportunity to adjust down the road (monthly, quarterly, as needed) in the system, the more accurate measures we will have.”

Governor’s office of economic opportunity

An ideal interface should be intuitive, both for agency users to report and analyst users to review data.

Brian

Dept. of Cultural and Community Engagement

“Division staff are very involved in running their programs, but asking them to do performance measures is taking them out of their work. Divisions want to receive more funding and tell their stories, but they are spread thin.”

Dept of workforce services

“We report measures anywhere from once a month to once a year. Additionally, the same person might not be reporting each time, so the less time we need to spend figuring the system out, the better.”

2 / Design Opportunity

In accordance with the newly passed legislation HB326 in the State of Utah, we knew we needed to nest new performance measure features into an existing budget reporting web app. However, after our user interviews, we also wanted to provide value to agencies through a process that many users found burdensome.

This led to a team brainstorming session where we identified key pain points that we heard about in user interviews and ideated potential features or solutions that could address these challenges.

Painpoints are in yellow, potential solutions are in green, and remaining questions are in red.

We then narrowed down our feature priorities based on which solutions would be the most impactful for users and most feasible to execute within our 10-week timeline. Given that the budget season was coming up, we prioritized the agency experience as agencies are the first touchpoint in the performance reporting process.

GOALs

Create Measure Form

Agency users can create, define, and edit their performance measures.

Report Measure Profile

Agencies can report measure data.

Agencies are able to review, approve, and track their performance over time.

Feature

Manage Measures Dashboard

3 / Design Decisions and Iterations

Create a Measure - Tagging

Each of the features underwent several rounds of testing and iteration. We continuously tested our design and content over Zoom, ultimately ending up with feedback from over 30+ users. Below are some of our user-informed design decisions.

Problem

Agencies are coming up with performance measures in response to budget line items rather than the other way around. This leads to misalignment between the budget and programs, as in some instances a line item can fund several programs and vice versa.

solution

When creating a performance measure, agencies can “tag” a performance measure with the line item it corresponds to, in addition to other metadata such as impact area and scope. This provides a more holistic understanding of the performance measure for both agencies and analysts and allows for easier search indexing

Report Measure Data - Spreadsheet Format

Users overwhelmingly preferred a spreadsheet-like interface so they can go back and forth between Excel and the performance portal. I carefully thought through and tested the necessary affordances this feature should have so that it supports flexible data reporting while preventing user error.

Measure Profiles and Dashboard - User Flow

Agency users should be able to (1) create a measure, (2) edit measure info, (3) report measure data, and (4) edit measure data. We wanted the distinction between these functions to be clear, and for navigation to be layered so that a user can quickly find the exact performance measure information they are looking for.

final solution

Create a Measure Form

Conditional questions and category tags aid the agency user through creating a new performance measure. Navigation headers provide a visualization of progress, and users cannot progress without finishing a previous section.

final solution

Measure Profile

Users can review and edit measure information and report measure data. Each column of the reporting table affords specific formatting to prevent reporting error. Agencies can also provide context for a measure, which helps analysts understand trends and anomalies in data.

final solution

Manage Performance Measures

Agencies can view all active, under review, and archived measures. This dashboard is also where agencies can be reminded of reporting requirements.

4 / Takeaways

Testing Content and Using Plain Language

A lot of feedback about the current reporting process was about jargon and unclear directions. I worked closely with my supervisor to tweak how content and buttons should be worded to reduce ambiguity.

Leaning on Existing Programs

Rather than throw away the existing SMIS design entirely, it served as a starting point for the performance portal design. It was important to reflect programs that users with which users are already familiar.

Documentation and a Smooth Developer Handoff

I continuously documented design decisions and sent style guides to our team’s developer to ensure that the designs were feasible and implemented according to vision. We also regularly attended stand-ups with Utah’s TTS (Technology Transformation Services) to smooth the handoff at the end of the internship. There’s no such thing as over-communicating, especially when working on a tight timeline.

Example of formatting and style documentation I created for the development team.

5 / Next Steps and Reflection

Within the 10-week internship, I was only able to prioritize the agency user experience, but later iterations of the performance portal will incorporate how GOPB and LFA analysts review and approve performance measure submissions. Another feature we did not have time to implement but was highly requested by users was data visualizations for performance data. GOPB and TTS hopes to integrate data visualizations with Measure Profiles in the near future for the benefit of agencies and the public.

As a team we needed to balance our ambitions with the limitations of time and entrenched bureaucratic processes. This perspective helped us narrow our focus and, and as my supervisor put it, “make a bitter process a little sweeter.” Every step of the way, I received encouragement and trust in my work. I’m grateful to my team for growing with me, and to every public servant who shared their time and thoughts with us.