Court Forms Evaluation & Design Guide

Do you want to make your court or government forms more user-friendly, accessible, and impactful?

Use our evaluation rubrics & design guides to improve how your paper, pdf, or interactive forms work for your court users.

How to use this guide

Why Focus on Court Forms for A2J?

How to Evaluate Your Court Forms

How to Design Better Court Forms

Why Focus on Court Forms?

Forms are not just pieces of paper. They’re tools for people to get their story across to a powerful judge, clerk, administrator, lawyer, or someone else who can decide about their future.

How — and if — you fill in a government form can determine if:

  • you get to stay in your rental home, or you get evicted
  • a company saying you owe them money can garnish your wages or bank account
  • you get a restraining order against someone harassing you, or threatening you with domestic violence
  • your past criminal record gets masked, sealed, or expunged
  • you can get your name and gender changed
  • you can get guardianship of a child whose parents aren’t able to take care of them.

Court and legal aid forms are meant to help a person through their justice journey, by eliciting key information from them & structuring it into the fields that the legal professionals need to evaluate the situation.

Court forms are a service, part of a wider justice journey a person is on.

From an Institution’s to a User’s Perspective on Forms

Court forms matter a lot to high-stakes financial, housing, and family matters. But often they’re designed within local state or county court groups that are focused on court needs:

  • How a local judge works, and how they like to see people’s information and claims.
  • How a local clerk works, and how they process forms and enter them into a case management system.
  • How things have been done in the past, and upholding that accumulation of precedents.

That said, many court teams, access to justice commissions, and community advocates are open to switching this institutional point of view. Especially with more focus on the justice gap and high numbers of people without lawyers (self-represented litigants), there’s more attention to how court forms can be more accessible, user-friendly, and supportive to people trying to participate in the justice system.

This page presents guides to evaluating your existing court forms from a user’s POV, and designing new paper or digital forms that are more accessible and effective for normal people.

Evaluate Your Court Forms

Is this legal form as user-friendly & effective as it could be?

Forms can be evaluated with a combination of user testing, data analytics, expert evaluation, and costing. This combo of measurements can determine if this form or technology tool is actually helping people do what it intends to, and in the most impactful and low-burden way.

For example, when evaluating a form, the evaluation process should use a combination of assessments that can measure

  • the phase 1 (“before the form”) of discovery and uptake,
  • the phase 2 (“during the form”) of usability, accessibility, and usefulness of the thing itself, and
  • the phase 3 (“after the form”) of effectiveness in getting the person towards a successful filing, engagement with the justice system, and a just resolution of their legal problem.

Form Design Rubric

From our past user testing and team review of forms, we have created a Form Design Rubric. It can be used by Judicial Councils, Administrative Offices of the Courts, A2J Commissions, community groups, and university researchers who are interested in finding out “Are my court forms user-friendly & effective? And how can we redesign them to increase the court’s accessibility & equity?”

Use the Form Design Rubric to see where to improve your court/government form, to be more user-friendly & effective

The Form Design Rubric has 11 categories for evaluation. Each category can be scored 1–5 (or even a 0 if the performance is truly that bad).

Notably, the Form Design Rubric is not just about plain language & visual design. Though these are often the factors discussed around forms’ accessibility, they are not the sole criteria by which to judge a form.

Rather, this Rubric recognizes that the form is a service — with a “before”, “during”, and “after”. The form is not just effective if it’s understandable and clear. It’s only effective if people are able to find it and trust it (the “before”). And if people are able to complete it, file it, and complete all the other steps necessary for it to be valid (the “after”).

Thus, the Form Design Rubric has evaluation categories around Discoverability and Branding (for the “before”) and around Pricing, Time, Data, and Next Steps (for the “after”).

How To Use the Rubric to Run Multi-Stakeholder Evaluation

The Rubric can be used internally within a court or legal team, to get an initial understanding of where the form’s strengths and weaknesses are, as a tool for users. This could happen within a Forms Committee, a self-help center, or an Access to Justice Commission.

The Rubric should then also be used externally with community members, past litigants, prospective litigants, and advocates. The Rubric can be used to administer interviews, surveys, and data-gathering with these stakeholders.

Court researchers or university teams could use the Rubric to structure a 15–30 minute conversation with community members, getting their scores in the 11 categories. In addition to the scores, the questions can also be open-ended prompts to get more qualitative information about these factors.

The Rubric can also gather feedback from lawyers, navigators, and other advocates who have repeated interactions with people trying to fill in the forms or trying to litigate their case after the form has been filed. These advocates will have repeat experience and also be able to see the down-the-road effects of certain form choices — especially around the Complexity category.

That said, the advocates’ scores should be balanced against the community members’ scores. We might see advocates scoring a form high because it asks many detailed questions about claims and defenses. The advocates may feel the form is effective because it’s trying to elicit lots of possible information that can help a person later in the case, and also give them key information about what matters in the legal proceedings.

But community members’ scores might give a different read. They might score a form that asks many detailed questions as overly burdensome or confusing. They might say that this kind of form would push them to disengage or abandon the process.

What the Rubric’s Results Can Do

The Form Design Rubric will not tell a Forms Committee or a community group the exact path of what to do, and how to balance out different stakeholders’ opinions about the best design of a form.

But the Form Design Rubric can structure these feedback and input sessions. It ensures that key factors are being considered, to ensure that people’s needs, protections, and empowerment is central to these important pieces of paper.

After many stakeholders review a form with the rubric, a Forms Committee should have an agenda. It will have the hotspots where the form is ranking low. These are the places to work with professional visual designers, legal service experts, and community members to improve the form. This might include work to:

  • Clean up the visual design of the form, with more white space, alignment, organization, and narrative flow.
  • Provide more context, support, and off-ramps on a form webpage, cover sheet, or text message flow.
  • Improve discoverability with SEO techniques, partnerships with tech platforms, improvement of court websites, and links with other agencies.
  • Simplify and digitize notarization, signatures, payments, fee waivers, and filing options.

You can check out my earlier writings on designing a better court form, to start operationalizing what you find out from the rubric.

Can You Use this Rubric for Digital Forms and Tools?

We built this Design Rubric for a paper form or a digital PDF/.doc version of a form. It’s for a static letter-sized surface that a person is supposed to fill in. That might be with a print-out, or on a screen.

We didn’t create the Design Rubric for interactive form tools, that people usually call document assembly tools, expert systems, or guided interviews. That said, many of the categories still apply to the digital tools versions of forms.

We are working on another Design Rubric for court form document assembly tools, that courts can use to evaluate these interactive forms. This second Rubric will have additional categories unique to web applications, but will still retain the categories in this first Rubric.

Evaluation of Forms’ Analytics and Outcomes

The Design Rubric is a tool to evaluate whether a form is *likely* to be effective with court users. It’s a set of principles and heuristics to get input, to make the strongest possible version of the form.

That said, court leaders will need another, complementary evaluation strategy to see if the forms are *actually* effective in making the court system more accessible, usable, and equitable.

The Rubric should not be the only tool that a Forms Committee or Access to Justice Committee uses to see if their forms are human-centered and accessible. Use the Rubric to make the best possible form design, website design, and process design. But then start gathering quantitative and qualitative data about how it works in practice.

This means gathering information about the form-related legal outcomes for a court user. Often this will be done through gathering court administrative data, as well as running reviews with clerk and judicial teams and surveys with court users.

Analytics and outcomes evaluation could include:

  • Discovery & reach rates of the form. How many people who are likely to need this form are actually finding it? What is its reach? This can be evaluated through calculations of the expected audience with this legal problem or in need of this form/tool. Then compare this to analytics or administrative data numbers about the number of people who visit this form page or access it.
  • Usage & Completion rates of the form. Of those who find the form, how many actually engage with it – -and try to fill it in or file it? This can be tracked through websites’ analytics or in-person clinic visits, to see rates of bounces, drops, or incompletion.
  • Time to fill in the form. How long does it take to fill in? The time can be measured through website analytics of form tools, clinic and self-help center management records, or user tests in lab simulations.
  • Error rates in complete and comprehensive form info. How many times do people fill in information incorrectly, not supplying the data that the form was intended to gather? How many times do they choose things arbitrarily, not because they actually want to (like opting to raise all defenses and claims listed out)? This can be measured by lab simulations with test users, or by sampling forms that have been created or filed to evaluate them for errors or arbitrary responses.
  • Filing rates. How many of the completed forms are filed with the court? This can be measured by comparing analytics about form tool usage or form downloads, versus filing rates in court administrative data.
  • The acceptance rate of the form’s output. Of the filed forms, how many are accepted by the clerk and entered into the record? This can be measured with court administrative data about rejection rates.
  • Continuance rates after the form. Are cases filed by a person using this form more or less likely to result in continuances during the case? Is the form helping the case be processed and resolved efficiently, or is it resulting in delays and confusion?

Most of these outcomes could be measured through administrative data held in software, self-help or legal aid centers client management systems, or court case management systems. There are also more subjective techniques to measure the form’s outcomes:

  • Judicial usability of the form entries. When the judge and clerk are reviewing the case file, triaging it to the correct process, and making decisions about outcomes — is the information in the form usable and useful to them? This can be measured through benchmarking evaluation sessions with clerks and judge teams, in which they review a sample of filings to evaluate them based on criteria they might usually use as informal heuristics when reviewing a case. These sessions can help them formally identify the criteria that make a filing usable and useful to them, which the team can then use to score other filings.
  • Procedural Justice of the form. Does the person then feel the justice system is transparent, fair, and open to them? Or does the form experience make them feel the opposite? This can be measured with exit surveys of actual users of the form, or with lab simulations with test users.
  • Substantive Justice outcomes. Do people who use the form better represent their case, claims, and evidence? Do they get the judicial time to spend more time on their case and take care in applying the law correctly? Do they raise more claims and defenses persuasively? Do they end up with judgments more often in their favor? Substantive justice outcomes can be measured by case file reviews, to see how many claims and defenses are raised, how many of them moved forward with serious consideration by the judicial decision-makers, and how many of them were ultimately decided in favor of the litigant. It might also be assessed with exit surveys with litigants about whether they felt their problem was resolved and they received a just outcome.

These legal outcomes can indicate if a form is, in fact, increasing appearance rates, ongoing participation in the court, procedural justice, and substantive outcomes.

If possible, then the team might also track longer-term life outcomes after the legal process is over. Is the conflict or crisis resolved? Does a person reduce their debt or financial stress? Is their housing and family more stable? Do they have more trust in the court and the government? These can be gathered through data-tracking in other public services, watching court records for new cases, and administering surveys and interviews with past court users.

This legal and life outcome evaluation can complement the Form Design Rubric — to see if this well-designed tool is indeed having the intended effect of improving the justice system, and helping a person through a legal crisis.

Is the Rubric Missing Anything?

When you use the Rubric, please be sure to note if you’re getting feedback that doesn’t seem to fit in any of the 11 categories.

Are community members, advocates, court staff, or others flagging another concern — something that would prevent a person from finding this form, filling it in, or filing it correctly?

Please let us know! We will refine and grow this Rubric accordingly, based on other factors that we haven’t come across yet in our own design research.

Design Guide for Better Court Forms

How do we make it more likely that people will engage with these forms, use them easily & correctly, and find value in filling them out?

This Design Guide will walk you through a combination of methods to do this improvement work:

  1. Following Visual and Court Form Design Rules, so that you are using best practices that past designers and experts have already developed.
  2. Running Local, Participatory Design Sessions, to get specific and diverse input from your community and their specific needs and context.

Visual and Court Form Design Principles

We have sets of guiding principles for court form designers to follow. The first principles are general graphic or visual design principles, that have been honed over the years in various (non-legal) fields. The second principles are court-form-specific principles that I distilled from the past decade of working on legal design at Stanford.

General Visual Design Rules

How do you lay out a paper-based (or digital PDF) form in the most user-friendly way? Here is a short rundown of useful visual design principles:

  1. Support the User Journey: by making it easy for a person to navigate the document, and guiding them through a ‘story’.
  2. Have a Clear, Strategic Hierarchy: prioritize the info & tasks, with clear navigation. Define a clear strategy that has a hierarchy — not all info is the same.
  3. Provide a Standard, Clear Layout: all content should be aligned, with a single, coherent visual language. Use a grid and possibly column design, to create distinct zones for the person to explore.
  4. Give Generous White Space: let the eye breathe, make people calm, and give space to the most important info.
  5. Deploy Selective Pops: use limited amounts of special fonts or colors to draw attention to high-priority info.
Key design principles that court form designers can use

If you want to get more into very detailed communication design choices (like font, color, and accessibility) there are more resources on general visual design rules here:

Court Form Design: the key components of a form

These general principles are great, but they can be hard to translate to the specific world of court forms. We need to dive into the specifics of the court form’s components — and people’s experiences of a form.

Court forms usually contain this shortlist of components. These are the ‘materials’ we can use in our design. We can also track — does the form have all these key things, does it get the hierarchy right, and does it treat them consistently.

  • Credentials that signal the form is official for a jurisdiction
  • Title and purpose, what the form is & what it’s about
  • Instruction info for the user, so they know what to do overall, and then in each section
  • Questions and tasks, asking the user for key information and posing choices to them
  • Entry fields for the user to put information into
  • Insider fields for the court staff to mark notes, enter info
  • Links to more help and associated documents
  • Next Steps & Deadlines of how to get this form into the court correctly, and what to expect after

Court Form Design: How litigants experience a form

People don’t magically see a form, get excited, and spend the next two hours completing it before they walk it over to the court clerk’s office. Their experience is usually much more complicated and stressful.

We need to see the court form not as a static document, but as an experience. Understanding people’s experience of a form helps us figure out specific principles that can help increase engagement, usability, and usefulness.

From our user testing and observation sessions at the Legal Design Lab, we’ve found trends in how litigants use forms — especially when they are self-represented.

  • They scan them over quickly. What should I expect? How long is this going to take? Can I even do this? Am I up for this challenge?
  • They do work in bursts. They may have 1 burst when they first engage. But they’ll likely pause & disengage after they get tired. Hopefully, they’ll re-engage with later bursts!
  • They might get distracted or discouraged when they can’t understand or feel overwhelmed. Can you help make sure this doesn’t lead to disengagement? Instead, support them there.
  • They want to be ‘normal’ and strategic. The form should amplify their sense of legal capability — not make them feel lonely or dumb.

This is where we come back to the big goal of access to justice. Court forms could be an essential gateway to participating in the justice system. But they are hard! They ask for complex, high-stakes information. And people are often in a high-stress situation, afraid of getting things wrong, and with a lot of other things to do in their life. So if a court form is badly designed, it can disengage users, and shut down their access to justice.

Key Court Form Design Principles

So with that background, teams can follow some established principles for good court form design, that should look familiar to any group that has used the Forms Design Rubric.

  1. Have a clear navigation scheme & glance-able structure. Can a person ‘get’ the key zones of info & tasks within a 1-minute glance-over?
  2. Be calm & readable. Don’t overcrowd with info and tasks. Does it make the person feel more capable or less? Does it have distinct zones of work?
  3. Support stressed-out users. Does it have off-ramps to info, examples, & assistance — especially near the hardest tasks?
  4. Be easy to fill in. Have consistent, ample space to fill info in. Make it clear through spacing, boxes, and lines about what is ‘right’ and ‘normal’ to put in.
  5. Don’t prioritize ‘insider’ tasks & info over the users’ tasks. Are user tasks in high-priority places? Are insider tasks put in discrete, low-importance places?

Running a Design Process on Your Court Forms

Great! Now we have user-centered metrics to aim for, we’re aiming at more engagement, usability, and usefulness, and we know some principles about how to do this. Let’s think even more concretely. How do we get courts, judicial councils, and forms committees to start improving the design of their court forms? How do we get more people-centered shifts, and more use of key design principles?

Kick off a Redesign by doing a User-Centered Design Review

If you’re inside the court, you can start a design process. I often like to do it in a series of workshops — beginning with design review sessions, involving many different stakeholders and especially court users.

You can lay out existing court forms on a board or a digital whiteboard, and then start going through what works or doesn’t. You can mark it up with post-its, or pen, or even cut it up and reorganize it.

We did this using a Miro board at today’s Form Camp session. We start a design review by beginning with a specific user’s POV. Today, we began with this user scenario:

A tenant in California has just been sued for eviction.

They’ve searched online & found a pdf of this form at the California courts’ webpage. https://www.courts.ca.gov/documents/ud105.pdf

How can we make it usable, useful, and engaging to this litigant?

So we situated our forms discussion in the user’s journey. They’ve come from a Google search, then to a court website, then to a digital pdf. Let’s imagine we are this tenant — and look at the pdf to see what we could improve.

Laying out the user’s journey on a whiteboard, for a forms design review

I screenshot the 5 pages of the California unlawful detainer answer response. This is the document that a tenant would have to fill out, in order to defend themselves against an eviction lawsuit. I laid out the 5 pages on the digital whiteboard, and then asked my team to answer a series of questions about how well the form lived up to the key design principles. I took notes with post-its as they gave feedback.

Doing the design review, with post-its capturing stakeholders’ comments about what could be improved

I prepared a series of specific questions, to get the stakeholders to critique the current design based on the court form and visual design principles.

User Journey Review of a court form

If a person sat down with this form, could they navigate it?

Have your team use the Form Design Rubric (see above) to go through a form together and spot problems, bright spots, and opportunities.

In a design workshop or roundtable setting, you can give the Rubric to stakeholders as “pre-work” for them to do on their own. They can then bring their scores to the conversation, to get to a group consensus about the form.

Or you can do a live, multi-person Rubric evaluation session. You might lead the conversation through the 11 categories, asking the group questions like:

  • At the start, does the document establish a clear, trustworthy relationship between the court & the user?
  • Do the tasks follow a logical, clear order? Are they grouped into clear ‘zones’ that make sense to a user? Are the zones labeled with clear Section Headings? Are there instructions/guidance about sections?
  • Are there ‘Offramp’ links for info & more help in the right place — in a context where the person might be looking for them?
  • At the end of the form, does it make the person confident about the next steps to take?
  • Do you have a clear hierarchy of information & tasks?
  • Your strategic ranking: Have you reviewed everything you want to convey & get from the user? What is most important? What is the middle? What is the least important for the user?
  • Giving the right treatment: For the most important things, have you put them:
    • In the prime locations
    • With bigger fonts
    • With ‘pop’ of color, font, or bold
  • Strong headings: Have you put strong, clear headings for the distinct ‘zones’?
  • Is there plain language or legal jargon, code references, etc?
  • Is the text presented in a large enough font, with enough line spacing, for it to be easy to read?
  • Does the text go all the way across the page (too long)?
  • Are the different zones of tasks cluttered together on page? Or is there breathing, white space at the margins and between zones?
  • Are the time, cost, and data privacy practices made clear & transparent, so a user knows what is happening and if they want to use this form?
  • Does the form do anything to help simplify all of the filing, payments, signatures, notarizations, and service of process tasks that are like to come next for the litigant?

The Rubric’s questions can make sure the stakeholders are doing a critique based on the user’s POV and design principles derived from past best practices. If you don’t use these questions, it’s easy to start shifting back to an institutional-centered POV.

After design review workshops, the team can then move onto other kinds of workshops to generate prototypes, test them, and decide on final versions of forms.

What kinds of form design workshops can you run in your court?

Better Court Form Design will be a continual process

It would be great if I could publish a template for all court forms to follow. It might have hard rules about maximum page length, the ideal font size, and the perfect grid layout. Perhaps these patterns will emerge in the coming years, as we do more testing of form designs with a wide range of users.

But for now, the best way to make user-friendly court forms is to have a continual process of reviewing current forms with community members, drawing on established design principles, testing any new prototype with litigants and court staff, and using our people-centered indicators as the metrics to determine if a court form should be released to the public.

In any court form design, there will be lots of trade-offs. How many details, claims, defenses, and rights can we let a person know about — before we exhaust them to the point of disengagement? How much generous white space can we give a person, until the form becomes so lengthy that it turns them off?

That’s why a multi-stakeholder process, still rooted in the litigant’s (and staff members’) points of view is necessary to decide what works best in making these trade-offs. Ideally, more courts will be engaging in this design work, gathering even more principles and best practices, and leading towards more standardization of the best form designs.

That said — perhaps this whole conversation will be moot soon, if we move towards more interactive form-filling websites, with no more PDFs at all. We’ll then move on to the best design of software interfaces. There will be slightly different principles in play — but our metrics and goals should still be the same.

We want more people to be able to protect their rights in court and do so with a sense of confidence, capability, and dignity. Good court forms are fundamental to good access to justice. As more courts embrace user-centered design, we can start moving to this better future.

Make your form more user-friendly

Form Experience Design Benchmark Principles

See Margaret Hagan’s design guide for court forms (especially in their paper, PDF version), with benchmark principles for these legal documents in particular.

These benchmark principles include:

  • Have a clear navigation scheme & glance-able structure. Can a person ‘get’ the key zones of info & tasks within a 1-minute glance-over?
  • Be calm & readable. Don’t overcrowd with info and tasks. Does it make the person feel more capable or less? Does it have distinct zones of work?
  • Support stressed-out users. Does it have off-ramps to info, examples, & assistance — especially near the hardest tasks?
  • Be easy to fill in. Have consistent, ample space to fill info in. Make it clear through spacing, boxes, lines about what is ‘right’ and ‘normal’ to put in.
  • Don’t prioritize ‘insider’ tasks & info over the user’s. Are user tasks in high-priority places? Are insider tasks put in discrete, low-importance places?