Accessibility in the Definition of Done

Accessibility in the Definition of Done: Why it matters & how to get it right

Including accessibility in your Definition of Done is one of the most effective ways to embed inclusive practice into agile delivery. It ensures digital products are not just functional but accessible to everyone, right from the start.

However, too many still forget one group entirely: disabled users. Here at arc inclusion, we believe that accessibility isn’t something you bolt on at the end. It’s something you bake in from the start and continue to build on.

Accessibility isn’t extra work; it’s part of the work. When it’s visible in sprint planning and embedded in your teams’ practices, you reduce rework, improve quality, and create better solutions for all users.

Smiling man in a plaid shirt pointing at sticky notes on a chalkboard-style agile board, with columns labeled To Do, Work, and Done.
Accessibility isn’t extra work, it’s part of the work. Make it visible in your sprint planning.

What is the goal of a Definition of Done?

The goal of a Definition of Done is to create a shared understanding within an agile team of what it means for work to be complete. Instead of relying on individual interpretations, the whole team agrees on a clear and consistent criteria that every item must meet before it can be considered finished.

It means:

  • Increasing transparency so everyone is clear on what ‘done’ really means

  • Ensuring quality and consistency across all work

  • Reducing rework and technical debt

  • Delivering increments that are potentially shippable at the end of each sprint

  • Supports continuous improvement

Having a clear definition of what it means for work to be complete means that teams can work with confidence, knowing that each increment meets the agreed quality standards and is ready to deliver value to users.

Learn how to run inclusive user testing to confirm accessibility in practice.

Why include accessibility in the Definition of Done?

Adding accessibility to your Definition of Done helps move it from aspiration to expectation. It means accessibility isn’t dependent on one advocate remembering to raise it; it’s embedded in the process and supported by the whole team. The shift strengthens agile accessibility and delivers clear benefits, such as:

  • Creating a shared understanding across teams

  • Improving quality and reducing rework

  • Making inclusion part of your workflow, not just a separate project

Including accessibility in your DoD also helps you spot issues earlier, avoid legal and reputational risk, and deliver better experiences for every user.

What are examples of a Definition of Done with Accessibility in Mind?

Every team is different, but the checks you include should be practical, repeatable, and tailored to your workflows. In the context of a Definition of Done in agile methodology, accessibility examples help teams embed inclusions into their everyday delivery. Here are some examples of what you might include:

1. Passes automated accessibility tests

Use tools like Axe, WAVE or Lighthouse to catch common issues. No, automation won’t find everything, but it’s a great early warning system and safety net.

2. Reviewed against WCAG 2.2 AA

Have someone on the team (or a specialist) check new features against WCAG success criteria. Focus on things like keyboard access, semantic structure, and contrast.

3. Content follows readability and accessibility standards

Good accessibility is about more than code. Clear, structured, plain language content makes a huge difference for users with cognitive differences, low literacy, or who rely on screen readers.

4. Live regions and dynamic content tested

If you’re working with modals, dropdowns, alerts, or other dynamic elements, make sure they’re properly announced by assistive tech.

5. Accessibility statement updated

If your organisation has a public accessibility statement, updating it when new features go live shows transparency and gives users a route to raise issues.

Embedding accessibility in the Definition of Done

Building accessibility into your DoD isn’t about slowing things down with red tape. Quite the opposite, it helps you deliver higher-quality, lower-risk work faster. For agile teams, this shift makes accessibility a natural part of delivery, rather than an afterthought, ensuring that inclusion is integrated into every sprint.

It also supports long-term change by reinforcing three essential pillars:

  • Becoming Compliant: Demonstrating due diligence, getting your services accessible, and avoiding risk.

  • Staying Accessible: Giving your teams the capability, confidence, and tools to deliver accessible products.

  • Inclusive Culture: Embedding accessibility into everyday thinking, values, and practice.

How Do You Create a Shared DoD?

Creating a shared Definition of Done is about building alignment across the whole agile team. It shouldn’t live in the head of one person; it should be visible, agreed, and applied consistently.

Here are some key steps to create yours:

  1. Review your current DoD: Is accessibility mentioned at all? If not, where could it be included?

  2. Choose realistic checks: Start small if needed. Even one or two well-defined steps can have a big impact.

  3. Train your teams: Help everyone understand what those checks mean and how to meet them.

  4. Create a feedback loop: Update your DoD as you learn, test, and improve.

Final Thoughts

At Arc Inclusion, we work with organisations to embed accessibility across the entire product lifecycle, from initial research and strategy through to design, development, testing, and rollout. For agile teams, this includes making sure accessibility is built into the Definition of Done so it becomes an everyday standard.

Whether you’re reviewing your agile processes or building a culture of accessibility from the ground up, our Digital Accessibility Maturity Assessment can help you pinpoint where you are now, uncover gaps, and map out your next steps to deliver truly accessible digital products.

Similar posts

Discover how we’ve helped organisations overcome accessibility challenges and achieve success.

FAQs

Although they may sound similar, there are some key differences between the two.

 

A DoD is a shared checklist that applies to all product backlog items, setting the standard for when work is complete across an agile team. It ensures the product increment meets the quality standards specified and is useful and ready for future release.

 

Creating a DoD is a collaborative process that can involve the entire team and sometimes even multiple teams across the business.

 

On the other hand, acceptance criteria are specific conditions for a single user story or feature. Its purpose is to identify whether the product increment fulfils the specific requirements of the product backlog item. They are normally the product owner’s responsibility but can also be delegated to developers.

A DoD is not always essential for every project, but it is crucial in agile methodologies. Without it, teams risk misunderstandings due to lack of clarity, inconsistent quality, and unfinished work being marked as complete.

 

A clear and transparent DoD ensures teams are aligned and accountable, creating a shared understanding of when work is truly finished and reducing the chance of rework later.

A weak or unclear Definition of Done creates confusion and inconsistency. Without clear criteria, agile teams may:

 

  • Deliver features with poor or inconsistent quality

  • Accumulate technical debt that slows future progress and releases

  • Spend more time on rework and bug fixing

  • Lose trust with stakeholders and users when expectations aren’t met

  • Mark incomplete work as finished

 

In agile methodology, a weak DoD leaves teams in the dark, undermining accountability and making it harder to deliver reliable, valuable increments.

Website accessibility monitoring is the fundamental process of scanning your website to detect any issues that could prevent users with disabilities from using it. Automated web accessibility monitoring tools continuously check for accessibility issues across your site, providing instant alerts for new and updated content, as well as your overall site health.

 

They track compliance with standards like the Web Content Accessibility Guidelines (WCAG) and show you how accessible your site is, where it should be, and what improvements should be made to deliver a better experience for all users.

 

In addition to measuring your compliance, they also provide a clear picture of your progress over time, so you can track the impact of your improvements and maintain ongoing accessibility.

The two main types are automated and manual monitoring. Together, they provide you with a comprehensive view of how accessible your site is and where improvements are needed.

 

  • Automated monitoring uses specialised web accessibility monitoring tools to scan your website for non-compliant features and common issues, such as missing alt text, poor colour contrast, or keyword navigability issues. These tools can also provide instant alerts for when site elements present accessibility risks and site health reports so you can prioritise any issues.

  • Manual monitoring is where accessibility experts and testers come in to review your site as a real user would, often using assistive technologies like screen readers. They will usually check how easy it is to navigate through pages, interact with content, and understand messages or instructions. The aim is to identify any areas which may present barriers for individuals with disabilities.

Accessibility monitoring is crucial for ensuring that everyone can use and experience your site in the same way, regardless of ability. It is also essential for staying compliant with standards like WCAG and with laws like The European Accessibility Act 2025.

 

Without regular monitoring, accessibility issues can easily appear when new pages are added, content is updated, or designs are changed.

 

Continuous website accessibility monitoring gives you a framework to:

  • Stay compliant

  • Improve user experience

  • Respond to issues quickly

  • Track progress over time

Accessibility monitoring should be integrated into your process rather than a one-time check. Websites can change frequently, with new pages, designs, and content changes, but each update can introduce accessibility issues.

 

Continuous monitoring, both manual and through an automated website monitor, is recommended to catch any issues as soon as they appear, particularly after any big changes, such as adding interactive elements, redesigns, and when legal or accessibility guidelines are updated.

 

Even without significant changes, monitoring should be a consistent part of your organisations website maintenance.

 

The more you test the better, but for those looking for an exact amount, ideally once a month is a good starting point to catch any emerging issues.

Book a meeting

This field is for validation purposes and should be left unchanged.

Signup

This field is for validation purposes and should be left unchanged.