A new frontier of closing the justice gap in Alaska and British Columbia, Canada is becoming a reality with AI-powered chatbots that provide self-represented litigants with accurate, user-friendly legal guidance
Access to justice is a fundamental pillar of a fair and equitable society, yet only one-in-four respondents to the National Center for State Courts’ State of the States survey agreed that courts are doing enough to help individuals navigate the court system without an attorney. Many of these pro se litigants still face substantial barriers to accessing legal assistance.
However, AI-powered chatbots now offer a promising solution by providing timely, tailored legal information to those in need — and two early examples are the chatbots Beagle+ and AVA.
Beagle+ makes Canadian law accessible in plain language
Beagle+ is a chatbot powered by generative AI (GenAI) and developed by People’s Law School in British Columbia. The chatbot assists people with step-by-step guidance on everyday legal problems by allowing users to input their legal concerns in their own words. The chatbot responds with appropriate information, links to relevant resources, and potential next steps. Drew Jackson, Digital & Content Lead at People’s Law School, led the efforts to create Beagle+ with technical assistance from Chris McGrath, Founder of Tangowork. Jackson and McGrath worked together to launch Beagle+ in early-2024.
Central to the success of Beagle+ is its thoughtful design and user-centric approach. The team prioritized creating a system that is both empathetic and informative with a primary focus on providing users with clear, actionable guidance. The chatbot’s ability to integrate seamlessly with existing web resources without requiring dual data maintenance is another significant achievement because it reduces operational overhead while maintaining up-to-date legal content.
Although the tool is successful, Jackson and McGrath faced challenges throughout the developmental journey. One key barrier to overcome was ensuring the chatbot did not give incorrect legal advice from its training data. Another challenge was improving the system’s ability to handle nuanced legal questions. To address these challenges, the team used iterative testing and refinement to achieve a 99% accuracy rate in legal conversations.
Alaska state court develops its first chatbot
The Alaska Court System (ACS) partnered with LawDroid, a legal technology company which has pioneered access to justice chatbots since 2016, and used a grant from the National Center for State Courts to develop an AI-powered chatbot called the Alaska Virtual Assistant, or AVA. The tool, which is in the final testing phase before launching, will help self-represented litigants navigate probate estate cases.
AVA uses enhanced retrieval augmented generation, which combines information retrieval with GenAI for improved accuracy and context in responses that are based on the court’s existing self-help web content, according to Tom Martin, CEO and Founder of LawDroid. Notably, AVA provides citations to verifiable sources and suggested follow-up questions to aid self-represented litigants in finding the information they didn’t even know they needed. ACS and LawDroid have been testing both OpenAI’s ChatGPT4 and Anthropic’s Claude Sonnet 3.5 and comparing accuracy and tone. A decision has not yet been made on which model will ultimately be used, according to Jeannie Sato, Director of Access to Justice Services of ACS.
Managing the complexities of legal language and ensuring the chatbot’s responses are consistent and reliable were two main challenges experienced during the development of AVA. These were addressed through a combination of meticulous content review, the use of advanced AI models, and continuous collaboration with legal and technical experts. Also, substantial effort was spent to create a comprehensive knowledge base from existing web content to ensure external sources did not leak in and result in erroneous responses to prompts. The production of AVA also required rigorous testing and refinement to address inaccurate inferences and inconsistent responses.
What the courts can learn from AVA and Beagle+
The development of Beagle+ and AVA yielded several key lessons that courts and legal services organizations can benefit from, including:
Focus on user needs during development — When creating public-facing legal tools, the most important requirement during the development and implementation journey is considering the needs of the average self-representing user who may have limited or no knowledge of the legal system. Beagle+ and AVA balance empathy with clear information to ensure the delivery of user-centric guidance that is both compassionate and practical, containing actionable insights and support. Additionally, both tools prioritize clear and concise language to achieve a reading level that is understood by the general public.
Collaborate with an interdisciplinary team — Both projects stressed the importance of having a multidisciplinary team that possesses legal and technical expertise along with a commitment to use plain language. This helps ensure that the chatbot is legally accurate, technically sound, and easy to understand.
Use iterative testing and human review — The development teams of both projects used rigorous and recurring testing and regular human review of responses — they also focused on using information solely from trusted sources (the knowledge base) to guarantee that users receive correct legal guidance. Maintaining a system for documenting and preserving all prompts and responses helps track accuracy and allows the team to monitor progress over time. ACS found that instructing the model to include a citation to the source of the information can help confirm accuracy and improve user confidence.
Continuously evaluate and improve the chatbot — Both teams underscored the importance of ongoing refinements to the knowledge base, stemming from iterative testing and user feedback analysis to maintain accuracy and improve the chatbot’s performance over time.
Dedicate resources well — Cost is often a factor for smaller court systems as well as for nonprofits and legal aid organizations. However, the most important factor in resource planning is dedicating the appropriate amount of internal staff time to the AI project. Project managers should plan to dedicate at least the 30% of one staff person’s time to build and review the knowledge base, evaluate and refine output, and fulfill other responsibilities. Allocate 30% of another person’s time for technical development.
Conclusion
As AI-powered legal chatbots continue to evolve, they offer a promising path to bridge the justice gap and empower self-represented litigants. By learning from successful implementations like Beagle+ and AVA, courts and legal services organizations can develop more effective tools to increase access to justice for all.
Join us for part two of the Tech for All webinar on February 19 to delve deeper into the technical aspects of building and monitoring these AI tools