Scaling Justice Archives - Thomson Reuters Institute https://blogs.thomsonreuters.com/en-us/topic/scaling-justice/ Thomson Reuters Institute is a blog from ¶¶ŇőłÉÄę, the intelligence, technology and human expertise you need to find trusted answers. Thu, 16 Apr 2026 06:20:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 Scaling Justice: AI is scaling faster than justice, revealing a dangerous governance gap /en-us/posts/ai-in-courts/scaling-justice-governance-gap/ Mon, 13 Apr 2026 16:57:55 +0000 https://blogs.thomsonreuters.com/en-us/?p=70330

Key takeaways:

      • AI frameworks need to keep up with implementation — While AI governance frameworks are being developed and enacted globally, their effectiveness depends on enforceable mechanisms within domestic justice systems.

      • Access to justice is essential for trustworthy AI regulation — Rights and protections are only meaningful if individuals can understand, challenge, and seek remedies for AI-driven decisions. Without operational access, governance frameworks risk remaining theoretical.

      • People-centered justice and human rights must anchor AI governance — Embedding human rights standards and ensuring equal access to justice in AI regulation strengthens public trust, accountability, and the credibility of both public institutions and private companies.


AI governance is accelerating across global, national, and local levels. As public investment in AI infrastructure expands, new oversight bodies are emerging to assess safety, risk, and accountability. The global policy conversation has from principles to the implementation of meaningful guardrails and AI governance frameworks, which legislators now are drafting and enacting.

These developments reflect growing recognition that AI systems demand structured oversight and a shift from voluntary safeguards and standards to institutionalized governance. One critical dimension remains underdeveloped, however: how do these frameworks function in practice? Are they enforceable? Do they provide accountability? Do they ensure equal access?

AI governance will not succeed on the strength of international declarations or regulatory design alone; rather, domestic justice systems will determine whether it works. At this intersection, the connection between AI governance and access to justice becomes real.

In early February, leaders across government, the legal sector, international organizations, industry, and civil society convened for an expert discussion. The following reflections attempt to build on that dialogue and its urgency.

From principles to enforcement

Over the past decade, AI governance has evolved from hypothetical ethical guidelines to voluntary commitments, binding regulatory frameworks, and risk-based approaches. Due to these game-changing advancements, however, many past attempts to provide structure and governance have been quickly outpaced by technology and are insufficient without enforcement mechanisms. As Anoush Rima Tatevossian of The Future Society observed: “The judicial community should have a role to play not only in shaping policies, but in how they are implemented.”

Frameworks establish expectations, while courts and dispute resolution mechanisms interpret rules, test rights, evaluate harm, assign responsibility, and determine remedies. If individuals are not empowered to safeguard their rights and cannot access these mechanisms, governance frameworks remain theoretical or are casually ignored.

This challenge reflects a broader structural constraint. Even without AI, legal systems struggle to meet demand. In the United States alone, 92% of people do not receive the help they need in accessing their rights in the justice system. Introducing AI into this environment without strengthening access can risk widening, rather than narrowing, the justice gap.


Please add your voice to ¶¶ŇőłÉÄę’ flagship , a global study exploring how the professional landscape continues to change.Ěý


Justice systems serve as the operational core of AI governance. By inserting the rule of law into unregulated areas, they provide the infrastructure that enables accountability by interpreting regulatory provisions in specific cases, assessing whether AI-related harms violate legal standards, allocating responsibility across public and private actors, and providing accessible pathways for redress.

These frameworks also generate critical feedback. Disputes involving AI systems expose gaps in transparency, fairness, and accountability. Legal professionals see where governance frameworks first break down in real-world conditions, often long before policymakers do. As a result, these frameworks function as an early signal of policy effectiveness and rights protections.

Importantly, AI governance does not require entirely new legal foundations. Human rights frameworks already provide standards for legality, non-discrimination, due process, and access to remedy, and these standards apply directly to AI-enabled decision-making. “AI can assist judges but must never replace human judgment, accountability, or due process,” said Kate Fox Principi, Lead on the Administration of Justice at the United Nations (UN) Office of the High Commissioner for Human Rights (OHCHR), during the February panel.

Clearly, rights are only meaningful when individuals can exercise them — this constraint is not conceptual, it’s operational. Systems must be understandable, affordable, and responsive, and institutions should be capable of evaluating complex, technology-enabled disputes.

Trust, markets & accountability

Governance frameworks that do not account for these dynamics risk entrenching inequities rather than mitigating them. An individual’s ability to understand, challenge, and seek a remedy for automated decisions determines whether governance is credible. A people-centered justice approach, as established in the , asks whether individuals can meaningfully engage with the system, not just whether rules exist. For example, women face documented barriers to accessing justice in any jurisdiction. AI systems trained on biased data can replicate or amplify existing disparities in employment, financial services, healthcare, and criminal justice.

“Institutional agreement rings hollow when billions of people experience governance as remote, technocratic, and unresponsive to their actual lives,” said Alfredo Pizarro of the Permanent Mission of Costa Rica to the UN. “People-centered justice becomes essential.”

AI systems already shape outcomes across employment, financial services, housing, and justice. Entrepreneurs, law schools, courts, and legal services organizations are already building AI-enabled tools that help people navigate legal processes and assert their rights more effectively. Governance design will determine whether these tools help spread access to justice and or introduce new barriers.

Private companies play a central role in developing and deploying AI systems. Their products shape economic and social outcomes at scale. For them, trust is not abstract; it is a success metric. “Innovation depends on trust,” explained Iain Levine, formerly of Meta’s Human Rights Policy Team. “Without trust, products will not be adopted.” And trust, in turn, depends on enforceability and equal access to remedy.

AI governance will succeed or fail based on access

As Pizarro also noted, justice provides “normative continuity across technological rupture.” Indeed, these principles already exist within international human rights law and people-centered justice; although they precede the advent of autonomous systems, they provide standards for evaluating discrimination, surveillance, and procedural fairness, and remain durable as new challenges to upholding justice and the rule of law emerge.

People-centered justice was not designed for legal systems addressing AI-related harms, but its outcome-driven orientation remains durable as new justice problems emerge.

The current stage presents an opportunity to align AI governance with access to justice from the outset. Beyond well-drafted rules, we need systems that people can use. And that means that any effective governance requires coordination between policymakers, legal professionals, and the public.


You can find other installments ofĚýour Scaling Justice blog seriesĚýhere

]]>
Scaling Justice: Unlocking the $3.3 trillion ethical capital market /en-us/posts/ai-in-courts/scaling-justice-ethical-capital/ Mon, 23 Mar 2026 17:12:28 +0000 https://blogs.thomsonreuters.com/en-us/?p=70042

Key takeaways:

      • An additional funding stream, not a replacement — Ethical capital has the potential to supplement existing access to justice infrastructure by introducing a justice finance mechanism that can fund cases with measurable social and environmental impact.

      • Technology as trust infrastructure — AI and smart technologies can provide the governance scaffolding required for ethical capital to flow at scale, including standardizing assessment, impact measurement, and oversight.

      • Capital is not scarce; allocation is — The true bottleneck is not the availability of funds; rather it’s the disciplined, investment-grade legal judgment required to evaluate risk, ensure compliance, and measure impact in a way that makes justice outcomes investable.


Kayee Cheung & Melina Gisler, Co-Founders of justice finance platform Edenreach, are co-authors of this blog post

Access to justice is typically framed as a resource problem — the idea that there are too few legal aid lawyers, too little philanthropic funding, and too many people navigating civil disputes alone. This often results in the majority of individuals who face civil legal challenges doing so without representation, often because they cannot afford it.

Yet this crisis exists alongside a striking paradox. While 5.1 billion people worldwide face unmet justice needs, an estimated $3.3 trillion in mission-aligned capital — held in donor-advised funds, philanthropic portfolios, private foundations, and impact investment vehicles — remains largely disconnected from solutions.

Unlocking even a fraction of this capital could introduce a meaningful parallel funding stream — one that’s capable of supporting cases with potential impacts that currently fall outside traditional funding models. Rather than depending on charity or contingency, what if justice also attracted disciplined, impact-aligned investment in cases themselves, in addition to additional funding that could support technology?

Recent efforts have expanded investor awareness of justice-related innovation. Programs like Village Capital’s have helped demystify the sector and catalyze funding for the technology serving justice-impacted communities. Justice tech, or impact-driven direct-to-consumer legal tech, has grown exponentially in the last few years along with increased investor interest and user awareness.

Litigation finance has also grown, but its structure is narrowly optimized for high-value commercial claims with a strong financial upside. Traditional funders typically seek 5- to 10-times returns, prioritizing large corporate disputes and excluding cases with significant social value but lower monetary recovery, such as consumer protection claims, housing code enforcement, environmental accountability, or systemic health negligence.

Justice finance offers a different approach. By channeling capital from the impact investment market toward the justice system and aligning legal case funding with established impact measurement frameworks like the , it reframes certain categories of legal action as dual-return opportunities, covering financial and social.

This is not philanthropy repackaged. It’s the idea that measurable justice outcomes can form the basis of an investable asset class, if they’re properly structured, governed, and evaluated.

Technology as trust infrastructure

While mission-aligned capital is widely available, the ability to evaluate legal matters with the necessary rigor remains limited. Responsibly allocating funds to legal matters requires complex expertise, including legal merit assessment, financial risk modeling, regulatory compliance, and impact evaluation. Cases must be considered not only for their likelihood of success and recovery potential, but also for measurable social or environmental outcomes.

Today, that assessment is largely manual and capacity-bound by small teams. The result is a structural bottleneck as capital waits on scalable, trusted evaluation and allocation.

Without a way to standardize and responsibly scale analysis of the double bottom line, however, justice funding remains bespoke, even when resources are available.

AI-enabled systems can play a transformative role by standardizing assessment frameworks and supporting disciplined capital allocation at scale. By encoding assessment criteria, decision pathways, and compliance safeguards and then mapping case characteristics to impact metrics, technology can enable consistency and allow legal and financial experts to evaluate exponentially more matters without lowering their standards.

And by integrating legal assessment, financial modeling, and impact alignment within a governed tech framework, justice finance platforms like can function as the connective tissue. Through the platform, impact metrics are applied consistently while human experts remain responsible for final determinations, thereby reducing friction, increasing transparency, and supporting auditability.

When incentives align

It’s no coincidence that many of the leaders exploring justice finance models are women. Globally, women experience legal problems at disproportionately higher rates than men yet are less likely to obtain formal assistance. Women also control significant pools of global wealth and are more likely to . Indeed, 75% of women believe investing responsibly is more important than returns alone, and female investors are almost twice as likely as male counterparts to prioritize environmental, social and corporate governance (ESG) factors when making investment decisions, .

When those most affected by systemic barriers also shape capital allocation decisions, structural change becomes more feasible. Despite facing steep barriers in legal tech funding (just 2% goes to female founders), women represent in access-to-justice legal tech, compared to just 13.8% across legal tech overall.

This alignment between lived experience, innovation leadership, and capital stewardship creates an opportunity to reconfigure incentives in favor of meaningful change.

Expanding funding and impact

Justice financing will not resolve the justice gap on its own. Mission-focused tools for self-represented parties, legal aid, and court reform remain essential components of a functioning justice ecosystem. However, ethical capital represents an additional structural layer that can expand the range of cases and remedies that receive financial support.

Impact orientation can accommodate longer time horizons, alternative dispute resolution pathways, and remedies that extend beyond monetary damages. In certain matters, particularly those involving environmental harm, systemic consumer violations, or community-wide injustice, capital structured around impact metrics may identify and enable solutions that traditional litigation finance models do not prioritize.

For example, capital aligned with defined impact frameworks may support outcomes that include remediation programs, compliance reforms, or community investments alongside financial recovery. These approaches can create durable benefits that outlast a single judgment or settlement.

Of course, solving deep-rooted inequities and legal system complexity requires more than new tools and new investors. It requires designing capital pathways that are repeatable, accountable, and aligned with measurable public benefit.

Although justice finance may not be a fit for every case and has yet to see widespread uptake, it does have the potential to reach cases that currently fall through the cracks — cases that have merit, despite falling outside traditional litigation finance models and legal aid or impact litigation eligibility criteria.


You can find other installments of our Scaling Justice blog series here

]]>
Scaling Justice: Easing the UK’s employee rights crisis /en-us/posts/ai-in-courts/scaling-justice-uk-employee-rights-crisis/ Tue, 24 Feb 2026 18:37:39 +0000 https://blogs.thomsonreuters.com/en-us/?p=69605

Key takeaways:

      • An emerging employment tribunal crisisĚý— The UK’s employment tribunal system is facing unprecedented backlogs, long wait times, and unaffordable legal representation, leaving many workers and small businesses unable to effectively resolve workplace disputes.

      • Process-oriented barriers to justiceĚý— Most claims are dismissed not because they lack merit, but due to claimants disengaging from a slow and complex process, with legal costs often exceeding the value of claims and legal aid unable to meet rising demand.

      • A potential role for legal technologyĚý— Mission-driven legal tech platforms are emerging to provide affordable, scalable support and help claimants stay engaged by offering a practical solution to improve access to justice.


When a worker in the United Kingdom is unfairly dismissed or denied wages, their path to resolution runs through employment tribunals, a specialized court system separate from civil courts. As in the United States, many workers and small businesses cannot afford legal representation and must navigate the process on their own.

With backlogs at all-time highs and affordable legal services at all-time lows, this system is coming under increasing pressure. Fortunately, mission-driven technology and data analysis are emerging to level the playing field and increase access to justice.

Current state by the numbers

According to an analysis of the and other data sources,*Ěýin the second quarter of 2025, employment tribunals resolved just 45% of incoming claims, adding 18,000 cases to the backlog alone. In the past year, the open caseload has surged by 244%. This pressure is set to intensify as the inbound Employment Rights Act 2025 — the UK’s most significant overhaul of workplace protections in decades — is set to extend protection to six million more workers in 2027.

As the backlog increases, so do wait times. In 2025, the average wait for resolution reached 25 weeks, more than double that of 2024, with some claim types like equal pay and discrimination claims reaching up to 37 weeks. Some more complex cases are reported to have their final hearings scheduled as far out as 2029.

With only 8% of cases reaching a final hearing and the majority resolved through settlement or withdrawal, the growing backlog raises concerns about whether lengthy wait times influence how claimants choose to resolve their cases.

In the UK, a common threshold for legal affordability is a salary of ÂŁ55,000, meaning around 65% of workers cannot afford legal representation. Legal aid and pro-bono services exist to support those in need, but with growing funding constraints and rising demand, these services cannot reach nearly two-thirds of claimants.


You can find more insights about how courts are managing the impact of advanced technology fromĚýour Scaling Justice seriesĚýhere


Tribunal awards are largely calculated from salary. This can result in a claim’s value often being lower than the cost of legal representation to pursue it. In a typical hospitality case, for example, a worker owed ÂŁ1,500 in unpaid wages (equivalent to 3½ weeks of pay) has a 92% chance of representing themselves and will wait on average six months for resolution — without pay owed, legal support, or outcome certainty.

The cost, both in time and resources, also falls on employers. In lower-margin industries such as hospitality, default judgments, in which an employer does not engage with proceedings, can reach as high as 37%, compared with a national average of around 6%. For these employers and for smaller businesses more broadly, the cost of legal support may also exceed the value of defending a claim.

With rising costs and growing delays, the risk for both employers and employees is that the system becomes inaccessible, leading to outcomes shaped by who can afford to sustain the process rather than case-by-case strength.

Where justice tech fits

The conventional assumption is that self-represented claimants are at a significant disadvantage when they go to court; yet the data is more nuanced. Self-represented claimants who reach a hearing prevail 44% of the time, compared to 52% for those with legal representation — a gap of less than eight percentage points.

The greater risk is not losing at hearing but never actually reaching one. Analysis of more than 2,700 struck-out, or dismissed, cases by employment rights platform Yerty found that the majority were dismissed not for lack of merit, but because claimants stopped engaging with the process. Only 6% were struck out for having no reasonable prospect of success. This suggests that the primary barrier may not be the absence of legal representation, but the ability to sustain engagement with a slow, complex, and often opaque process.

Increasing numbers of UK workers turning to AI tools like ChatGPT for legal support highlight not only the demand for affordable access but also the risks of general-purpose tools being used in legal contexts. Fabricated case law in tribunal submissions, for example, harms users and adds further pressure to an already overstretched system.


The conventional assumption is that self-represented claimants are at a significant disadvantage when they go to court; yet the data is more nuanced.


A new generation of legal technology platforms is emerging to fill this gap, with tools purpose-built for the specific circumstances of employment law. Yerty and Valla, among others, offer AI-powered guidance tailored to the UK tribunal process, providing affordable, scalable support previously out of reach for most workers. Government organizations are also moving in this direction. For example, in its recent five-year strategy outlook committed to exploring new digital services that offer faster, more accessible support.

Technology alone cannot address underfunding, judicial capacity, or fundamental power imbalances. However, if the majority of dismissed claims stem from disengagement rather than weak cases, and self-represented claimants prevail at comparable rates to those with lawyers, then the answer isn’t more lawyers — it’s better support upstream. Mission-driven legal technology can provide consistent, scalable guidance that helps both parties manage the process and avoid falling through the cracks.

The UK government’s own assessment of the Employment Rights Bill forecasts a 15% increase in claims by 2027 due to expanded eligibility. As noted above, the system is already under significant pressure before these reforms take effect, and traditional responses — more judges, more funding — too often take years to deliver.

While not a complete answer, justice tech can help address a real, measurable problem, that of keeping people engaged in a process that too often disengages them. For a hospitality worker owed back pay, a healthcare worker facing unfair dismissal, or a retail employee navigating a discrimination claim alone, that support could mean the difference between a case heard and one abandoned — and justice delayed or justice denied.


*Sources: Ministry of Justice Tribunal Statistics Quarterly (July-September 2025); Yerty analysis of 2,721 struck-out tribunal decisions and 8,761 case outcomes; ACAS Strategy 2025-2030; 2024 UK Judicial Attitude Survey, UCL Judicial Institute / UK Judiciary, February 2025.

]]>
Scaling Justice: How technology is reshaping support for self-represented litigants /en-us/posts/ai-in-courts/scaling-justice-technology-self-represented-litigants/ Fri, 23 Jan 2026 15:31:24 +0000 https://blogs.thomsonreuters.com/en-us/?p=69124

Key takeaways:

      • From scarcity to abundance — Technology has shifted the challenge in access to justice from scarcity of legal help to issues of accuracy, governance, and effective support.ĚýAI and digital tools now provide abundant legal information to self-represented litigants, but they raise new questions about reliability, oversight, and alignment with human needs.

      • The necessity of human-in-the-loop — Human involvement remains essential for meaningful resolution.ĚýWhile AI can explain procedures and guide users, real support often requires relational and institutional human guidance, especially for vulnerable populations facing anxiety, low literacy, or systemic bias.

      • One part of a bigger question — Systemic reform and broader approaches are needed beyond technological fixes because technology alone cannot solve deep-rooted inequities or the complexity of the legal system. Efforts should include prevention, alternative dispute resolution, and redesigning systems to prioritize just outcomes and accessibility.


Access to justice has long been framed as a problem of scarcity, with too few legal aid lawyers and insufficient funding forcing systems to be built in triage mode. This has been underscored with the unspoken assumption that most people navigating civil legal problems would do so without meaningful help, often because their issues were not compelling or lucrative enough to justify legal representation.

This framing no longer holds, however. Legal information, once tightly controlled by legal professionals, publishers, and institutions, is now abundantly available. Large language models, search-based AI systems, and consumer-facing legal tools can explain civil procedure, identify relevant statutes, translate dense legalese into plain language, and generate step-by-step guidance in seconds.

Increasingly, self-represented litigants are actively using these tools, whether courts or legal aid organizations endorse them or not. Katherine Alteneder, principal at Access to Justice Innovation and former Director of the Self-Represented Litigation Network, notes: “This reality cannot be fully controlled, regulated out of existence, or ignored.”

And as Demetrios Karis, HFID and UX instructor at Bentley University, argues: “Withholding today’s AI tools from self-represented litigants is like withholding life-saving medicine because it has potential side effects. These systems can already help people avoid eviction, protect themselves from abuse, keep custody of their children, and understand their rights. Doing nothing is not a neutral choice.”

Thus, the central question is no longer whether technology can help self-represented litigants, but rather how it should be deployed — and with what expectations, safeguards, and institutional responsibilities.

Accuracy, error & tradeoffs

The baseline capabilities of general-purpose AI systems have advanced dramatically in a matter of months. For common use cases that self-represented litigants most likely seek — such as understanding process, identifying next steps, preparing for hearings, and locating authoritative resources — today’s frontier models routinely outperform well-funded legal chatbots developed at significant cost just a year or two ago.


The central question is no longer whether technology can help self-represented litigants, but rather how it should be deployed — and with what expectations, safeguards, and institutional responsibilities.


These performance gains raise important questions about the continued call for extensive customization to deliver basic legal information. However, performance improvements do not eliminate the need for careful design. Tom Martin, CEO and founder of LawDroid (and columnist for this blog), emphasizes that “minor tweaking” is subjective, and that grounding AI tools in high-quality sources, appropriate tone, and clear audience alignment remains essential, particularly when an organization takes responsibility and assumes liability for the tool’s voice and output.

Not surprisingly, few topics in the legal tech community generate more debate than AI accuracy, but it cannot be evaluated in isolation. Human lawyers make mistakes, static self-help materials become outdated, and informal advice from friends, family, or online forums is often wrong. Models should be evaluated against realistic alternatives, especially when the alternative is no help at all.

Off-the-shelf tools now perform surprisingly well at generating plain-language explanations, often drawing on primary law, court websites, and legal aid resources. In limited testing, inaccuracies tend to reflect misunderstandings or overgeneralizations rather than pure fabrication. And while these are errors that are still serious, they may be easier to detect and correct with review.

Still, caution is key, often because AI tells people what they want to hear in order to keep them on the platform. Claudia Johnson of Western Washington University’s Law, Diversity, and Justice Center asks what an acceptable error rate is when tools are deployed to vulnerable populations and reminds organizations of their duty of care. Mistakes, especially those known and uncorrected, can carry legal, ethical, and liability consequences that cannot be ignored.

Knowledge bases are infrastructure, but more is needed

Vetted, purpose-built, and mission-focused solution ecosystems are emerging to fill the gap between infrastructure and problem-solving. The Justice Tech Directory from the Legal Services National Technology Assistance Project (LSNTAP) provides legal aid organizations, courts, and self-help centers with visibility into curated tools that incorporate guardrails, human review, and consumer protection in ways that general-purpose AI platforms do not.

Of course, this infrastructure does not exist in a vacuum. Indeed, these systems address the real needs of real people. While calls for human-in-the-loop systems are often framed as safeguards against technical failure, some of the most important reasons for human involvement are often relational and institutional. Even accurate information frequently fails to resolve legal problems without human support, particularly for people experiencing anxiety, shame, low literacy, or systemic bias within courts.


Not surprisingly, few topics in the legal tech community generate more debate than AI accuracy, but it cannot be evaluated in isolation.


A human in the loop can improve how self-represented litigants are treated by clerks, judges, and opposing parties. Institutional review models often provide this interaction at pre-filing document clinics, navigator-supported pipelines, and structured AI review workshops that integrate human judgment and augment human effort rather than replacing it.

Abundance and the limits of technology

Information does not automatically produce equity. Technology cannot make up for existing, persistent systemic issues, and several prominent voices caution against treating AI as a workaround for deeper system failures. Richard Schauffler of Principal Justice Solutions, notes that the underlying problem with the use of AI in the legal world is the fact that our legal process is overly complicated, mystified in jargon, inefficient, expensive, and deeply unsatisfying in terms of justice and fairness — and using AI to automate that process does not alter this fact.

Without changes at the courthouse level, upstream technological improvements may not translate into just outcomes. Bias, discrimination, and resource constraints cannot be solved by technology alone. Even perfect information from a lawyer does not equal power when structural inequities persist.

Further, abundance fundamentally changes the problem. As Alteneder notes, rather than access, the primary problem now is “governance, trust, filtering, and alignment with human values.” Similar patterns are seen in healthcare, journalism, and education. Without scaffolding, technology often widens gaps, benefiting those with greater capacity to interpret, prioritize, and act. For self-represented litigants, the most valuable support is often not answers, but navigation: What matters most now, which paths are realistic, how to understand when to escalate and when legal action may not serve broader life needs.

Focusing solely on court-based self-help misses an opportunity to intervene earlier, especially on behalf of self-represented litigants. AI-enabled tools have the potential to identify upstream legal risk and connect people to mediation, benefits, or social services before disputes harden.


You can find more insights about how courts are managing the impact of advanced technology from our Scaling Justice series here

]]>
Scaling Justice: Unauthorized practice of law and the risk of AI over-regulation /en-us/posts/ai-in-courts/scaling-justice-unauthorized-practice-of-law/ Mon, 01 Dec 2025 19:35:29 +0000 https://blogs.thomsonreuters.com/en-us/?p=68596

Key insights:

      • Are regulations choking innovation? — Current regulatory efforts may be stifling innovation in AI-driven legal solutions, exacerbating the access to justice crisis and prioritizing lawyer business model protection over consumer needs.

      • Some safeguards already in place — Existing consumer protection laws and product liability laws already provide robust safeguards against potential AI-related harm, making it unnecessary to impose additional restrictive policies on AI-driven legal services.

      • A balanced regulatory approach is best — An approach that encourages responsible innovation, prioritizes consumer protection, and fosters a data-driven mindset can best unlock the transformative potential of AI in addressing critical gaps in access to justice.


As AI-driven legal solutions gain traction, calls for regulation have grown apace. Some are thoughtful, others ill-informed or protectionist, and many focus on the issue of unauthorized practice of law (UPL). While protecting the public is crucial, shielding the legal profession from competition is not. A large majority (92%) of low-income people currently receive no or insufficient legal assistance; and the ongoing uncertainty in the legal AI and UPL regulatory landscape is chilling innovation that could support them.

The legal profession has always struggled to provide affordable, accessible services even as they simultaneously attempt to block those working ethically to bridge the gap with technology. When done right, legal industry regulation should balance protection with progress to avoid stifling innovation and exacerbating the access to justice crisis.

Consumer protection laws already provide robust safeguards against potential AI-related harms. Existing product liability laws and enforcement actions by state attorneys general ensure that consumers have recourse if AI legal tools cause harm. Despite these safeguards, concerns about unregulated AI filling the gaps in legal services persist.

It is time to upend the calculus of consumer harm and examine the motives of regulation. Rather than forcing tech-based legal services to prove they cause no harm in order to avoid changes of UPL, regulators should be required to justify, with data, that legal technology companies cause harm and whether any ruling will constrain supply in the face of a catastrophic lack of access to justice.

Uneven regulatory efforts raise questions

Current regulatory efforts tend to focus on companies that directly serve legal consumers, while leaving broader AI models largely unchecked. This raises uncomfortable questions: Are we truly protecting the public, or merely constraining competition and thereby reinforcing barriers to innovation in the process?


You can find out more about here


“If UPL’s purpose is protecting the public from nonlawyers giving legal advice — and if regulators define legal advice as applying law to facts — how many legal questions are asked of these Big Tech tools every day?” asks Damien Riehl, a veteran lawyer and innovator. “And if we won’t go after Big Tech, will regulators prosecute Small Legal Tech, which in turn utilizes Big Tech tools? If Big Tech isn’t violating UPL, then neither is Small Tech [by using Big Tech’s tools].”

Efforts to regulate the use of AI-based legal services are, de facto, another path to market constraint. Any attempt to regulate AI should be rooted in actual consumer experience. Justice tech companies, by definition, pursue mission-driven work to benefit consumers, but if an AI-driven tool causes harm, it should certainly be investigated and regulated. State bar associations are not waiting for harm to occur before considering regulating AI-driven legal help — and we must wonder why.

The risks of premature regulation

We must enable, not obstruct, AI-driven legal solutions and ensure that innovation remains a driving force in modernizing legal services. If restrictive policies make it difficult to develop cost-effective legal solutions, fewer consumers —particularly those with limited resources — will have access to legal assistance.

AI is developing far too quickly for a slower regulatory trajectory to keep up — any contemplated regulation would be evaluating last year’s technology, which is at best half as good as the latest iterations. Regulating AI-driven legal services now is akin to prior restraint, as when published or broadcast material is anticipated to cause problems in the future and is suppressed or prohibited before it can be released. However, this approach does not apply to new technology; we already can look for evidence of harm in product liability.

By prioritizing consumers rather than lawyer business model protection, AI-enabled legal support would be monitored for potential harm with data collected and analyzed to bring to light any issues. That way, regulations could be built around that defined, data-backed harm. For instance, we might require certification protocols for privacy or security if those issues prove problematic.

Forward-thinking states are going further

In July, the Institute for the Advancement of the American Legal System (IAALS) released a new report, , which advocated for a phased approach to regulation, beginning with experimentation, education, and consumer protection, while gathering and evaluating data. Later phases could involve potential regulation based on what is learned. In this way, innovation is encouraged while consumer needs and public trust remain paramount.

Also this year, Colorado cut the proverbial Gordian Knot by releasing a — consistent with existing analysis of UPL complaints in the state — for AI tools focused on improving access to justice. Guiding principles include ensuring consumers have clarity about the services they receive and their limits, educating consumers on the risks inherent in relying on advice from non-lawyer sources, and including a lawyer in the loop. Utah, Washington, and Minnesota all have considered similar policies. And IAALS now is collaborating with Duke University’s Center on Law & Tech to create a toolkit and templates to make it easier for other states to adopt UPL non-prosecution or similar policies.

Yet, some regulators seek the opposite, looking to define the exact types of business activity that will lead to UPL prosecution. While this framework is more likely to become obsolete more quickly, it serves a similar purpose: providing clear guardrails that allow innovation to flourish, while protecting consumers by clearly indicating the limitations of the software. The to specifically exclude tech products from UPL enforcement, provided they are accompanied by adequate disclosures that they are not a substitute for the advice of a licensed lawyer. Such policies are essential, and they can encourage those entrepreneurs aiming to ameliorate the justice gap.

What’s next?

The legal and justice tech industries should aim for a regulatory framework that encourages responsible, iterative innovation — and participants should take some proactive steps, including: i) justice tech companies should participate in the discussion and share their business- and mission-focused perspectives to help shape any new regulations; and ii) regulators with internal non-prosecution policies should consider making them public to encourage entrepreneurs in their state.

These approaches would enable positive change for state residents, support overburdened legal aid organizations and courts, and foster a flourishing tech ecosystem aimed at serving unrepresented and under-represented parties.

The legal profession has not been able to ensure justice for all, making it even harder for low-income and unrepresented parties to find the help they need. Now, AI-driven legal service providers are moving forward on addressing critical gaps in access to justice.

With a measured and equitable approach to regulation that neither ignores AI’s risks nor overlooks its transformative potential, the legal industry and regulators must keep pace with today’s technology — and such efforts should not obstruct those legal providers who can bring the law closer to that ideal and help close the justice gap.


You can learn more about the challenges faced by justice tech providers here

]]>
Scaling Justice: How law schools are reimagining access to justice through technology /en-us/posts/ai-in-courts/scaling-justice-law-schools-reimagining-access-to-justice/ Fri, 25 Jul 2025 12:38:57 +0000 https://blogs.thomsonreuters.com/en-us/?p=66867

Key findings:

      • Law schools as innovation hubs — Several law schools across the US are becoming laboratories for access to justice, using technology and partnerships to develop scalable legal tools that help underserved communities.

      • Technology-driven experiential learning — Students are gaining hands-on experience by building digital tools — like document automation, chatbots, and AI-powered self-service tools — that expand legal services and improve service delivery.

      • Systemic impact on legal education — These programs offer not just innovation at the local level but present an opportunity for a structural shift in legal education and justice reform more broadly.


Law schools are stepping into a critical role as laboratories for persistent access to justice challenges. As the legal system grapples with rising demand and constrained resources, many law schools are forging partnerships, launching clinics, and embracing technology to bridge the justice gap.

Experiential learning is increasingly important in the rapidly changing legal sector, and with technological innovation, it is helping to close justice gaps that traditional models have failed to address. By embedding students into tech-enabled, hybrid frontline legal services and having them work on building scalable, human-centered solutions, a growing number of legal education programs have embraced this dual imperative. Students learn the law and deploy it through digital tools, virtual services, and data-informed strategies that help them augment their abilities and maximize impact.

Building for scale with guided tools

Many law school-based tech clinics currently leverage technology not just to serve clients, but to test new methods of delivery. Early pioneers in this space include collaborations with Pro Bono Net and schools like Chicago-Kent College of Law and Suffolk University Law School. Using the LawHelp Interactive platform, students recently built guided interviews and digital self-help forms with tools like A2J Author and HotDocs to assist unrepresented litigants navigating thorny legal issues such as divorce, eviction, and domestic violence.

These tools — designed to be used independently by non-lawyers — exemplify a modern variation on experiential education in which students create scalable, public-facing tools with measurable impact.

For example, Western New England School of Law’s Center for Social Justice uses document automation and online in-take portals to provide services for criminal record expungement and LGBTQ+ legal support. Students provide targeted services at scale, reaching individuals who may otherwise be excluded from traditional legal systems due to transportation, financial, or cultural barriers.


As the legal system grapples with rising demand and constrained resources, many law schools are forging partnerships, launching clinics, and embracing technology to bridge the justice gap.


At Suffolk University’s Legal Innovation and Technology Clinic, law students collaborate with legal aid organizations and courts to develop scalable solutions for civil legal issues. Students are taught project management and computer programming to create powerful tools that directly assist unrepresented parties, including and online guided interviews which help people navigate processes such as .

Ohio State University’s Justice Tech Practicum brings together law students and computer science students to design, build, test, and refine technologies aimed at addressing access to justice issues. The program is currently working with the Self-Help Center of Franklin County to develop tools for tenants facing eviction.

Tech clinics as justice design labs

Law school clinics are increasingly functioning as innovation labs for system-level design — incubating, testing, and improving justice tools in real time. Suffolk University’s Legal Innovation and Technology Clinic is partnering with the American Arbitration Association to pilot tech-driven approaches to low-contest divorces and family law matters in Massachusetts. Students help design and test accessible digital tools that streamline dispute resolution processes.

At the , the Innovation for Justice program collaborated with the Alaska Legal Services Corporation to improve Benefactor, a digital tool that helps guide case managers, social workers, and community navigators through the Social Security disability application process. The Arizona UX for Justice team delivered a human-centered design roadmap for product refinement, legal empowerment, and broader implementation.

Legal Aid of North Carolina (LANC) is also tapping into the power of collaborative tech development. Through its Innovation Lab, LANC worked with law students from Duke and Vanderbilt universities to develop and refine its Legal Information Answers chatbot with students conducting auditing and user testing to optimize the client experience. Vanderbilt students also analyzed LANC lawyer workflows to identify how AI tools might improve staff effectiveness.

AI and data-driven legal empowerment

Law schools are also leveraging AI to improve access to justice, transparency, and user experience. VAILL, the Vanderbilt AI Law Lab, is collaborating with lawyers from the Legal Aid Society of Middle Tennessee, Vanderbilt Data Science Institute (DSI) students and staff, and courts to create and implement Day in Court, a tool to help unrepresented parties navigate court appearances successfully. The pilot will initially focus on small claims matters and provide a platform that can be replicated in other jurisdictions. VAILL and DSI also collaborated to create an advanced directive tool, powered by generative AI (GenAI), for Tennesseans, a technology that will serve as the model for a future suite of self-service life planning tools.


Law school clinics are increasingly functioning as innovation labs for system-level design — incubating, testing, and improving justice tools in real time.


The Stanford Legal Design Lab engages students in service design, user research, and AI strategy in partnership with public interest organizations like Legal Services Corporation, the American Bar Association, Los Angeles courts, and legal aid groups around the country. Students conduct community interviews, run workshops, and develop accountability frameworks for AI-powered justice. In one project, Stanford students collaborated with the NAACP to refine and scale their Housing Navigator, an eviction prevention pilot that can help tenants navigate housing instability.

Implications for legal education

These are just a few examples of law school programs that are reimagining the student’s role not simply as a temporary service provider, but as a developer of justice infrastructure. What distinguishes these programs is not just innovation at the local level, but the structural insight they offer for legal education and justice reform more broadly. They suggest a model in which:

      • experiential learning is centered on service delivery;
      • technology is now integral, not peripheral, to legal education; and
      • students contribute to justice infrastructure, not just as one-time interventions.

These models also underscore the power of public-private collaboration. As more courts digitize their services, these student-built tools are increasingly integrated into formal legal processes. Organizations can expand their capacity without proportional increases in cost or headcount, while advancing digital literacy among students and clients alike. Law schools, in this context, are neither isolated nor purely academic — they are collaborators and facilitators in a broader ecosystem of justice.

As legal needs intensify amid economic and social strain, these programs offer more than isolated success stories. They present a blueprint for rethinking how legal services are delivered and who delivers them. By treating law students not only as future lawyers but also as present contributors — and by equipping them with technology to do so efficiently — these initiatives are helping to shift the access-to-justice paradigm from one of scarcity to one of scalability.

For policymakers, educators, and legal professionals, the message is clear: innovation doesn’t have to wait for graduation. It’s happening now — in clinics, classrooms, and cloud-based platforms — where tomorrow’s lawyers are building the infrastructure for a more just and accessible legal system today.


You can find more articles in our Scaling Justice series here

]]>
Scaling justice: How AI and ADR are reshaping legal access /en-us/posts/ai-in-courts/scaling-justice-ai-adr-reshaping-legal-access/ Mon, 09 Jun 2025 18:44:53 +0000 https://blogs.thomsonreuters.com/en-us/?p=66221

This article is part of an ongoing series titled , by Maya Markovich and others in consultation with the Thomson Reuters Institute. This series aims to not only explore how justice technology fits within the modern legal system, but how technology companies themselves can scale as businesses while maintaining their access to justice mission.


Court systems worldwide are buckling under their own weight, backlogged and burdened with expensive processes. As the volume of civil disputes outpaces the capacity of traditional legal infrastructure and services to manage them, a new legal operating system is emerging, powered by AI and built to serve all people with legal problems.

The goal of this transformation is to resolve millions more disputes efficiently, affordably, and equitably. From legacy institutions to tech startups, stakeholders across the legal ecosystem are recognizing that the path forward requires more than just smarter tools — it demands a structural overhaul.

The civil justice bottleneck

Civil disputes have long presented a crisis of accessibility. In the United States, thousands of civil litigants must navigate the legal system without representation. In 2022, the Legal Services Corporation found that 92% of Americans received inadequate or no legal help for their civil legal problems, up from 86% in 2017. Debt collection cases account for more than one-quarter of civil dockets, and defendants are unrepresented in more than 90% of those cases. In state courts, 75% of civil cases — representing more than 130 million cases per year — involve at least one unrepresented party. The majority of these cases involve such serious life events as eviction, debt collection, family law, and consumer claims.

This market failure affects not only the least financially stable among us, but also those just above the line, small businesses, and more. And it’s not just the outcomes of these cases that impact those involved. Indeed, the day-to-day requirements of dealing with a legal matter for most people — taking time off work, making appearances, funding representation — all contribute to the ongoing stress and negative economic impact on the daily lives of the individuals and families involved.

The root of this crisis is not only a lack of legal aid funding or the impossibility of having enough legal professionals to fill this need, but it’s also the architecture of the system itself. The processes, timelines, and costs of traditional litigation are ill-suited for the volume and nature of modern, often data-intensive disputes.

The National Center for State Courts (NCSC) reports that only one-third (33%) of Americans believe state courts provide equal justice to all, regardless of race or income. This perception is rooted in barriers to access, opaque processes, and the overwhelming complexity of legal proceedings for those without proper representation. However, when a large portion of a country’s citizens see their legal system as inaccessible or biased, its legitimacy — and the public’s willingness to rely upon it — is challenged.

ADR and AI: Expanding the reach of legal remedies

Alternative dispute resolution (ADR) has long offered a more accessible, efficient, and cost-effective path to justice compared to traditional court proceedings. By removing many procedural and financial barriers that deter individuals from pursuing legal remedies, ADR can serve as a vital tool for closing the justice gap. Mediation and arbitration processes are typically faster, less adversarial, and more flexible, allowing for solutions tailored to reflect the real-world needs of the parties involved.

When combined with technology, ADR can reach broader populations by reducing reliance on physical court infrastructure, streamlining documentation, and enabling participation from any location. If implemented equitably, ADR can shift the justice system from a reactive, high-cost mechanism to an inclusive service that meets people where they are in every sense.

To address this gap, legal innovators are turning to AI to reimagine how disputes can be resolved. Rather than simply layering AI onto outdated workflows, some organizations are designing systems from the ground up, in which AI is not merely a feature but the foundation. These AI-native platforms can automate nearly every step in a dispute, from intake and record generation to decision issuance and appeals. Crucially, they are built with adaptability in mind. Users may choose to resolve certain disputes through fully automated processes while preserving the option for human oversight in more complex or sensitive cases.

Companies like eBay, PayPal, and Amazon use AI to route routine claims tied to delivery delays or minor warranty disputes through automated systems that can deliver resolutions within minutes or escalate disputes to a human decision-maker. Those involved in higher-stakes conflicts might opt for traditional arbitration with AI assisting in evidence organization and process management. This flexibility allows for a truly customizable choose your own resolution experience that enables exponentially more disputes to be resolved.

Unlocking access to justice

The legal sector is beginning to see increased investment in ADR technologies. Courts increasingly understand the benefit of private systems that resolve matters outside the courtroom and can relieve overburdened dockets with a process that resolves matters.

Notably, many of these efforts are emerging from nonprofit or mission-driven justice tech companies that aim to expand access to justice with a sustainable business model. This ethos, combined with open collaboration and public iteration, will give the emerging tech-powered legal operating system its staying power. Further, the rise of AI-powered tools to meet the need for dispute resolution at scale offers the opportunity to not only modernize the legal system but also dramatically expand access to justice. For millions of individuals and small businesses priced out of legal representation or overwhelmed by procedural complexity, automated or hybrid systems that can offer a faster, more accessible, and less intimidating path to resolution are a critical fix.

One its own, the unguided development of automated processes will not narrow the justice gap. However, if thoughtfully designed, these tools can help correct long-standing inequities — streamlining claims processes for unrepresented parties, demystifying legal language, and ensuring that outcomes are not dependent on a party’s resources or familiarity with the legal system.

We can now build legal infrastructure that truly scales to meet the needs of the public. Realizing this vision will require deliberate choices, and key questions remain, such as:

      • How do we ensure these systems are inclusive and accessible to people across socioeconomic, linguistic, and cultural divides?
      • What standards will govern fairness and due process in automated environments?
      • How can we make transparency and accountability inherent and necessary features?
      • What role should public institutions play in shaping and overseeing these private resolution platforms?

The future of the civil justice system lies not only in technical advances, but in a new vision of who the system serves. If guided by equity and integrity, with the goals of fairness at scale, AI has the potential to not only optimize justice but extend it to all.


You can read more posts in here

]]>
Scaling Justice: Bridging the justice gap with advanced technology /en-us/posts/ai-in-courts/scaling-justice-bridging-justice-gap/ Fri, 02 May 2025 14:14:02 +0000 https://blogs.thomsonreuters.com/en-us/?p=65687

This article is part of an ongoing series titledĚýScaling Justice, by Maya Markovich and others in consultation with the Thomson Reuters Institute. This series aims to not only explore how justice technology fits within the modern legal system, but how technology companies themselves can scale as businesses while maintaining their access to justice mission.


Millions of people worldwide face barriers when seeking legal help, and this justice gap disproportionately affects low- to middle-income individuals and members of historically excluded communities.

The United States ranks 107th of 142 countries in affordability of legal support, and approximately 92% of low-income individuals receive inadequate or no legal assistance for their civil legal problems. Worse yet, in 75% to 95% of civil cases, at least one party is unrepresented, leaving more than 120 million people each year navigating the US legal system without support. And these broad numbers mask the racial and socioeconomic disparity that pervade the legal system, which results in unjust outcomes and an overrepresentation of those without access to legal services in the justice system.

Moreover, our criminal and civil justice systems feed into each other in a negative loop for many people. An unpaid fine can lead to crushing debt and criminal liability, while the wait for public representation for a criminal offense can prevent a person from dealing with life-changing personal issues like maintaining housing, employment, or financial stability. The ripple effect of this extends beyond individuals to impact families, communities, and entire demographics.

Factors contributing to the justice gap

Certain factors within society — both currently and in the past — may contribute to the continual rise in the justice gap, including:

Economic disparities

Financial constraints are one of the most significant barriers to accessing justice. Attorney fees and court costs make legal representation inaccessible to most of those who need it, leaving them no alternative but to navigate complex legal issues alone. Financial burdens associated with legal disputes can also deter people from filing, defending, or following through with legitimate legal claims.

Implicit bias within the legal system

Implicit bias refers to unconscious attitudes and stereotypes that can affect decision-making within the legal system. Indeed, marginalized groups such as ethnic minorities, women, and immigrants often face systemic discrimination in both civil and criminal justice systems across the world. This bias can lead to unequal treatment, harsher penalties, and diminished trust in the legal system.

In addition, an entrenched bias towards pro se litigants impacts their experience of the legal system and can also influence the outcome. Unrepresented parties have reported that even when their documents are flawless, the statute is clear, and a letter citing the relevant law is included in their filings, some judges, prosecutors, and clerks assume they are incorrect when the other side is represented by counsel. When those represented by legal counsel are presumed more likely to have meritorious claims, unrepresented parties are denied justice. And while pro bono services and legal aid organizations are powerful drivers of justice equity, they are chronically under-resourced and overwhelmed.

Geographic limitations and digital divides

Geographic location can significantly impact access to legal services, especially for individuals that live in remote or underserved areas. These so-called legal deserts have limited availability of legal professionals and court facilities, often forcing community members to expend significant resources to travel for legal assistance. The digital divide further complicates these challenges — individuals with unreliable internet access or lower digital literacy are often restricted in their ability to access support for their legal problems.

How tech can bridge access to justice

Fortunately, justice tech — which encompasses an expansive array of digital tools that include online legal platforms, document automation software, virtual courtrooms, AI-powered legal assistance, and much more — has emerged as a way to address these disparities by helping expand access to legal resources, streamline processes, and improve outcomes.

In addition, there are several specific legal areas that can be well served by justice tech tools, such as civil justice, in which document-preparation tools can help individuals navigate issues including family law, tenant rights, and small claims cases; or criminal justice, in which litigants can access digital solutions that can improve interactions with law enforcement, offer support for incarcerated individuals, and facilitate post-incarceration reintegration; or family law, in which digital platforms can transform how individuals navigate family and estate matters, such as divorce, child custody, bankruptcy, and trust management.

Other tools also offer comprehensive litigation support for unrepresented litigants to help manage their cases through guided legal education, document preparation, and case strategy. And some digital platforms support entrepreneurs, immigrants, and civil rights advocates by providing legal information, compliance tools, and resources for addressing discrimination and harassment.

This explosion of justice tech tools and platforms even offers the opportunity to reduce the likelihood of recidivism by connecting returning citizens with training, employment, housing, and other services — or by streamlining expungement and record-sealing to help users overcome legal obstacles that often hinder employment, housing, and reintegration.

Conclusion

The justice tech sector is actively transforming outdated and costly legal systems, while helping individuals overcome financial, geographic, and systemic barriers to level the playing field. Indeed, AI-driven solutions can provide more affordable and streamlined legal support, clearly demonstrating that the use of AI in a legal context must be developed and delivered with a laser focus on consumer benefit, mitigating consumer harm, and ensuring transparent and unbiased results.

In this way, justice tech is not just an instrument for efficiency. It also presents a fundamental shift and a true alternative to questions around our approach to legal access. With a culture of innovation, mission focus, and accountability, justice tech can and should be part of the solution for a more accessible and fair legal system.


You can find out more aboutĚýthe impact of justice techĚýhere

]]>
Scaling Justice: Breaking through global regulatory roadblocks for increased justice equity /en-us/posts/ai-in-courts/scaling-justice-breaking-through-roadblocks/ Wed, 16 Apr 2025 13:19:29 +0000 https://blogs.thomsonreuters.com/en-us/?p=65528

This article is part of an ongoing series titled , by Maya Markovich and others in consultation with the Thomson Reuters Institute. This series aims to not only explore how justice technology fits within the modern legal system, but how technology companies themselves can scale as businesses while maintaining their access to justice mission.


Despite the historic domination of business law in the field of legal tech, the industry has begun to include mention of access to justice in discussions of tech application, entrepreneurship, and investment in solutions that aim to narrow the justice gap through technology. The opportunity is vast, with a total addressable market of 5 billion people worldwide.

Like any nascent sector, startups on the crest of this breaking wave of justice tech must work through this lack of market awareness and stakeholder inertia toward new models. Further, they also face a gauntlet of regulatory obstacles in an antiquated profession that is reluctant to release its monopoly on the market. These restrictions hamper progress toward universal access to justice. With the advent of widely available AI tools, however, technology can and should be part of the solution, and clearing roadblocks to scale innovation in access to justice is essential.

Legal advice and services in the US and the UK

Regulations differ across regions of course but often have the same effect of hampering progress and market growth. In the United States for example, the American Bar Association’s Model Rules of Professional Conduct 5.4 prohibits unauthorized practice of law, precluding non-licensed attorneys from delivering legal services. Each of the 50 states have adopted some form of Rule 5.4, but there is no bright line between providing legal information and legal advice, leaving those who help pro se litigants just making their best guess.

Given the scale of the access to justice gap, lawyers alone cannot narrow it. Not only will there never be enough lawyers in the US to represent everyone who needs help with their legal issue, it does not make economic sense for them to take on clients in certain types of matters. Pro bono and legal aid organizations are doing incredible work, but they are under-resourced and overwhelmed. As tech tools reach true viability in the legal sector, the profession needs clarity on what constitutes unauthorized practice of law and where and how technology can fit in to address the access issue.

In the United Kingdom, legal advice and services are similarly regulated. Legal advice is defined as guidance on legal rights and obligations, while reserved legal services include activities such as litigation and conveyancing (property transfer), which can only be performed by authorized professionals. This regulatory framework, while designed to ensure quality of legal services, also limits their accessibility to those who can afford them.

Clients often face overwhelming legal jargon and documentation, with no support for implementation or follow-up once a case concludes. This lack of transparency and continuity further alienates individuals seeking resolution to their legal problems. Moreover, costs can spiral without a clear ceiling, often leaving clients worse off than they were before they engaged legal services.

The UK and US have thriving legal tech communities, yet their impact on access to justice has been limited. Current first touchpoints for legal advice, such as UK.gov and Citizens Advice in the UK and local bar associations in the US, simply direct users to lawyers rather than offering self-serve solutions. Collaboration is essential: government bodies, courts, legal aid organizations, and lawyers should work with justice tech companies to harness the potential of generative AI (GenAI) to promote access to justice.

Exploratory frameworks like go beyond just providing innovators with access to regulators for Q&As and instead can provide a platform for joint learning, user testing, and feedback-gathering that will enable evidence-based policy changes. This approach not only fosters trust and understanding but also paves the way for AI-driven legal services to gain the recognition needed to transform the sector.

The knotty state of current regulations

Without regulatory change, direct to consumer justice tech companies cannot help the millions of Americans who need it — because they are stuck. Indeed, some of the challenges that justice tech companies face are very well laid out in in New York, especially around outdated rules of professional responsibility, which includes hurdles such as:

      • vagueness of unauthorized practice of law statutes and revenue-sharing restrictions;
      • the inability to hire lawyers directly to provide legal advice;
      • the fact that, with very limited exception, only law firms can collect fees directly from clients seeking legal advice, otherwise it is “fee splitting” or revenue sharing, which is prohibited;
      • also, companies cannot hire paralegals with decades of experience to give any sort of public-facing advice to consumers unless they are supervised by a lawyer in a firm. What is allowed, however, is a lawyer with zero years of experience in an area of law supervising a paralegal within a firm structure to deliver the same advice; and
      • justice tech companies live under the constant threat of looming lawsuits for the unauthorized practice of law, which are threatened monthly and can easily bankrupt a startup.

These outmoded regulations governing unauthorized practice of law and revenue-sharing are standing in the way of ameliorating the access to justice crisis. And while their principles are incredibly important, there are no clear standards. Often unauthorized practice of law situations are subjective and oversee by legal bar groups that are often set up to protect lawyers’ market share, not the consumer.

The potential of GenAI

There is hope, however, in advanced technology. GenAI, particularly large language models (LLMs), holds much promise for the legal sector. By 2026, it is anticipated that 80% of legal cases will involve GenAI, significantly reducing time and resources. Further, GenAI has the potential to democratize access to justice by making legal advice affordable, accessible, and equitable. This scalability could provide exceptional value, far surpassing any incremental increases in the legal aid budget.

The UK and US stand at a similar crossroads, with the opportunity to lead the integration of GenAI into legal services and create a more accessible, affordable, and equitable legal system. The advent of Generative AI presents a chance to build upon alternative business models, transforming legal services in a way that benefits billions of consumers rather than merely increasing margins for law firms.

To effectively promote adoption and impact, however, regulators should transition from merely facilitating advocacy to instead establishing regulatory sandboxes. These sandboxes would allow consumers to safely experiment with new technologies, while innovators would gain visibility and a platform to build trust. Meanwhile, regulators could collect feedback and evidence from real users to guide their policy changes.

With the demand for legal access outstripping the supply of professionals, GenAI has the potential to transform justice tech by ensuring its applications are affordable, available at all hours, and equitable for every user. It is also critical to the future of the legal profession, which is desperately in need of new and creative ways to adapt to and survive in our changing world.


You can find out more about the impact of Justice Tech here

]]>
Scaling Justice: How partnerships between lawyers and justice tech companies can drive opportunity /en-us/posts/ai-in-courts/partnerships-between-lawyers-justice-tech/ Thu, 20 Feb 2025 14:45:21 +0000 https://blogs.thomsonreuters.com/en-us/?p=64901

This article is part of an ongoing series titled , by Maya Markovich and others in consultation with the Thomson Reuters Institute. This series aims to not only explore how justice technology fits within the modern legal system, but how technology companies themselves can scale as businesses while maintaining their access to justice mission.


Ethical direct-to-consumer legal tech advancements are a win-win-win for lawyers, founders, and consumers, especially for those lawyers in small to midsize law firms because working with technology partners can offer them unique and transformative business growth opportunities.

From lowering overhead and increasing business success to elevating equity and expanding lawyers’ reach, collaboration with justice tech can offer lawyers significant competitive advantages as new realities redefine the profession. Not inconsequently, it can also provide lawyers with the opportunity to do well while doing good.

A new, hybrid approach

More than lack access to justice, a clear indicator that many people find accessing the legal system challenging, if not impossible. And attempt to resolve life-altering problems in court on their own and often experience deeply unequal outcomes. To address this justice gap, justice tech firms have been building solutions for the public — and forging partnerships with lawyers and their firms is one way to bring these solutions further forward.


As technology challenges law firms’ traditional billable hour model, the legal industry is at an inflection point.


Justice tech offerings can successfully help bridge the justice gap in a few ways, including by placing an emphasis on transparency and by educating the user about their legal rights, options, and the process. And these platforms can do this in multiple languages and modalities to empower all users, and reduce the overwhelming confusion of the process, thus facilitating good decision-making.

In addition, justice tech offerings can allow for the automation of legal forms and provide a centralized dashboard or app to manage processes, pending tasks, and more. Often, the apps offer immediate updates and 24/7 access to accommodate user availability, even in legal deserts. They can also offer real-time support when a lawyer is needed, replicating the success of telehealth models, and allow for affordable, transparent fees for routine legal needs.

Finally, these tech offerings are now leveraging AI to better facilitate initial client intake, summarize rules, statutes, and case law, help with fact pattern analysis, provide quick answers to common legal questions, and suggest search terms.

An untapped opportunity

For investors and entrepreneurs, the justice gap is a massive, largely untapped opportunity: those billions of people globally who cannot access their rights represent the total addressable market in a field that critically needs innovation.

Legal professionals are also increasingly aware that with the recent and impending waves of transformation, their own industry is no longer lawyer centric. These changes present an opportunity for them to join with justice tech firms and capitalize on new business opportunities. Forward-thinking lawyers and justice tech companies share a collective goal — that of combining technology and legal expertise to make the law more accessible and at scale.

Efforts to make the law more accessible have been underway for decades, but until a few years ago, most people had only heard of a few direct-to-consumer legal tech companies like Legal Zoom, RocketLawyer, or ProBonoNet. Venture capital was initially hesitant, and regulatory constraints were not yet tested. Now, however, recognizable direct-to-consumer brands like serve thousands, and some of these justice tech entrepreneurs make access to justice their core mission.


Legal professionals are also increasingly aware that the recent and impending waves of transformation… present an opportunity for them to join with justice tech firms and capitalize on new business opportunities.


is an example of this mission-focused hybrid model, offering members of its pro se community the option to engage a licensed attorney in their area for limited-scope services, including legal advice or pre-filing document review. Another justice tech company, , enables its users to complete their divorce procedures using proprietary tech alone or by adding on-demand legal, wellness, and financial services. And TurnSignl, a platform that quickly and efficiently connects a driver to a lawyer during traffic stops or accidents, makes legal help accessible at the moment of need.

These companies and more like them empower more people at the beginning of their issues through scalable and consumer-friendly tools to simplify court bureaucracy and translate complex legal concepts into plain language. They also offer seamless partnering with lawyers when the situation becomes too complicated for the user to handle on their own. And this enables lawyers to do what they do best — problem-solving and legal representation — without administrative overhead.

Why should lawyers partner with justice tech companies?

Justice tech-lawyer partnerships establish a sustainable framework that increases the public’s ability to access their rights while still preserving lawyers’ vital role. These collaborations can be especially important to solo and small to midsize law firm practices in several ways, including:

Driving profitability with new revenue streams — Lawyers working with justice tech companies can drive firm profitability through multiple new revenue streams and supplement their existing practice during slower times. Providing limited-scope services such as document review, legal advice, and negotiation strategy to clients can lead to full representation without requiring intake processes, scheduling, or business development investment.

Increase business success — Partnership makes it easier for prospective clients to find representation via referrals and virtual marketplaces, allowing lawyers to build their brand and digital presence through co-marketing opportunities that tap into justice tech companies’ increasing brand recognition. Justice tech company partners can also market on lawyers’ behalf, answer customer service questions, and handle billing inquiries, which would allow lawyers to focus on the more rewarding work of counseling clients, digging into legal issues, and helping people navigate their legal problems.

Lower overhead — Working with justice tech companies enables law firm leaders to outsource many pain points of managing a law firm and clients. Some justice tech companies handle appointment scheduling, billing, and administrative aspects of a legal practice, reducing the non-legal services workload for participating attorneys and ensuring prompt payment for services.

Offer a competitive advantage — In a rapidly evolving industry, lawyers who embrace tech-led solutions to maximize their value are enhancing their practices by differentiating themselves and setting new standards for client satisfaction. And since justice tech companies are customer-obsessed, collaborating with them enables lawyers to benefit from that consumer satisfaction and trust by association.

As technology challenges law firms’ traditional billable hour model, the legal industry is at an inflection point. Indeed, as the access to justice crisis expands, fewer clients will be willing or able to pay for traditional legal representation. In response, those in the legal profession need to rethink how they can best deliver legal services in order to remain relevant and profitable while refocusing on their ethical obligation to ensure they are helping, not hindering, public access to justice.


You can find out more about the impact of Justice Tech here

]]>