The internet has been abuzz with the recent technological innovation of an Artificial Intelligence (AI) powered entity set to defend a litigant before a United States court for a traffic violation case.
While Joshua Browder’s DoNotPay application has got many heads turning, it remains to be seen how effective the role of the AI-powered entity will be in defending litigants.
Bar & Bench’s Jelsyna Chacko spoke to technology law experts in academia and legal practice to gain perspectives on potential of such a development in the Indian context.
The tool is assisting legal defence rather than performing the legal defence
Ameen Jauhar, Team Lead - Applied Law & Technology Research (ALTR) at Vidhi Centre for Law and Policy, clarified that the AI tool that is being deployed is an algorithm and not a robot per se.
"There is a difference between algorithms and robots. We tend to classify everything as the same. The current product is an app with an underlying algorithm. This algorithm is build as a large language model and is a generative algorithm (i.e. it can understand natural human language and produce outputs similarly in human language).
As a large language model, it will be able to hear arguments in court and generate textual responses to the same which may be used by a lawyer or a defendant while presenting oral arguments.”
Contextualizing it in the Indian scenario, Jauhar cited the example of a briefing counsel’s assistance to a senior counsel.
“A briefing counsel who is better versed with the facts or the case laws that are relevant to the proposition being argued, assists a senior counsel and keeps whispering or writing down those inputs. That is essentially what this tool is going to do. So yes, that is very much possible in an Indian courtroom, theoretically speaking. There is no bar against lawyers or litigators using such tools in India, at least not yet."
Pondering on the practicality of an AI-powered app to assist in courtroom proceedings in India, Jauhar said,
“It will require the algorithm to be trained in Indian case laws, Indian jurisprudence, etc. Also it needs to be affordable. Right now, it is priced at $36 for 3 months in the US. For a law firm lawyer, it may be affordable, but for every litigating lawyer, it may not be. What I understand is that this app is being run by a corporation which is trying this kind of large language model to assist people in their legal defence across different kinds of cases. They’re starting with traffic violation cases, which is a safe bet.”
On the aspect of whether such a technology could replace lawyers in the future, Jauhar clarified the role of the AI tool and the pre-requisites that need to be considered before such a technology is adopted in Indian courts.
“There’s a big series of events that needs to happen for bringing such technology in the Indian context. I’m not saying that this kind of technology cannot be a proof of concept that maybe 30-40 years down the line, you could have these kinds of algorithms actually being deployed in physical form, where you don’t need a human interface to communicate the arguments to the justice system. Now that is contingent on a lot of other things, not just technological innovation…
...Today, this can’t be the case, because under most jurisdictions, a legal representative needs to be someone enrolled in a Bar Council. This tool is not representing a client, it is aiding representation to a defendant in a traffic court with access to easily understandable information, legal jargon, etc. In most traffic courts, it is not really lawyers who go and appear. It is typically the defendants themselves…
...What remains to be seen is how effective the tool is, because ultimately it will be imparting certain information to the defendant. How well the defendant takes in that information and then further articulates, is something that is fully contingent on the defendant… What needs to be said is that the tool is assisting legal defence rather than doing or performing the legal defence,” he added.
While Joshua Browder has voluntarily taken on the liability to cover the costs of the defendant in case they lose the case, it is pertinent to question what potential legal liabilities could ensue in the Indian context. On this aspect, Jauhar emphasized the importance of contextualizing the role of the tool to the type of cases it is providing assistance on, and the resultant harm that could potentially be caused in case of an error.
“The argument of liability stems fundamentally from what role the tool is performing. If it’s just an application that is assisting in a way, the best way to deal with it would be under a product liability case. If the product itself is faulty, then like in conventional product liability jurisprudence, you should hold the manufacturer liable, provided you prove certain things and establish that fault. I would contextualize this argument of liability on a case-to-case basis, because this standard cannot remain the same.
An algorithm that is providing defence assistance for a traffic ticket versus an algorithm that provides defence assistance to someone accused of domestic violence and can potentially end up in jail; will be very different…since the potential for harm will be very different. The basic discourse on AI regulation is moving away from standardising liability and instead evaluating it on a use case basis.
Coming back to India, there are currently no laws against this, so you will have to go under product liability laws.”
On the aspect of privacy violations when a third party application is involved in legal proceedings, Jauhar mentioned the necessity of robust data protection laws which regulate how the information is used and the purposes for which they are retained or processed.
“This will be like any other app that collects information… The way it will be protected is through data protection laws...It should only be accessible through consent and frameworks. Thereafter comes the purpose of processing such information. For instance, it is utilizing such information to create a profile and understand why someone jumped a traffic light. For that, the user puts in the information of the basic facts…The processing of such information should be limited to the function that needs to be performed. That is exactly where legislation is imperative.”
The way we see lawyers is going to dramatically change over the next 30-50 years
Speaking to Bar & Bench, Namit Oberoy, Founder of Indian Legal Tech, compared the use-case of the DoNotPay application to the AI-powered research tools used by lawyers.
“If you see any of the AI powered tools lawyers are using today, they are helping to predict what kind of cases are most likely to apply to your situation and the more context you give it, the more accurate it is. It is the exact same process a human is also doing… So that part, AI-enabled research and AI-enabled argumentation is already happening, not in the future, but today.”
“Today, technology exists to write basic petitions 100 times faster than a human can while being as accurate… When it comes to legal reasoning, if the same AI is also trained on legal text, say Motor Vehicles Act and some 10,000 cases related to it, then it can build those legal reasons. If you see ChatGPT, it is already doing that. It is not trained specifically on a legal model, but once you do that, it can make a legal argument also and it can do very well. After that, what the human ends up doing, what we call human-in-the-loop technology, is just supervising. That’s it.
We’re doing that in research already. In India we will start using it on reasoning. In larger law firms, there will still be a lot of restrictions and some scepticism, but in smaller practices, you’ll start seeing it. The only difference when it comes to applying it in India is that Indian courts are very risk-averse. Lawyers are also risk-averse. They’re not going to openly admit majority of their case was generated by an AI."
Commenting on the affordability of utilizing AI-powered technology like DoNotPay for smaller law firms or legal practitioners in India, Oberoy put emphasis on the larger picture of efficiency and cost-effectiveness when investing in an AI-powered tool.
“Let’s say you take the assistance of a chatbot to understand your rights and entitlements. The chatbot will help you with this information in a fraction of a second. There’s a certain cost to this piece of knowledge which is ‘x’. The alternative that exists today is that one approaches a lawyer and ends up paying 5x or 10x for legal advice. AI is relatively very cheap to you. If you have your own data, you can train on top of it…
...For a smaller law practice, their need is not one-time. If they have a specific expertise, say negotiable instruments, then they can easily train the AI for that. The cost, 2 years or 5 years or even 10 years down the road, the associate for whom you’re going to pay ₹50,000 a month, is going to do the same job as the AI, but 20-30 times less efficiently.
I think the discussion of cost has to be relative… A company like DoNotPay will not come to India at rates of $30-40. They’ll probably launch it for ₹500/month or so.”
On the acceptance and adaptability of the Indian legal fraternity to a third party technology such as DoNotPay’s in legal proceedings, Oberoy said that the challenge was more on the lines of socio-cultural acceptance than technology.
“ I think the challenge here is not technological… Once the AI has context and semantic memory, then it’s only a question of whether it is happening in audio or text. So, the challenge is not technological at all. The challenge is cultural, it’s social and India is a very risk-averse environment. The India legal industry has a lot more structural barriers which is why whenever a new legal technology solution is created and on the other side is a consumer, not a law firm, it’s always going to face backlash.”
Speaking on the potential privacy breaches that could occur with the use of AI-powered tools getting access to private data, Oberoy said,
“Things become a little complicated when a law firm uses Open AI. They decide to train the AI model with the 50,000 odd cases they’ve done till date… The next time they’re creating a contract, the contract is not just optimized based on the Open AI’s own capabilities, it also learns from the previous contracts the firm has made. That’s when a law firm needs to think about whether it’s safe, because if all of their client data goes out, it breaches confidentiality. These situations are where law firms will need to think a little bit more.”
Addressing the question of whether such technology can jeopardize future generations of legal professionals in India, Oberoy said,
"The way we see lawyers is going to dramatically change over the next 30-50 years. In fact, legal work is not going to be similar. It’s not a bad thing, but old structures will have to make way for the new ones. That doesn’t mean AI is going to replace humans or AI tech is going to replace lawyers. It’s that lawyers who are leveraging AI to innovate are going to replace lawyers who are not leveraging AI…
...An AI cannot start a business, or start a practice. It’s a forward looking lawyer who now has a sea of possibilities, who can think, 'Why are people doing this? I can provide this at one-tenth to my client and can serve 1,000 clients at a time.' That’s the person who becomes the new breed of lawyers. Today, that person will be an exception, 10 years down the road those people will be few, at some point in the future they’ll be the majority… “
We need to embrace technology, but with caution and responsibility
Supreme Court advocate and Founder of CyberSaathi, an initiative that focuses on cyber security in digital space, NS Nappinai says,
“I believe that whilst AI is an effective tool particularly for process-driven work, the same would not apply to something like the practice of law or arguing in courts, as the same requires multiple skill sets and emotional quotient.
Advocacy is what inspired practitioners of my time and whilst it may be a dying art form in today’s practice regime, it still holds a critical place, particularly in litigation.
We cannot eschew technology and need to embrace it, but we have to do so with caution and responsibility. Using AI in law carries the same two warning signs.”
Commenting on the aspect of liability, Nappinai said,
“Ideally, there would be liability on the manufacturer and the software developers. However, recent case laws from the US have indicated otherwise, where the owners have instead been mulcted with liability. The trend appears to indicate that no robot can or ought to be left to autonomously act, but that human intervention would always be needed.”
No threat in the near future; an AI robot in court will take many years
Former Supreme Court judge, Justice Madan Lokur expressed his eagerness to witness the DoNotPay proceedings through live-streaming. He said,
“It is too early to say whether something similar will come in India. The hearing will take place next month in the US. I hope it is live-streamed so that we can witness a "historic" trial. Getting a robot to argue a case is not at all easy. That’s why the trial next month is for the simplest of cases, namely, a traffic offence.”
While many considerations need to be taken into account while speculating the potential integration of AI-powered lawyers in the Indian context, it will be interesting to see how far-reaching an impact DoNotPay has in the global scenario.