top of page

Robotic Automation in Legal Practice: Lola v. Skadden

Robotic automation in legal practice has always been a topic of wide discussion but its analysis is rather theoretical than practical. This is because the legal service industry is considered rigid and its rigidity lies in the fact that it is a profession which is self-regulated or is preferably a self-preserving monopoly. In American context, the American Bar Association (‘ABA’) through its professional and ethical guidelines makes the legal profession a rigid system which is heavily monopolised by the lawyers, making it almost impossible for a non-lawyer or a non-member of the bar to have any involvement in the legal practice.


In this light, a matter before the United State Courts of Appeal for the 2nd Circuit (‘Court’), Lola v. Skadden, Arps, Slate, Meagher & Flom) (“Lola Case”), has had a long-term impact on the workings of the legal industry. In this matter, the Court had differentiated on the work that can be done by a machine and a human lawyer, concluding that any work that can done by a machine should not be sustained under the definition of ‘practice of law’ just because it is instead done by a human lawyer, as such work does not involve any independent legal judgement or discretion.


Therefore, work such as ensuring documents are free from any typographical or grammatical error, drafting of briefs, preparation of case documents etc. cannot be considered practice of law if and only if technological advancement is such that machines or artificial intelligence (‘AI’) can do this same work accurately. The Lola case has allowed AI infused legal services to encroach upon the lower-level legal jobs and clerical work.


The Lola case is a watershed moment as it has tackled the rigid monopoly of the legal service industry which is protected by the professional and ethical guidelines provided by the ABA. The ABA Rules for Professional Conduct, Rule 5.4, does not allow a non-lawyer ownership in a law firm. This rule is founded on the principal that it is a lawyer’s duty to protect their client’s interests and non-lawyers cannot be trusted to protect client’s interests as they might commercialise on it. Further, every American state has their own Unauthorised Practice of Law Statutes (‘UPL’) that are backed by ABA at a national level, which makes it illegal for anyone not registered in the state’s bar to provide any kind of legal assistance. In this context, the Lola case has stated that the work that can be ‘completely’ performed by a machine or an A.I. system, shall not be considered to be within the definition of legal practice.


Considering the current Covid-19 situation at hand, the Lola case can have an impact of limiting the definition of ‘practice of law’ and as each task is removed from the definition of legal practice due to further technological progress, it will force lawyers to adapt and formulate new practices of laws innovatively. There might even be a day where continuous usage of AI in the legal industry will result in efficiently storing the patterns of legal thinking of a human lawyer, which when clubbed with AI systems, may develop a ‘legal mind’ to provide opinions and inspect the outcome of litigations. However, this is a very simplified way of thinking. This is because firstly, work that requires complex thought process are more difficult to automate and is not as simple as being broken down into an algorithm. However, in places where there is a perusal of data required, a smart contract system can be used to identify and scan, for instances such as financial frauds or audits. Secondly, in the American context the current professional and ethical guidelines by the ABA are based on human level of competence, care and skill that is required in a legal field. If this is replaced by an AI then perhaps a separate guideline for AI infused legal services will be enacted or else a human lawyer has to again determine whether the task performed by an AI is in compliance with current legal competency required. Hence, if an AI system is adopted in the legal field where data is stored on a cloud-based service, then depending on a country’s privacy laws, its place of storage, authenticity and information ownership will be heavily scrutinised only to be considered below the evidentiary value when presented in the court.


Given the advancement of AI in other industries, especially automated decisions, in order for an AI system or a machine to qualify for assisting in practicing law, the courts may devise a test which shall contain certain qualifications including the ethical design. While this is not likely to replace human lawyers, it may certainly assist in cases that do not require greater extent of application of law. For example, in case of summary suits where there are standard set of facts, such as non-payment of electricity bills, which does not require much scrutiny of law.

Hence, even though a full automation process by AI systems in the legal industry may not happen in the near future, there remains a wide range of opportunity for legal technology to advance within under ‘practice of law’. Various regulators have moved away from a mere ‘fact’ based system to a ‘data’ bases system where they build flexible regulations to create an inclusive environment for innovative and disruptive technologies.


The article is authored by Rishika Raghuwanshi, Co-head, BlockSuits.

1 Σχόλιο


shhubham
21 Αυγ 2020

Ah. What a case, particularly interesting if the same line is taken by Indian Courts.


Practice of law can only be done by "Advocates". Wonder what it means for any website which offers automated brief/pleading/agreement preparations.


(Not that any of the law is enforced)


Food for thought.


P. S. Crisp and well written Rishika!

Μου αρέσει
bottom of page