
One of the oldest and most formalised professions in the history of humanity is the legal profession. However, in the last two years alone, more change has occurred in the legal profession than in the last two decades of its history. Artificial intelligence is no longer just a buzzword bandied about in the corporate suites of Silicon Valley—AI is now literally sitting in the preparation tools used in the courtroom, in contract review software, and in legal research tools.
For law students and early career lawyers, the changing role of AI in the profession is an uncertain but also exciting time. It is an exciting time because those who are willing to embrace the change now will be far more attractive in an employment market that is hungry to find lawyers who understand technology and AI. It is uncertain because those who do not will be entering a profession where their training has not kept up with the technology they will be expected to use.
This blog will examine the changing role of AI in the education of lawyers, the skills that lawyers now need, and where the education of lawyers is failing to deliver.

Before we get into what law schools need to teach, it’s worth considering just how quickly the law profession itself is changing.
The percentage of law firms in North America employing AI has risen dramatically over the past year: from 19% in 2023 to 79% in 2024. Harvey AI, Westlaw Edge, Lexis+ AI, Kira Systems, and LawGeex are all in daily use across law firms of all sizes. What used to take junior associates hours in the library or pouring over documents can now be done in minutes with the aid of AI.
This has given law schools a direct demand signal. The question now asked by law firms is: does this candidate understand how to utilise AI tools? Can they analyse the output of AI tools? Do they understand the ethical implications of AI in the practice of law?
The answer, it seems, is: not yet.
Legal education has traditionally emphasised doctrine, case analysis, statutory interpretation, and legal writing. While these are important skills, the tools for practicing law have changed significantly. Most law school curricula have not kept pace with these changes.
What is changing actively is:
Traditionally, legal research involved students researching through legal databases, tracing cases, and interpreting the holdings of numerous cases. All these have been made easier by the use of AI technology. For example, Westlaw Edge and Lexis+ have the ability to produce relevant case law, conflicting cases, and legal principles in a matter of seconds. If students are not familiar with these tools, they are at a disadvantage in the profession.
Currently, there are several legal AI platforms that can assist in the review of contracts by checking for missing clauses, risk areas, unusual clauses, and comparing contract language. For example, Legistify has the ability to reduce contract review by up to 60% for contracting organisations. Junior lawyers are now required to oversee the process, not do the work manually.
M&A due diligence, a traditional proving ground for first-year associates, is changing with the advent of AI technology. Kira Systems and LawGeex are examples of software programs that can examine a large number of documents in a short time, identify key provisions, and identify unusual provisions in a contract. The role of a lawyer in this area is now focused on review, evaluation, and validation.
With the help of generative AI, lawyers can draft first drafts of their legal writing. Prompt engineering, or the power to provide clear and well-structured commands to a machine to elicit a meaningful legal result, is a new professional skill in its own right, of equal importance to drafting.
In these areas, the underlying legal judgement and thought process are still those of a lawyer, but the process in which this judgement takes place has completely shifted.
On the basis of the changing nature of legal practice, the following are some of the competencies that are required of law graduates entering the workforce:
The ability to understand what AI systems are and are not capable of in a legal context. This includes an understanding of how large language models work in conceptual terms, the phenomenon of hallucinations by which a model may produce plausible but incorrect answers, and when to trust an answer provided by an AI and when not to trust it.
The power to provide precise and structured prompts to Artificial Intelligence tools in order to produce accurate and relevant legal information. Ambiguous prompts will lead to ambiguous results. Lawyers who can provide precise prompts to Artificial Intelligence tools can work more efficiently and produce better results.
The Artificial Intelligence tools provide information that looks credible but can be wrong. A lawyer should have the power to read information produced by Artificial Intelligence tools, such as a case summary, contract analysis, or legal brief, and understand what is accurate, what is questionable, and what is wrong.
Knowledge of the structure of legal data, what metadata means in the context of document review, and an understanding of what e-discovery tools can do and what the results of legal analytics tools mean.
The use of AI in law raises important issues about client confidentiality, data privacy, and the duty of competence. Lawyers must have an understanding of the ethical rules governing the use of AI in law in their jurisdiction and be in a position to advise their clients about the relevant regulatory issues.
The law is a rapidly changing field in terms of technology. The tools that are important today might be replaced or improved in a few years. The ability to learn new tools, critically assess new technologies, and be flexible in one’s approach might be more important than knowledge of specific tools.
Despite the rate of change in practice, law school curricula have yet to keep pace. There are structural gaps in the following areas:
Most law schools offer an elective course in AI or legal technology, and students who are not actively seeking out this education are not exposed to the topic in law school.
Research indicates that only 9% of law students are using generative AI in their work, and only 25% plan to use it in the near future. This is an alarmingly low rate compared to the workforce as a whole, indicating that law students are unfamiliar with, suspicious of, and/or lacking guidance on the use of AI.
Many law teachers themselves do not possess familiarity with current AI tools. Without faculty preparedness in teaching students how to use current tools, as well as how to critically evaluate tool results, even well-intentioned curriculum development lacks substance.
Current examination structures, which emphasise personal memorisation and unassisted writing, do not allow for AI tools to be incorporated into the educational process in a way that is educationally effective.
While some law schools now address teaching students how to use AI tools, there remains a lack of emphasis on ethical considerations, i.e., how AI use relates to principles of confidentiality, disclosure, and professional responsibility.
In the context of Indian law, where there has been rapid adoption of legal tech, along with tremendous growth in the corporate legal space, most law schools have failed to train students in AI. The Indian AI legal space was valued at 29.5 million USD in 2024, and there has been increasing adoption of AI tools in corporate legal teams. However, there remains a shortage of graduates ready to work in this space.
Despite these difficulties, there are positive examples of legal institutions incorporating AI into their education strategies.
University of Chicago Law School has adopted a phased approach through its Bigelow Program, whereby students are advised not to use AI tools during their first semester, allowing them to establish core legal research and writing skills. However, in subsequent semesters, AI tools are incorporated thoughtfully, and students are expected to understand the difference between their own analysis and AI-generated content.
Villanova University Law School has partnered with the Harvey for Law Schools program, providing second- and third-year students with access to the Harvey AI platform, a legal AI tool. Students are provided with a controlled environment to use the tool, with guidance on how to interpret results.
Duke University’s Continuing Studies Division provides a 40-hour professional development course called “Embracing AI for Legal Professionals,” which includes legal research, document review, contract analysis, predictive analytics, and ethics.
At least eight law schools across the country have now incorporated mandatory AI training into their first-year curriculum, acknowledging that the topic is not elective.
What these models are indicating is what a well-rounded approach looks like, and that is foundational legal skills, and then the use of AI tools, along with critical evaluation and ethics, as well.
Students need not wait for their institutions to catch up. There are things law students and young lawyers can do now:
Legal tech platforms are playing an increasingly active role, not only in providing AI tools but also in raising awareness and building capabilities around the use of AI tools. Platforms like Legistify, which provide AI-based contract management and legal operations tools to Indian businesses, form part of a broader ecosystem that is changing the way in-house legal teams operate.
For law students and young lawyers, working with these platforms, whether through internships, product trials, or content, will give them real-world exposure to how AI tools are actually being utilised in real-world corporate settings. The understanding of how these tools will be utilised (e.g., contract management, clause analysis, and compliance) will be immediately relevant to in-house legal teams, business laws, and legal operations.
Legal tech is not a separate discipline from law; rather, it has become, and will continue to be, the infrastructure through which law operates, and understanding it will be a fundamental expectation of a lawyer.
The legal profession in 2030 will be significantly different from the legal profession in 2020. There are several trends that indicate the direction in which the legal profession is headed:
While the use of AI in legal research and document analysis is now well underway, the next generation of legal AI will tackle more complex issues, including the analysis of case outcome predictions, monitoring and analysis of legal compliance, and the provision of early dispute resolution support. Lawyers will need to oversee and authenticate the work done by AI, as opposed to doing everything themselves.
Legal technologist, legal operations manager, AI compliance counsel, and legal data analyst are job roles that did not exist or existed only in niche areas in the last decade. However, these roles are now being created, and law graduates with the right mix of legal and technological skills are well positioned to take them up.
It is estimated that up to 30% of law graduates may seek alternative career paths by 2030, including roles such as those in regulatory compliance, legal tech consulting, policy, and in-house counsel for tech firms and startups. The legal system and AI competency are a powerful combination for a broader range of career opportunities.
As AI is increasingly used in legal contexts, the legal rules surrounding its use are also likely to evolve. Lawyers who are competent in AI and the legal rules surrounding its use are likely to be highly sought after.
The demands of employers, bar associations, and students themselves will increasingly require law schools to make AI a part of the regular curriculum, rather than an elective.
The trend is clear. The legal profession is becoming a technology-enabled profession, and legal education is beginning to reflect that. Students who are developing these skills, even if on an informal and independent basis, are going to be ahead of the curve when they enter the profession.
AI is not replacing lawyers. It is changing what lawyers do, how they do it, and what they should know before they start. The disconnect between where legal education is and where legal practice is means it is an opportunity and a challenge.
For law schools, it is how to effectively integrate AI literacy, tool training, and ethics frameworks into the curriculum, rather than an afterthought. For students, it is how to seize this opportunity, rather than waiting for law schools to catch up.
Lawyers who will thrive in the next decade will be those who can apply strong legal analysis and work effectively with AI systems, critically evaluating them and using them well, and exercising independent judgment where it counts most.
While AI will certainly replace some tasks currently performed by lawyers, there will be no replacement of lawyers in a general sense. The replacement will be more along the lines of replacing the time taken up by lawyers in tasks such as document review, initial research, and drafting contracts. The work that involves judgment, counseling, negotiation, and advocacy will remain firmly in human hands. The replacement will be from doing everything manually, through supervision, and validation.
Not necessarily. It is important for law students to have a basic idea about how AI works, but most AI tools currently in use in the legal profession are meant to be used by non-technical users. It would be more important for law students to have an idea about how to critically evaluate what has been produced by an AI tool, how to structure prompts, and how to identify what an AI tool is getting wrong.
The major ethical concerns are those related to confidentiality, such as what happens to client information when it is uploaded to a third-party AI platform; those related to accuracy, such as the potential for AI to produce false citations or legal analysis; those related to disclosure, such as whether a client must be advised when an AI is used on their matter; and those related to competence, such as the lawyer’s obligation to use technology responsibly and stay apprised of its capabilities and limitations.
The ones that are commonly used are Harvey AI, Casetext, Westlaw Edge, Lexis+ AI, and Legistify. There are also those that are commonly used for contracts, such as LawGeex and Luminance, that are commonly used in a corporate legal setting.
One important test is whether students are graduating with the ability to use at least one AI legal tool proficiently, to critically evaluate AI-generated legal outputs, to recognize the ethical and professional responsibility implications of the use of AI in the practice of law, and to appreciate the use of AI in the areas of law they are entering. Those curricula that cover all four of these areas are likely doing a good job of teaching AI.