Select Page

I recently wrote a piece bemoaning the lack of critical thinking skills among younger lawyers in the age of GenAI and wondering how the profession can hope to train lawyers to have these skills. I wondered whether the repetitive, tedious work that young lawyer traditionally had to do created experienced lawyers who could recognize patterns and solutions due to the exposure to multiple situations over their career.

I also wondered what would happen to that wisdom through experience when that repetitive work was done by GenAI tools. What will happen to the ability to “think like a lawyer” in a world where so many tasks are now done with GenAI and automation?

After reading a recent interview of the economist Tyler Cowen, I believe law schools need to play a greater role in developing GenAI skills to facilitate the development of that wisdom. Cowen is a professor at George Mason University.

Cowen’s Arguments

Cowen persuasively argues that colleges and the broader education system are simply failing to adapt to the demands of an AI world. He believes that education should focus on teaching students how to effectively use AI, instead of thinking of ways to force them to avoid it. He also thinks in the future those without AI skills will have a difficult time getting a job and advancing.

In particular, Cowen believes educational institutions should double down on teaching how to use AI tools instead of focusing on things like rote homework (what purpose is there of giving homework assignments that ChatGPT can do instantaneously) and rote memorizations (I thought about this as I watched my grandson struggle with doing math problems when the answer could be found on a smart phone calculator). Instead, Cowen says we should prioritize critical thinking, creativity, adaptability, and individualized guidance, qualities that AI cannot replace (yet).

As an example of this kind of thinking, OpenAI recently introduced Study Mode that instead of giving students answers, forces them to think through problems and come up with answers. It employs things like the Socratic method and personalized hints designed to guide students to come up with their own answers. It is designed to teach students how to think critically.

For example, instead of my grandson trying use pen and paper to solve the math problem of what is 7×8 through memory, it would ask him what does 7 times 8 mean in words. In a legal context, instead of asking students to memorize the elements of negligence, it might ask them to identify what’s missing from a fact pattern to establish a prima facie case.”

It is exactly this kind of tool Cowen would think educational institutions need to apply.

Law Schools

These insights have particular urgency for legal education. Indeed, most of Cowen’s criticisms and suggested changes need to be front and center for law school leaders. It’s naïve to think that law student and lawyers aren’t going to use GenAI tools in virtually every aspect of their professional and personal lives. Rather than avoiding the subject or worse yet trying to stop use of these tools, law schools should make GenAI tools a fundamental part of research, writing and drafting training.

They need to focus not on memorization but on the critical thinking skills beginning lawyers used to get in the on-the-job training guild type system. As I discussed, that training came from repetitive and often tedious work that developed experienced lawyers who could recognize patterns and solutions based on the exposure to similar situations. But much of that repetitive and tedious work may go away in a GenAI world.

The Socratic Method

As OpenAI’s Study Mode demonstrates, the Socratic method which law schools have leaned into for years could be ideal for just this type of training, if done right. Doing it right in the age of GenAI means not asking for the regurgitation of facts and holdings, but asking, for example, how a GenAI summary does or does not tease out the critical portion of a case and the nuance of the holding.

Or asking students to generate a list of potential issues demonstrated by a factual scenario using GenAI and then discussing what the tool got right and what it may have gotten wrong and why. It’s asking students to generate an argument using ChatGPT and then discussing what’s missing. Law schools need to focus on requiring students to disclose and explain when and how they use GenAI. And show them what is right and wrong with GenAI outputs.

The Role of Adjunct Professors

But to do this, law schools need to better partner with actual practicing lawyers who can serve as adjunct professors. Law schools need to do away with the notion that adjuncts are second-class teachers.

Practicing lawyers can supply that additional insight that full-time professors can’t because they lack the experience. Indeed, in the publish or perish world of law school tenure, the temptation to use GenAI to create law review tomes could lead to professors doing just what we want young lawyers to avoid: overreliance on GenAI instead of critical thinking and creativity.

It is the experienced lawyers, those with the accumulated wisdom, who are best equipped to spot things GenAI tools may have missed. Who can spot flaws in a chatbot’s reasoning. Who can separate the wheat from the chaff. By working with and mentoring law school students on these kinds of things, they can begin to impart these abilities.

Certainly, the academia v. practical debate has been going on ever since I was in law school. But the importance of the practical has changed with the advent of GenAI. GenAI provides the means to gain access to information in ways never before imagined. The trick is to harness that information and that requires practical guidance not esoteric discussion.

But Will They?

Cowen makes one other disturbing point: he fears educational institutions will continue through inertia to employ traditional outmoded methods that don’t prepare students for the brave new world. I fear that these concerns are magnified when it comes to law schools. Law schools, like most lawyers, are slow to change. Law schools have a developed system that focuses more on academic questions, scholarship, and prestige than practicality.  

But GenAI is here to stay and we, as a profession, have to teach ourselves and younger lawyers how to practically use the tools and still think crucially. Adopting these approaches at the law school level would better assure all lawyers understand how to effectively use these tools, not just those lawyers who go to work at a big firm which has the resources to do this type of training.

Indeed, some law schools are recognizing these various facts and offering courses on using AI, how to teach legal reasoning to AI models, and the like. But the effort needs to integrate GenAI in every law school class. Law schools should start by requiring AI literacy courses for all first-year students and mandating that every course syllabus include assignments that explicitly incorporates AI tools. In the age of AI, there is no course that should not include training and adoption of the tools with a look toward the future.

GenAI gives the concept of training law students to think like a lawyer a whole new meaning. Law schools have a responsibility to their students to play a crucial role in this evolution. They can’t shuck it off for old times’ sake.


Stephen Embry is a lawyer, speaker, blogger, and writer. He publishes TechLaw Crossroads, a blog devoted to the examination of the tension between technology, the law, and the practice of law.

The post Teaching How To ‘Think Like a Lawyer’ Revisited appeared first on Above the Law.

GettyImages 1413923549

I recently wrote a piece bemoaning the lack of critical thinking skills among younger lawyers in the age of GenAI and wondering how the profession can hope to train lawyers to have these skills. I wondered whether the repetitive, tedious work that young lawyer traditionally had to do created experienced lawyers who could recognize patterns and solutions due to the exposure to multiple situations over their career.

I also wondered what would happen to that wisdom through experience when that repetitive work was done by GenAI tools. What will happen to the ability to “think like a lawyer” in a world where so many tasks are now done with GenAI and automation?

After reading a recent interview of the economist Tyler Cowen, I believe law schools need to play a greater role in developing GenAI skills to facilitate the development of that wisdom. Cowen is a professor at George Mason University.

Cowen’s Arguments

Cowen persuasively argues that colleges and the broader education system are simply failing to adapt to the demands of an AI world. He believes that education should focus on teaching students how to effectively use AI, instead of thinking of ways to force them to avoid it. He also thinks in the future those without AI skills will have a difficult time getting a job and advancing.

In particular, Cowen believes educational institutions should double down on teaching how to use AI tools instead of focusing on things like rote homework (what purpose is there of giving homework assignments that ChatGPT can do instantaneously) and rote memorizations (I thought about this as I watched my grandson struggle with doing math problems when the answer could be found on a smart phone calculator). Instead, Cowen says we should prioritize critical thinking, creativity, adaptability, and individualized guidance, qualities that AI cannot replace (yet).

As an example of this kind of thinking, OpenAI recently introduced Study Mode that instead of giving students answers, forces them to think through problems and come up with answers. It employs things like the Socratic method and personalized hints designed to guide students to come up with their own answers. It is designed to teach students how to think critically.

For example, instead of my grandson trying use pen and paper to solve the math problem of what is 7×8 through memory, it would ask him what does 7 times 8 mean in words. In a legal context, instead of asking students to memorize the elements of negligence, it might ask them to identify what’s missing from a fact pattern to establish a prima facie case.”

It is exactly this kind of tool Cowen would think educational institutions need to apply.

Law Schools

These insights have particular urgency for legal education. Indeed, most of Cowen’s criticisms and suggested changes need to be front and center for law school leaders. It’s naïve to think that law student and lawyers aren’t going to use GenAI tools in virtually every aspect of their professional and personal lives. Rather than avoiding the subject or worse yet trying to stop use of these tools, law schools should make GenAI tools a fundamental part of research, writing and drafting training.

They need to focus not on memorization but on the critical thinking skills beginning lawyers used to get in the on-the-job training guild type system. As I discussed, that training came from repetitive and often tedious work that developed experienced lawyers who could recognize patterns and solutions based on the exposure to similar situations. But much of that repetitive and tedious work may go away in a GenAI world.

The Socratic Method

As OpenAI’s Study Mode demonstrates, the Socratic method which law schools have leaned into for years could be ideal for just this type of training, if done right. Doing it right in the age of GenAI means not asking for the regurgitation of facts and holdings, but asking, for example, how a GenAI summary does or does not tease out the critical portion of a case and the nuance of the holding.

Or asking students to generate a list of potential issues demonstrated by a factual scenario using GenAI and then discussing what the tool got right and what it may have gotten wrong and why. It’s asking students to generate an argument using ChatGPT and then discussing what’s missing. Law schools need to focus on requiring students to disclose and explain when and how they use GenAI. And show them what is right and wrong with GenAI outputs.

The Role of Adjunct Professors

But to do this, law schools need to better partner with actual practicing lawyers who can serve as adjunct professors. Law schools need to do away with the notion that adjuncts are second-class teachers.

Practicing lawyers can supply that additional insight that full-time professors can’t because they lack the experience. Indeed, in the publish or perish world of law school tenure, the temptation to use GenAI to create law review tomes could lead to professors doing just what we want young lawyers to avoid: overreliance on GenAI instead of critical thinking and creativity.

It is the experienced lawyers, those with the accumulated wisdom, who are best equipped to spot things GenAI tools may have missed. Who can spot flaws in a chatbot’s reasoning. Who can separate the wheat from the chaff. By working with and mentoring law school students on these kinds of things, they can begin to impart these abilities.

Certainly, the academia v. practical debate has been going on ever since I was in law school. But the importance of the practical has changed with the advent of GenAI. GenAI provides the means to gain access to information in ways never before imagined. The trick is to harness that information and that requires practical guidance not esoteric discussion.

But Will They?

Cowen makes one other disturbing point: he fears educational institutions will continue through inertia to employ traditional outmoded methods that don’t prepare students for the brave new world. I fear that these concerns are magnified when it comes to law schools. Law schools, like most lawyers, are slow to change. Law schools have a developed system that focuses more on academic questions, scholarship, and prestige than practicality.  

But GenAI is here to stay and we, as a profession, have to teach ourselves and younger lawyers how to practically use the tools and still think crucially. Adopting these approaches at the law school level would better assure all lawyers understand how to effectively use these tools, not just those lawyers who go to work at a big firm which has the resources to do this type of training.

Indeed, some law schools are recognizing these various facts and offering courses on using AI, how to teach legal reasoning to AI models, and the like. But the effort needs to integrate GenAI in every law school class. Law schools should start by requiring AI literacy courses for all first-year students and mandating that every course syllabus include assignments that explicitly incorporates AI tools. In the age of AI, there is no course that should not include training and adoption of the tools with a look toward the future.

GenAI gives the concept of training law students to think like a lawyer a whole new meaning. Law schools have a responsibility to their students to play a crucial role in this evolution. They can’t shuck it off for old times’ sake.


Stephen Embry is a lawyer, speaker, blogger, and writer. He publishes TechLaw Crossroads, a blog devoted to the examination of the tension between technology, the law, and the practice of law.