Select Page
justicegap e1527857564807

If your legal team is exploring how to use generative AI in practice, and let’s be honest, most are at least testing the waters, the conversation often jumps straight to tools and outputs. Which chatbot should we use? Can we trust it to draft something real? How do we control hallucinations?

But before you even get there, there’s a more basic question that many legal departments overlook. What are we feeding the AI in the first place?

In a recent episode of Notes to My (Legal) Self,” Linsey Krolik, a law professor at Santa Clara University and longtime in-house counsel at companies like PayPal and Arm, made a compelling case for what she calls AI literacy. But the bigger insight was between the lines: you can’t use AI effectively in a legal setting without understanding the inputs, and that means contracts.

Watch the episode here:

The AI Moment Isn’t Coming. It’s Already Here.

“We are using generative AI today, whether we want to admit it or not,” Linsey said during the interview. “It’s happening. So get on board and we can learn together.”

That sense of collective learning, and the gap between curiosity and confidence, is something many in-house teams are experiencing firsthand. There is pressure to move quickly, reduce turnaround time, and do more with less. AI promises all of that. But as Linsey pointed out, we need to start with the basics.

She’s training law students to build real-world legal documents like terms of service and privacy policies for early-stage startups. These students are already experimenting with AI. They’re learning where it helps, where it fails, and how to critically assess its output. They are developing muscle memory not just in drafting, but in understanding why contracts are structured the way they are.

That foundational skill, contract literacy, is what too many practicing teams are missing.

AI Is A Mirror. If Your Contract Data Is A Mess, It Will Show.

When lawyers think about AI tools, it’s easy to focus on the output. What can it draft? What questions can it answer?

But what matters just as much is the underlying data. If your team can’t easily answer questions like “What are our standard payment terms across all NDAs?” or “Which vendor contracts auto-renew in the next 90 days?” then any AI solution you implement will be trying to find patterns in chaos.

Linsey emphasized that in-house teams are increasingly being asked, “Did you use AI for this?” And when the answer is no, the follow-up is often, “Why not?” That pressure to explore and adopt is growing. But AI isn’t magic. It won’t clean up your contract portfolio for you. It will only surface what’s already there, or worse, what’s missing.

Contract Literacy Isn’t Just Knowing Legal Terms. It’s Knowing The Business.

One of the sharpest observations Linsey made during the conversation was about how contract education has evolved. She’s moved beyond traditional legal writing assignments to include things like AI-assisted drafting and short business-style presentations.

Why? Because she understands that lawyers today don’t just write contracts. They explain them. They negotiate them. They implement them. And increasingly, they design workflows and data systems around them.

AI can support that work, but only when the lawyer understands what the business needs from the contract. If you can’t articulate the difference between what a procurement manager wants to know and what your finance lead needs to see, no AI tool will bridge that gap for you.

The Real Risk Isn’t AI. It’s Staying Unprepared.

Linsey acknowledged the ethical concerns around AI, confidentiality, accuracy, and unauthorized reliance, but she also made it clear that the bigger risk is paralysis.

“There’s a lot of uncertainty now,” she said. “But I think we need to start being more curious and less scared.”

She teaches her students to disclose when they use AI, to reflect on why they used it, and to evaluate the quality of the output. In doing so, they learn how to build trust in the tools and in their own judgment.

That same framework applies to in-house legal teams. Instead of asking whether AI is perfect, start asking whether your team is ready. Can you explain what your standard indemnity clause looks like? Can you audit vendor agreements for renewal triggers? Do you have a structured way to compare terms across contracts?

These are contract literacy questions. And until you can answer them confidently, AI will remain a shiny solution looking for a problem.

Want To Get AI-Ready? Start With Your Contracts.

Linsey Krolik is training the next generation of lawyers to think critically, use emerging tools responsibly, and work directly with the business. If today’s law students are learning to draft, structure, and analyze contracts with AI as a companion, then the rest of the legal world needs to catch up fast.

AI readiness starts with knowing what you have, what it means, and how to use it. That begins not with software, but with skill. Not with automation, but with understanding.

Contract literacy isn’t the end goal. It’s the starting line.

Watch the full interview with Linsey here.


Olga V. Mack is the CEO of TermScout, an AI-powered contract certification platform that accelerates revenue and eliminates friction by certifying contracts as fair, balanced, and market-ready. A serial CEO and legal tech executive, she previously led a company through a successful acquisition by LexisNexis. Olga is also a Fellow at CodeX, The Stanford Center for Legal Informatics, and the Generative AI Editor at law.MIT. She is a visionary executive reshaping how we law—how legal systems are built, experienced, and trusted. Olga teaches at Berkeley Law, lectures widely, and advises companies of all sizes, as well as boards and institutions. An award-winning general counsel turned builder, she also leads early-stage ventures including Virtual Gabby (Better Parenting Plan)Product Law HubESI Flow, and Notes to My (Legal) Self, each rethinking the practice and business of law through technology, data, and human-centered design. She has authored The Rise of Product LawyersLegal Operations in the Age of AI and DataBlockchain Value, and Get on Board, with Visual IQ for Lawyers (ABA) forthcoming. Olga is a 6x TEDx speaker and has been recognized as a Silicon Valley Woman of Influence and an ABA Woman in Legal Tech. Her work reimagines people’s relationship with law—making it more accessible, inclusive, data-driven, and aligned with how the world actually works. She is also the host of the Notes to My (Legal) Self podcast (streaming on SpotifyApple Podcasts, and YouTube), and her insights regularly appear in Forbes, Bloomberg Law, Newsweek, VentureBeat, ACC Docket, and Above the Law. She earned her B.A. and J.D. from UC Berkeley. Follow her on LinkedIn and X @olgavmack.

The post Contract Literacy Is The Missing Link In AI Readiness, Says Linsey Krolik appeared first on Above the Law.

justicegap e1527857564807

If your legal team is exploring how to use generative AI in practice, and let’s be honest, most are at least testing the waters, the conversation often jumps straight to tools and outputs. Which chatbot should we use? Can we trust it to draft something real? How do we control hallucinations?

But before you even get there, there’s a more basic question that many legal departments overlook. What are we feeding the AI in the first place?

In a recent episode of Notes to My (Legal) Self,” Linsey Krolik, a law professor at Santa Clara University and longtime in-house counsel at companies like PayPal and Arm, made a compelling case for what she calls AI literacy. But the bigger insight was between the lines: you can’t use AI effectively in a legal setting without understanding the inputs, and that means contracts.

Watch the episode here:

The AI Moment Isn’t Coming. It’s Already Here.

“We are using generative AI today, whether we want to admit it or not,” Linsey said during the interview. “It’s happening. So get on board and we can learn together.”

That sense of collective learning, and the gap between curiosity and confidence, is something many in-house teams are experiencing firsthand. There is pressure to move quickly, reduce turnaround time, and do more with less. AI promises all of that. But as Linsey pointed out, we need to start with the basics.

She’s training law students to build real-world legal documents like terms of service and privacy policies for early-stage startups. These students are already experimenting with AI. They’re learning where it helps, where it fails, and how to critically assess its output. They are developing muscle memory not just in drafting, but in understanding why contracts are structured the way they are.

That foundational skill, contract literacy, is what too many practicing teams are missing.

AI Is A Mirror. If Your Contract Data Is A Mess, It Will Show.

When lawyers think about AI tools, it’s easy to focus on the output. What can it draft? What questions can it answer?

But what matters just as much is the underlying data. If your team can’t easily answer questions like “What are our standard payment terms across all NDAs?” or “Which vendor contracts auto-renew in the next 90 days?” then any AI solution you implement will be trying to find patterns in chaos.

Linsey emphasized that in-house teams are increasingly being asked, “Did you use AI for this?” And when the answer is no, the follow-up is often, “Why not?” That pressure to explore and adopt is growing. But AI isn’t magic. It won’t clean up your contract portfolio for you. It will only surface what’s already there, or worse, what’s missing.

Contract Literacy Isn’t Just Knowing Legal Terms. It’s Knowing The Business.

One of the sharpest observations Linsey made during the conversation was about how contract education has evolved. She’s moved beyond traditional legal writing assignments to include things like AI-assisted drafting and short business-style presentations.

Why? Because she understands that lawyers today don’t just write contracts. They explain them. They negotiate them. They implement them. And increasingly, they design workflows and data systems around them.

AI can support that work, but only when the lawyer understands what the business needs from the contract. If you can’t articulate the difference between what a procurement manager wants to know and what your finance lead needs to see, no AI tool will bridge that gap for you.

The Real Risk Isn’t AI. It’s Staying Unprepared.

Linsey acknowledged the ethical concerns around AI, confidentiality, accuracy, and unauthorized reliance, but she also made it clear that the bigger risk is paralysis.

“There’s a lot of uncertainty now,” she said. “But I think we need to start being more curious and less scared.”

She teaches her students to disclose when they use AI, to reflect on why they used it, and to evaluate the quality of the output. In doing so, they learn how to build trust in the tools and in their own judgment.

That same framework applies to in-house legal teams. Instead of asking whether AI is perfect, start asking whether your team is ready. Can you explain what your standard indemnity clause looks like? Can you audit vendor agreements for renewal triggers? Do you have a structured way to compare terms across contracts?

These are contract literacy questions. And until you can answer them confidently, AI will remain a shiny solution looking for a problem.

Want To Get AI-Ready? Start With Your Contracts.

Linsey Krolik is training the next generation of lawyers to think critically, use emerging tools responsibly, and work directly with the business. If today’s law students are learning to draft, structure, and analyze contracts with AI as a companion, then the rest of the legal world needs to catch up fast.

AI readiness starts with knowing what you have, what it means, and how to use it. That begins not with software, but with skill. Not with automation, but with understanding.

Contract literacy isn’t the end goal. It’s the starting line.

Watch the full interview with Linsey here.


Olga V. Mack is the CEO of TermScout, an AI-powered contract certification platform that accelerates revenue and eliminates friction by certifying contracts as fair, balanced, and market-ready. A serial CEO and legal tech executive, she previously led a company through a successful acquisition by LexisNexis. Olga is also a Fellow at CodeX, The Stanford Center for Legal Informatics, and the Generative AI Editor at law.MIT. She is a visionary executive reshaping how we law—how legal systems are built, experienced, and trusted. Olga teaches at Berkeley Law, lectures widely, and advises companies of all sizes, as well as boards and institutions. An award-winning general counsel turned builder, she also leads early-stage ventures including Virtual Gabby (Better Parenting Plan)Product Law HubESI Flow, and Notes to My (Legal) Self, each rethinking the practice and business of law through technology, data, and human-centered design. She has authored The Rise of Product LawyersLegal Operations in the Age of AI and DataBlockchain Value, and Get on Board, with Visual IQ for Lawyers (ABA) forthcoming. Olga is a 6x TEDx speaker and has been recognized as a Silicon Valley Woman of Influence and an ABA Woman in Legal Tech. Her work reimagines people’s relationship with law—making it more accessible, inclusive, data-driven, and aligned with how the world actually works. She is also the host of the Notes to My (Legal) Self podcast (streaming on SpotifyApple Podcasts, and YouTube), and her insights regularly appear in Forbes, Bloomberg Law, Newsweek, VentureBeat, ACC Docket, and Above the Law. She earned her B.A. and J.D. from UC Berkeley. Follow her on LinkedIn and X @olgavmack.