{"id":116225,"date":"2025-04-23T16:54:49","date_gmt":"2025-04-24T00:54:49","guid":{"rendered":"https:\/\/xira.com\/p\/2025\/04\/23\/california-bar-reveals-it-used-ai-for-exam-questions-because-of-course-it-did\/"},"modified":"2025-04-23T16:54:49","modified_gmt":"2025-04-24T00:54:49","slug":"california-bar-reveals-it-used-ai-for-exam-questions-because-of-course-it-did","status":"publish","type":"post","link":"https:\/\/xira.com\/p\/2025\/04\/23\/california-bar-reveals-it-used-ai-for-exam-questions-because-of-course-it-did\/","title":{"rendered":"California Bar Reveals It Used AI For Exam Questions, Because Of Course It Did"},"content":{"rendered":"<figure class=\"wp-block-image alignright size-large is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"620\" height=\"413\" src=\"https:\/\/i0.wp.com\/abovethelaw.com\/wp-content\/uploads\/sites\/4\/2022\/08\/AdobeStock_318590519-620x413.jpeg?resize=620%2C413&#038;ssl=1\" alt=\"\" class=\"wp-image-83373\" title=\"\"><figcaption><\/figcaption><\/figure>\n<p>Unaware that you\u2019re supposed to rip the band-aid off all at once, the embattled officials behind the California Bar Exam decided they hadn\u2019t had enough of the non-stop cavalcade of disastrous headlines about February\u2019s exam, so they just casually threw out there that they used AI to come up with some of the questions.<\/p>\n<p>Everyone\u2019s going to be chill about this, right? I mean, \u201cbar exams\u201d and \u201cartificial intelligence\u201d are two topics that famously elicit only only rational, measured reactions.<\/p>\n<p>After February\u2019s haphazard glitch-fest was the product of <a href=\"https:\/\/abovethelaw.com\/2024\/05\/california-bar-risks-going-bankrupt-rather-than-change-its-exam\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">a shotgun wedding revamp<\/a> that ditched the NCBE in an effort to save the state licensing operation from descending into bankruptcy. The NCBE\u2019s testing venue rules had pushed California\u2019s resources to the limit with the most populous state in the union forced to book massive, expensive, and mostly inconvenient locations every time it offered the test. A new provider would offer the examiners the opportunity to do more remote testing and take advantage of multiple, smaller locations to save money. It sounded good on paper and, more or less, it\u2019s still a better path forward for California.<\/p>\n<p>But things didn\u2019t quite work out this time.<\/p>\n<p>They commissioned Kaplan to write the questions and Meazure to administer the test. Then there were practice test problems, technical problems galore, the more convenient venues didn\u2019t materialize, and the proctoring spawned subreddits worth of horror stories. They ended up having to offer refunds and make-up tests before floating the possibility of <a href=\"https:\/\/abovethelaw.com\/2025\/03\/california-bar-exam-fiasco-considers-adding-even-more-chaos\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">inviting more chaos by throwing the whole project in the can<\/a>.<\/p>\n<p>Making this the perfect time to inform the not-at-all-stressed applicants that the questions might have also been hallucinated AI slop. THANKS, BYE!<\/p>\n<p><a href=\"https:\/\/www.latimes.com\/california\/story\/2025-04-23\/state-bar-of-california-used-ai-for-exam-questions\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">According to the LA Times<\/a>:<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>The State Bar of California said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam.<\/p>\n<p>But it declined to acknowledge significant problems with its multiple-choice questions \u2014 even as it revealed that a subset of questions were recycled from a first-year law student exam, while others were developed with the assistance of AI by ACS Ventures, the State Bar\u2019s independent psychometrician.<\/p>\n<\/blockquote>\n<p>Despite what it sounds like, psychometricians are not villains from an L. Ron Hubbard book. But they also are not lawyers, understandably troubling educators and applicants when they learned that a bunch of non-lawyers used AI to develop questions. Basically, their job is to measure test performance, not use ChatGPT to rewrite \u201cContracts for Dummies.\u201d<\/p>\n<p>Before rushing to rage, the AI-aided questions made up a \u201csmall subset\u201d of the exam, amounting to 23 of the 171 scored multiple-choice questions. It\u2019s also worth noting that developing questions isn\u2019t the same as handing over test design to the computers. The chair of the State Bar\u2019s Committee of Bar Examiners, Alex Chan, pushed back against that prospect, explaining, \u201cthe professors are suggesting that we used AI to draft all of the multiple choice questions, as opposed to using AI to vet them.\u201d<\/p>\n<p>Over and above whatever it means for AI to \u201cvet\u201d them, the bar also said that all questions were reviewed by subject matter experts. Of course, those same panels suspiciously lost a few credentialed law professors after the Bar worried that academics who worked with the NCBE in the past could raise copyright issues. Theoretically the <em>non-profit<\/em> NCBE could waive those in the public interest, but it\u2019s also a non-profit with <a href=\"https:\/\/projects.propublica.org\/nonprofits\/organizations\/362472009\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">$175 million in assets<\/a> so make your own conclusions about how that would\u2019ve gone down.<\/p>\n<p>There\u2019s nothing inherently wrong with AI helping out in this process. If the examiners have confidence in the subject matter experts reviewing the final product, it might be a useful brainstorming tool. Indeed, the AI doesn\u2019t seem as concerning as who allegedly used it. Garbage in, garbage out is a real problem, and whether it\u2019s writing or vetting the questions, having a non-lawyer on the other end of the keyboard raises risks. Some of the professors cited in the Times article also worry that it raises conflict of interest issues as the psychometrician is expected to vouch for the reliability of the questions on the back end, though the bar examiners stress that this process isn\u2019t subjective. <\/p>\n<p>Besides, the bar examiners were told to consider using AI by no less than the California Supreme Court \u2014 who oversees the process. A statement that prompted the California Supreme Court to immediately respond that they never knew anything about this until the press release dropped.<\/p>\n<p>WELL. OILED. MACHINE.<\/p>\n<p>There are a lot of risks in using AI generally and to build a life-changing exam specifically. Other than the inherent shadiness involved in making this announcement a couple months after the fact and having the ultimate authorities at the State Supreme Court reply, \u201cWhoa! Don\u2019t rope us into this mess!\u201d there\u2019s reason to believe the bar examiners probably used AI responsibly here.<\/p>\n<p>Still, Katie Moran, an associate professor at the University of San Francisco School of Law, told the Times that the State Bar should \u201crelease all 200 questions that were on the test for transparency and to allow future test takers a chance to get used to the different questions.\u201d That\u2019s the sort of proactive transparency that avoids these sorts of belated bombshells. It also puts more eyes on the questions and helps the authors \u2014 the human ones \u2014 refine questions in response to the wisdom of the subject matter expert crowd. <\/p>\n<p>\u201cShe also called on the State Bar to return to the multi-state bar exam for the July exams.\u201d Yeah\u2026 for the love of all that\u2019s holy, don\u2019t do that. Nothing solved by more whiplash, there\u2019s almost no chance the examiners could lock down all the pricey venues they would need by July, and \u2014 the State Bar notes \u2014 one of the few things applicants hate more than this test is the lack of a remote option.<\/p>\n<p>Also, the NCBE questions consistently generate a ton of applicant complaints. While everyone in California \u2014 rightly \u2014 grumbles about these new questions, just wait until July when we start seeing the metric shit ton of rage over incoherent multi-state questions. Eventually\u2026 the NCBE bans talking about the substance of the exam, ostensibly to protect the sanctity of the exam for later test takers but with the useful side effect of keeping criticism tamped down until after results are released. TRANSPARENCY!<\/p>\n<p>The California Bar Exam is broken in a lot of ways and people should be angry about it. But without a time machine to go back a number of years to when they should\u2019ve started <em>planning<\/em> to move to this new test instead of slapping it together in a matter of months, it is what it is. But they\u2019re working on fixing it, even if this was a borderline irresponsible move to start this in February. <\/p>\n<p>And of all the problems we\u2019ve heard throughout this process, using AI to develop some questions doesn\u2019t make the top-tier. <\/p>\n<hr>\n<p><strong><em><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignright  wp-image-443318\" src=\"https:\/\/i0.wp.com\/abovethelaw.com\/wp-content\/uploads\/2016\/11\/Headshot-300x200.jpg?resize=188%2C125&#038;ssl=1\" alt=\"Headshot\" width=\"188\" height=\"125\" title=\"\"><a href=\"http:\/\/abovethelaw.com\/author\/joe-patrice\/\" target=\"_blank\" rel=\"noopener nofollow\">Joe Patrice<\/a>\u00a0is a senior editor at Above the Law and co-host of <a href=\"http:\/\/legaltalknetwork.com\/podcasts\/thinking-like-a-lawyer\/\" target=\"_blank\" rel=\"noopener nofollow\">Thinking Like A Lawyer<\/a>. Feel free to\u00a0<a href=\"mailto:joepatrice@abovethelaw.com\">email<\/a> any tips, questions, or comments. Follow him on\u00a0<a href=\"https:\/\/twitter.com\/josephpatrice\" target=\"_blank\" rel=\"noopener nofollow\">Twitter<\/a>\u00a0or <a href=\"https:\/\/bsky.app\/profile\/joepatrice.bsky.social\" rel=\"noopener nofollow\" target=\"_blank\">Bluesky<\/a> if you\u2019re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a <a href=\"https:\/\/www.rpnexecsearch.com\/josephpatrice\" target=\"_blank\" rel=\"noopener nofollow\">Managing Director at RPN Executive Search<\/a>.<\/em><\/strong><\/p>\n<p>The post <a href=\"https:\/\/abovethelaw.com\/2025\/04\/california-bar-reveals-it-used-ai-for-exam-questions-because-of-course-it-did\/\" rel=\"nofollow noopener\" target=\"_blank\">California Bar Reveals It Used AI For Exam Questions, Because Of Course It Did<\/a> appeared first on <a href=\"https:\/\/abovethelaw.com\/\" rel=\"nofollow noopener\" target=\"_blank\">Above the Law<\/a>.<\/p>\n<figure class=\"wp-block-image alignright size-large is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"620\" height=\"413\" src=\"https:\/\/i0.wp.com\/abovethelaw.com\/wp-content\/uploads\/sites\/4\/2022\/08\/AdobeStock_318590519-620x413.jpeg?resize=620%2C413&#038;ssl=1\" alt=\"\" class=\"wp-image-83373\" title=\"\"><figcaption><\/figcaption><\/figure>\n<p>Unaware that you\u2019re supposed to rip the band-aid off all at once, the embattled officials behind the California Bar Exam decided they hadn\u2019t had enough of the non-stop cavalcade of disastrous headlines about February\u2019s exam, so they just casually threw out there that they used AI to come up with some of the questions.<\/p>\n<p>Everyone\u2019s going to be chill about this, right? I mean, \u201cbar exams\u201d and \u201cartificial intelligence\u201d are two topics that famously elicit only only rational, measured reactions.<\/p>\n<p>After February\u2019s haphazard glitch-fest was the product of <a href=\"https:\/\/abovethelaw.com\/2024\/05\/california-bar-risks-going-bankrupt-rather-than-change-its-exam\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">a shotgun wedding revamp<\/a> that ditched the NCBE in an effort to save the state licensing operation from descending into bankruptcy. The NCBE\u2019s testing venue rules had pushed California\u2019s resources to the limit with the most populous state in the union forced to book massive, expensive, and mostly inconvenient locations every time it offered the test. A new provider would offer the examiners the opportunity to do more remote testing and take advantage of multiple, smaller locations to save money. It sounded good on paper and, more or less, it\u2019s still a better path forward for California.<\/p>\n<p>But things didn\u2019t quite work out this time.<\/p>\n<p>They commissioned Kaplan to write the questions and Meazure to administer the test. Then there were practice test problems, technical problems galore, the more convenient venues didn\u2019t materialize, and the proctoring spawned subreddits worth of horror stories. They ended up having to offer refunds and make-up tests before floating the possibility of <a href=\"https:\/\/abovethelaw.com\/2025\/03\/california-bar-exam-fiasco-considers-adding-even-more-chaos\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">inviting more chaos by throwing the whole project in the can<\/a>.<\/p>\n<p>Making this the perfect time to inform the not-at-all-stressed applicants that the questions might have also been hallucinated AI slop. THANKS, BYE!<\/p>\n<p><a href=\"https:\/\/www.latimes.com\/california\/story\/2025-04-23\/state-bar-of-california-used-ai-for-exam-questions\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">According to the LA Times<\/a>:<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>The State Bar of California said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam.<\/p>\n<p>But it declined to acknowledge significant problems with its multiple-choice questions \u2014 even as it revealed that a subset of questions were recycled from a first-year law student exam, while others were developed with the assistance of AI by ACS Ventures, the State Bar\u2019s independent psychometrician.<\/p>\n<\/blockquote>\n<p>Despite what it sounds like, psychometricians are not villains from an L. Ron Hubbard book. But they also are not lawyers, understandably troubling educators and applicants when they learned that a bunch of non-lawyers used AI to develop questions. Basically, their job is to measure test performance, not use ChatGPT to rewrite \u201cContracts for Dummies.\u201d<\/p>\n<p>Before rushing to rage, the AI-aided questions made up a \u201csmall subset\u201d of the exam, amounting to 23 of the 171 scored multiple-choice questions. It\u2019s also worth noting that developing questions isn\u2019t the same as handing over test design to the computers. The chair of the State Bar\u2019s Committee of Bar Examiners, Alex Chan, pushed back against that prospect, explaining, \u201cthe professors are suggesting that we used AI to draft all of the multiple choice questions, as opposed to using AI to vet them.\u201d<\/p>\n<p>Over and above whatever it means for AI to \u201cvet\u201d them, the bar also said that all questions were reviewed by subject matter experts. Of course, those same panels suspiciously lost a few credentialed law professors after the Bar worried that academics who worked with the NCBE in the past could raise copyright issues. Theoretically the <em>non-profit<\/em> NCBE could waive those in the public interest, but it\u2019s also a non-profit with <a href=\"https:\/\/projects.propublica.org\/nonprofits\/organizations\/362472009\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">$175 million in assets<\/a> so make your own conclusions about how that would\u2019ve gone down.<\/p>\n<p>There\u2019s nothing inherently wrong with AI helping out in this process. If the examiners have confidence in the subject matter experts reviewing the final product, it might be a useful brainstorming tool. Indeed, the AI doesn\u2019t seem as concerning as who allegedly used it. Garbage in, garbage out is a real problem, and whether it\u2019s writing or vetting the questions, having a non-lawyer on the other end of the keyboard raises risks. Some of the professors cited in the Times article also worry that it raises conflict of interest issues as the psychometrician is expected to vouch for the reliability of the questions on the back end, though the bar examiners stress that this process isn\u2019t subjective. <\/p>\n<p>Besides, the bar examiners were told to consider using AI by no less than the California Supreme Court \u2014 who oversees the process. A statement that prompted the California Supreme Court to immediately respond that they never knew anything about this until the press release dropped.<\/p>\n<p>WELL. OILED. MACHINE.<\/p>\n<p>There are a lot of risks in using AI generally and to build a life-changing exam specifically. Other than the inherent shadiness involved in making this announcement a couple months after the fact and having the ultimate authorities at the State Supreme Court reply, \u201cWhoa! Don\u2019t rope us into this mess!\u201d there\u2019s reason to believe the bar examiners probably used AI responsibly here.<\/p>\n<p>Still, Katie Moran, an associate professor at the University of San Francisco School of Law, told the Times that the State Bar should \u201crelease all 200 questions that were on the test for transparency and to allow future test takers a chance to get used to the different questions.\u201d That\u2019s the sort of proactive transparency that avoids these sorts of belated bombshells. It also puts more eyes on the questions and helps the authors \u2014 the human ones \u2014 refine questions in response to the wisdom of the subject matter expert crowd. <\/p>\n<p>\u201cShe also called on the State Bar to return to the multi-state bar exam for the July exams.\u201d Yeah\u2026 for the love of all that\u2019s holy, don\u2019t do that. Nothing solved by more whiplash, there\u2019s almost no chance the examiners could lock down all the pricey venues they would need by July, and \u2014 the State Bar notes \u2014 one of the few things applicants hate more than this test is the lack of a remote option.<\/p>\n<p>Also, the NCBE questions consistently generate a ton of applicant complaints. While everyone in California \u2014 rightly \u2014 grumbles about these new questions, just wait until July when we start seeing the metric shit ton of rage over incoherent multi-state questions. Eventually\u2026 the NCBE bans talking about the substance of the exam, ostensibly to protect the sanctity of the exam for later test takers but with the useful side effect of keeping criticism tamped down until after results are released. TRANSPARENCY!<\/p>\n<p>The California Bar Exam is broken in a lot of ways and people should be angry about it. But without a time machine to go back a number of years to when they should\u2019ve started <em>planning<\/em> to move to this new test instead of slapping it together in a matter of months, it is what it is. But they\u2019re working on fixing it, even if this was a borderline irresponsible move to start this in February. <\/p>\n<p>And of all the problems we\u2019ve heard throughout this process, using AI to develop some questions doesn\u2019t make the top-tier. <\/p>\n<hr>\n<p><strong><em><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignright  wp-image-443318\" src=\"https:\/\/i0.wp.com\/abovethelaw.com\/wp-content\/uploads\/2016\/11\/Headshot-300x200.jpg?resize=188%2C125&#038;ssl=1\" alt=\"Headshot\" width=\"188\" height=\"125\" title=\"\"><a href=\"http:\/\/abovethelaw.com\/author\/joe-patrice\/\" target=\"_blank\" rel=\"noopener nofollow\">Joe Patrice<\/a>\u00a0is a senior editor at Above the Law and co-host of <a href=\"http:\/\/legaltalknetwork.com\/podcasts\/thinking-like-a-lawyer\/\" target=\"_blank\" rel=\"noopener nofollow\">Thinking Like A Lawyer<\/a>. Feel free to\u00a0<a href=\"mailto:joepatrice@abovethelaw.com\">email<\/a> any tips, questions, or comments. Follow him on\u00a0<a href=\"https:\/\/twitter.com\/josephpatrice\" target=\"_blank\" rel=\"noopener nofollow\">Twitter<\/a>\u00a0or <a href=\"https:\/\/bsky.app\/profile\/joepatrice.bsky.social\" rel=\"noopener nofollow\" target=\"_blank\">Bluesky<\/a> if you\u2019re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a <a href=\"https:\/\/www.rpnexecsearch.com\/josephpatrice\" target=\"_blank\" rel=\"noopener nofollow\">Managing Director at RPN Executive Search<\/a>.<\/em><\/strong><\/p>\n<p>The post <a href=\"https:\/\/abovethelaw.com\/2025\/04\/california-bar-reveals-it-used-ai-for-exam-questions-because-of-course-it-did\/\" rel=\"nofollow noopener\" target=\"_blank\">California Bar Reveals It Used AI For Exam Questions, Because Of Course It Did<\/a> appeared first on <a href=\"https:\/\/abovethelaw.com\/\" rel=\"nofollow noopener\" target=\"_blank\">Above the Law<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Unaware that you\u2019re supposed to rip the band-aid off all at once, the embattled officials behind the California Bar Exam decided they hadn\u2019t had enough of the non-stop cavalcade of disastrous headlines about February\u2019s exam, so they just casually threw out there that they used AI to come up with some of the questions. Everyone\u2019s [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[16],"tags":[],"class_list":["post-116225","post","type-post","status-publish","format-standard","hentry","category-above_the_law"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts\/116225","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/comments?post=116225"}],"version-history":[{"count":0,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts\/116225\/revisions"}],"wp:attachment":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/media?parent=116225"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/categories?post=116225"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/tags?post=116225"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}