Almost two months after hundreds of potential California lawyers have complained that their Bar exams were plagued by technical problems and irregularitiesThe state legal license organization caused a new indignation by admitting that certain multiple choice questions have been developed using artificial intelligence.
The California state bar said in a press release on Monday that it would ask the Supreme California court to adjust the test results for those who have passed its February bar exam.
But he refused to recognize important problems with his multiple choice questions – even if she revealed that a subset of questions was recycled from a student examination in the first year, while others were developed with the help of ACS Ventures, the independent psychometrician of the State Bar.
“The debacle which was the examination of the bar of February 2025 is worse than we imagine,” said Mary Basick, deputy dean of academic skills at the Faculty of Law UC Irvine. “I am almost speechless. Having the questions written by non-avocados using artificial intelligence is simply incredible. ”
After completing the exam, said Basick, some test candidates complained that some of the questions felt like IA written.
“I defended the bar,” said Basick. “” No question! They wouldn’t do that! “”
Using questions developed by AI written by unleashed psychometricians in light, represented “an obvious conflict of interest”, argued Basick, because “they are the same psychometricians responsible for establishing that the questions are valid and reliable”.
“It is an astounding admission,” said Katie Moran, an associate professor at the School of Law at the University of San Francisco who specializes in preparing bar exams.
“The State Bar has admitted to having employed a company so that the non-use of an AI to write questions that have been given to the bar exam,” she said. “They then paid the same company to assess and ultimately approve the exam questions, including the questions that the company wrote.”
The State Bar, which is an administrative branch of the Supreme Court of California, said on Monday that the majority of multiple choice issues had been drawn up by Kaplan Exam Services, a company with which it contracted last year when it sought to save money.
According to a recent presentation By the State Bar, 100 of the 171 multiple choice questions were asked by Kaplan and 48 were drawn from a first year student exam. A smaller 23-questions subset was marked by ACS Ventures, the Psychometrician of the State Bar, and developed with artificial intelligence.
“We have confidence in the validity of (multiple choice questions) to assess with precision and fairly legal competence of the tests,” said Leah Wilson, executive director of the State Bar, in a press release.
On Tuesday, a spokesperson for the State Bar told Times that all questions – including the 29 questions marked and unornsious of the agency’s independent psychometrician who were developed with the help of the AI - were examined by content validation panels and experts in the matter before the examination for factors, in particular legal accuracy, minimum skills and potential bias.
When measured for reliability, the state bar told Times, the questions combined with multiple choice of all sources – including AI – have carried out “above the psychometric target of 0.80”.
The state bar also rejected the idea of a conflict of interest.
“The process of validation of the questions and to test the reliability is not subjective,” said the state bar, “and the statistical parameters used by the psychometrician remain the same regardless of the source.”
Alex Chan, a lawyer who is chairman of the examiner’s committee of the State Bar, told Times that only a small subset of questions used AI – and not necessarily to create the questions.
“The teachers suggest that we used AI to write all the questions with multiple choice, instead of using AI to examine them,” said Chan. “It’s not my understanding.”
Chan noted that the Supreme Court of California urged the State Bar in October to examine “the availability of any new technology, such as artificial intelligence, which could innovate and improve the reliability and profitability of these tests.”
“The court gave its advice to examine the use of AI, and that’s exactly what we are going to do,” said Chan.
But a spokesperson for the highest court in California said on Tuesday that the judges discovered that this week that the state bar had used AI to develop exam questions.
“Until yesterday’s press release from the state bar yesterday, the court did not know that AI had been used to write one of the questions with multiple choice,” a spokesperson said in a statement.
Last year, when the State Bar was faced with a deficit of $ 22 million in its general fund, it decided to reduce costs by abandoning the examination of the bar to several states of the National Conference of Examinators, a system used by most states and to move to a new hybrid model of tests in person and distance. He cuts An agreement of $ 8.25 million with the Kaplan Exam Services test preparation company To create test questions and hired Meazure by learning to administer the exam.
There have been several problems with the deployment by new exams by the State bar. Some candidates have indicated that they had been launched online test platforms or experienced screens that have been late and displayed error messages. Others complained that multiple choice test issues had typing faults, consisted of absurd issues and left the important facts aside.
The botched examinations prompted some students to bring a federal complaint against Meazure Learning. Meanwhile, the judicial president of the California Senate, Thomas J. Umberg (D-Santa Ana) called to an audit of the State Bar and the California Supreme Court ordered the Revert to traditional exams of administration bar in person in July bars.
But the State Bar is progressing with its new multiple choice system system – even if some university experts have repeatedly reported problems with the quality of the February exam issues.
“Many have expressed their concern concerning the speed at which Kaplan’s questions were written and the quality of these questions,” wrote Basick and Moran on April 16 in a public commentary on the committee of bars examiners. “The 50 issues of published practice – which have been strongly published and reissued only a few weeks before the exam – still contain many errors. This has still eroded our confidence in the quality of the questions. ”
Historically, Moran said, the exam issues written by the National Conference of Bars Examinators have taken years to develop.
The reuse of some of the issues of the first -year law examination raised red flags, said Basick. An examination to determine if a person had learned enough during their first year of law faculty is different from that which determines if a test service is not very competent to practice the law, she argued.
“This is a very different standard,” she said. “It is not only,” Hey, do you know this rule? ” It is “do you know how to apply it in a situation where there is an ambiguity and determine the correct line of driving?” »»
In addition, the use of AI and recycling issues of a first -year law exam represented a major change for the preparation of bar exams, said Basick. She argued that such a change required a two -year opinion under the California business and professions.
But the state bar told Times that the sources of the questions had not triggered this two -year notice.
“The fact that there were several sources for the development of questions did not have an impact on the preparation of the exams,” said the state bar.
Basick said that she was worried in early March when, she said, the state bar had kicked her as well as other university experts in their gender panels.
She said that the State Bar argued that these law teachers had worked with issues written by the National Conference of Bar examiners in the last six months, which could raise questions of potential copyright violation.
“Ironically, what they did instead is having non-avocado questions using artificial intelligence,” she said. “The place where artificial intelligence would have obtained their information must be the questions of the NCBE, because there is nothing else available. What would artificial intelligence use? ”
Since the debacle of the February examination, the State bar has underestimated the idea that there were substantial problems with multiple choice questions. Instead, he focused on Meazure learning problems.
“We examine the seller’s performance to comply with their contractual obligations,” said the state bar in a document This listed the problems of which the candidates encountered and highlighted the relevant performance expectations in the contract.
But the criticisms accused the state bar of changing the blame – and argued that it had not recognized the seriousness of the problems with multiple choice questions.
Moran called on the State Bar to publish the 200 questions that were the test of transparency and to allow future tests to get used to the various questions. She also called on the State Bar to return to the multi-state bar examination for the July exams.
“They have just shown that they cannot do a fair test,” she said.
Chan said that the Bar Examination Committee will meet on May 5 to discuss non -score adjustments and remedies. But he doubted that the State Bar would publish the 200 questions or return to the examinators of the National Dams Conference in July.
The security of the NCBE exams would not allow any form of distant tests, he said, and the recent surveys of the State Bar have shown that almost 50% of candidates at the California bar wanted to keep the remote option.
“We are not going back to the NCBE – at least in the short term,” said Chan.
California Daily Newspapers