Introduction
In early 2021, following a three-year study focused on the current bar examination and the evolving landscapes of both the legal profession and legal education, the National Conference of Bar Examiners (NCBE) began to develop and implement the next generation of the bar exam (NextGen bar exam). This work includes multiple implementation research phases to test new question types and content and to establish administration and scoring processes prior to the NextGen bar exam’s debut in July 2026. The purposes of each research phase are:
Implementation Research Phase 1: Pilot Testing
- Determine efficacy of new question types
- Determine impact of providing legal resources
- Determine time needed to answer new question types
Implementation Research Phase 2: Field Testing
- Gather data on additional question types
- Confirm timing estimates
- Compare grading experiences for the NextGen bar exam and the current exam
- Generate initial question and test performance data
Implementation Research Phase 3: Prototype Exam Administration
- Test jurisdiction administration of a full-length, nine-hour examination via the delivery platform that will be used for the live exam
- Test the written-response grading system that will be used for the live exam
- Generate performance data to set the new score scale, establish concordance between the current score scale and the new scale, and provide information for jurisdictions’ determination of passing scores on the NextGen bar exam
Earlier this year, NCBE published a research brief describing findings from the first research phase, pilot testing. This field test research brief summarizes the second research phase, which focused on administering NextGen bar exam questions to a large sample of current law students and recent law graduates to determine the feasibility of the question formats and to conduct an initial evaluation of question-level performance. The brief begins with background information on the question types tested. It then provides a summary of the characteristics of the field test administrations, followed by discussion of the lessons learned and how they will inform prototype testing.
Questions that field testing sought to answer included:
- Is it feasible to administer and score the proposed new question types?
- How much time does it take participants to respond to the different question types?
- How does the experience of grading the new exam compare to the experience of grading the current exam?
- Do the questions reduce performance differences that are not related to examinee competency?
Field Test Question Types
Three types of questions were included on the field test:
Multiple-Choice Questions: Standalone multiple-choice questions with either four answer options and one correct answer or six answer options and two correct answers. Note that some multiple-choice questions also appeared in the integrated question sets and longer performance tasks described below. Sample Multiple-Choice Questions
Integrated Question Sets: Each set was based on a common fact scenario and could include some legal resources (e.g., excerpts of statutes or judicial opinions) and/or supplemental documents (e.g., a police report or excerpt from a deposition). Integrated question sets included a mixture of multiple-choice and short written-response questions. In addition to testing doctrinal law, some integrated question sets focused on drafting or editing a legal document; other sets focused on counseling and/or dispute resolution. Sample Integrated Question Sets
Performance Tasks: These tasks required participants to demonstrate their ability to use fundamental lawyering skills in realistic situations, completing tasks that a beginning lawyer should be able to accomplish. These tasks could feature areas of doctrinal law, with accompanying legal resources, not included in the NextGen Foundational Concepts and Principles. One of the longer performance tasks included multiple-choice questions and short written-response questions focused on research skills, followed by a longer writing assignment. Sample Performance Task
Field Test Administration Characteristics
The field test was administered on January 26 and 27, 2024, to 4,016 final-year law students and recent law graduates at 88 volunteer law schools across the United States. Participants completed a collection of field test questions (a “form”) that consisted of two hours of test content. By randomly assigning participants to different field test forms, we ensured that performance differences between forms could be interpreted as differences in form difficulty, not participant ability. Participants were compensated for their time.
Additional information regarding question types, participants, and forms is included below. The analyses conducted include question difficulty, response times, grader performance, and group performance by question type and skill category.
Field Test Questions and Forms
A total of five field test forms were administered. Across the five forms, there were 112 standalone single-selection multiple-choice questions. Additionally, four drafting sets assessing writing and editing skills not easily covered by the longer written portions of the exam were included. Five counseling sets assessing client-counseling, advising, negotiation, and dispute-resolution skills were included,[1] as were two performance tasks and one research performance task.[2] In total, 155 individual questions were administered.
All Foundational Concepts and Principles and Foundational Skills were tested across the 155 questions.[3]
How Did Field Test Questions Perform?
To examine question performance, the proportion of possible points earned across participants (p-value) was calculated. A higher p-value indicates that the question is easier (i.e., participants tended to earn more points), and a lower p-value indicates that the question is more difficult (i.e., participants tended to earn fewer points). Summary statistics of p-values for each question type are presented in Table 1.