Phase 3 Report of the Testing Task Force

Blueprint Development Committee and Test Design Committee Meetings

November 2020

Overview and Next Steps

The purpose of the Testing Task Force’s study over the past three years has been to gather the necessary research to inform how the bar exam should be changed to adapt to a changing profession. The study has been conducted with transparency and has included input from stakeholders each step of the way.

For Phase 3 of the Task Force’s work, two committees were convened to discuss test blueprint and design issues by working from the qualitative and quantitative data that were compiled in Phase 1 (stakeholder listening sessions) and Phase 2 (nationwide practice analysis). The charge of the Blueprint Development Committee (BDC) was to help determine what content should be tested on the bar exam, while the role of the Test Design Committee (TDC) was to recommend how that content should be assessed. The BDC consisted of newly licensed and experienced practitioners who provided input on exam content using the Phase 2 study results and applying their professional judgment and experience. The TDC was composed of legal educators and bar admission representatives who provided input on an effective structure and design for the exam. The TDC’s work was guided by the Phase 1 study results and by the professional judgment and experience of committee members in educating law school students and admitting newly licensed lawyers (NLLs) to the bar. This Phase 3 report summarizes the work completed by the BDC and the TDC.

Using the results from the BDC and TDC meetings, the Task Force is formulating a set of preliminary recommendations for the content and design of the next generation of the bar exam. The starting point for developing any exam is to consider the purpose of the test and the claim(s) to be made about examinee performance. Early in the study, the Task Force adopted the following claim for the intended use of scores from the bar exam: To protect the public by helping to ensure that those who are newly licensed possess the minimum knowledge and skills to perform activities typically required of an entry-level lawyer. This claim guided the work of the BDC and TDC and will serve as foundation for the Task Force’s recommendations for the next generation of the bar exam.

Beyond fidelity to the claim, the content and design of the exam must be constrained by psychometric requirements relating to reliability, validity, and fairness. Therefore, the Task Force has consulted, and will continue to consult, with its internal and external assessment experts in formulating recommendations that satisfy the Standards for Educational and Psychological Testing (AERA, APA, NCME, 2014). The other key objectives guiding the Task Force as it develops recommendations include

  • ensuring that the depth and breadth of the content tested is carefully aligned with minimum competence for entry-level practice;
  • increasing emphasis on assessment of lawyering skills;
  • adopting an integrated approach to assessing knowledge and skills;
  • continuing to exercise vigilance to prevent bias and ensure fairness and accessibility for all examinees;
  • reducing the length of the exam to the extent possible without sacrificing reliability and validity of score interpretations;
  • administering the exam by computer either at test centers managed by vendors or on examinees’ laptops in jurisdiction-managed testing facilities;
  • keeping the exam affordable so that cost does not pose a barrier to entering the profession;
  • reducing the time between exam administration and announcement of results through greater efficiencies in scoring and grading while maintaining the high quality of processes used; and
  • maintaining the benefits of score portability realized through use of the Uniform Bar Exam.

The Task Force’s preliminary recommendations will be shared with bar admission authorities and the legal academy for comment before recommendations are finalized by the Task Force for submission to NCBE’s Board of Trustees in January 2021.

Blueprint Development Committee Meeting

BDC Panelists

The TTF recruited 17 practicing lawyers to participate as panelists on the BDC; 14 of the panelists were female and 10 were people of color. In total, the panelists practiced in 13 jurisdictions and across a range of practice settings (private law firm, government, nonprofit organization, legal services/public interest, judicial law clerk, and in-house counsel). Each panelist indicated his or her area(s) of practice, which collectively included:

  • Administrative Law
  • Appellate
  • Business Law
  • Civil Rights
  • Commercial Law
  • Contracts
  • Criminal Law
  • Disability Rights
  • Education Law
  • Employee Benefits
  • Employment Law and Labor Relations
  • Estate Planning
  • Fair Housing
  • Family Law
  • Health Care Law
  • Insurance Coverage
  • Judicial/Clerkship
  • Juvenile
  • Personal Injury
  • Probate
  • Professional Regulation
  • Torts

Process

The BDC met from June 29 to July 1, 2020. The meeting was held virtually for five hours each day. Prior to the meeting, each panelist was provided a binder of materials that served as advance readings for the meeting:

  • Meeting Agenda
  • NCBE Testing Task Force Study Overview
  • Testing Task Force Phase 1 Executive Summary
  • Testing Task Force Phase 2 Report

Additional materials were provided closer to the meeting date for reference during the meeting:

  • Detailed results of Phase 2 Analysis
    • Tasks organized by overall frequency of performing with results provided by Practice Cluster
    • Knowledge Areas organized by overall importance with results provided by Practice Cluster
  • Proposed Blueprint Structure – tasks organized under seven skill domains and knowledge areas
  • Information about the structure of the current Uniform Bar Examination (UBE) and Multistate Professional Responsibility Examination (MPRE)
  • Current MPRE Subject Matter Outline

Dr. Chad Buckendahl and Dr. Susan Davis-Becker from ACS Ventures LLC (ACS) facilitated the meeting. Staff from NCBE (Kellie Early and Dr. Mark Raymond) and the Chair of the TTF (Hon. Cindy Martin) attended the meeting to observe. Following introductory remarks by Judge Martin, ACS led an orientation that began with a review of the primary claim for the intended use of scores from the bar exam:

To protect the public by helping to ensure that those who are newly licensed possess the minimum knowledge and skills to perform activities typically required of an entry-level lawyer.

ACS then provided an overview of the TTF study, the purpose and function of a test blueprint, and the plan for the three-day meeting. The facilitators walked the panelists through the meeting materials and explained how to interpret the results of the Phase 2 practice analysis. This orientation took the majority of the first day of the meeting.

The general discussion began after the orientation with a review of the job tasks from the practice analysis survey. Specifically, the full list of 179 tasks was reduced to those 136 tasks that were rated as being performed Frequently or Moderately by 50%1 or more of the survey respondents. Further, the tasks identified for review were organized by the TTF and ACS under these seven skill domains:

  • Legal Research
  • Legal Writing and Drafting
  • Client Counseling and Advising
  • Issue Spotting and Evaluation
  • Investigation and Analysis
  • Negotiation and Dispute Resolution
  • Client Relationship and Management

The BDC reviewed each task and discussed its relevance to practice by NLLs based on the ratings collected during the practice analysis, including (1) the overall frequency ratings (how frequently the tasks were performed by NLLs), (2) the frequency ratings by Practice Cluster (survey respondents grouped by practice areas), and (3) the frequency ratings by those survey respondents identified as NLLs versus those who were not NLLs. The result of each task-level discussion was a recommendation as to whether the task should be included within that skill domain as being representative of the activities required of NLLs. The BDC also recommended consolidation of some tasks to eliminate overlap or redundancy.

After reviewing all 136 tasks in this manner, the BDC was asked to consider how much emphasis or weight should be given to the seven skill domains on the bar exam, including models of (1) equal weighting for each skill domain, (2) natural weighting, meaning the weight is determined by the number of tasks under each skill domain, or (3) weighting based on the judgments of the BDC. The BDC panelists opted for the third model and applied their judgment to reach consensus on recommended weights for each skill domain. This activity was concluded at the end of the second day.

The third day of the meeting was focused on reviewing the knowledge areas from the practice analysis. The full list of 77 knowledge areas from the practice analysis survey was reduced to 25 by prioritizing those areas that were rated as Important by 50%2 or more of the survey respondents. The BDC reviewed each knowledge area and discussed the relevance to practice by NLLs based on the overall importance ratings, the importance ratings by Practice Cluster, and the importance ratings by those respondents identified as NLLs versus those who were not NLLs. The result of each knowledge area discussion was a recommendation as to whether the area should be included on the bar exam.

After making decisions about what knowledge areas to recommend for inclusion, the BDC considered how much emphasis or weight should be given to each knowledge area on the bar exam. The BDC also considered generally whether knowledge areas should be measured in a content-dependent context (necessary legal resources are not provided) or in a content-independent context (necessary legal resources are provided). There was not enough time at the end of the third day to allow for coming to consensus on weighting or how the knowledge areas should be measured. Therefore, the facilitators introduced the topic and the panelists were sent a post-meeting assignment to provide their recommendations.

Each panelist was also asked to complete an online evaluation of the blueprint development meeting.

Results

Skill Domains

In total, the BDC identified 103 tasks from the original list of 136 tasks as representative of the seven skill domains identified for assessment on the bar exam: 9 of the original 136 tasks were consolidated to eliminate redundancy, and 24 tasks were recommended for exclusion, with most of those excluded because the BDC concluded that the tasks exceed the scope of an NLL’s practice. As an example, “Prepare or designate record for appellate or other post-judgmental review” is a task that the BDC recommended for exclusion. The list of tasks that were considered and the results can be found in Appendix A.

Table 1 shows for each skill domain the number of tasks, a general description of the domain, and the recommended weighting. The weighting is shown as the average of the weights recommended by the BDC panelists; a range of roughly 3% around that average is shown in parentheses.

Table 1. Skills Results
Skill Domain Tasks Description of Domain Weighting (%)
Legal Research 5 Researching the Law, Written/Reading Comprehension, Critical/Analytical Thinking 17.5 (15–20)
Legal Writing and Drafting 24 Written Expression, Critical/Analytical Thinking 14.5 (12–17)
Client Counseling and Advising 14 Oral Expression, Oral Comprehension, Cultural Competence, Advocacy, Critical/Analytical Thinking, Problem Solving, Practical Judgment 11.9 (10–15)
Issue Spotting and Evaluation 7 Identifying Issues, Observant, Critical/Analytical Thinking 17.5 (15–20)
Investigation and Analysis 17 Interviewing/Questioning, Fact Gathering, Cultural Competence, Problem Solving 17.5 (15–20)
Negotiation and Dispute Resolution 23 Negotiation Skills/Conflict Resolution, Creativity/Innovation, Expressing Disagreement, Written Expression, Oral Expression, Oral Comprehension, Advocacy, Practical Judgment 11.9 (10–15)
Client Relationship and Management 13 Networking and Business Development, Resource Management/ Prioritization, Organization, Strategic Planning, Managing Projects, Achievement/Goal Orientation, Practical Judgment, Decisiveness, Cultural Competence 9.2 (7–12)

Knowledge Areas

In deciding which knowledge areas to recommend for inclusion on the exam, the BDC endorsed 11 of the original 25 areas that were rated as Important by 50% or more of the survey respondents. The list of knowledge areas that were considered and the results can be found in Appendix B.

The BDC further recommended that the following six knowledge areas should be excluded as stand-alone topics and coverage of these areas should be subsumed under other knowledge areas and skills:

  • Statutory Interpretation Principles –> subsumed under Skills and Constitutional Law
  • Uniform Commercial Code –> subsumed under Business Organizations or Contract Law
  • Remedies –> subsumed under all knowledge areas
  • Civil Rights –> subsumed under Constitutional Law
  • Landlord-Tenant Law –> subsumed under Real Property and/or Contract Law
  • Debtor-Creditor Law –> subsumed under Business Organizations and/or Contract Law

For each recommended knowledge area, Table 2 below shows the recommended weighting (average of BDC panelists’ judgments along with a range of + 3%) and measurement approach (reflecting the consensus of at least two-thirds of the panelists) based on the post-meeting survey completed by 15 panelists. The survey asked for their recommendations on

  1. the weighting to be given to each knowledge area; and
  2. the measurement focus for each knowledge area as either testing knowledge of legal doctrine (content-dependent—legal resources not provided) or applying skills in the area (content-independent—legal resources provided).
Table 2. Knowledge Area Results
Knowledge Area Weighting (%) Measurement Approach
Business Organizations 7 (4–10) Knowledge (content-dependent)
Professional Responsibility, Ethics 7 (4–10) Knowledge (content-dependent)
Legal Research Sources & Methods 8 (5–11) Applying skills (content-independent)
Constitutional Law 9 (6–12) Knowledge (content-dependent)
Dispute Resolution* 9 (6–12) Applying skills (content-independent)
Real Property 9 (6–12) Knowledge (content-dependent)
Torts 9 (6–12) Knowledge (content-dependent)
Evidence 10 (7–13) Knowledge (content-dependent)
Criminal Law & Procedure 10 (7–13) Knowledge (content-dependent)
Contract Law 10 (7–13) Knowledge (content-dependent)
Civil Procedure 11 (8–14) Knowledge (content-dependent)

* This knowledge area represents the combination of Alternative Dispute Resolution and Trial Advocacy and Practice.

In submitting their recommendations regarding the knowledge areas after the meeting, the BDC panelists were invited to provide comments supporting their recommendations. The survey responses of those panelists who provided comments are set out in Appendix C.

Finally, the panelists were provided with the opportunity to complete an evaluation of the meeting and results by answering four selected-response questions and one open-ended question allowing for additional comments (see Appendix D). In general, the panelists felt prepared for the meeting based on their advance reading, rated the orientation and training part of the meeting as successful, and felt the recommended knowledge and skill domains represent appropriate expectations for NLLs. There were mixed feelings on the amount of time allocated for discussions, with the responses ranging from “more than enough time” to “not enough time.” This might be due to the need for a post-meeting activity to complete the work of the BDC relating to weighting and measurement focus for the recommended knowledge areas.  

Summary

Through review of the Phase 2 practice analysis results and discussion among the panelists, the BDC came to consensus and recommended 7 skill domains, with 103 representative tasks, and 11 knowledge areas for inclusion on the bar exam, with a targeted weighting for each. For 2 of the 11 knowledge areas—Legal Research Sources & Methods and Dispute Resolution—the BDC recommended that the assessment focus on applying skills in the area in a content-independent context (legal resources provided). The BDC suggested overall measurement targets of 30–40% weight on skills and 60–70% weight on knowledge on the bar exam.

Test Design Committee Meeting

TDC Panelists

The TTF invited each jurisdiction to nominate a representative (bar administrator, bar examiner, or justice) to serve on the TDC by completing an online form. The TTF selected from the nominees to achieve a mix of roles, jurisdiction sizes, and other demographic variables. The TTF also invited individual deans and faculty members from a variety of law schools to serve. The panel of 28 was composed of 11 educators, 9 bar examiners, 6 bar administrators, and 2 justices; 10 of the panelists were female and 7 were people of color. Each panelist had experience educating law students, administering the bar exam, serving as a bar examiner, or, in the case of the justices, serving as liaison between a state supreme court and the state’s board of bar examiners.

Process

The TDC completed its work between July 16 and August 4, 2020, through two virtual meetings of five hours per day over three days (Meeting 1 on July 16–17 and Meeting 2 on August 4) with an offline review of written materials before Meeting 1 and between meetings. The third day of August 4 was added after the meeting was switched from in-person to virtual mode. Seven of the TDC panelists were not available on August 4. Therefore, 28 panelists were present for Meeting 1 and 21 were present for Meeting 2.3 Those who could not attend Meeting 2 were given the opportunity to provide written input before and after the meeting.

Each panelist was provided with the following materials to read prior to Meeting 1:

  • Meeting Agenda
  • Testing Task Force Study Overview
  • TDC Role and Discussion Topics
  • Phase 1 Report Executive Summary
  • TTF Study Phase 2
  • BDC Meeting Summary
  • Summaries of Test Design Considerations

Dr. Chad Buckendahl and Dr. Susan Davis-Becker from ACS Ventures LLC (ACS) facilitated the meeting. Staff from NCBE (Kellie Early, Danielle Moreau, and Dr. Mark Raymond) and the Chair of the TTF (Hon. Cindy Martin) attended the meeting to observe. Judge Martin made introductory remarks and then ACS led an orientation that began with a review of the primary claim for the intended use of scores from the bar exam:

To protect the public by helping to ensure that those who are newly licensed possess the minimum knowledge and skills to perform activities typically required of an entry-level lawyer.

ACS then provided an overview of the TTF study, the purpose and function of a test design, and the plan for the three-day meeting. The ACS facilitators walked the panelists through the meeting materials and explained how each document related to the TDC’s work.

After the orientation, the panel was split into two groups, and an ACS facilitator guided each group through a discussion of the test design topics and specific questions listed below.

1. Structure

1.1 How many decisions will there be to the bar exam? [Will there be one overall pass/fail decision, or will candidates have to pass individual components?]

1.2 How many components will there be to the bar exam?

2. Knowledge Areas & Skill Domains

2.1 How should each skill domain be represented on the exam?

2.2 How should each knowledge area be represented on the exam?

3. Assessment Methods

3.1 What methods should be employed and to what degree?

4. Administration

4.1 When should the bar exam be administered? [At what point in their legal education would candidates be eligible to take the exam?]

4.2 How frequently should the bar exam be administered?

5. Score Interpretation and Use

5.1 What features are important for score portability?

6. Accessibility and Fairness

6.1 What flexibility is important in administration?

6.2 What accommodations should the design consider?

As noted in the TDC’s reading materials, the TTF decided that the next generation of the bar exam will be a computer-based test, administered either at testing centers or on examinees’ laptops, so the TDC did not discuss the issue of delivery mode.

The TDC panelists recognized the interconnectedness of these topics and spent the meeting time sharing their opinions and discussing advantages and challenges associated with the options. Most of the discussion time was spent on topics 1 (Structure), 2 (Knowledge Areas & Skill Domains), 3 (Assessment Methods), and 4 (Administration). Although there was some discussion of topics 5 (Score Interpretation and Use) and 6 (Accessibility and Fairness), no specific suggestions were formulated on these topics.

The ACS facilitators identified the general opinions expressed in the two group discussions during the first day, noting similarities and differences in viewpoints. At the beginning of the second day, the facilitators presented the main points from each group’s discussion to the entire TDC. Then the TDC again split into their respective groups to continue working through the design topics.

After Meeting 1, ACS consulted with NCBE staff and the TTF Chair to develop three draft design models that represented the range of opinions expressed by the TDC panelists on design topics. These three models were sent to the panelists prior to Meeting 2, and they were asked to comment via a survey. Responses to the survey were provided by 21 panelists.

At the beginning of Meeting 2, ACS shared the results of the survey regarding the three draft design models and presented a fourth model that consolidated several features from the original models and drew from the strengths of each, as identified through the survey responses.

At the conclusion of Meeting 2, the TDC was mostly in accord on several features. One topic in particular—the timing of administration of the exam—was not resolved. Therefore, ACS distributed a second survey to all 28 TDC panelists requesting their opinion on that topic. 

Each panelist was also asked to complete an online evaluation of the test design meeting.

Results

The results from Meeting 1 reflected a diverse set of opinions and perspectives as to what design features would best meet the claim outlined for the intended use of bar exam scores. The TDC was largely split on whether the design should use compensatory scoring (with scores on each component combined to produce one overall pass/fail decision for licensure) or conjunctive scoring (with scores on each component treated as separate pass/fail decisions and a requirement that candidates pass each component to be licensed). Under a compensatory design, candidates may compensate for a weak performance on one component with a strong performance on another. Under a conjunctive design, candidates must demonstrate the required level of proficiency on each component. The other design feature on which there was a diversity of opinions was whether to use a single-event (one exam administration taken after completion of law school) or a multi-event (exam administered as separate components with the option to take the first component during law school) administration model.4 Therefore, the three draft design models detailed in Appendix E were created using these decision points as the key differentiators.

Each of the draft design models assumed that the bar exam would include two components—Application of Core Doctrinal Law and Application of Lawyering Skills—and would be administered using a range of assessment methods/formats. TDC panelists were instructed to recognize that assessment of knowledge and skills are interconnected. That is, examinees are applying lawyering skills, such as issue spotting and analysis, when demonstrating their knowledge of legal doctrine, and some knowledge of or familiarity with legal doctrine is generally needed when demonstrating lawyering skills.

A common feature of each draft model is a test of Knowledge of Professional Responsibility that would be administered separately from the bar exam and could be taken during law school or after graduation.

  • Model A: This model proposed single-event administration and compensatory scoring (component scores combined for one pass/fail decision). Candidates would take the exam as a single event after completing law school and would be required to retest on all components if they failed to achieve the combined passing score.
  • Model B: This model proposed multi-event administration and conjunctive scoring (separate pass/fail decision for each component). Candidates would be eligible, but not required, to take the Application of Core Doctrinal Law component after completing 60 credits of law school. The Application of Lawyering Skills component would be taken after completing law school. Candidates would be required to retest on only the component they failed to pass.
  • Model C: This model proposed a hybrid approach to both administration and scoring. Candidates would take the exam after completing law school but not necessarily as a single event, meaning they would have the option to take each component separately and would retake only the component(s) they failed. Hybrid compensatory scoring would be used, with one overall pass/fail decision based on a combined score but with a minimum required score set for each component.

The three draft test design models were emailed to all 28 TDC members after Meeting 1 with an online survey, and 21 members responded to the following questions:

  1. What they perceived as the strengths of each model that would best help NCBE meet the stated claim
  2. What they perceived as the weaknesses of each model that would detract from NCBE’s ability to meet the stated claim
  3. Their overall opinion of each model rated on the following scale:
    1. I like it and would support it
    2. I can live with it and would support it
    3. I do not like it and would not support it

The ratings for each model are shown in the table below, as well as a set of consolidated ratings where the responses of “I like it” and “I can live with it” are combined. Overall, the panel rated models A and B similarly but gave lower ratings to model C.

Ratings of Draft Design Models
Original Ratings A B C Consolidated Ratings A B C
I like it and would support it 6 7 7 I like it OR I can live with it
 
17
 
17
 
13
 
I can live with it and would support it 11 10 6
I do not like it and would not support it 4 4 8 I do not like it and would not support it 4 4 8

Additional feedback included the strengths and weaknesses the TDC perceived for each model. The full set of feedback is included in Appendix F and is briefly summarized below.

  • Model A: The TDC recognized that this single-event administration with compensatory scoring model was most similar to the current bar exam and would not require major changes for stakeholders. Many supported this model because compensatory scoring affords candidates the opportunity to balance their strengths and weaknesses, while a single-event administration requires them to pull together everything they learned in law school. Conversely, the TDC noted that, given the similarity to the current design, this model would not take advantage of the opportunity to enhance the design of the bar exam. Some highlighted that compensatory scoring adds to candidates’ stress about the exam because it requires unsuccessful candidates to retake the entire exam.
  • Model B: The TDC identified the benefits of this multi-event administration with conjunctive scoring model to be that it would allow the testing of core doctrinal knowledge closer to when it is learned, would be less stressful for candidates, and might be viewed by candidates as being fairer because they could focus on demonstrating their proficiency on each component separately. The option to take the Application of Core Doctrinal Law component while in law school allows candidates to make a choice that works best for them and may enable some candidates to get licensed more quickly. On the downside, some noted that allowing testing during law school might change the educational experience (e.g., change timing of taking courses, cause students to miss out on internship/externship/employment opportunities) and might add stress for candidates who are trying to prepare for a high-stakes exam while still in law school.
  • Model C: Of the three models, the TDC panelists provided the most consistent feedback on this hybrid model; they highlighted the advantages of allowing candidates to combine their scores on the two components for a single pass/fail decision but with minimum score requirements for each, and allowing candidates to retake only the component they failed. The most frequently noted weakness was the lack of opportunity for candidates to take a component earlier in law school and related concerns about delay in licensure after graduation because of the time needed to score the Application of Lawyering Skills component.

ACS focused on the most common positive and negative comments about each model and created Model D as a fourth option (see Appendix G) to share with the TDC during Meeting 2. This model adopts the hybrid scoring approach from Model C (component scores combined for a single pass/fail decision but with minimum score requirements for each component) and the optional multi-event administration approach from Model B (candidates may take the Application of Core Doctrinal Law component during law school if they wish).

The TDC discussed each design feature of Model D. The prevailing views of the TDC members who participated in Meeting 2 are summarized below.

Structure: The TDC generally supported the structure of two components (Application of Core Doctrinal Law and Application of Lawyering Skills) for the bar exam and a separate exam on Knowledge of Professional Responsibilities. Pass/fail decisions for the bar exam would be based on a compensatory score for the exam but with minimum score requirements for each component. The compensatory score would be a weighted combination of the scores on the two components, and the TDC suggested either a 50/50 weighting (equal weight between the two components) or a 60/40 weighting with the higher weight allocated to the Application of Lawyering Skills component. Again, it is important to recognize the interdependence of knowledge and skills and that the two are not necessarily assessed separately.  

Application of Core Doctrinal Law component: For this component, the TDC did not unanimously agree with the appropriateness of assessing some of the knowledge areas recommended for inclusion by the BDC; both the BDC and the TDC agreed, however, that the depth and breadth of coverage in the knowledge areas tested should be limited to the core legal principles that NLLs need to know without “looking it up” (i.e., they should be able to issue spot and know the basic rules but should not be expected to know “the exceptions to the exceptions”).

Application of Lawyering Skills component: The TDC showed unanimous support for measuring skills such as Legal Writing, Legal Research, Issue Spotting and Evaluation, and Investigation and Analysis. With respect to assessing legal writing skills, some TDC panelists suggested that the written responses be scored on both content and effectiveness of writing, with some proposing that the effectiveness of writing should only enhance a candidate’s score if the content is correct. For Professional Responsibility and Ethics, the TDC acknowledged the importance of the subject matter but did not want to see it tested as a core knowledge area on the bar exam because it would duplicate content tested on the Multistate Professional Responsibility Exam (MPRE). The TDC suggested that Professional Responsibility could serve as the context for questions in the Application of Lawyering Skills component to assess skills such as Issue Spotting and Evaluation, with the Model Rules of Professional Responsibility being provided as a resource to use during testing. Some members of the TDC expressed strong concerns that the skills of Client Counseling and Advising, Client Relationship and Management, and Negotiation and Dispute Resolution could not be measured objectively and without bias, and the importance of those concerns is noted. In terms of methods for assessing skills, the TDC generally supported the idea of case studies (e.g., written fact scenarios or video simulations) using multiple item types (e.g., short answer, selected response, extended response) with a library of legal resources provided.

Administration: Most of the TDC panelists were supportive of the idea that candidates should have the option to take a component of the bar exam during law school, but a few panelists were adamantly opposed, voicing their concerns regarding the impact on law school curriculum and law students. Near the end of the meeting, discussion arose about which component would be more appropriate for testing during law school if such an option were to be offered. There was not enough time remaining to resolve panelists’ views on this issue, so a short follow-up survey was distributed to the TDC posing the following questions:

  1. Would you endorse candidates being able to take a component of the bar exam after completing 60 hours of law school?
  2. If there was an option for candidates to take a component of the bar exam after completing 60 hours of law school, which component do you think should be available?

The results of the survey, including comments, are provided in Appendix H. There were 21 responses, with 13 answering “yes” to the first question of whether candidates should be able to take a component after completing 60 hours of law school and 8 answering “no.” Thus, the results show a preference for allowing candidates to take components of the bar exam during law school based on the yes/no votes. When reading the comments provided with the “no” votes, two of the eight wanted the option for candidates to take a component in law school but at an earlier point than after completing 60 credits. The second question resulted in 17 selecting “Doctrinal Law” and 4 selecting “Lawyering Skills.” While they were asked to select either doctrine or skills in question #2, a couple of panelists indicated in their comments that they would prefer candidates have the option to take either component or both during law school.

Following completion of the meeting, each panelist was asked to complete an online evaluation of the test design process. The results are set out in Appendix I. Overall, the panelists felt they were very prepared for the first meeting and that the right amount of time was allocated to discussing each test design topic. Similar responses were provided regarding Meeting 2, with a few panelists indicating that more discussion time would have been beneficial. Panelists’ overall evaluation of the process was generally positive. 

Summary

Through review of the Phase 1 input from stakeholders, review of the BDC recommendations based on the Phase 2 practice analysis results, and review of the additional reading materials that were provided, the TDC discussed various options for test design. Their opinions on test design issues were not unanimous, and the TDC’s discussion rightly reflected the interconnectedness and complexity of some of the test design issues while providing valuable insight into the benefits and challenges of various approaches to those design issues.

References

1 To account for a margin of error of 3%, the list reviewed by the BDC included tasks rated as being performed Frequently or Moderately by 47% or more.

2 To account for a margin of error of 3%, the list reviewed by the BDC included knowledge areas rated as Important by 47% or more.

3 The panelists present for Meeting 2 consisted of 10 educators, 7 bar examiners, and 4 bar administrators.

4 Under either administration model, jurisdictions could permit candidates to take components that are to be completed “after law school” prior to graduation, as is the case with the current bar exam.