assessment

Home / Posts tagged "assessment"
Going Back to the Basics, Low-Tech Assessment Methods in Large Doctrinal Classes

Going Back to the Basics, Low-Tech Assessment Methods in Large Doctrinal Classes

Teaching Idea for February.

By Sandra Simpson, Professor, Gonzaga University School of Law.

While teaching large, doctrinal courses, it is possible to engage and assess the entire class with low-tech methods.  I teach a Real Estate Transactions course to 60 plus students every spring.  One effective method is using 3M posterboards for groups to “publish” their work.  I used this method this week when we were reviewing contract concepts.  In reviewing covenants versus conditions, I needed to know where my students were in terms of understanding these basic contract terms.  To accomplish this, I returned to a basic, low-tech method of large 3M posterboards (poster-sized sticky notes) for this assessment.

Once I found the 3M posterboard pad (in a lonely, dusty corner closet), I posted 23 pieces of paper around the room before the students arrived.  Once the students arrived[1], I had them form groups of three.[2]  I asked the groups to read the following clause: “Seller to provide the buyer with a certificate of occupancy prior to closing.”  The students were then asked to determine whether this clause creates a promise or a contingency.  After five minutes of group discussion, I asked random groups to support whether it is a promise or a contingency.  We discuss why the distinction matters.  Students soon realize the clause can be argued either way, which is not ideal for a real estate contract; it can lead to litigation, affecting the parties’ contract rights.

For the next step, I asked the students to redraft the clause creating a promise, and then redraft the clause creating a contingency.  The students wrote the two clauses on their 3M poster paper.  After every group was done with the drafting and had posted their paper on the wall, I asked them to walk around reading the other groups’ drafted clauses.  Each group marked the one they liked best (they could not vote for their own).

After all the students sat down, we looked at the votes to ascertain the best clauses and debrief the exercise.  The voting showed two very different drafting techniques tied for the best clauses.  This highlighted some drafting issues and created a discussion of different methods to create a promise or a contingency.  The entire exercise took 30 minutes, but it engaged the entire class.  An additional bonus was that the posterboards remained on the walls for the entire class, allowing me to walk around (while students were working on another problem) and read all the students’ work, which created another opportunity to talk to the groups about their work and answer lingering questions.

[1] It was really fun to listen to their reactions to the paper being posted around the room.  They were very curious and excited.

[2] You can form the groups yourself, particularly if you want to pair strong and weak students.

Visual Aids for the Law Classroom

Visual Aids for the Law Classroom

By Aaron Caplan, Loyola Law School, Loyola Marymount University

Visual aids are not the most important thing a law teacher does in the classroom.  They can never substitute for well-chosen material, clear organization, thoughtfully chosen in-class activities, being a good explainer or being a good listener.  With that said, good visual aids can help students learn more effectively – and bad visual aids make learning harder.

A series of videos based on a presentation I gave at the AALS New Law Teachers Workshop in June 2019 explores what makes successful visual aids work.  The first segment explores the psychology of multi-media learning, providing a theory for preparing visual aids that complement one’s lesson plan and not detracting from it.  The following segments provide examples of visual aids that I have used with success in various classes, including illustrations, visual renderings of legal texts, visualizations of concepts, and more.

The videos can be reached here:  www.lls.edu/CaplanVisualAids/

Assessing Legal Research Skills: A Fresh Approach

Assessing Legal Research Skills: A Fresh Approach

By Eric Voigt, Faulkner University, Jones School of Law

I have asked myself many times, “Self, could my first-year law students research a legal issue without any guidance from me?” You have probably asked yourself a similar question if you teach a skills-based course. This semester, I decided to create a new assessment measure to answer my question: an online research exam.

Summary of How I Teach Legal Research

My students learn to perform legal research through multiple methods. Students first read the assigned chapters from the textbook I have authored titled Legal Research Demystified: A Step-by-Step Approach. Students then jump online and answer multiple-choice questions on Core Knowledge for Lawyers (https://coreknowledgeforlawyers.com). Core Knowledge automatically grades each answer and provides an explanation (similar to Core Grammar) to reinforce basic research concepts. Next, students complete guided research exercises using the research services and tools they just read about. During class, I discuss the commonly-missed questions and answer their questions. Last, students must apply their research skills to the open memo problem—once again, with guidance from me.

Purpose of Online Research Exam

Despite those formative assessments, I wanted a higher degree of confidence that my students could “fly the research nest” and answer a legal question on any unfamiliar issue. To that end, I am creating an online research exam that my students will take this semester. I have one primary purpose: determine whether my students could find—and understand—relevant statutes and interpretive cases without guidance from me.

Content of Online Research Exam

For my research exam, students will not simply answer questions on research concepts (e.g., What is KeyCite?). Instead, students will resolve a client’s legal question using Westlaw or Lexis Advance. Specifically, they will research state statutes and update them, including confirming their validity, checking effective dates, and reviewing amendments. They will also need to find cases that have interpreted the statutes. Last, students will synthesize the relevant rules and authorities and predict the client’s likelihood of success.

Delivery Format of Online Research Exam

Students will electronically complete my research exam directly on TWEN (The West Education Network), which is my course management system. (Next year, students will be able to complete the research exam on Carolina Academic Press’s platform, Core Knowledge.) Most of the exam contains multiple-choice questions, but it also has a few fill-in-the blank questions and one short answer question. The final question, for instance, requires students to follow CRAC principles (Conclusion-Rule-Application-Conclusion) and write a few paragraphs on whether the client would prevail.

By placing the exam online, I can include questions that build upon prior ones, allowing me to assess students’ understanding of different steps of the research process. For example, suppose a student finds the wrong statutes in response to an initial question. I could still assess whether the student understands how to update the statutes by identifying the correct statutes in subsequent questions and asking about their validity and effective dates.

Because some questions provide the answers to prior ones, I will establish certain limits. Using TWEN’s advanced options, I will prevent students from downloading the exam and viewing any subsequent question until they have answered the question on their screen (called “sequential quizzing”). I will also have TWEN grade the first selected answer for each question, so a student cannot change an answer based on what the student learns from later questions.

I will have students take the exam outside of the classroom, so they will not be limited to our eighty-minute class periods. Students will have a three-day window to start the research exam; once started, they will have three continuous hours to complete it. Students will need the extra time to discern the relevant from the irrelevant authorities, as well as more time to analyze the application of law to the client’s situation.

TWEN’s Grading Features

TWEN has several useful grading features. TWEN automatically grades the multiple-choice and fill-in-the blank questions. As to a short answer question, a professor can electronically mark each one correct or incorrect and can even assign partial credit. TWEN then tallies each student’s scores on all questions. The professor can “release” the grades for all students, allowing each student to view only his or her own grade.

Benefits of an Online Research Exam

Assigning an online research exam has multiple benefits to professors and students, such as the following:

  • Professors assess students without giving up an in-person class meeting.
  • Professors who assign the exam in lieu of in-person meetings (permitted under the ABA rules) could free up an entire week of classes to provide feedback on students’ draft memos.
  • Professors can ascertain whether students have learned how to do “real” legal research.
  • Students receive their exam grade immediately upon completion.
  • Students discover any weak research skills before the deadline of the open memo.
  • Students gain confidence in researching on their own and learn skills that can be applied to the open memo problem.

In short, an online research exam is a good assessment tool for first-year and upper-level students. It could be assigned in an integrated research and writing course or a stand-alone research class. If you would like a copy of my research exam, please email at evoigt@faulkner.edu.

Racial Anxiety

Racial Anxiety

By Anastasia M. Boles, UA Little Rock, William H. Bowen School of Law

As law professors, we care deeply about our students.  We put a tremendous amount of effort into our teaching, advising student organizations, and serving as formal and informal mentors.  Unfortunately, science has taught us that unconscious racism may be operating to degrade our student interactions. Many of us are familiar with the term “implicit bias.”  Over the last few decades, social psychologists have explored the ways implicit preferences and biases permeate society, including criminal justice, health, and education.  Thus, unconscious racism may be interfering with our student interactions.

While lesser known than implicit bias, a common consequence of unconscious racism is “racial anxiety,” which is the unconscious anxiety we may experience or exhibit when interacting with a person of a different race.  For example, racial anxiety can cause undetectable physical changes in our bodies such as nervousness, discomfort, stiffness, and decreased eye contact.  The experience of unconscious racial anxiety sets up a vicious cycle; we unconsciously minimize interactions that have made us uncomfortable in the past, even if we cannot name the source of the discomfort. Racial anxiety expresses differently depending on race – people of color may be anxious about experiencing racism; whites may fear saying the wrong thing, or being labeled a racist.  Whatever the cause, as our cognitive resources are directed to mitigating any racial anxiety we are experiencing, the quality of our personal interaction with the differently-raced person can degrade.[1]

Racial anxiety is likely present in the halls and classrooms of law schools as well.  Despite our best intentions, law professors may experience racial anxiety symptoms in cross-racial conservations and interactions with our students.  At the same time, our differently-raced students may experience racial anxiety as they interact with us.  Consider this common scenario: a white law professor and a student of color meet outside of class for the first time to review an exam, talk about an issue from class, or discuss a paper.  Racial anxiety can affect the professor’s ability to build rapport with the student, appear open and friendly, evaluate the student’s learning needs, engage the student’s questions, and build trust.  The student of color, if also affected by racial anxiety, is less able to ask questions, absorb feedback, and seek mentoring.  If either the law professor or law student experienced unconscious racial anxiety during the meeting, future interactions between the professor and student may be affected.  Now imagine the potential for racial anxiety to disrupt the law school classroom where a sensitive issue related to race comes up in class discussion.  Racial anxiety may degrade the ability or willingness of the professor to engage the issue.  The ensuring student discussion could suffer.  Our students require our full attention; if racial anxiety is depleting the attention we give, we should do something about it.

What can we do?  If racial anxiety operates in our unconscious minds, can we ever hope to banish it?  The great news is that we can.  To combat racial anxiety, psychologists recommend that we start by increasing our cross-racial interactions with our students.  Psychologists call this “intergroup contact.”  Strategies such as encouraging students to attend office hours to increase familiarity, attending and supporting student events with differently-raced students, and increasing the amount and depth of conversations with differently-raced students can help.  During cross-racial interactions, seek to understand cultural differences as well as identifying similarities; the goal is to recognize and appreciate the varying cultural backgrounds of our students – not minimize them.  The more law teachers and law students from different racial backgrounds interact with one another, the less potential for racial anxiety to disrupt those interactions.

[1] For more information about racial anxiety see here, and here.

The Compounding Effects of Assessment

The Compounding Effects of Assessment

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

If you’ve found your way to the Institute of Law Teaching and Learning, you are likely already a believer in formative assessment. We do have empirical evidence that formative assessment improves student learning in law: Two recent studies have shown that students who received individualized feedback during the semester outperformed students who did not on final exams, and not just in the class where they received the feedback but in every single class they were taking.  [1] One study’s authors note the “likelihood of this occurring by chance is one in 256.”[2]

But as we add formative assessments to students’ semesters, we must consider how we are altering the demands on their time. The middle of the semesters, which have traditionally been the playground for the Socratic Method and for legal writing assignments, may now be filled with a variety of assessment activities, and some of them may dominate students’ time in a way that impacts students’ learning in other classes. When our assessments interfere with students’ participation in other classes, or vice versa, the inferences that we draw from our assessments about student learning may not be valid. And an assessment that provides invalid data is worse than no assessment at all. Consequently, we must all consider our assessments as students experience them, “holistically and interactively.”[3]

How do we deeply coordinate assessments and avoid an assessment system that instead overwhelms students, clutters or fragments their learning, or discourages them early in their first semester? We must coordinate beyond shared calendars, starting in our own classrooms by ensuring that our own assessment activities, as a slice of the student-time pie, are designed with and justified by best practices that encourage an assessment’s validity. In a recent article, I’ve identified five relevant best practices:

  1. Make the assessments’ alignment with learning goals transparent to students and to other faculty members with whom we intend to coordinate: A clear alignment with learning goals helps students understand how the assessments will move them towards learning goals, and helps them make informed decisions about their allocation of time. A clear alignment also allows us to clearly communicate our assessment choices to other faculty members.
  2. Use rubrics to create a shared language of instruction: Once we identify learning goals, rubrics help us refine our communication with students. They see how they will be assessed, and we see with specificity what they have learned.
  3. Ensure the assessments encourage student autonomy: One particularly harmful potential outcome of a tightly orchestrated assessment system is that it may overly dictate student decisions, rather than facilitate student autonomy. Our assessment systems should build students’ feelings of autonomy, competence, and relatedness, which are fundamental to learning.
  4. Set high expectations and display confidence that students can meet those expectations: Students prone to maladaptive responses to feedback are likely to be overwhelmed and discouraged by frequent assessments. Explaining our high expectations and displaying confidence in students can help address these tendencies.
  5. Regularly review the entire assessment system, paying particular attention to students’ ownership of their own learning within the system.

When we ground our formative assessment decisions in best practices, we are better able to communicate our decisions to students, and better able to more deeply coordinate with other faculty members.


[1] See Daniel Schwarcz & Dion Farganis, The Impact of Individualized Feedback on Law Student Performance, 67 J. Legal Educ. 139, 142 (2017) (finding that formative assessment improved performance on final exams for students with below-median entering credientials); Ruth Colker et al., Formative Assessments: A Law School Case Study, 94 U. Det. Mercy L. Rev. 387 (2017) (finding the same); Carol Springer Sargent, Andrea A. Curcio, Empirical Evidence That Formative Assessments Improve Final Exams, 61 J. Legal Educ. 379, 383–84 (2012) (finding that formative assessment improved performance on final exams for students with above-median entering credentials); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Developing an Empirical Model to Test Whether Required Writing Exercises or Other Changes in Large-Section Law Class Teaching Methodologies Result in Improved Exam Performance, 57 J. Legal Educ. 195, 197 (2007) (finding the same); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Does Practice Make Perfect? An Empirical Examination of the Impact of Practice Essays on Essay Exam Performance, 35 Fla. St. U. L. Rev. 271, 280-82, 302-306 (2008)(finding the same).

[2] Schwarcz, supra note 1, at 142.

[3] See Harry Torrance, Formative Assessment at the Crossroads: Conformative, Deformative and Transformative Assessment, 38 Oxford Rev. of Educ. 323, 334 (2012) (noting that “assessment is always formative, but not necessarily in a positive way”).

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

Elizabeth Ruiz Frost, Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback, 65 J. Legal Educ. 938 (2016)

Elizabeth Ruiz Frost’s article Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback was published in 2016, but it continues to affect the way I design and critique my students’ assessment activities—both in my classroom and across our curriculum—as we respond to the ABA’s mandate for more formative assessment. Professor Frost posits that, while providing a model answer (either student- or professor-authored) in place of individual feedback may allow for efficient formative feedback, in most situations it does not provide effective formative feedback. She points to evidence that weaker students tend to misinterpret model answers and are less capable of accurately assessing their own work against the model.

In her article, Professor Frost gives reasons beyond efficiency a professor may have for giving feedback through a model answer, including that learning through a model answer encourages a students to self-teach, a skill they will rely on throughout their career; model answers provide feedback quickly, while students are still primed for it; model answers will not alienate students with personalized, negative comments; and model answers are what students clamor for. Professor Frost explains why each of these reasons is inadequate to justify what she describes as a shift in the learning burden: the professor avoids learning how to provide effective feedback by forcing a student to learn how to improve from a model.

Model answers provide effective formative assessment only if students are able to compare their work with a model and see what they did wrong. Professor Frost roots the assumption students do this in the “Vicarious Learning and Self-Teaching models of education, which have pervaded legal teaching since the nineteenth century.” In fact, whether this feedback is effective depends first on the characteristics and mindset of the learners, and second on the type of knowledge the professor is assessing. As to the first variable, because weaker students are less self-aware, they face a “double curse”: “[t]he weakest students, who lack the ability to distinguish between the standard exemplified by a model answer and their own work, will learn the least from a model answer. So the students who need feedback most for continued learning will get the least.”

The second variable is relevant because model answers can provide effective feedback for questions of factual knowledge and concept identification. But any assessment that requires higher-order thinking—where students need to demonstrate analysis, for example—model answers are not as effective. Students instead need elaborative feedback.

Professor Frost ends her article with methods for using model answers to give feedback that best promote student learning: (a) providing an annotated model answer together with individualized feedback; (b) creating opportunities for remediation and reassessment for students after they have reviewed model answers; (c) using a student’s own work as a model answer; (d) requiring students to review model answers in small groups instead of individually; (e) providing multiple sample answers for review, including both strong and weak samples; and (f) focusing on metacognitive skills throughout so that students can better self-evaluate against model answers.

Several of her methods have worked for my students. Recently, I’ve noticed the first method recommended above working across the curriculum: students learn more from a model answer when the same skill (here, answering a midterm essay question) is tested in another course and personalized feedback is given there. In short, learning in one course is improved by the efforts of professors in other courses.

Being human to my students and letting them know I care

Being human to my students and letting them know I care

By Jane Korn, Gonzaga University School of Law

I have taught first year law students for a long time.  Please do not ask how long!  But years ago, I became worried about the mental health and stress levels of my first semester, first year students. I teach a four credit, one semester course in Civil Procedure during the first semester of law school.   On the last day of the week that I teach in Civ Pro, I take a few minutes out of class time and ask my students to tell me how they are doing.

The first time I do this, usually at the end of the first week of law school,  I tell my students that it is my custom, from time to time, to take time out from Civ Pro, and talk about anything they would like (with some limits).  In some years, it takes weeks for them to take me up on this offer.  Other years, they start right in.  They ask questions like the following:

  1. When should I start outlining?
  2. How much time should I spend studying every night?
  3. How important is getting involved in extracurricular activities?
  4. What if I don’t know what kind of law I want to practice?
  5. Do professors care about grammar and organization on a final exam? (I only answer what I expect and do not answer for other faculty)

I think that much of the time, they do not get a chance to ask a law professor these kinds of questions, and can usually only ask upper class students.  While we have faculty advisors, students may or may not feel comfortable asking them questions like the above.  They eventually do (and sometimes quickly) feel comfortable asking me a wide variety of questions.  They sometimes ask personal questions and, within reason, I answer them because it makes them feel more comfortable with me.  Questions on gossipy matters about other faculty are off limits. If for example, they complain about another professor,  I handle this question with a smile and say something like – you should ask that professor about this issue.

I set aside class time for several reasons. First, while I do worry about giving up valuable teaching time, lessening the stress of my students may make them more able to learn.  Second, students often feel like they are the only one with a particular concern during this first semester, and they often do not have the ability to know that others have the same concerns or questions.  In the first year, many of our students are not from this area and are far away from support systems, at least at first until they can make friends at law school.  The ability to know that other students have the same problems they do can lessen the feeling of isolation.  Using class time to answer questions to the entire group may help them with this sense of isolation and being the only one who doesn’t know something.  It also lets them see that their concerns are important and credible.

Every year my teaching evaluations reflect this process positively.  Students feel like I care (which I do).  However, the reason I do it is to increase their comfort during those first few exciting, confusing, and terrifying months of law school.

Content Analysis Coding Practice

Content Analysis Coding Practice

By Sandra Simpson, Gonzaga University School of Law

Hi All,

I have been doing training on assessment practices for in-class use and for institutional programmatic assessment.  To that end, I am learning many techniques which I am employing in my class to find out what teaching methods are working and what are not.  I learned the following coding method which allows me to assess answers from my students to open-ended survey questions.  To use the coding method, I look for themes in their answers.  I describe the system below.  Please contact me should you have any questions on the methodology or what I do with the information.  In traditional, interactive fashion, there is a practice exercise at the end so readers can see how simple this method is.  Once I see themes, I am able to respond and make changes.

Method:

Course goal:  Students will learn how to locate and print and on-line sources which are complete and relevant to solving a factual problem.

Question posed:  You are asked to do many assignments and activities in this class to help develop your legal research skills.  Please identify an assignment or type of activity that you found most helpful in developing your research skills.  Please include in your answer a specific description of what about the assignment and/or activity that helped you.

I asked this question to my students as I wanted to hear the student perspective on which types of assignments and activities most effectively helped them develop research skills.  I give many assignments, but I was unsure which ones were useful. I also wanted to know what about the assignment was helpful: step-by-step instructions, group work, lecture, or flipped classroom model. I developed a coding system so that I could analyze the results.  I devised the coding after reviewing a 20% sample of student responses.  I randomized who I chose.  They turned in the responses, and I chose every 5th one.

  1. Identification number for class level: (accelerated student=1; a traditional 1L=2)
  2. Overall response: (0=no response/question was unanswered; 1= student provided a usable response; 2=state/implied that research skills were not strengthened in LRW I course; 3 =response was either not useful or could not be coded)
  3. Positive mention of a structured assignment which led the student with step-by-step instructions to helping them develop research skills. (0=no; 1=yes)
  4. Positive mention of a structured assignment which required to use or develop research skills but no mention of step-by-step guidance being useful. (0=no; 1=yes)
  5. Positive mention of a structured assignment which required students to work collaboratively. (0=no; 1=yes)
  6. Positive mention of lecture on how-to-do research in print by LRW professor. (0=no; 1=yes)
  7. Positive mention of demonstration on how-to-do research on-line by the librarians. (0=no; 1=yes)
  8. Positive mention of video demonstration on how-to-do digest research in print which is uploaded to the TWEN page. (0=no; 1=yes)
  9. Positive mention of one-on-one assistance of a faculty member. (0=no; 1=yes)
  10. Positive mention of one-on-one assistance of a librarian. (0=no; 1=yes)

Use the coding scheme on the previous page to code the following three responses.  Each student has one row.

Student 1: Accelerated student: I learned the most about research when we did the mini assignments on finding cases in the digests in print.  It was most effective to me as we were allowed to work in groups, the professor gave us clear instructions as to each step in the process and I was able to watch the video on TWEN where the professor walked through an example.  Other assignments did not teach me as much when I had to struggle alone as I wasted a lot of time.

Student 2: Traditional 1L: This class and all my law classes have been a struggle for me.  I often don’t know where to go for help, and I am tired and stressed all the time.  The teacher seems to favor the three girls in the front row.  The rest of us aren’t encouraged to say anything.

Student 3: Traditional 1L: the assignment that taught me the most about research and really helped to develop my research was our first open memo.  What helped the most was struggling through the resources myself, asking for guidance from the librarian, and meeting personally with the professor who went to the library with me.  I found myself looking back at my lecture notes and the reading to remember how to do things.  This particular assignment helped bring it all together.  The other mini-assignments were too disjointed to help me much.

Coding sheet

 

 

Institute for Law Teaching and Learning