assessment

Home / Posts tagged "assessment"
Racial Anxiety

Racial Anxiety

By Anastasia M. Boles, UA Little Rock, William H. Bowen School of Law

As law professors, we care deeply about our students.  We put a tremendous amount of effort into our teaching, advising student organizations, and serving as formal and informal mentors.  Unfortunately, science has taught us that unconscious racism may be operating to degrade our student interactions. Many of us are familiar with the term “implicit bias.”  Over the last few decades, social psychologists have explored the ways implicit preferences and biases permeate society, including criminal justice, health, and education.  Thus, unconscious racism may be interfering with our student interactions.

While lesser known than implicit bias, a common consequence of unconscious racism is “racial anxiety,” which is the unconscious anxiety we may experience or exhibit when interacting with a person of a different race.  For example, racial anxiety can cause undetectable physical changes in our bodies such as nervousness, discomfort, stiffness, and decreased eye contact.  The experience of unconscious racial anxiety sets up a vicious cycle; we unconsciously minimize interactions that have made us uncomfortable in the past, even if we cannot name the source of the discomfort. Racial anxiety expresses differently depending on race – people of color may be anxious about experiencing racism; whites may fear saying the wrong thing, or being labeled a racist.  Whatever the cause, as our cognitive resources are directed to mitigating any racial anxiety we are experiencing, the quality of our personal interaction with the differently-raced person can degrade.[1]

Racial anxiety is likely present in the halls and classrooms of law schools as well.  Despite our best intentions, law professors may experience racial anxiety symptoms in cross-racial conservations and interactions with our students.  At the same time, our differently-raced students may experience racial anxiety as they interact with us.  Consider this common scenario: a white law professor and a student of color meet outside of class for the first time to review an exam, talk about an issue from class, or discuss a paper.  Racial anxiety can affect the professor’s ability to build rapport with the student, appear open and friendly, evaluate the student’s learning needs, engage the student’s questions, and build trust.  The student of color, if also affected by racial anxiety, is less able to ask questions, absorb feedback, and seek mentoring.  If either the law professor or law student experienced unconscious racial anxiety during the meeting, future interactions between the professor and student may be affected.  Now imagine the potential for racial anxiety to disrupt the law school classroom where a sensitive issue related to race comes up in class discussion.  Racial anxiety may degrade the ability or willingness of the professor to engage the issue.  The ensuring student discussion could suffer.  Our students require our full attention; if racial anxiety is depleting the attention we give, we should do something about it.

What can we do?  If racial anxiety operates in our unconscious minds, can we ever hope to banish it?  The great news is that we can.  To combat racial anxiety, psychologists recommend that we start by increasing our cross-racial interactions with our students.  Psychologists call this “intergroup contact.”  Strategies such as encouraging students to attend office hours to increase familiarity, attending and supporting student events with differently-raced students, and increasing the amount and depth of conversations with differently-raced students can help.  During cross-racial interactions, seek to understand cultural differences as well as identifying similarities; the goal is to recognize and appreciate the varying cultural backgrounds of our students – not minimize them.  The more law teachers and law students from different racial backgrounds interact with one another, the less potential for racial anxiety to disrupt those interactions.

[1] For more information about racial anxiety see here, and here.

The Compounding Effects of Assessment

The Compounding Effects of Assessment

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

If you’ve found your way to the Institute of Law Teaching and Learning, you are likely already a believer in formative assessment. We do have empirical evidence that formative assessment improves student learning in law: Two recent studies have shown that students who received individualized feedback during the semester outperformed students who did not on final exams, and not just in the class where they received the feedback but in every single class they were taking.  [1] One study’s authors note the “likelihood of this occurring by chance is one in 256.”[2]

But as we add formative assessments to students’ semesters, we must consider how we are altering the demands on their time. The middle of the semesters, which have traditionally been the playground for the Socratic Method and for legal writing assignments, may now be filled with a variety of assessment activities, and some of them may dominate students’ time in a way that impacts students’ learning in other classes. When our assessments interfere with students’ participation in other classes, or vice versa, the inferences that we draw from our assessments about student learning may not be valid. And an assessment that provides invalid data is worse than no assessment at all. Consequently, we must all consider our assessments as students experience them, “holistically and interactively.”[3]

How do we deeply coordinate assessments and avoid an assessment system that instead overwhelms students, clutters or fragments their learning, or discourages them early in their first semester? We must coordinate beyond shared calendars, starting in our own classrooms by ensuring that our own assessment activities, as a slice of the student-time pie, are designed with and justified by best practices that encourage an assessment’s validity. In a recent article, I’ve identified five relevant best practices:

  1. Make the assessments’ alignment with learning goals transparent to students and to other faculty members with whom we intend to coordinate: A clear alignment with learning goals helps students understand how the assessments will move them towards learning goals, and helps them make informed decisions about their allocation of time. A clear alignment also allows us to clearly communicate our assessment choices to other faculty members.
  2. Use rubrics to create a shared language of instruction: Once we identify learning goals, rubrics help us refine our communication with students. They see how they will be assessed, and we see with specificity what they have learned.
  3. Ensure the assessments encourage student autonomy: One particularly harmful potential outcome of a tightly orchestrated assessment system is that it may overly dictate student decisions, rather than facilitate student autonomy. Our assessment systems should build students’ feelings of autonomy, competence, and relatedness, which are fundamental to learning.
  4. Set high expectations and display confidence that students can meet those expectations: Students prone to maladaptive responses to feedback are likely to be overwhelmed and discouraged by frequent assessments. Explaining our high expectations and displaying confidence in students can help address these tendencies.
  5. Regularly review the entire assessment system, paying particular attention to students’ ownership of their own learning within the system.

When we ground our formative assessment decisions in best practices, we are better able to communicate our decisions to students, and better able to more deeply coordinate with other faculty members.


[1] See Daniel Schwarcz & Dion Farganis, The Impact of Individualized Feedback on Law Student Performance, 67 J. Legal Educ. 139, 142 (2017) (finding that formative assessment improved performance on final exams for students with below-median entering credientials); Ruth Colker et al., Formative Assessments: A Law School Case Study, 94 U. Det. Mercy L. Rev. 387 (2017) (finding the same); Carol Springer Sargent, Andrea A. Curcio, Empirical Evidence That Formative Assessments Improve Final Exams, 61 J. Legal Educ. 379, 383–84 (2012) (finding that formative assessment improved performance on final exams for students with above-median entering credentials); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Developing an Empirical Model to Test Whether Required Writing Exercises or Other Changes in Large-Section Law Class Teaching Methodologies Result in Improved Exam Performance, 57 J. Legal Educ. 195, 197 (2007) (finding the same); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Does Practice Make Perfect? An Empirical Examination of the Impact of Practice Essays on Essay Exam Performance, 35 Fla. St. U. L. Rev. 271, 280-82, 302-306 (2008)(finding the same).

[2] Schwarcz, supra note 1, at 142.

[3] See Harry Torrance, Formative Assessment at the Crossroads: Conformative, Deformative and Transformative Assessment, 38 Oxford Rev. of Educ. 323, 334 (2012) (noting that “assessment is always formative, but not necessarily in a positive way”).

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

Elizabeth Ruiz Frost, Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback, 65 J. Legal Educ. 938 (2016)

Elizabeth Ruiz Frost’s article Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback was published in 2016, but it continues to affect the way I design and critique my students’ assessment activities—both in my classroom and across our curriculum—as we respond to the ABA’s mandate for more formative assessment. Professor Frost posits that, while providing a model answer (either student- or professor-authored) in place of individual feedback may allow for efficient formative feedback, in most situations it does not provide effective formative feedback. She points to evidence that weaker students tend to misinterpret model answers and are less capable of accurately assessing their own work against the model.

In her article, Professor Frost gives reasons beyond efficiency a professor may have for giving feedback through a model answer, including that learning through a model answer encourages a students to self-teach, a skill they will rely on throughout their career; model answers provide feedback quickly, while students are still primed for it; model answers will not alienate students with personalized, negative comments; and model answers are what students clamor for. Professor Frost explains why each of these reasons is inadequate to justify what she describes as a shift in the learning burden: the professor avoids learning how to provide effective feedback by forcing a student to learn how to improve from a model.

Model answers provide effective formative assessment only if students are able to compare their work with a model and see what they did wrong. Professor Frost roots the assumption students do this in the “Vicarious Learning and Self-Teaching models of education, which have pervaded legal teaching since the nineteenth century.” In fact, whether this feedback is effective depends first on the characteristics and mindset of the learners, and second on the type of knowledge the professor is assessing. As to the first variable, because weaker students are less self-aware, they face a “double curse”: “[t]he weakest students, who lack the ability to distinguish between the standard exemplified by a model answer and their own work, will learn the least from a model answer. So the students who need feedback most for continued learning will get the least.”

The second variable is relevant because model answers can provide effective feedback for questions of factual knowledge and concept identification. But any assessment that requires higher-order thinking—where students need to demonstrate analysis, for example—model answers are not as effective. Students instead need elaborative feedback.

Professor Frost ends her article with methods for using model answers to give feedback that best promote student learning: (a) providing an annotated model answer together with individualized feedback; (b) creating opportunities for remediation and reassessment for students after they have reviewed model answers; (c) using a student’s own work as a model answer; (d) requiring students to review model answers in small groups instead of individually; (e) providing multiple sample answers for review, including both strong and weak samples; and (f) focusing on metacognitive skills throughout so that students can better self-evaluate against model answers.

Several of her methods have worked for my students. Recently, I’ve noticed the first method recommended above working across the curriculum: students learn more from a model answer when the same skill (here, answering a midterm essay question) is tested in another course and personalized feedback is given there. In short, learning in one course is improved by the efforts of professors in other courses.

Being human to my students and letting them know I care

Being human to my students and letting them know I care

By Jane Korn, Gonzaga University School of Law

I have taught first year law students for a long time.  Please do not ask how long!  But years ago, I became worried about the mental health and stress levels of my first semester, first year students. I teach a four credit, one semester course in Civil Procedure during the first semester of law school.   On the last day of the week that I teach in Civ Pro, I take a few minutes out of class time and ask my students to tell me how they are doing.

The first time I do this, usually at the end of the first week of law school,  I tell my students that it is my custom, from time to time, to take time out from Civ Pro, and talk about anything they would like (with some limits).  In some years, it takes weeks for them to take me up on this offer.  Other years, they start right in.  They ask questions like the following:

  1. When should I start outlining?
  2. How much time should I spend studying every night?
  3. How important is getting involved in extracurricular activities?
  4. What if I don’t know what kind of law I want to practice?
  5. Do professors care about grammar and organization on a final exam? (I only answer what I expect and do not answer for other faculty)

I think that much of the time, they do not get a chance to ask a law professor these kinds of questions, and can usually only ask upper class students.  While we have faculty advisors, students may or may not feel comfortable asking them questions like the above.  They eventually do (and sometimes quickly) feel comfortable asking me a wide variety of questions.  They sometimes ask personal questions and, within reason, I answer them because it makes them feel more comfortable with me.  Questions on gossipy matters about other faculty are off limits. If for example, they complain about another professor,  I handle this question with a smile and say something like – you should ask that professor about this issue.

I set aside class time for several reasons. First, while I do worry about giving up valuable teaching time, lessening the stress of my students may make them more able to learn.  Second, students often feel like they are the only one with a particular concern during this first semester, and they often do not have the ability to know that others have the same concerns or questions.  In the first year, many of our students are not from this area and are far away from support systems, at least at first until they can make friends at law school.  The ability to know that other students have the same problems they do can lessen the feeling of isolation.  Using class time to answer questions to the entire group may help them with this sense of isolation and being the only one who doesn’t know something.  It also lets them see that their concerns are important and credible.

Every year my teaching evaluations reflect this process positively.  Students feel like I care (which I do).  However, the reason I do it is to increase their comfort during those first few exciting, confusing, and terrifying months of law school.

Content Analysis Coding Practice

Content Analysis Coding Practice

By Sandra Simpson, Gonzaga University School of Law

Hi All,

I have been doing training on assessment practices for in-class use and for institutional programmatic assessment.  To that end, I am learning many techniques which I am employing in my class to find out what teaching methods are working and what are not.  I learned the following coding method which allows me to assess answers from my students to open-ended survey questions.  To use the coding method, I look for themes in their answers.  I describe the system below.  Please contact me should you have any questions on the methodology or what I do with the information.  In traditional, interactive fashion, there is a practice exercise at the end so readers can see how simple this method is.  Once I see themes, I am able to respond and make changes.

Method:

Course goal:  Students will learn how to locate and print and on-line sources which are complete and relevant to solving a factual problem.

Question posed:  You are asked to do many assignments and activities in this class to help develop your legal research skills.  Please identify an assignment or type of activity that you found most helpful in developing your research skills.  Please include in your answer a specific description of what about the assignment and/or activity that helped you.

I asked this question to my students as I wanted to hear the student perspective on which types of assignments and activities most effectively helped them develop research skills.  I give many assignments, but I was unsure which ones were useful. I also wanted to know what about the assignment was helpful: step-by-step instructions, group work, lecture, or flipped classroom model. I developed a coding system so that I could analyze the results.  I devised the coding after reviewing a 20% sample of student responses.  I randomized who I chose.  They turned in the responses, and I chose every 5th one.

  1. Identification number for class level: (accelerated student=1; a traditional 1L=2)
  2. Overall response: (0=no response/question was unanswered; 1= student provided a usable response; 2=state/implied that research skills were not strengthened in LRW I course; 3 =response was either not useful or could not be coded)
  3. Positive mention of a structured assignment which led the student with step-by-step instructions to helping them develop research skills. (0=no; 1=yes)
  4. Positive mention of a structured assignment which required to use or develop research skills but no mention of step-by-step guidance being useful. (0=no; 1=yes)
  5. Positive mention of a structured assignment which required students to work collaboratively. (0=no; 1=yes)
  6. Positive mention of lecture on how-to-do research in print by LRW professor. (0=no; 1=yes)
  7. Positive mention of demonstration on how-to-do research on-line by the librarians. (0=no; 1=yes)
  8. Positive mention of video demonstration on how-to-do digest research in print which is uploaded to the TWEN page. (0=no; 1=yes)
  9. Positive mention of one-on-one assistance of a faculty member. (0=no; 1=yes)
  10. Positive mention of one-on-one assistance of a librarian. (0=no; 1=yes)

Use the coding scheme on the previous page to code the following three responses.  Each student has one row.

Student 1: Accelerated student: I learned the most about research when we did the mini assignments on finding cases in the digests in print.  It was most effective to me as we were allowed to work in groups, the professor gave us clear instructions as to each step in the process and I was able to watch the video on TWEN where the professor walked through an example.  Other assignments did not teach me as much when I had to struggle alone as I wasted a lot of time.

Student 2: Traditional 1L: This class and all my law classes have been a struggle for me.  I often don’t know where to go for help, and I am tired and stressed all the time.  The teacher seems to favor the three girls in the front row.  The rest of us aren’t encouraged to say anything.

Student 3: Traditional 1L: the assignment that taught me the most about research and really helped to develop my research was our first open memo.  What helped the most was struggling through the resources myself, asking for guidance from the librarian, and meeting personally with the professor who went to the library with me.  I found myself looking back at my lecture notes and the reading to remember how to do things.  This particular assignment helped bring it all together.  The other mini-assignments were too disjointed to help me much.

Coding sheet

 

 

Do NOT follow this link or you will be banned from the site!