Participation

Home / Posts tagged "Participation"
Racial Anxiety

Racial Anxiety

By Anastasia M. Boles, UA Little Rock, William H. Bowen School of Law

As law professors, we care deeply about our students.  We put a tremendous amount of effort into our teaching, advising student organizations, and serving as formal and informal mentors.  Unfortunately, science has taught us that unconscious racism may be operating to degrade our student interactions. Many of us are familiar with the term “implicit bias.”  Over the last few decades, social psychologists have explored the ways implicit preferences and biases permeate society, including criminal justice, health, and education.  Thus, unconscious racism may be interfering with our student interactions.

While lesser known than implicit bias, a common consequence of unconscious racism is “racial anxiety,” which is the unconscious anxiety we may experience or exhibit when interacting with a person of a different race.  For example, racial anxiety can cause undetectable physical changes in our bodies such as nervousness, discomfort, stiffness, and decreased eye contact.  The experience of unconscious racial anxiety sets up a vicious cycle; we unconsciously minimize interactions that have made us uncomfortable in the past, even if we cannot name the source of the discomfort. Racial anxiety expresses differently depending on race – people of color may be anxious about experiencing racism; whites may fear saying the wrong thing, or being labeled a racist.  Whatever the cause, as our cognitive resources are directed to mitigating any racial anxiety we are experiencing, the quality of our personal interaction with the differently-raced person can degrade.[1]

Racial anxiety is likely present in the halls and classrooms of law schools as well.  Despite our best intentions, law professors may experience racial anxiety symptoms in cross-racial conservations and interactions with our students.  At the same time, our differently-raced students may experience racial anxiety as they interact with us.  Consider this common scenario: a white law professor and a student of color meet outside of class for the first time to review an exam, talk about an issue from class, or discuss a paper.  Racial anxiety can affect the professor’s ability to build rapport with the student, appear open and friendly, evaluate the student’s learning needs, engage the student’s questions, and build trust.  The student of color, if also affected by racial anxiety, is less able to ask questions, absorb feedback, and seek mentoring.  If either the law professor or law student experienced unconscious racial anxiety during the meeting, future interactions between the professor and student may be affected.  Now imagine the potential for racial anxiety to disrupt the law school classroom where a sensitive issue related to race comes up in class discussion.  Racial anxiety may degrade the ability or willingness of the professor to engage the issue.  The ensuring student discussion could suffer.  Our students require our full attention; if racial anxiety is depleting the attention we give, we should do something about it.

What can we do?  If racial anxiety operates in our unconscious minds, can we ever hope to banish it?  The great news is that we can.  To combat racial anxiety, psychologists recommend that we start by increasing our cross-racial interactions with our students.  Psychologists call this “intergroup contact.”  Strategies such as encouraging students to attend office hours to increase familiarity, attending and supporting student events with differently-raced students, and increasing the amount and depth of conversations with differently-raced students can help.  During cross-racial interactions, seek to understand cultural differences as well as identifying similarities; the goal is to recognize and appreciate the varying cultural backgrounds of our students – not minimize them.  The more law teachers and law students from different racial backgrounds interact with one another, the less potential for racial anxiety to disrupt those interactions.

[1] For more information about racial anxiety see here, and here.

The Compounding Effects of Assessment

The Compounding Effects of Assessment

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

If you’ve found your way to the Institute of Law Teaching and Learning, you are likely already a believer in formative assessment. We do have empirical evidence that formative assessment improves student learning in law: Two recent studies have shown that students who received individualized feedback during the semester outperformed students who did not on final exams, and not just in the class where they received the feedback but in every single class they were taking.  [1] One study’s authors note the “likelihood of this occurring by chance is one in 256.”[2]

But as we add formative assessments to students’ semesters, we must consider how we are altering the demands on their time. The middle of the semesters, which have traditionally been the playground for the Socratic Method and for legal writing assignments, may now be filled with a variety of assessment activities, and some of them may dominate students’ time in a way that impacts students’ learning in other classes. When our assessments interfere with students’ participation in other classes, or vice versa, the inferences that we draw from our assessments about student learning may not be valid. And an assessment that provides invalid data is worse than no assessment at all. Consequently, we must all consider our assessments as students experience them, “holistically and interactively.”[3]

How do we deeply coordinate assessments and avoid an assessment system that instead overwhelms students, clutters or fragments their learning, or discourages them early in their first semester? We must coordinate beyond shared calendars, starting in our own classrooms by ensuring that our own assessment activities, as a slice of the student-time pie, are designed with and justified by best practices that encourage an assessment’s validity. In a recent article, I’ve identified five relevant best practices:

  1. Make the assessments’ alignment with learning goals transparent to students and to other faculty members with whom we intend to coordinate: A clear alignment with learning goals helps students understand how the assessments will move them towards learning goals, and helps them make informed decisions about their allocation of time. A clear alignment also allows us to clearly communicate our assessment choices to other faculty members.
  2. Use rubrics to create a shared language of instruction: Once we identify learning goals, rubrics help us refine our communication with students. They see how they will be assessed, and we see with specificity what they have learned.
  3. Ensure the assessments encourage student autonomy: One particularly harmful potential outcome of a tightly orchestrated assessment system is that it may overly dictate student decisions, rather than facilitate student autonomy. Our assessment systems should build students’ feelings of autonomy, competence, and relatedness, which are fundamental to learning.
  4. Set high expectations and display confidence that students can meet those expectations: Students prone to maladaptive responses to feedback are likely to be overwhelmed and discouraged by frequent assessments. Explaining our high expectations and displaying confidence in students can help address these tendencies.
  5. Regularly review the entire assessment system, paying particular attention to students’ ownership of their own learning within the system.

When we ground our formative assessment decisions in best practices, we are better able to communicate our decisions to students, and better able to more deeply coordinate with other faculty members.


[1] See Daniel Schwarcz & Dion Farganis, The Impact of Individualized Feedback on Law Student Performance, 67 J. Legal Educ. 139, 142 (2017) (finding that formative assessment improved performance on final exams for students with below-median entering credientials); Ruth Colker et al., Formative Assessments: A Law School Case Study, 94 U. Det. Mercy L. Rev. 387 (2017) (finding the same); Carol Springer Sargent, Andrea A. Curcio, Empirical Evidence That Formative Assessments Improve Final Exams, 61 J. Legal Educ. 379, 383–84 (2012) (finding that formative assessment improved performance on final exams for students with above-median entering credentials); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Developing an Empirical Model to Test Whether Required Writing Exercises or Other Changes in Large-Section Law Class Teaching Methodologies Result in Improved Exam Performance, 57 J. Legal Educ. 195, 197 (2007) (finding the same); Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Does Practice Make Perfect? An Empirical Examination of the Impact of Practice Essays on Essay Exam Performance, 35 Fla. St. U. L. Rev. 271, 280-82, 302-306 (2008)(finding the same).

[2] Schwarcz, supra note 1, at 142.

[3] See Harry Torrance, Formative Assessment at the Crossroads: Conformative, Deformative and Transformative Assessment, 38 Oxford Rev. of Educ. 323, 334 (2012) (noting that “assessment is always formative, but not necessarily in a positive way”).

Interactive Presentation Software

Interactive Presentation Software

By Tonya Krause-Phelan, WMU-Cooley Law School

Many professors have been looking for meaningful ways to integrate technological tools into their course design. I am one of them. But for a professor who does not allow students to use their laptops for notetaking, it was important that students be able to recognize that I was using the technology for a limited and strategic purpose, not to be hip or gimmicky. Additionally, it was particularly important that any technological tool I chose was one that could be used quickly, easily, and strategically A few years ago, while at an ILTL conference, a professor polled the audience during her presentation using Mentimeter. I was impressed and after leaving the conference, I explored ways that I could use this slick, but simple app in my classes.

So, what is Mentimeter? Simply, it is an interactive presentation software app that allows professors to interact, collaborate, and poll students. (https://www.mentimeter.com/). The concept is simple: the professor asks a question, the class votes, and the students’ responses appear as a presentation on the classroom screen showing the results.  To prepare the question that will appear on screen, the professor must sign up with Mentimeter. The website allows the professor to write their own questions from scratch or to use one of the site’s templates.  There are many different styles and formats to choose from.  When ready to poll the class, the professor simply displays the question slide prepared in Mentimeter. The students are prompted to go to the voting website, use their cell phones to enter the code that appears on the question slide, and to vote.  As the students vote, their responses appear on the classroom screen. The professor can, however, choose to hide the results until everyone has voted. So far, I’ve incorporated Mentimeter with success using three specific formats: Word Clouds, Multiple Choice, and Questions from the Audience.

 Word Clouds. With the Word Cloud format, I pose a question. The students’ answers actually create a work of art; it literally looks like a cloud made out of words. As students respond, their answers rearrange the word cloud in real-time to emphasize the most common words submitted by the class.  This format is particularly useful to gauge students’ perceptions, understanding, and reflections.  For example, I polled my Criminal Procedure students to gauge their understanding of the most important requirement of the Miranda rule before they read the case. Without fail, arrest is always the biggest word; in other words, students think arrest triggers the Miranda warnings. After students read the case and we analyzed it in class, their Word Cloud more accurately reflected the rule and as a result, custody, interrogation, silence, and lawyer became the largest words in the Word Cloud. When students compared both word clouds, they had a clear visual of the wrong interpretation of the rule versus the correct application of the rule.

Multiple Choice.  In Criminal Law, a first-term class, I have used the Multiple Choice format in its basic format: to give students a multiple choice question. With first-term students, this is a useful tool that allows me to guide them through the deductive reasoning process necessary to successfully navigate multiple choice questions.  But I have also used the Multiple Choice format in Criminal Procedure to administer a simulated photo identification procedure. After showing students the photo identification, I gave them a Mentimeter prompt with five choices (the number of people in the photo identification), A-E, and they made their identification by selecting the letter that represented the photograph of the alleged perpetrator they chose. I hid the screen from students while they voted so they would not be influenced by other students’ selections.

 Question from the Audience.  Another useful way to use Mentimeter is the Questions from the Audience format. At the end of main units, I often allow students a few minutes to pause and to reflect about what they have just learned. Using the Questions from the Audience format, students may ask any questions as they process the information without interrupting other students. This particular format allows the professor to choose when and how the questions appear on screen; the professor can hide the questions while students are in the questioning process or the professor can permit the questions to appear as bubbles, scrolling questions, or one at a time. I typically hide the questions until all students have posted their questions.  This allows me time to sort through the questions and determine how best to handle answering them. Depending on the number of questions, I typically answer the questions in class or use an exercise to help the students figure out the answer. This format and process is also useful in review sessions hosted by the professor or teaching assistants.

There are many interactive apps available for classroom use. Mentimeter is one of them. It is fun, interactive, and very user friendly.  The possibilities for which this app can be used in the law school classroom are many.  Give it a try. Neither you nor your students will be disappointed.

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

Review: Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback

By Lindsey P. Gustafson, UA Little Rock, William H. Bowen School of Law

Elizabeth Ruiz Frost, Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback, 65 J. Legal Educ. 938 (2016)

Elizabeth Ruiz Frost’s article Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback was published in 2016, but it continues to affect the way I design and critique my students’ assessment activities—both in my classroom and across our curriculum—as we respond to the ABA’s mandate for more formative assessment. Professor Frost posits that, while providing a model answer (either student- or professor-authored) in place of individual feedback may allow for efficient formative feedback, in most situations it does not provide effective formative feedback. She points to evidence that weaker students tend to misinterpret model answers and are less capable of accurately assessing their own work against the model.

In her article, Professor Frost gives reasons beyond efficiency a professor may have for giving feedback through a model answer, including that learning through a model answer encourages a students to self-teach, a skill they will rely on throughout their career; model answers provide feedback quickly, while students are still primed for it; model answers will not alienate students with personalized, negative comments; and model answers are what students clamor for. Professor Frost explains why each of these reasons is inadequate to justify what she describes as a shift in the learning burden: the professor avoids learning how to provide effective feedback by forcing a student to learn how to improve from a model.

Model answers provide effective formative assessment only if students are able to compare their work with a model and see what they did wrong. Professor Frost roots the assumption students do this in the “Vicarious Learning and Self-Teaching models of education, which have pervaded legal teaching since the nineteenth century.” In fact, whether this feedback is effective depends first on the characteristics and mindset of the learners, and second on the type of knowledge the professor is assessing. As to the first variable, because weaker students are less self-aware, they face a “double curse”: “[t]he weakest students, who lack the ability to distinguish between the standard exemplified by a model answer and their own work, will learn the least from a model answer. So the students who need feedback most for continued learning will get the least.”

The second variable is relevant because model answers can provide effective feedback for questions of factual knowledge and concept identification. But any assessment that requires higher-order thinking—where students need to demonstrate analysis, for example—model answers are not as effective. Students instead need elaborative feedback.

Professor Frost ends her article with methods for using model answers to give feedback that best promote student learning: (a) providing an annotated model answer together with individualized feedback; (b) creating opportunities for remediation and reassessment for students after they have reviewed model answers; (c) using a student’s own work as a model answer; (d) requiring students to review model answers in small groups instead of individually; (e) providing multiple sample answers for review, including both strong and weak samples; and (f) focusing on metacognitive skills throughout so that students can better self-evaluate against model answers.

Several of her methods have worked for my students. Recently, I’ve noticed the first method recommended above working across the curriculum: students learn more from a model answer when the same skill (here, answering a midterm essay question) is tested in another course and personalized feedback is given there. In short, learning in one course is improved by the efforts of professors in other courses.

Review: Spaced Repetition: A Method for Learning More Law In Less Time

Review: Spaced Repetition: A Method for Learning More Law In Less Time

By Tonya Krause-Phelan, WMU-Cooley Law School

Spaced Repetition: A Method for Learning More Law In Less Time by Gabriel H. Teninbaum
17 JOURNAL HIGH TECHNOLOGY LAW 273 (2017)

Spaced Repetition explains why spaced repetition is so much more than learning from flashcards. This article presents a concise tutorial detailing the psychological phenomena known as spaced repetition and how it can help to law students, bar preppers, and practitioners learn the law more quickly, effectively and efficiently. Discovered in the 1800’s, spaced repetition is a learning and memorization method that not only improves the way people learn and prepare for exams, it also fosters faster learning and greater retention. To understand how spaced repetition promotes learning and aids memory,  it is important to consider the three related psychological phenomena that form a spaced repetition system: the forgetting curve, the spacing effect, and the testing effect.

The forgetting curve is the decline in the ability to recall information. This occurs because as soon as a person learns something, they begin to forget it. To combat the forgetting curve, spaced repetition cues learners to restudy immediately before the learned material is predicted to be forgotten. Research shows there is an ideal moment to reinforce learned information. Recalling the information at just the right time allows learners to not only keep the memory active, but to identify the information that has already been forgotten so it can be targeted for restudying.

The spacing effect requires study sessions to be properly spaced to slow down the forgetting curve. Because of the initial steep decline of the forgetting curve, learners will need to review information frequently at first. Over time, the spacing effect increases allowing learners to wait for longer periods of time between review sessions. If done correctly the spacing can go from hours, to days, to weeks, to months, and even to years. As a result, material learned via spaced repetition in the first year of law school could be reviewed periodically throughout the second and third year of law school to be easily recalled during bar review and the bar examination.

The testing effect describes the ability of people to more readily recall learned information. Learners experience the testing effect when they recall learned information by testing themselves instead of passively observing the information. The benefit is even more pronounced when assessment is followed by meaningful feedback that includes exposure to the correct answer. The most effective spaced repetition techniques involve learners answering questions which force them to use their memory as much as possible such as free recall, short answer, multiple-choice, Cloze deletion exercises, and recognition. But spaced repetition can be so much more than just definitional flash cards and fill-in-the blank exercises; it can also be used to help learners apply complex content.

Early on, spaced repetition systems had to be created and used by hand. However, today, mobile applications have opened up a whole new world of possibilities for staging spaced repetition platforms. While Spaced Repetition is a primer on the basics of spaced repetition systems, it also promotes the author’s web-based platform: SpacedRepetition.com. The author has built in several key benefits into his platform including: it’s a web-based platform easily used on smartphones and mobile devices; it uses an algorithm to apply spaced repetition; it includes expertly created core content; it allows for editable content; it provides a third slide option (to include other pieces of black letter law or context); and, the content is shareable.

Spaced repetition can help law students, bar preppers, and practitioners learn more effectively and efficiently. The author cautions, however, that spaced repetition requires more than just looking at flashcards. Users of spaced repetition must still learn how to organize, apply, and express the law. But, if learners use spaced repetition outside of the classroom, legal educators can make more effective use of flipped classrooms as well as active learning and application exercises. While this article promotes the author’s platform, it is worthwhile read for legal educators looking to understand and provide spaced repetition learning opportunities for their students.

 

Review: From Seminar to Simulation: Wading Out to the Third Wave

Review: From Seminar to Simulation: Wading Out to the Third Wave

By Tonya Krause-Phelan, WMU-Cooley Law School

From Seminar to Simulation: Wading Out to the Third Wave by Margaret Moore Jackson
19 JOURNAL OF GENDER, RACE, AND JUSTICE 127 (2016)

From Seminar to Simulation: Wading Out to the Third Wave encourages legal educators to embrace simulated teaching in light of the newly-adopted ABA standards relating to experiential learning. Because ABA Standard 303(a)(3) requires students to complete at least six credits of experiential coursework which can be earned in law clinics, field placements, or simulation courses, Professor Jackson suggests that simulation teaching can be integrated into existing courses by reformatting seminars, those upper-level, reading and discussion-based courses that typically focus on specialized areas of law not usually tested on the bar exam. Reformatting a seminar course as a simulation course allows faculty to accomplish two significant goals. First, it provides an experiential learning opportunity for students that meets, if not exceeds, the new requirement. Second, it can also create an opportunity for students to develop and use professional values as they learn to apply the law.

Beyond meeting the new standards, including simulations as experiential teaching is a way professors can foster integrated learning. Many professors already incorporate classroom exercises and role play into their doctrinal classes. Even though these efforts are designed to develop students’ professional skills, they do not satisfy the ABA’s definition of a simulation course. To comply with Standard 304, a simulation course must reasonably assimilate the experience of   client representation or engage in other lawyering tasks in a set of facts and circumstances devised or adopted by a faculty member. The simulation course requires faculty to directly supervise the student’s performance followed by faculty feedback and student self-evaluation. Finally, there must be a classroom instructional component.

From a faculty perspective, a potential barrier to merging simulated teaching and experiential learning into existing courses is the time-consuming nature of simulation teaching. Faculty are also apprehensive about how much subject matter will have to be sacrificed to carve out enough time for the simulation component. Despite the potential difficulties, there are many benefits to simulation teaching. For starters, simulation teaching assists in applied knowledge and introductory skills development in that it cements learning of substantive law. Faculty can continue informal doctrinal teaching as students engage in simulated roles by structuring assignments that teach practical lawyering skills that will also reinforce their learning of legal analysis. And because simulated teaching fosters concentrated learning of professional skills and values, it also promotes justice, underscores service to the community, and helps students to overcome assumptions and inherent biases.

Although the ABA requirements for a simulation course appear formidable, Professor Jackson suggests that restructuring courses to provide students with six credits of experiential education might not be as daunting a task as some might think. Professor Jackson provided a template for creating a plan convert a seminar course into a simulation course based on her housing discrimination class. But the format easily translates to any substantive class or seminar. Begin by identifying the competencies students should achieve by the end of the course. Make sure to envision these competencies in the context of the area of law. The objectives should be relevant and realistic in the area of practice. Be careful to limit the goals to an amount that can be effectively implemented and assessed. Consider a format that focusses on repetition and refinement of targeted skills in relation to more elaborate doctrine.

For example, in Professor Jackson’s fair housing seminar, students were assigned to represent a hypothetical client. The assignments required students to know the applicable law, provide client advice based on the law and the particular situation, communicate with other lawyers, judges, and real estate professionals as the client’s case required, and to be alert to potential injustices. Supplementing exercises included professional writing activities and oral presentations to a community audience.  A final component of the exercises encouraged students to focus on client communication designed to develop relational skills and empathy, dispel students’ false assumptions about the role of law in society, and to develop their self-conceptions as professionals to promote justice.

Transitioning to simulation teaching provides faculty with opportunities to connect learning the law with developing the skills, instincts, and inclinations to use the law to promote justice. Whether a professor seeking to augment a doctrinal class with experiential learning exercises or a professor looking to dive into the full spectrum of simulated teaching, From Seminar to Simulation: Wading Out to the Third Wave provides the pedagogical support and procedural format to transition to simulation teaching.

 

Review: The Science of Equality, Vols I & II

Review: The Science of Equality, Vols I & II

By Tonya Kowalski, Washburn University School of Law

Rachel D. Godsil, et al., The Science of Equality, Vols I & II

Attendees at this past summer’s biennial ALWD conference had the great fortune to learn about the latest research on addressing diversity-related challenges. Among the featured speakers was law professor Rachel Godsil, who identified very specific strategies for addressing bias in education, particularly implicit racial bias and related phenomena.

Prof. Godsil and her colleagues at The Perception Institute have published a series of highly readable, persuasive, and practical reports on these pernicious barriers to education. Among these reports are two volumes of The Science of Equality, linked below. Each report synthesizes and assesses the research, but also describes a series of empirically supported strategies for intervention.  For example, Volume 2 offers a simple, low-cost strategy for educators to use when providing written feedback. The “wise feedback” approach couples messages about high expectations with expressions of confidence in students’ ability to meet those expectations. Studies show that such messages vastly improved response rates and quality from students in a particular marginalized group.

The topics and strategies range from institutional to individual. Readers will find an array of proposals suitable for both classroom professors and administrators.

Notes and Links:

  • The Science of Equality in Education: The Impact of Implicit Bias, Racial Anxiety, and Stereotype Threat on Student Outcomes
  • The Science of Equality Vol. 2: The Effects of Gender Roles, Implicit Bias, and Stereotype Threat on the Lives of Women and Girls
  • Additional publications
  • ALWD is the Association of Legal Writing Directors
  • This post’s author is currently an ALWD board member but has no personal stake in The Perception Institute.
Review: Reframing the Socratic Method

Review: Reframing the Socratic Method

By Tonya Krause-Phelan, WMU-Cooley Law School

Reframing the Socratic Method by Jamie R. Abrams
64 JOURNAL OF LEGAL EDUCATION 562 (2015)

Reframing the Socratic Method offers a fresh idea to redesign the Socratic Method from a professor-student exercise into an exercise that fosters diverse participation and develops essential lawyering skills. Professor Abrams acknowledges that the Socratic Method, used by law schools for over a century, has become the quintessential example of question-based learning. But contrary to many modern critics of the Socratic Method, Professor Abrams does not disparage the Socratic Method or call for its elimination. Nor does she endorse it. Instead, she encourages professors to restructure the Socratic Method in three ways to ensure it aligns with current innovations and reform: make it client-focused, research-focused, and skills-sensitization focused.

First, Professor Abrams suggests that the Socratic Method should focus primarily on the client, instead of the case. Traditionally, Socratic dialogue begins by asking the student what happened in the case which causes students to think about the case abstractly. As a result, students do not consider the case from the client’s point of view. Further, it does not permit students to scrutinize the decisions made by the lawyers in the case. With a few simple changes, professors can move the Socratic Method from a rule-based to a client-based task. Instead of asking students to recite the facts of the case, a client-based Socratic approach asks the student to explain what happened to the plaintiff or why the plaintiff sought counsel.  These modified questions still highlight the relevant facts of the case, but they allow students to understand the facts from the client’s point of view as well as to consider the attorney-client relationship.

Next, Professor Abrams recommends that instead of using the traditional Socratic Method approach to focus on case outcomes and hypothetical questions, the Socratic questions should be changed so that students use relevant legal authority to represent the client. Instead of asking a student to recite the court’s holding, students should be asked what precedent, the client’s lawyer would have found in preparing the client’s case. These modified questions, propel students to analyze the legal authority relied upon by the court, to understand how the precedent negatively or positively affected the client, and to understand the historical and social underpinnings of the legal precedent. To further insert research-based components into the Socratic dialogue, professors could require students to apply information contained in the case footnotes or to prepare supplemental material to answer research-focused questions. This allows students to develop the ability to assess the strengths and weaknesses of a client’s case.

Finally, Professor Abrams proposes that professors modify the Socratic dialogue to sensitize students to the broad range of legal skills needed to lawyer effectively. She acknowledges that this type of questioning may not be practical in every case. But when possible, the professor should ask questions that guide students to think about effective lawyering skills. These refined questions could range from what role settlement negotiations play in a client’s case to understanding the relevant ethical rules used to determine who the client is and how to meet the client’s objectives. Professor Abrams illustrates how reframing the Socratic Method in a commonly-taught constitutional law case, Reed v Reed, changes the dynamic of instruction from professor-student to student-propelled focus on the client, legal research, and effective lawyering skills.

Professor Abrams explains that reframing the Socratic Method achieves three benefits. First, it allows for coherence and continuity to legal education. Second, it trains practice ready lawyers because students will be better prepared to tell the clients actual answers to actual questions and they will be sensitized to how intensive legal research truly is. And third, it creates inviting and inclusive classrooms. While detailing specific examples for each type of modification she recommends, Professor Abrams illustrates how easily professors could modify the manner in which they already use the Socratic method to accomplish the current goals of innovation and reform.

 

 

Do NOT follow this link or you will be banned from the site!