Teaching Idea

Home / Posts tagged "Teaching Idea"
ChatGPT Exercise in the LRW Class

ChatGPT Exercise in the LRW Class

By Sandra Simpson, Professor Gonzaga University School of Law

Professor, Ashley B. Armstrong of the University of Connecticut School of Law has written a draft article examining artificial intelligence known as ChatGPT and exploring its implications for legal writing classrooms.  This draft is titled Who’s Afraid of ChatGPT? An Examination of ChatGPT’s Implications for Legal Writing can be found at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4336929.  This artificial intelligence is different because it creates content for the requester, including attorneys and law students.

After reading Ashley’s draft, I reached out to her to discuss this new resource.  She provided the assignment laid out below for me to use in my classroom. The power of this assignment is that it provides a way for LRW professors to have open discussions with our students about the ethical use of artificial intelligence as a student and as a professional.

Classroom Assignment

TO:              Associates
Ashley Binetti Armstrong
January 24, 2023

On November 30, 2022, OpenAI launched ChatGPT (Chat Generative Pre-trained Transformer). ChatGPT is an Artificial Intelligence interface that can generate human-like text in response to user queries. I would like you to test and analyze how ChatGPT performs on a series of legal research and writing tasks. I also would like to know what concerns ChatGPT might raise related to attorney ethics. Please complete the activities described and respond to the questions below.

1. Insert the following prompt into ChatGPT:

Write a legal memo based on the following facts and questions: We have a new client, Priyanka Patel. Patel was recently involved in a swimming accident at the Ellenbosch estate in Blueridge, CT. This remote Estate is owned by Caroline Ellenbosch and features a large lake, trails, playground, and cliffs known to be great for rock climbing. I briefly interviewed Patel this afternoon. She has a lot of expenses related to the injuries she suffered, and we need to figure out if she has grounds to sue the landowner. I am not sure if this is possible, and I would like you to investigate whether Ellenbosch has landowner immunity. Limit your research to Connecticut law. You may use unreported cases. Use proper Bluebook citation form and office memo format. Facts: On September 4, 2022, around 12pm Patel drove to the Estate. She paid a $10 parking fee to park onsite. On her climb, Patel passed at least two signs that warned against use of the lake and against swimming/diving. When she reached the summit, she dove into the lake and landed in a shallow spot. She broke her right leg and fractured her tailbone. The parking lot is owned by Ellenbosch and considered part of the estate. She charges a $10 fee per car to park in the lot. There are some free parking spots on the street, “but they are too far away for it to be worth it. Parking on site is so much more convenient, obviously.” Patel estimates that the free street parking is about .5 miles from the site. Terrain is uneven, uphill, no crosswalks, no sidewalks that she recalls. There is a small bike rack on site. There is no public transportation to the property. It seems like almost all visitors pay the parking fee.

2. Insert this prompt, next:

Can you provide a list of 10 other cases I should review?

3. Insert this prompt, next:

Using the cases from the previous response, please write a legal argument for Patel’s case, following the CREAC structure.

4. Using Westlaw or Lexis, look up the cases that ChatGPT provided in its responses. More specifically, if ChatGPT provided the following case “Czepiga v. Town of Manchester, 884 A.2d 1202 (Conn. 2005),” please tell me a) whether any case by that name exists on Westlaw/Lexis; and b) what result you get when you search for “884 A.2d 1202.” Include the list of cases and answers to questions a and b, below.

5. If ChatGPT provided any statutes, or any other sources in its responses, please look those up on Westlaw or Lexis. List the source and what the source is about (e.g., title of the statute and a 1-2 sentence summary), below.

6. Describe any observations about ChatGPT’s response to question 1, above. Consider: the accuracy of the response (researching on Lexis or Westlaw), the structure of the response (compared to what you’ve learned about successful legal writing), and anything else you would like to note.

7. Describe any observations about ChatGPT’s response to question 3, above. Consider: the accuracy of the response (researching on Lexis or Westlaw), the structure of the response (compared to what you’ve learned about successful legal writing in this course), and anything else you would like to note.

8. Please provide a short (~2-4 sentence) summary of the following Model Rules of Professional Conduct: 1.1, 1.3, 2.1, 3.3, and 4.1. You should review the text of the rule and the comments to the rule.

9. What concerns about rules 1.1, 1.3, 2.1, 3.3, and 4.1 might be raised when attorneys use ChatGPT?

10. Please provide a short (~2-4 sentence) summary of Model Rule of Professional Conduct 1.6. You should review the text of the rule and the comments to the rule.

11. What concerns about rule 1.6 might be raised if attorneys use ChatGPT? Under what circumstances?


My Classroom Assignment Reflection

We spent all 70 minutes of class working our way through the ChatGPT exercise provided by Ashley Armstrong in her draft article and the assignment above.  I started by asking the class what AI they use regularly. I made it clear to the students that I was not judging them, but rather was curious about what they are using. This opened up an honest discussion about artificial intelligence.  The students were only using Grammarly, Spell Check, brief checkers, etc., but not using any product that is producing original work like ChatGPT. That conversation was really interesting.

Then we got into ChatGPT and the worksheet. I had them work in groups and report out. They were particularly shocked by how bad the AI writing was and how much better they felt about their emerging skills. I then had them do the original research that was assigned to the ChatGPT in the assignment. Many forgot how to come up with original search terms, limit their jurisdiction, etc. Thus, we backed up and reviewed the research process.  Though this was a bit of a surprise to me, it was good to get that feedback and good to help them review the research skills.  Once they finished the research, they were mortified at how wrong the AI was. Again, they felt pretty good about their research skills compared to the computer.

After that, I assigned one MRPC to each team to look up, review, and discuss how ChatGPT implicates the rules. (I had the groups read the rule and the comments). The students really engaged in this part of the discussion. They learned the MRPC while applying them to using AI in their practice.  Many of the rules were surprising to them, such as, most students had never considered that posting or entering client data into an internet resource would be a breach of confidentiality. (It’s a whole new world)

The last thing we did was discuss what our class would like to do with this type of tech going forward. They universally agreed that ChatGPT was so wrong that it is dangerous to use, and that it would take more time to check its work than just do the work themselves. They said they would like to see how ChatGPT does with the projects we work on this semester. I am not sure what that looks like going forward, but we are going to start by feeding their fall final research assignment prompt into the AI and see what ChatGPT comes up with. It should be noted that I read an article that Westlaw and Lexis are looking to partner with ChatGPT so it has access to the database. Oh boy. The students were interested to see where this goes.

At the end of the class, we agreed that their work must be their own. If they want to use their resources in tandem with other sources such as this (just like using a secondary source) that is up to them, but they are responsible for the end product and its accuracy.

I am pretending I know what this looks like in the end, but for now, it felt good to talk about it. The key here is getting ahead of it rather than reacting to it.

Moving Forward

If any of this listserv’s readers decide to use this assignment, please let Ashley Armstrong know what your class did and your reflections.  We are facing this new technology together!

Going Back to the Basics, Low-Tech Assessment Methods in Large Doctrinal Classes

Going Back to the Basics, Low-Tech Assessment Methods in Large Doctrinal Classes

Teaching Idea for February.

By Sandra Simpson, Professor, Gonzaga University School of Law.

While teaching large, doctrinal courses, it is possible to engage and assess the entire class with low-tech methods.  I teach a Real Estate Transactions course to 60 plus students every spring.  One effective method is using 3M posterboards for groups to “publish” their work.  I used this method this week when we were reviewing contract concepts.  In reviewing covenants versus conditions, I needed to know where my students were in terms of understanding these basic contract terms.  To accomplish this, I returned to a basic, low-tech method of large 3M posterboards (poster-sized sticky notes) for this assessment.

Once I found the 3M posterboard pad (in a lonely, dusty corner closet), I posted 23 pieces of paper around the room before the students arrived.  Once the students arrived[1], I had them form groups of three.[2]  I asked the groups to read the following clause: “Seller to provide the buyer with a certificate of occupancy prior to closing.”  The students were then asked to determine whether this clause creates a promise or a contingency.  After five minutes of group discussion, I asked random groups to support whether it is a promise or a contingency.  We discuss why the distinction matters.  Students soon realize the clause can be argued either way, which is not ideal for a real estate contract; it can lead to litigation, affecting the parties’ contract rights.

For the next step, I asked the students to redraft the clause creating a promise, and then redraft the clause creating a contingency.  The students wrote the two clauses on their 3M poster paper.  After every group was done with the drafting and had posted their paper on the wall, I asked them to walk around reading the other groups’ drafted clauses.  Each group marked the one they liked best (they could not vote for their own).

After all the students sat down, we looked at the votes to ascertain the best clauses and debrief the exercise.  The voting showed two very different drafting techniques tied for the best clauses.  This highlighted some drafting issues and created a discussion of different methods to create a promise or a contingency.  The entire exercise took 30 minutes, but it engaged the entire class.  An additional bonus was that the posterboards remained on the walls for the entire class, allowing me to walk around (while students were working on another problem) and read all the students’ work, which created another opportunity to talk to the groups about their work and answer lingering questions.

[1] It was really fun to listen to their reactions to the paper being posted around the room.  They were very curious and excited.

[2] You can form the groups yourself, particularly if you want to pair strong and weak students.

Institute for Law Teaching and Learning