Page 19 - ACTL Journal_Sum24
P. 19

  Frederick has developed his own custom app using ChatGPT, called Case Analysis Pro (“Pro”). He gave Pro the case mate- rials: the complaint, the answer, stipulations, depositions of the witnesses, and the documents – photos, police reports, advertising, the NTSB report, toxicology, and a diagram of the accident. Frederick asked Pro to “please provide the best theme for the plaintiff to use at trial in shaping argument for the jury.”
Pro chugged out:
“A compelling theme could be ‘Trust Betrayed: A Promise of Safety Undelivered.’ The theme underscores the trag- edy of Laura Menninger’s death as a direct result of this trust [that consumers place in manufacturers] being be- trayed, focusing on the emotional impact of losing a loved one due to a failure in the technology that was supposed to protect them. This narrative can effectively resonate with jurors by connecting on a human level.”
It was rather humorous to find non-sentient AI telling hu- mans how to connect on a human level. Maybe more hu- morous that AI best practice advice is to “be nice to your AI.”
Wolfe explained. You will find that if you say nice things to it like please and thank you, AI will work harder for you. There is a whole science, called prompt engineering, built around how you ask AI to do the tasks you want. Prompts are critical to how you get output. If instead of saying, “You’ve got it wrong, try it again,” you say “Good try. Let me give you suggestions on how to do it better,” it will give you much better results.
Carolyn Fairless offered some thoughts on the ethics of AI. “My initial reaction when I first learned about things like this was to say ‘We don’t want our lawyers touching it.’ I then quickly came to realize my lawyers are going to touch it whether I tell them to do it or not and so we need to educate
them about some of the ethical issues that are coming up.”
Fairless continued “I came up with eight different ethical rules that are squarely implicated by the things that we’re talking about here today. If you put client information into a chat bot where you have no control over what they’re doing with that information, they’re going to use that information to continue to train the generative AI, and you have just violated your duty of confidentiality to your client.”
Professor Grossman agreed. “One of the things that I would be very careful about is what are you putting into the software.
  SUMMER 2024   JOURNAL 18
 
























































































   17   18   19   20   21