Posts about law

ChatGPT goes to court

I attended a show-cause hearing for two attorneys and their firm who submitted nonexistent citations and then entirely fictitious cases manufactured by ChatGPT to federal court, and then tried to blame the machine. “This case is Schadenfreude for any lawyer,” said the attorneys’ attorney, misusing a word as ChatGPT might. “There but for the grace of God go I…. Lawyers have always had difficulty with new technology.”

The judge, P. Kevin Castel, would have none of it. At the end of the two-hour hearing in which he meticulously and patiently questioned each of the attorneys, he said it is “not fair to pick apart people’s words,” but he noted that the actions of the lawyers were “repeatedly described as a mistake.” The mistake might have been the first submission with its nonexistent citations. But “that is the beginning of the narrative, not the end,” as again and again the attorneys failed to do their work, to follow through once the fiction was called to their attention by opposing counsel and the court, to even Google the cases ChatGPT manufactured to verify their existence, let alone to read what “gibberish” — in the judge’s description—ChatGPT fabricated. And ultimately, they failed to fully take responsibility for their own actions.

Over and over again, Steven Schwartz, the attorney who used ChatGPT to do his work, testified to the court that “I just never could imagine that ChatGPT would fabricate cases…. It never occurred to me that it would be making up cases.” He thought it was a search engine — a “super search engine.” And search engines can be trusted, yes? Technology can’t be wrong, right?

Now it’s true that one may fault some large language models’ creators for giving people the impression that generative AI is credible when we know it is not — and especially Microsoft for later connecting ChatGPT with its search engine, Bing, no doubt misleading more people. But Judge Castel’s point stands: It was the lawyer’s responsibility — to themselves, their client, the court, and truth itself — to check the machine’s work. This is not a tale of technology’s failures but of humans’, as most are.

Technology got blamed for much this day. Lawyers faulted their legal search engine, Fastcase, for not giving this personal-injury firm, accustomed to state courts, access to federal cases (a billing screwup). They blamed Microsoft Word for their cut-and-paste of a bolloxed notorization. In a lovely Gutenberg-era moment, Judge Castel questioned them about the odd mix of fonts — Times Roman and something sans serif — in the fake cases, and the lawyer blamed that, too, on computer cut-and-paste. The lawyers’ lawyer said that with ChatGPT, Schwartz “was playing with live ammo. He didn’t know because technology lied to him.” When Schwartz went back to ChatGPT to “find” the cases, “it doubled down. It kept lying to him.” It made them up out of digital ether. “The world now knows about the dangers of ChatGPT,” the lawyers’ lawyer said. “The court has done its job warning the public of these risks.” The judge interrupted: “I did not set out to do that.” For the issue here is not the machine, it is the men who used it.

The courtroom was jammed, sending some to an overflow courtroom to listen. There were some reporters there, whose presence the lawyers noted as they lamented their public humiliation. The room was also filled with young, dark-suited law students and legal interns. I hope they listened well to the judge (and I hope the journalists did, too) about the real obligations of truth.

ChatGPT is designed to tell you what you want it to say. It is a personal propaganda machine that strings together words to satisfy the ear, with no expectation that it is right. Kevin Roose of The New York Times asked ChatGPT to reveal a dark soul and he was then shocked and disturbed when it did just what he had requested. Same for attorney Schwartz. In his questioning of the lawyer, the judge noted this important nuance: Schwartz did not ask ChatGPT for explanation and case law regarding the somewhat arcane — especially to a personal-injury lawyer usually practicing in state courts — issues of bankruptcy, statutes of limitation, and international treaties in this case of an airline passenger’s knee and an errant snack cart. “You were not asking ChatGPT for an objective analysis,” the judge said. Instead, Schwartz admitted, he asked ChatGPT to give him cases that would bolster his argument. Then, when doubted about the existence of the cases by opposing counsel and judge, he went back to ChatGPT and it produced the cases for him, gibberish and all. And in a flash of apparent incredulity, when he asked ChatGPT “are the other cases you provided fake?”, it responded as he doubtless hoped: “No, the other cases I provided are real.” It instructed that they could be found on reputible legal databases such as LexisNexis and Westlaw, which Schwartz did not consult. The machine did as it was told; the lawyer did not. “It followed your command,” noted the judge. “ChatGPT was not supplementing your research. It was your research.”

Schwartz gave a choked-up apology to the court and his colleagues and his opponents, though as the judge pointedly remarked, he left out of that litany his own ill-served client. Schwartz took responsibility for using the machine to do his work but did not take responsibility for the work he did not do to verify the meaningless strings of words it spat out.

I have some empathy for Schwartz and his colleagues, for they will likely be a long-time punchline in jokes about the firm of Nebbish, Nebbish, & Luddite and the perils of technological progress. All its associates are now undergoing continuing legal education courses in the proper use of artificial intelligence (and there are lots of them already). Schwartz has the ill luck of being the hapless pioneer who came upon this new tool when it was three months in the world, and was merely the first to find a new way to screw up. His lawyers argued to the judge that he and his colleagues should not be sanctioned because they did not operate in bad faith. The judge has taken the case under advisement, but I suspect he might not agree, given their negligence to follow through when their work was doubted.

I also have some anthropomorphic sympathy for ChatGPT, as it is a wronged party in this case: wronged by the lawyers and their blame, wronged by the media and their misrepresentations, wronged by the companies — Microsoft especially — that are trying to tell users just what Schwartz wrongly assumed: that ChatGPT is a search engine that can supply facts. It can’t. It supplies credible-sounding — but not credible — language. That is what it is designed to do. That is what it does, quite amazingly. Its misuse is not its fault.

I have come to believe that journalists should stay away from ChatGPT, et al., for creating that commodity we call content. Yes, AI has long been used to produce stories from structured and limited data: sports games and financial results. That works well, for in these cases, stories are just another form of data visualization. Generative AI is something else again. It picks any word in the language to place after another word based not on facts but on probability. I have said that I do see uses for this technology in journalism: expanding literacy, helping people who are intimidated by writing and illustration to tell their own stories rather than having them extracted and exploited by journalists, for example. We should study and test this technology in our field. We should learn about what it can and cannot do with experience, rather than misrepresenting its capabilities or perils in our reporting. But we must not have it do our work for us.

Besides, the world already has more than enough content. The last thing we need is a machine that spits out yet more. What the world needs from journalism is research, reporting, service, solutions, accountability, empathy, context, history, humanity. I dare tell my journalism students who are learning to write stories that writing stories is not their job; it is merely a useful skill. Their job as journalists is to serve communities and that begins with listening and speaking with people, not machines.


Image: Lady Justice casts off her scale for the machine, by DreamStudio

Posner’s dangerous thinking

Mike Masnick on techdirt points us to some dangerous and incomplete thinking from Judge Richard Posner on his blog. At the bottom, Posner writes:

Expanding copyright law to bar online access to copyrighted materials without the copyright holder’s consent, or to bar linking to or paraphrasing copyrighted materials without the copyright holder’s consent, might be necessary to keep free riding on content financed by online newspapers from so impairing the incentive to create costly news-gathering operations that news services like Reuters and the Associated Press would become the only professional, nongovernmental sources of news and opinion.

Good God. Posner is not just trying to mold the new world to old laws – which is issue enough – but is trying to change the law to protect the old world and its incumbents from the new world and its innovators. He is willing to throw out fair comment and free speech for them. That is dangerous.

Posner’s thinking is incomplete in a few ways. First, he is ignorant of the imperatives of the link economy. The links and discussion he wants to outlaw is precisely how content is distributed and value is added to it in the new media economy.

Second, as Masnick points out, Posner assumes that jouranlism as it was done is journalism as it should be done: that the goal is to protect newsrooms, unchanged. But there are tremendous savings to be had thanks to the link economy: do what you do best, link to the rest.

Note how The New York Times and The Guardian – not to mention the Huffington Post and Andrew Sullivan – covered the Iran crisis. They linked. Links made their journalism complete. So did readers. The Times has three editors for every writer but in the blog, there was no need – no opening – for them. There was no need for production or design. The new news organization can and will operate at a different scale from the old one, because it can and because it must. So what is Posner protecting besides the old budget and payroll. He’s not protecting journalism – or rather, he’s protecting it only from progress.

No, sir, the news industry – and the law – must be updated for this new world and so must your thinking.

: LATER: Here‘s Matt Welch at Reason.

The craigslist (read: internet) witchhunt

The internet – in the form of the latest kerfuffle over craigslist – is exposing an anachronism of law in society.

I’ve seen reference lately to attorneys general and law-enforcement officials saying that the craigslist community policing itself isn’t enough. Said the Wall Street Journal: “Some large Internet communities are coming to a controversial conclusion: the Web can’t always police itself.” That’s why, they argue, they need to swoop in to save us from sex.

But the truth is that this episode only shows the gap between the law and the community. Craigslist’s community does police itself against the things that matter to it: fraud, spam, trolls. That’s how craigslist’s founder, Craig Newmark, spends his days, in customer service: policing against the things that bother and matter to his community. But sex? Who gives a damn? Clearly, the community doesn’t think it needs to be protected from that. So who are these cops protecting and from what?

That’s a fascinating aspect of the culture of the internet: It shows what really matters to a community and what does not matter and that, in turn, reveals how out of touch laws and those who make and enforce them can be. Craigslist is a society and it has its own laws and means of enforcement.

Can the law, like media, still be one-size-fits-all? Well, of course, to some extent, it must be. We need consistent laws across society that define everything from fraud to murder; tat is the foundation of society. But within a society there are other societies. And so, in the U.K., there have long been religious courts that deal with disputes in the Jewish and Muslim communities. The laws of society still stand over them (thank God) and members of the community retain the right to call on those laws. Online, we also have communities that cut across borders and have their own rules of behavior. Indeed, even games become societies with laws and consequences. As Lawerence Lessig famously said, code is law, for it prescribes behavior exactly. Laws come into conflict with laws.

And so, once again, the internet becomes a threat to the control and power of an elite and they are exploiting craiglist – and the murderer who used it – to reassert their control. But it has the marks of a witchhunt. Craigslist’s blog this weekend writes about the attorney general of South Carolina going after it even though craigslist promotes these supposed sins less than others. The blog says: “And FWIW, telephone yellow pages and other local print media have both companies beat hands down as adult service ad venues for South Carolina. Any interest in targeting them for criminal prosecution? Didn’t think so.” This weekend, I was also glad to hear craigslist CEO Jim Buckmaster go on the offensive against the offended on On the Media.

I’ll be writing more about the law after the internet soon. I have lawyers on the brain.

(Disclosure: Craig Newmark is a friend and an investor in Daylife, where I’m a partner.)

For bloggers: A stay-out-of-jail card

My colleague at CUNY, Prof. Geanne Rosenberg, has just put up an online course for bloggers and media practitioners of any stripe with the 10 things you need to know to stay out of court.

It’s quick, clear, easy, and fun with videos and quizzes. This was produced with experts from the Berkman Center at Harvard and the Media Law Research Center. The course is funded by the Knight Foundation and its Knight Citizens News Network.

The 10 rules to blog by:
1. Check your facts.
2. Avoid virtual vendettas.
3. Obey the law.
4. Weigh promises.
5. Reveal secrets selectively.
6. Consider what you copy.
7. Learn recording limits.
8. Don’t abuse anonymity.
9. Shun conflicts of interest.
10. Seek legal advice.

The press release says:

Each rule in the educational module is aimed at helping citizen journalists avoid lawsuits; each rule serves as an entry point for more in-depth material. While other educational materials on online publication are organized by legal doctrines such as libel, privacy, laws of access, and intellectual property law, the “Top Ten Rules” are organized around practical guidelines for safer and more effective journalistic conduct.

The module aims to educate citizen journalists about legal hotspots, help them distinguish between genuine legal problems and intimidation tactics, learn simple practical steps to reduce legal risk, find additional resources and information, understand rights related to news gathering, and recognize when to reach out for a lawyer’s advice.

I’m included in the credits but this is all Prof. Rosenberg — and good thing, since I don’t even play a lawyer on TV. All I did was say that I wish bloggers and citizen journalists had this kind of help and there was Knight to fund it and Geanne to write it. So here is a gift to bloggers from them and CUNY.

But wait, there’s more: For a graduate-level course with lots of in-depth details, the amazing Berkman is, at the same time, putting online a legal guide with information on such topics as setting up a publishing business.

The lowest common denominator of speech

Martin Stabe points to another legal story that is getting too little coverage here, with links to a news story and a FindLaw analysis about a New York court refusing to protect an American author from a UK “libel tourism” judgment over a book that sold a mere 23 copies in England.

What’s profoundly frightening about this is that we in America could find ourselves subject to the UK’s libel and privacy laws, which throw free speech to the wolves in defense of privacy.

In other words, thanks to the internet, we could be subject to the lowest-common-denominator of protection of speech against libel actions.

One could imagine it would get even worse: Couldn’t we be subject to Islamic theocracies’ prohibitions against criticizing Muhammad or dictators’ laws against criticizing them. If other legal systems can reach out to us and our speech here then corporations — publishers, networks, service providers — chilling us.