This audio is automatically generated. Please let us know if you have any comments.
This feature is part of ‘The Dotted Line’ series, which takes an in-depth look at the complex legal landscape of the construction industry. To see the entire series, click here.
Two and a half years ago, Construction Dive asked ChatGPT, OpenAI’s great language model generate a construction contract on a design-bid-build job for a 600-unit mixed-use project in San Jose, California. The document, lawyers said, could be legally enforceable: It covered the scope of work, payment terms, a termination clause and sections on indemnification, insurance and change orders.
But it also came with missing clauses and poor risk management. These same lawyers warned that using artificial intelligence just to generate a contract was akin to opening Pandora’s Box: the risk simply wasn’t worth it.
Construction lawyers now say that technology is here to stay and that its capabilities have improved as it has advanced in countless aspects of daily life. But after two years of improvement, can he now write a rigid construction contract? More specifically, should construction lawyers resort to it to do so?

Megan Shapiro
Permission granted by Megan Shapiro
“Oh, my God, no. I’d be horrified if I heard someone was using it to generate a contract,” said Megan Shapiro, a construction attorney and partner at the law firm Radoslovich Shapiro in Sacramento, California.
While AI has uses for construction lawyers, experts cautioned that using it as a wholesale, one-stop-shop for contracts can only lead to problems.
“It’s not something where you push a couple of buttons and you have a contract ready,” said Michael Vardaro, managing partner of the New York City-based law firm Zetlin & De Chiara.
The subject of hallucinations
Shapiro didn’t mince words when it came to the accuracy of AI tools. He said it was “shocking” how wrong AI-generated content could be and how often it occurred.
In fact, so-called hallucinations have been a massive problem for large language models. While this has been the case since generative AI made its mainstream debut in 2022, an error developers promised would become less common as the technology uses and evolves, The New York Times reported in May that for newer AI systems, the hallucination rates reached 79%according to the test and the tool.
For lawyers, this presents a significant business and ethical issue as they seek to balance speed with accuracy. Case in point: the arrival of LLMs has led to “AI slop” in the legal systemor filings with false or misrepresented court cases, the Times reported on Nov. 7. Currently, a database that tracks cases of AI hallucination in legal systems worldwide has included, at the time of publication, about 600 cases, most in the US
due diligence
Such problems present reputational and business risks, said Trent Cotney, a partner at the law firm Adams & Reese and head of its construction team. If a lawyer files a brief with AI-fabricated documents or claims, the presiding judge could consider sanctions.

Trent Courtney
Courtesy of Adams and Reese LLP
“Not to mention the fact that there’s a reputational hit, and your client is obviously not going to be happy about that either,” Cotney said.
Shapiro said what keeps her up at night is the thought that her clients could be using AI on their own without talking to her about it. He fears that if someone at a construction company doesn’t want to pay to call a lawyer on this legal matter, that person might try to solve the problem themselves.
“I think the risk to them is getting substantially worse, because I think consumer confidence in production is going up, even though I think the quality of production is going down,” Shapiro said.
find an edge
With diligence and a keen eye, however, lawyers said there are ways AI can help legal professionals with their daily tasks.

Miquel Vardaro
Permission granted by Zetlin & De Chiara
Vardaro, for example, said one advantage of AI was its ability to pick up documents and search for sections within them, a job that was left to associates years ago.
“Now, artificial intelligence can do that and give you sort of hits on that project file in minutes, where that might have taken weeks, if not months, to do,” Vardaro said.
Experts also claimed their ability to review contracts for specific provisions. Cotney, who refers to himself as an early adopter of the technology, will use it as a gut check, especially when it comes to verifying references and citations.
“I use AI to check what I’ve done, identify if there’s something else to think about that I haven’t thought of, or help me tweak certain things or compile certain things,” Cotney said.
Shapiro will sometimes use AI in her work as a research tool, but that’s where it stops for her. It will help you find and identify resources and avenues to explore that you wouldn’t otherwise get.
“Thinking about it as a lawyer or a junior associate, a young lawyer, and reviewing it that way, I definitely think there’s value there,” Shapiro said.
At the same time, lawyers said AI tools need to be called upon by people who already know the law, in order to flag whether something it produces is suspicious.
“It’s not a completely useless tool,” Shapiro said. “It’s just a tool that needs to be used by a user who has the knowledge and experience to take advantage of it in a safe, effective and efficient way.”
