Stanford Study Reveals hallucinations in Lexis & Westlaw Tools
Read the whole thing here. Key takeaways:
US Supreme Court Chief Justice Roberts' annual report to Congress noted "disturbing" reports of "hallucinations" of false or worse non-existent legal authority cited in legal briefs.
Lexis and Westlaw tout their AIs as being hallucination-free and reliable.
Hallucinating in this context means citing non-existing cases among other egregious flaws.
WEXIS are supposed to be the gold standard. And ...
Their AIs hallucinate to a frightening degree.
I don't think we'll be replaced anytime soon. Our best resource is ourselves.
There is an ethical obligation of technological competence, the corollary of which means to know which technology to not completely trust at the moment.