My first reaction was “who give a fuck?” then I got to the part of the article that says:
His website, which also features the purple dragon and a bunch of busted links in the footer, says that the firm “integrates AI to lower the cost of legal services.”
Which is honestly a thousand times more concerning than how he chooses to display his silly logo. Dude is writing legal documents with AI. At least his lack of professionalism is obvious.
I would imagine they’d be stupid if they did use AI. I’ve seen people use AI to “write” technical documentation that I have had to review. That shit goes straight into the bin because the time I spend fixing all the AI nonsense is about the same amount of time it would take for me to write the document myself. It’s gotten to a point where I straight up reject all AI generated documentation because I know fixing them is a waste of time.
I imagine legal documents have to be at least as precise as technical documents, so if they’re checking the output I seriously doubt they’re saving any time or money by using AI.
And anytime I see anyone advocating this crap it’s always because it gets the job done “faster”, and like, the rule is: “fast; cheap; good; pick two”, and this doesn’t break that rule.
Yeah, they get it done super fast, and super shitty. I’m yet to see anyone explain how an LLM gets the job done better, not even the most rabid apologists.
LLMs have zero fidelity, and information without fidelity is just noise. It is not good at doing information work. In fact, I don’t see how you get information with fidelity without a person in the loop, like on a fundamental, philosophical level I don’t think it’s possible. Fidelity requires truth, which requires meaning, and I don’t think you get a machine that understands meaning without AGI.
My first reaction was “who give a fuck?” then I got to the part of the article that says:
Which is honestly a thousand times more concerning than how he chooses to display his silly logo. Dude is writing legal documents with AI. At least his lack of professionalism is obvious.
Pretty sure all law firms do that nowadays. And they are quite stupid if they don’t. As long as they check the output…
Pretty sure I’m not gonna hire you to do any professional work for me.
I would imagine they’d be stupid if they did use AI. I’ve seen people use AI to “write” technical documentation that I have had to review. That shit goes straight into the bin because the time I spend fixing all the AI nonsense is about the same amount of time it would take for me to write the document myself. It’s gotten to a point where I straight up reject all AI generated documentation because I know fixing them is a waste of time.
I imagine legal documents have to be at least as precise as technical documents, so if they’re checking the output I seriously doubt they’re saving any time or money by using AI.
And anytime I see anyone advocating this crap it’s always because it gets the job done “faster”, and like, the rule is: “fast; cheap; good; pick two”, and this doesn’t break that rule.
Yeah, they get it done super fast, and super shitty. I’m yet to see anyone explain how an LLM gets the job done better, not even the most rabid apologists.
LLMs have zero fidelity, and information without fidelity is just noise. It is not good at doing information work. In fact, I don’t see how you get information with fidelity without a person in the loop, like on a fundamental, philosophical level I don’t think it’s possible. Fidelity requires truth, which requires meaning, and I don’t think you get a machine that understands meaning without AGI.
deleted by creator