

*hippochrissy
*hippochrissy
That relationship is why in every edition but fifth, healing is necromancy, and why Heal and Harm are identical in 3rd/pf, because they’re the same fundamentally, you just had to tweak the settings on one to get the other.
Its for prescription drugs.
You’re not supposed to flush them or throw them away without taking precautions to make sure they don’t end up in nature. Pill bottles aren’t water tight and all.
Rather than see people flush them or whatever, pharmacies will often take them and they have these days as an awareness thing.
Pretty sure the police do it just because they get drugs from arrests and already have what they need to do it properly.
A lot of our neurons are with us for our whole life. Early neuron degeneration is what causes Alzheimer’s, Parkinsons, and similar disorders.
Not all neurons last a lifetime, and there are kinds that die off and are replaced, but a good chunk of them aren’t meant to replicate anymore and so won’t be freed of microplastics by bloodletting, and would cause serious problems if microplastics harm their normal processes.
Regular cells die or split regularly. When they die, white blood cells eat them, and they’ll be part of filtering the blood.
Neurons don’t though. There’s still some concerns.
Oh that’s unfortunate. Well I don’t mind not supporting people like that so I’ll give it a go
Do you mean play disco Elysium or is there some drama associated with it?
I never had an issue with gleba enemies.
You just don’t kill the little nests.
They can’t expand the same way biters can, they only expand to wetlands, not dry, and obviously not water, which means (at least on my world) they tend to cluster, and some areas just can’t get an expansion from one wetland to the next.
That wetland rule seems to just be true for expansions, the original nests will be on dry land, but once cleared, they can’t expand back to those spots.
I ended up sending a few souped up laser turret spidertrons to remotely manually clear out a bunch, and they just failed to generate back. I did have to kill a ton of nests, but by leaving the little nests, they don’t expand back, and little nests can’t send out expansion groups of their own.
Pain in the ass to save the little nests while killing all the big ones, but definitely worth it. Gleba is the hubworld now, every planet except aquilo can go directly to it and back.
I just wish I could send stuff ship to ship, but gleba’s resources (except stone) are infinitely renewable anyway, rockets arent a problem once it’s set up.
If my math is right it’s just 3500 gleba science (before productivity)
Ship in the other needed science and labs and research the tech quick.
I believe in you.
Well the IRS says it is accurate.
It doesn’t say accurate to what standard but I think its pretty clear that “tax law” is the default here.
Well, you can always produce coal in orbit from asteroids and ship it down
Sounds awesome. You got yourself a little spy.
Oh dang time flies when you’re having fun exploiting people
That’s a ferret.
Old, near the top, but it still flows down. Dunno exact age. Blonde, but not everyone loses hair color.
The difference is, if this were to happen and it was found later that a court case crucial to the defense were used, that’s a mistrial. Maybe even dismissed with prejudice.
Courts are bullshit sometimes, it’s true, but it would take deliberate judge/lawyer collusion for this to occur, or the incompetence of the judge and the opposing lawyer.
Is that possible? Sure. But the question was “will fictional LLM case law enter the general knowledge?” and my answer is “in a functioning court, no.”
If the judge and a lawyer are colluding or if a judge and the opposing lawyer are both so grossly incompetent, then we are far beyond an improper LLM citation.
TL;DR As a general rule, you have to prove facts in court. When that stops being true, liars win, no AI needed.
Right the internet that’s increasingly full of AI material.
Nah that means you can ask an LLM “is this real” and get a correct answer.
That defeats the point of a bunch of kinds of material.
Deepfakes, for instance. International espionage, propaganda, companies who want “real people”.
A simple is_ai checkbox of any kind is undesirable, but those sources will end back up in every LLM, even one that was behaving and flagging its output.
You’d need every LLM to do this, and there’s open source models, there’s foreign ones. And as has already been proven, you can’t rely on an LLM detecting a generated product without it.
The correct way to do it would be to instead organize a not-ai certification for real content. But that would severely limit training data. It could happen once quantity of data isn’t the be-all end-all for a model, but I dunno when when or if that’ll be the case.
The CEOs discussed it and this year the limit is “at least one more”