To be fair if anyone is motivated to discover flaws in testing methodology and publicly disclose them right now it’s Labs.
To be fair if anyone is motivated to discover flaws in testing methodology and publicly disclose them right now it’s Labs.
I want to see AI Voyager episodes like people are doing for the Simpsons and family guy. Basic 3d models interacting with each other using generated dialog.
Just Janeway and crew wandering the Delta quadrant violating the prime directive in the name of coffee.
It’s getting hard to do just between AMD and Nvidia on Windows.
I’m old enough to remember the days when reviewers showed macro shots of the wires in half life 2 to test AA between different cards.
Does anyone even test that enabling “Ultra” settings results in the same configuration across vendors/generations? I’m pretty sure LTT Labs found cases where it wasn’t.
I found a polearm that happened to have decent base damage, and nothing else. Sold it for $20 on the RMAH and ended up using that to buy the expansion when I finally came back.
Can’t stand the way it works now either though. It’s basically one of those idle games now. You just play the same shit no matter what difficulty. The only difference is the number of zeros on the damage numbersnas you gradually gear up to whatever the season armor is.
That’s what keeps people coming back to D2R. You get a new piece of gear and suddenly you can run areas that you couldn’t before. You have that carrot of maybe one day getting an enigma or eBotD, or you’ll get a good drop for another class and now you’re levelling up an alt so they can use that gear.
I’m content to wait a few years to see if it gets any better. I bought D3 on launch and didn’t come back until a year after the expansion.
With the amount of microtransactions in the game it’s only a matter of time before it goes on sale for like $15, or goes free to play. I’ll get it then.
It’s between Apple and framework for me for my next laptop. The question is do I want a laptop that I can infinitely repair and upgrade, or do I want a laptop that actually has battery life when I pull it out of my bag because it has a functioning sleep mode. Thanks Intel. Maybe make sure your processors are actually power efficient before axing S3 sleep.
My wife switched from iPhone to Android and has the opposite problem. She never dismisses her notifications, so her notification bar is constantly full and she’ll have 30+ notifications in the tray.
My OCD couldn’t handle that. If I get a message that I don’t want to respond to I leave it there because I know that eventually the OCD will overcome the social anxiety and I’ll have to respond to it.
There is no continent called “America”. We have North America and South America.
When someone says “South American” I don’t think Alabama I think Brazil or Argentina.
The term “North American” is commonly used when you’re describing something that applies to both Canada and the US. Eg. “North American sports teams”.
We commonly use the term “Central American” when referring to Mexico, El Salvador, etc. because even though they are technically in North America there is a strong cultural divide, similar to how the middle East is technically Asia, but you’d never refer to someone from Saudi Arabia as “Asian”.
D4 was dead the moment they announced D2R. Why would I pay $80 for a game with microtransactions and battlepass when I can pay $50 for a game that comes complete in box?
They should have taken D2R kept the mechanics and just rolled new classes, maps, and items.
I don’t want a new game, I just want more content.
Don’t forget that a large chunk of that money also goes to the creators. It’s significantly more than they get from showing you an ad.
My laptop is 4 years old at this point. I spent $2400 on it before I wanted something future proof, and while it’s still plenty fast with it’s 10th gen Intel processor and 32gb ram, knowing that I could drop $500 and upgrade to the latest AMD or Intel chip makes me wish I could have held out another year and gotten the framework.
Given that we’ve more or less peaked in terms of non-gaming performance I probably won’t be buying another laptop until this one dies but my next laptop will be a framework without question as well.
Google didn’t buy HTC. They bought the parts of the company responsible for making the first Pixel phone.
HTC is still a separate entity. They just don’t release 25 phones/year now, and all of their stuff is mid-range garbage.
My favourite Jellyfin feature is the one where it doesn’t ‘accidentally’ reset my dashboard every few weeks to promote its garbage free streaming partners. I don’t think I could give that feature up.
I have a $2 USB C cable I got off of Ali that I use to charge my laptop at 65W. It’s rated for 100W but I have no way of testing it.
It’s actually higher quality than any official apple cable I’ve used, although that’s a pretty low bar.
The first USB-C Android phones were also only USB 2.0.
Although that was 8 years ago, when USB 3 was only just starting to become commonplace.
The SoC lacks the hardware. Even the USB C iPads with A series chips operate at 2.0 speeds. They can only do 5Gbit in host mode, like with an external SSD. Plugged in to a computer they are 2.0.
I would imagine future chips will have the capability, once the Pro chips trickle down to the base models.
If Debian works on your hardware and you just want something that works and doesn’t give you issues then yes its a good choice. It will just work happily in the background for years.
Fedora Server is a great choice if its something you want to continuously tinker with. Each release averages a little over 1 year of support so you’ll want to do a dist upgrade after each new version comes out.
I’m currently considering switching to it on a couple of production servers I manage because they rely on PostGIS. EL9 and Debian rely on the official postgres repositories rather than shipping their own .deb/rpms and the official postgres repository’s GIS packages are so unreliable I think it would be more stable on Arch. With Fedora server however I can just install postgres and postgis from the official community repo.
Ubuntu deviates from accepted standards too often (Mir, Upstart, Snap) thanks to Canonicals ham fisted attempts to redefine Linux.
Arch has a tendency to break due to the maintainers commitment to staying true to upstream. Too often you end up on the Arch wiki looking up how to solve small issues that should have been in the original PKGBUILD
Gentoo, not everyone wants to compile everything from source
Debian’s commitment to FOSS results in frequent incompatibilities (both SW and HW) out of the box.
Fedora is the perfect middle ground. It implements the latest technology standards as soon as they are stable (eg, Wayland, Btrfs by default), stays fairly close and true to upstream while maintaining package stability, and overall just works with a large variety of lackages
Fedora is for people who use Linux as a tool rather than a hobby.
If a game can’t run on the Series S it means it also can’t be ported to the PC. Turn down the resolution and graphics settings until you get the same fps target and continue in with your day.
I would expect any game from a developer that complains about this to be so poorly optimized that it runs like it would on the Series S on the bigger consoles, and likely have garbage gameplay as well because they spent all of their budget on graphics.
It won’t work as live action, and Sanderson is probably a little cagey after working on Amazon’s WoT. He would need to have absolute control over the writing.
Wax and Wayne in the style of Fullmetal Alchemist would be amazing though. Comedic but also serious.