• 0 Posts
  • 8 Comments
Joined 6 months ago
cake
Cake day: April 21st, 2024

help-circle
  • This is a patch from the hardware vendor so I am assuming that the ask is not that the hardware vendor take responsibility but that they not release buggy hardware. That is what I mean about the validation issue.

    The attack vector is shared in the patch so it isn’t entirely a theory.

    There is a comment from Linus about how this patch is only needed for some hardware and doesn’t apply to others but I don’t get his relevance there as different hardware validates against different use cases and their source logic might be entirely disparate.

    So my validation talk is simply saying that bugs happen. My concern here is what more should a hardware vendor do beyond submitting a kernel patch? You can’t just not have the bug, and if you recall the part someone else will just keep theirs in the field and take all the market share and roll the dice that their bugs don’t get exploited.


  • Is this really the hardware vendor’s problem though? It’s the consumers problem.

    I bring up full validation because the concern here is putting in a speculative fix. If the ask is, why was the hardware like that in the first place the answer is because it can’t be fully validated. If the ask is why should a speculative fix go into the Kernel it is because the consumers are not on top of tree and if a fix has a chance of never being exploited it needs to be pulled in years ahead so it goes into an LTR that customers migrate to BEFORE the issue comes up.


  • Fully validating hardware is an insane task that hasn’t been really done in years. It would mean 5 years between chip releases and a 2-5X in cost to produce, and people wouldn’t follow the validated configs anyways. If we followed the validated hardware spec we would have 50 min boot times and not go past a 3.5Ghz clock.

    People have the choice today on if they want to run on validated hardware. You can opt in to get a 2.8Ghz part that supports 2666MT/s that is mostly tested and validated, or you can get a 5Ghz part that supports 6000MT/s that is only partially validated. They cost the same price. What do folks think people pick?



  • The intent is to make the distributed version more true to the real original. None of us got to see the original. The original is a bunch of data on various machines. What we saw was a low quality save file of the original, cut down and watered down to the specs of 4:3 CRT televisions and broadcast hardware of the time. That version develops artefacts not intended when distributed on modern media.

    Now this probably isn’t using original source files but it is possible. Remaster as a term also is used when they take the final master copy and rerun it through more modern technologies to get a cleaner output which is what I expect happened here.



  • Every PC will be using AI as we move forward and thinking they won’t seems as head in the sand to me as thinking the Internet would be a fad. Remember how awful the Internet was in the 80s and 90s? AI is in a similar spot today.

    Why would I read a manual when I can ask an AI to summarize it and give me pages so I can confirm? If I’m trying to do a task I know a million people have solved like Python code to translate XLSX and CSV to JSON and back, why wouldn’t I use AI for that?

    Trusting AI outright and not reviewing the answers is silly, but doing research with AI is soooo much faster. Also the majority of articles and manuals you find online written in the past year used AI and you can have CoPilot spit it out to you WITH the original sources that the website/blog hides.

    The idea that AI isn’t trustworthy is silly, because no one is trustworthy. You should always have been double checking things for yourself, but sitting and struggling through something for 2 days is foolish when AI could do 80% of the work for you in seconds.