• Teanut@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    nVidia’s new Digits workstation, while expensive from a consumer standpoint, should be a great tool for local inferencing research. $3000 for 128GB isn’t a crazy amount for a university or other researcher to spend, especially when you look at the price of the 5090.

    • Viri4thus@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      21 hours ago

      Why would you buy a single use behemoth when you can buy a strix halo 128GB that can work as an actual tablet/laptop and have all the functionality of the behemoth?! while supporting decades of legacy x86 software. Truly wondering why anyone would buy that NVIDIA thing other than pure ignorance and marketing says NV is the AI company.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      Dense models that would fit in 100-ish GB like mistral large would be really slow on that box, and there isn’t a SOTA MoE for that size yet.

      So, unless you need tons of batching/parallel requests, its… kinda neither here nor there?

      As someone else said, the calculus changes with cheaper Strix Halo boxes (assuming those mini PCs are under $3K).