• @boonhet@lemm.ee
    link
    fedilink
    English
    0
    edit-2
    6 months ago

    AI AI AI AI

    Yawn

    Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.

    • @gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      06 months ago

      You can get a Coral TPU for 40 bucks or so.

      You can get an AMD APU with a NN-inference-optimized tile for under 200.

      Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

      What price point are you trying to hit?

      • @boonhet@lemm.ee
        link
        fedilink
        English
        06 months ago

        What price point are you trying to hit?

        With regards to AI?. None tbh.

        With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

        • @barsoap@lemm.ee
          link
          fedilink
          English
          06 months ago

          With regards to AI?. None tbh.

          TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.

          • @bassomitron@lemmy.world
            link
            fedilink
            English
            06 months ago

            No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.

              • caseyweederman
                link
                fedilink
                English
                06 months ago

                I have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.

                • @boonhet@lemm.ee
                  link
                  fedilink
                  English
                  06 months ago

                  I mean the image generators can be cool and LLMs are great for bouncing ideas off them at 4 AM when everyone else is sleeping. But I can’t imagine paying for AI, don’t want it integrated into most products, or put a lot of effort into hosting a low parameter model that performs way worse than ChatGPT without a paid plan. So you’re exactly right, it’s not being sold to me in a way that I would want to pay for it, or invest in hardware resources to host better models.