• @brlemworld@lemmy.world
    link
    fedilink
    English
    0
    edit-2
    9 months ago

    I would point out that Google has been “carbon neutral” with it’s data centers for quite some time, unlike others who still rape the environment ahem AWS.

  • @repungnant_canary@lemmy.world
    link
    fedilink
    English
    09 months ago

    I’m genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won’t give them any more income. How do they justify it then?

    • @conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      09 months ago

      It’s another untapped market they can monopolize. (Or just run at a loss because investor’s are happy with another imaginary pot of gold at the end of another rainbow.)

    • @HappycamperNZ@lemmy.world
      link
      fedilink
      English
      09 months ago

      Perception. If a company isn’t on the leading edge we don’t consider them the best.

      Regardless if you use them or not, if Google didn’t touch AI but Edge did you would believe edge is more advanced.

    • @afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      09 months ago

      Because data is king and sessions are going to be worth a lot more than searches. Go through the following

      1. Talk to a LLM about what product to buy

      2. Search online for a product to buy

      Which one gives out more information about yourself?

  • @afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    09 months ago

    This is terrible. Why don’t we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?

    You know the shit that we should have been doing before I was born.

    • @Ragnarok314159@sopuli.xyz
      link
      fedilink
      English
      09 months ago

      Yes, but now we can get much worse results and three pages of ads for ten times the energy cost. Capitalism at its finest.

  • @mctoasterson@reddthat.com
    link
    fedilink
    English
    09 months ago

    The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn’t necessary and I’m not convinced it results in a “better search result” for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.

        • Balder
          link
          fedilink
          English
          0
          edit-2
          9 months ago

          I always felt like I was alone in this thinking. I think anyone with a bit of a security mindset don’t want everything connected, besides it makes them more expensive and easier to break.

          • @afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            09 months ago

            It definitely has to walk in the desert for a while. I know multiple people who like it for some stuff. Like cameras and managing air conditioning.

  • ben
    link
    fedilink
    English
    09 months ago

    I skimmed the article, but it seems to be assuming that Google’s LLM is using the same architecture as everyone else. I’m pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.

    That and they don’t seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.

    • @AlecSadler@sh.itjust.works
      link
      fedilink
      English
      09 months ago

      I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.

      • ben
        link
        fedilink
        English
        09 months ago

        Yeah they’re pretty impressive for some at home stuff and they’re not even that costly.

      • @dan@upvote.au
        link
        fedilink
        English
        09 months ago

        The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.

        It runs Tensorflow Lite, so you can also build your own models.

        Pretty good for a $25 device!

    • @kromem@lemmy.world
      link
      fedilink
      English
      09 months ago

      Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.

      At this point, a lot of people just care about the ‘feel’ of anti-AI articles even if the substance is BS though.

      And then people just feed whatever gets clicks and shares.

    • @dan@upvote.au
      link
      fedilink
      English
      09 months ago

      I’m pretty sure Google uses their TPU chips

      The Coral ones? They don’t have nearly enough RAM to handle LLMs. They only support small Tensorflow Lite models.

      They might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

      instead of a regular GPU

      I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.

  • @Lyricism6055@lemmy.world
    link
    fedilink
    English
    09 months ago

    I switched to Kagi like 6 months ago and I still love it. Almost never have to go back to google except for maps.

  • @DarkCloud@lemmy.world
    link
    fedilink
    English
    09 months ago

    If these guys gave a shit they’d focus on light based chips, which are in very early stages, but will save a lot of power.

  • @lone_faerie@lemmy.blahaj.zone
    link
    fedilink
    English
    09 months ago

    AI is just what crypto bros moved onto after people realized that was a scam. It’s immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it’s being backed by major corporations because it means fewer employees they have to pay.

    • @afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      09 months ago

      energy for a solution in search of a problem,

      Except this time it’s being backed by major corporations because it means fewer employees they have to pay.

      Ah yes the classic it is useless and here is a use for it logic.

      • Traister101
        link
        fedilink
        English
        09 months ago

        I take it you haven’t had to go through an AI chat bot for support before huh

        • @afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          09 months ago

          I have and don’t see the relevance. The argument is that it is useless and then mentions a use case. If you want to say it’s crap I won’t argue the point but you can’t say X and ~X.

    • @pycorax@lemmy.world
      link
      fedilink
      English
      09 months ago

      There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren’t just a scam. However, most of these are not consumer facing and the average person won’t really hear about them.

      It’s unfortunate that what you said is very true on the consumer side of things…

      • @Ragnarok314159@sopuli.xyz
        link
        fedilink
        English
        09 months ago

        I would love to see an AI make an ANSYS model that isn’t shit. They might be able to make cute pictures, but when it comes to making models for CFD or FEA, AI is a complete waste of time.

  • ArchRecord
    link
    fedilink
    English
    09 months ago

    If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.

    I find it useful in DuckDuckGo because it’s out of the way, unobtrusive, and only pops up when necessary. I’ve tried using Google with its search AI enabled, and it was the most unusable search engine I’ve used in years.

      • ArchRecord
        link
        fedilink
        English
        09 months ago

        I haven’t had any problems myself.

        In fact, I regularly use their anonymized LLM Chat tab to help out with restructuring data, summarizing some more complex topics, or finding some info that doesn’t readily appear near the top of search. It’s made my search experience (again, specifically in my circumstance) much better than before.

  • @ForgottenFlux@lemmy.worldOP
    link
    fedilink
    English
    09 months ago

    Summary:

    • AI’s rapid growth has transformed digital life, but its significant environmental impact remains largely unchecked.
    • AI-powered features can consume up to 10 times more electricity than traditional searches, potentially equating to a country’s power usage.
    • The proliferation of energy-intensive data centers powering AI is outpacing the electric grid’s capacity, forcing utilities to maintain fossil fuel plants for reliability.
    • Estimates suggest AI could account for 9% of U.S. energy demand by 2030, substantially contributing to climate change.
    • Lack of industry transparency and mandatory reporting makes quantifying AI’s full environmental toll difficult.
    • Tech companies negotiate discounted utility rates, shifting costs to ratepayers and reducing incentives for energy efficiency.
    • Government regulation has been slow and industry-influenced, focusing on hypothetical future risks over current, tangible harms.
    • The burden of AI’s environmental impact disproportionately falls on Global South communities where data centers are located.
    • Tech companies resist mandatory disclosures, prioritizing profits over sustainability while the public bears the physical costs.