• @rsuri@lemmy.world
    link
    fedilink
    English
    07 months ago

    I use it occasionally. Recently I used it to convert a written specification in a document to a java object. And it was like 95% correct - but having to manually double check everything and fix the errors eliminated much of the time savings.

    However that’s a very ideal use case. Most often I forget it exists.

    • @rolaulten@startrek.website
      link
      fedilink
      English
      0
      edit-2
      7 months ago

      I use it a fair bit. Mind, it’s something like formating a giant json stdout into something I want to read…

      I also do find it’s useful for sketching out an outline In pseudo code.

    • Pennomi
      link
      fedilink
      English
      07 months ago

      Flying cars exist, they’re just not cost effective. AFAICT there’s no GPT that is proficient at coding yet.

        • sepi
          link
          fedilink
          English
          0
          edit-2
          7 months ago

          The more people using chatgpt to generate low quality code they don’t understand, the more job safety and greater salary I get.

      • @Sylvartas@lemmy.world
        link
        fedilink
        English
        07 months ago

        As far as I know, right now the main problem with flying cars is that they are nowhere near as idiot-proof as a normal car, and don’t really solve any transportation problem since most countries’ air regulations agencies would require them to exclusively take off and land in airports… Where you can usually find tons of planes that can go much further (and are much more cost effective, as you pointed out)

  • @tkw8@lemm.ee
    link
    fedilink
    English
    0
    edit-2
    7 months ago

    I’m shocked. There must be an error in this analysis. /s

    Maybe engage an AI coding assistant to massage the data analysis lol

  • Eager Eagle
    link
    fedilink
    English
    07 months ago

    I like to use suggestions to feel superior when trash talking the generated code

    • @Kualk@lemm.ee
      link
      fedilink
      English
      0
      edit-2
      7 months ago

      We always have to ask what language is it auto-completing for? If it is a strictly typed language, then existing tooling is already doing everything possible and I see no need for additional improvement. If it is non-strictly typed language, then I can see how it can get a little more helpful, but without knowledge of actual context I am not sure if it can get a lot more accurate.

      • @MeatsOfRage@lemmy.world
        link
        fedilink
        English
        07 months ago

        Hell yea. Our unit test coverage went way up because you can blow through test creation in second. I had a large complicated migration from one data set to another with specific mutations based on weird rules and GPT got me 80% of the way there and with a little nudging basically got it perfect. Code that would’ve taken a few hours took about 6 prompts. If I’m curious about a new library I can get a working example right away to see how everything fits together. When these articles say there’s no benefit I feel people aren’t using these tools or don’t know how to use them effectively.

        • @SkyeStarfall@lemmy.blahaj.zone
          link
          fedilink
          English
          07 months ago

          Yeah, it’s useful, you just gotta keep it on a short leash, which is difficult when you don’t know what you’re doing

          Basically, it’s a useful tool for experienced developers that know what to look out for

          • A Wild Mimic appears!
            link
            fedilink
            English
            07 months ago

            From the combined comments it looks like if you are a beginner or a pro then it’s great; if you only have just enough knowledge to be dangerous (in german that’s proverbial “gefährliches Halbwissen”) you should probably stay away from it :-)

    • @breckenedge@lemmy.world
      link
      fedilink
      English
      07 months ago

      In my experience, most of the tech layoffs have been non-devs. PMs and Designers have been the hardest hit and often their roles are being eliminated.

    • @Warl0k3@lemmy.world
      link
      fedilink
      English
      07 months ago

      Its basically a template generator, which is really helpful when you’re generating boilerplate. It doesn’t save me much if any time to refactor/fill in that template, but it does save some mental fatigue that I can then spend on much more interesting problems.

      It’s a niche tool, but occasionally quite handy. Without leaps forward technically though, it’s never going to become more than that.

    • @dgmib@lemmy.world
      link
      fedilink
      English
      07 months ago

      Just beware, sometimes the AI suggestions are scary good, some times they’re batshit crazy.

      Just because AI suggests it, doesn’t mean it’s something you should use or learn from.

  • EleventhHour
    link
    fedilink
    English
    0
    edit-2
    7 months ago

    Devs that are punching above their class, however, probably get great benefit from it. I would think it’s also an OK learning tool, except for how inaccurate it can be sometimes.

  • @9point6@lemmy.world
    link
    fedilink
    English
    07 months ago

    My main use is skipping the blank page problem when writing a new suite of tests—which after about 10 mins of refactoring are often a good starting point

  • Greg Clarke
    link
    fedilink
    English
    07 months ago

    Generative AI is great for loads of programming tasks like helping create regular expressions or syntax conversions between languages. The main issue I’ve seen in codebases that rely heavily on generative AI is that the “solutions” often fix today’s bug while making future debugging more difficult. Generative AI makes it easy to go fast in the wrong direction. Used right it’s a useful tool.

  • @TrickDacy@lemmy.world
    link
    fedilink
    English
    07 months ago

    I truly don’t understand the tendency of people to hate these kinds of tools. Honestly seems like an ego thing to me.

    • @leftzero@lemmynsfw.com
      link
      fedilink
      English
      07 months ago

      Having to deal with pull requests defecated by “developers” who blindly copy code from chatgpt is a particularly annoying and depressing waste of time.

      At least back when they blindly copied code from stack overflow they had to read through the answers and comments and try to figure out which one fit their use case better and why, and maybe learn something… now they just assume the LLM is right (despite the fact that they asked the wrong question and even if they had asked the right one it’d’ve given the wrong answer) and call it a day; no brain activity or learning whatsoever.

      • @TrickDacy@lemmy.world
        link
        fedilink
        English
        07 months ago

        That is not a problem with the ai software, that’s a problem with hiring morons who have zero experience.

        • @leftzero@lemmynsfw.com
          link
          fedilink
          English
          07 months ago

          No. LLMs are very good at scamming people into believing they’re giving correct answers. It’s practically the only thing they’re any good at.

          Don’t blame the victims, blame the scammers selling LLMs as anything other than fancy but useless toys.

          • @TrickDacy@lemmy.world
            link
            fedilink
            English
            07 months ago

            Okay before I blamed your terrible employees who are incredibly bad at their jobs but maybe I should blame their leadership after that comment

          • jungle
            link
            fedilink
            English
            07 months ago

            Did you get scammed by the LLM? If not, what’s the difference between you and the dev you mentioned?

            • @leftzero@lemmynsfw.com
              link
              fedilink
              English
              0
              edit-2
              7 months ago

              I was lucky enough to not have access to LLMs when I was learning to code.

              Plus, over the years I’ve developed a good thick protective shell (or callus) of cynicism, spite, distrust, and absolute seething hatred towards anything involving computers, which younger developers yet lack.

              • jungle
                link
                fedilink
                English
                07 months ago

                Sorry, you misunderstood my comment, which was very badly worded.

                I meant to imply that you, an experienced developer, didn’t get “scammed” by the LLM, and that the difference between you and the dev you mentioned is that you know how to program.

                I was trying to make the point that the issue is not the LLM but the developer using it.

                • @leftzero@lemmynsfw.com
                  link
                  fedilink
                  English
                  07 months ago

                  And I’m saying that I could have been that developer if I were twenty years younger.

                  They’re not bad developers, they just haven’t yet been hurt enough to develop protective mechanisms against scams like these.

                  They are not the problem. The scammers selling the LLM’s as something they’re not are.

    • @GreenKnight23@lemmy.world
      link
      fedilink
      English
      07 months ago

      I sent a PR back to a Dev five times before I gave the work to someone else.

      they used AI to generate everything.

      surprise, there were so many problems it broke the whole stack.

      this is a routine thing this one dev does too. every PR has to be tossed back at least once. not expecting perfection, but I do expect it to not break the whole app.

        • @GreenKnight23@lemmy.world
          link
          fedilink
          English
          07 months ago

          that depends on your definition of what a “terrible dev” is.

          of the three devs that I know have used AI, all we’re moderately acceptable devs before they relied on AI. this formed my opinion that AI code and the devs that use it are terrible.

          two of those three I no longer work with because they were let go for quality and productivity issues.

          so you can clearly see why my opinion of AI code is so low.

          • @TrickDacy@lemmy.world
            link
            fedilink
            English
            07 months ago

            I would argue that it’s obvious if someone doesn’t know how to use a tool to do their job, they aren’t great at their job to begin with.

            Your argument is to blame the tool and excuse the person who is awful with the tool.

            • @GreenKnight23@lemmy.world
              link
              fedilink
              English
              07 months ago

              my argument is that lazy devs use the tool because that’s what it was designed for.

              just calling a hammer a hammer.

              • @aesthelete@lemmy.world
                link
                fedilink
                English
                07 months ago

                Some tools deserve blame. In the case of this, you’re supposed to use it to automate away certain things but that automation isn’t really reliable. If it has to be babysat to the extent that I certainly would argue that it does, then it deserves some blame for being a crappy tool.

                If, for instance, getter and setter generating or refactor tools in IDEs routinely screwed up in the same ways, people would say that the tools were broken and that people shouldn’t use them. I don’t get how this is different just because of “AI”.

                • @TrickDacy@lemmy.world
                  link
                  fedilink
                  English
                  07 months ago

                  Okay, so if the tool seems counterproductive for you, it’s very assuming to generalize that and assume it’s the same for everyone else too. I definitely do not have that experience.

              • @TrickDacy@lemmy.world
                link
                fedilink
                English
                07 months ago

                Using a tool to speed up your work is not lazy. Using a tool stupidly is stupid. Anyone who thinks these tools are meant to replace humans using logic is misunderstanding them entirely.

                You remind me of some of my coworkers who would rather do the same mind numbing task for hours every day rather than write a script that handles it. I judge them for thinking working smarter is “lazy” and I think it’s a fair judgement. I see them as the lazy ones. They’d rather not think more deeply about the scripting aspect because it’s hard. They rather zone out and mindlessly click, copy/paste, etc. I’d rather analyze and break down the problem so I can solve it once and then move onto something more interesting to solve.

                • @aesthelete@lemmy.world
                  link
                  fedilink
                  English
                  07 months ago

                  They rather zone out and mindlessly click, copy/paste, etc. I’d rather analyze and break down the problem so I can solve it once and then move onto something more interesting to solve.

                  From what I’ve seen of AI code in my time using it, it often is an advanced form of copying and pasting. It frequently takes problems that could be better solved more efficiently with fewer lines of code or by generalizing the problem and does the (IMO evil) work of making the solution that used to require the most drudgery easy.

                • @GreenKnight23@lemmy.world
                  link
                  fedilink
                  English
                  07 months ago

                  sometimes working smarter is actually putting the work in so you don’t have to waste time and stress about if it’s going to work or not.

                  I get Dreamweaver vibes from AI generated code. Sure, the website works. looks exactly the way it should. works exactly how it should. that HTML source though… fucking aweful.

                  I can agree, AI is an augment to the tools you can use. however, it’s being marketed as a replacement and a large variety of devs are using it as such.

                  shitty devs are enabled by shitty tools.

    • @tee9000@lemmy.world
      link
      fedilink
      English
      07 months ago

      Its really weird.

      I want to believe people arent this dumb but i also dont want to be crazy for suggesting such nonsensical sentiment is manufactured. Such is life in the disinformation age.

      Like what are we going to do, tell all Countries and fraudsters to stop using ai because it turns out its too much of a hassle?

        • @tee9000@lemmy.world
          link
          fedilink
          English
          07 months ago

          You are speaking for everyone so right away i dont see this as an actual conversion, but a decree of fact by someone i know nothing about.

          What are you saying is an important reminder? This article?

          By constant activism, do you mean anything that occurs outside of lemmy comments?

          Why would we not take LLMs seriously?

            • @tee9000@lemmy.world
              link
              fedilink
              English
              07 months ago

              Theres no particular fuck up mentioned by this article.

              The company that conducted the study which this article speculates on said these tools are getting rapidly better and that they arent suggesting to ban ai development assistants.

              Also as quoted in the article, the use of these coding assistance is a process in and of itself. If you arent using ai carefully and iteratively then you wont get good results with current models. How we interact with models is as important as the model’s capability. The article quotes that if models are used well, a coder can be faster by 2x or 3x. Not sure about that personally… seems optimistic depending on whats being developed.

              It seems like a good discussion with no obvious conclusion given the infancy of the tech. Yet the article headline and accompanying image suggest its wreaking havoc.

              Reduction of complexity in this topic serves nobody. We should have the patience and impartiality to watch it develop and form opinions independently from commeter and headline sentiment. Groupthink has been paricularly dumb on this topic from what ive seen.

    • @YungOnions@sh.itjust.works
      link
      fedilink
      English
      07 months ago

      Typical lack of nuance on the Internet, sadly. Everything has to be Bad or Good. Black or White. AI is either The best thing ever™ or The worst thing ever™. No room for anything in between. Considering negative news generates more clicks, you can see why the media tend to take the latter approach.

      I also think much of the hate is just people jumping on the AI = bad band-wagon. Does it have issues? Absolutely. Is it perfect? Far from it. But the constant negativity has gotten tired. There’s a lot of fascinating discussion to be had around AI, especially in the art world, but God forbid you suggest it’s anything but responsible for the total collapse of civilisation as we know it…

    • @gaael@lemmy.world
      link
      fedilink
      English
      07 months ago

      Also, when a tool increases your productivity but your salary and paid time off don’t increase, it’s a tool that only benefits the overlords and as such deserves to be hated.

      • @stephen01king@lemmy.zip
        link
        fedilink
        English
        07 months ago

        Oh, so do you use a 13 year old PC because a newer one increases your productivity without increasing your salary and paid time off?

          • @TrickDacy@lemmy.world
            link
            fedilink
            English
            07 months ago

            I could request a new one, but why?

            Gives excellent argument for requesting a new one:

            slow as all hell.

          • @stephen01king@lemmy.zip
            link
            fedilink
            English
            07 months ago

            I mean, you’re clearly using them because they still work, not because of a hatred for increasing productivity for the overlords. Your choice was based on reasonable logic, unlike the other guy.

      • @TrickDacy@lemmy.world
        link
        fedilink
        English
        07 months ago

        Some people feel proud that their work got done quicker and also aren’t micromanaged so if they choose, yes actually they can have more time for their personal lives. Not everyone’s job is purely a transaction in which they do the absolute minimum they can do without being fired.

        I hope you feel better soon, because you’re clearly bitter and lashing out at whatever you can lash at.

        • ElectricMachman
          link
          fedilink
          English
          07 months ago

          I’m glad you live in this fantasy world where more productivity = more personal time, but it doesn’t always work like that, especially in salaried positions. More productivity generally means more responsibility coming your way, which rarely results in an increased salary.

  • Eager Eagle
    link
    fedilink
    English
    0
    edit-2
    7 months ago

    lol Uplevel’s “”“full report”“” saying devs using Copilot create 41% more bugs has 2 pages and reads like a promotional material.

    you can download it with a 10 minute email if you really want to see for yourself.

    just some meaningless numbers.

  • @Grandwolf319@sh.itjust.works
    link
    fedilink
    English
    07 months ago

    Yep, by definition generative AI gets worse the more specific you get. If you need common templates though, it’s almost as good as today’s google.

  • @VonReposti@feddit.dk
    link
    fedilink
    English
    07 months ago

    While I am not fond of AI, we do have access to it at work and I must admit that it saves some time in some cases. I’m not a developer with decades of experience in a single language, so something I am using AI to is asking “Is it possible to do a one-liner in language X where it does Y?” It works very well and the code is rarely unusable, but it is still up to my judgement whether the AI came up with a clever use of functions that I didn’t know about or whether it crammed stuff into a single unreadable line.