• Takumidesh@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    3
    ·
    20 hours ago

    This is satire / trolling for sure.

    LLMs aren’t really at the point where they can spit out an entire program, including handling deployment, environments, etc. without human intervention.

    If this person is ‘not technical’ they wouldn’t have been able to successfully deploy and interconnect all of the pieces needed.

    The AI may have been able to spit out snippets, and those snippets may be very useful, but where it stands, it’s just not going to be able to, with no human supervision/overrides, write the software, stand up the DB, and deploy all of the services needed. With human guidance sure, but with out someone holding the AIs hand it just won’t happen (remember this person is ‘not technical’)

    • iAvicenna@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      My impression is that with some guidance it can put together a basic skeleton of complex stuff too. But you need a specialist level of knowledge to fix the fail at compile level mistakes or worse yet mistakes that compile but don’t at all achieve the intended result. To me it has been most useful at getting the correct arguments for argument heavy libraries like plotly, remembering how to do stuff in bash or learning something from scratch like 3js. Soon as you try to do something more complex than it can handle, it confidently starts cycling through the same couple of mistakes over and over. The key words it spews in those mistakes can sometimes be helpful to direct your search online though.

      So it has the potential to be helpful to a programmer but it cant yet replace programmers as tech bros like to fantasize about.

    • idk ive seen some crazy complicated stuff woven together by people who cant code. I’ve got a friend who has no job and is trying to make a living off coding while, for 15+ years being totally unable to learn coding. Some of the things they make are surprisingly complex. Tho also, and the person mentioned here may do similarly, they don’t ONLY use ai. They use Github alot too. They make nearly nothing themself, but go thru github and basically combine large chunks of code others have made with ai generated code. Somehow they do it well enough to have done things with servers, cryptocurrency, etc… all the while not knowing any coding language.

    • MyNameIsIgglePiggle@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      ·
      18 hours ago

      Claude code can make something that works, but it’s kinda over engineered and really struggles to make an elegant solution that maximises code reuse - it’s the opposite of DRY.

      I’m doing a personal project at the moment and used it for a few days, made good progress but it got to the point where it was just a spaghetti mess of jumbled code, and I deleted it and went back to implementing each component one at a time and then wiring them together manually.

      My current workflow is basically never let them work on more than one file at a time, and build the app one component at a time, starting at the ground level and then working in, so for example:

      Create base classes that things will extend, Then create an example data model class, iterate on that architecture A LOT until it’s really elegant.

      Then Ive been getting it to write me a generator - not the actual code for models,

      Then (level 3) we start with be UI.layer, so now we make a UI kit the app will use and reuse for different components

      Then we make a UI component that will be used in a screen. I’m using flutter as an example so It would be a stateless component

      We now write tests for the component

      Now we do a screen, and I import each of the components.

      It’s still very manual, but it’s getting better. You are still going to need a human cider, I think forever, but there are two big problems that aren’t being addressed because people are just putting their head in the sand and saying nah can’t do it, or the clown op in the post who thinks they can do it.

      1. Because dogs be clownin, the public perception of programming as a career will be devalued “I’ll just make it myself!” Or like my rich engineer uncle said to me when I was doing websites professionally - a 13 year old can just make a website, why would I pay you so much to do it. THAT FUCKING SUCKS. But a similar attitude has existed from people “I’ll just hire Indians”. This is bullshit, but perception is important and it’s going to require you to justify yourself for a lot more work.

      2. And this is the flip side good news. These skills you have developed - it’s is going to be SO MUCH FUCKING HARDER TO LEARN THEM. When you can just say “hey generate me an app that manages customers and follow ups” and something gets spat out, you aren’t going to investigate the grind required to work out basic shit. People will simply not get to the same level they are now.

      That logic about how to scaffold and architect an app in a sensible way - USING AI TOOLS - is actually the new skillset. You need to know how to build the app, and then how to efficiently and effectively use the new tools to actually construct it. Then you need to be able to do code review for each change.

      </rant>

    • nick@midwest.social
      link
      fedilink
      arrow-up
      4
      ·
      18 hours ago

      Mmmmmm no, Claude definitely is. You have to know what to ask it, but I generated and entire deadman’s switch daemon written in go in like an hour with it, to see if I could.

      • Takumidesh@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        18 hours ago

        So you did one simple program.

        SaaS involves a suite of tooling and software, not just a program that you build locally.

        You need at a minimum, database deployments (with scaling and redundancy) and cloud software deployments (with scaling and redundancy)

        SaaS is a full stack product, not a widget you run on your local machine. You would need to deputize the AI to log into your AWS (sorry, it would need to create your AWS account) and fully provision your cloud infrastructure.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          13 hours ago

          Lol they don’t need scaling and redundancy to work. They just need scaling and redundancy to avoid being sued into oblivion when they lose all their customer data.

          As a full time AI hater, I fully believe that some code-specialized AI can write and maybe even deploy a full stack program, with basic input forms and CRUD, which is all you need to be a “saas”.

          It’s gonna suck, and be unmaintainable, and insecure, and fragile. But I bet it could do it and it’d work for a little while.

          • Maxxie@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 hours ago

            That’s not “working saas” tho.

            Its like calling hello world a “production ready CLI application”.

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              arrow-up
              2
              ·
              4 hours ago

              What makes it “working”, is that the Software part of Software as a Service, is available as a Service.

              The service doesn’t have to scale to a million users. It’s still a SaaS if it has one customer with like 4 users.

              Is this a pedantic argument? Yes.
              Are you starting a pedantic fight about the specific definition of SaaS? Also yes.

    • qaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      18 hours ago

      It’s further than you think. I spoke to someone today about and he told me it produced a basic SaaS app for him. He said that it looked surprisingly okay and the basic functionalities actually worked too. He did note that it kept using deprecated code, consistently made a few basic mistakes despite being told how to avoid it, and failed to produce nontrivial functionalies.

      He did say that it used very common libraries and we hypothesized that it functioned well because a lot of relevant code could be found on GitHub and that it might function significantly worse when encountering less popular frameworks.

      Still it’s quite impressive, although not surprising considering it was a matter of time before people would start to feed the feedback of an IDE back into it.

    • jackeryjoo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      19 hours ago

      We just built and deployed a fully functional AWS app for our team entirely written in AI. From the terraform, to the backing API, to the frontend Angular. All AI. I think AI is further along here than you suspect.

      • Takumidesh@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        18 hours ago

        I’m skeptical. You are saying that your team has no hand in the provisioning and you deputized an AI with AWS keys and just let it run wild?

      • hubobes@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        18 hours ago

        How? We try to adopt AI for dev work for years now and every time the next gen tool or model gets released it fails spectacularly at basic things. And that’s just the technical stuff, I still have no idea on how to tell it do implement our use cases as it simply does not understand the domain.

        It is great at building things other have already built and it could train on but we don’t really have a use case for that.

    • Tja@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      19 hours ago

      Might be satire, but I think some “products based on LLMs” (not LLMs alone) would be able to. There’s pretty impressive demos out there, but honestly haven’t tried them myself.