• HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    12 hours ago

    It is in everyone’s interest to gradually adjust to the notion that technology can now perform tasks once thought to require years of specialized education and experience.

    The years of specialized education and experience is not for writing code in and of itself. Anyone with an internet connection can learn to do that in not that long. What takes years to perfect is writing reliable, optimized, secure code, communicating and working efficiently with others, writing code that can be maintained by others long after you leave, knowing the theories behind why code written in a certain way works better than code written in some other way, and knowing the qualitative and quantitative measures to even be able to assess whether one piece of code is “better” than the other. Source: Self-learned programming, started building stuff on my own, and then went through an actual computer science program. You miss so much nuance and underlying theory when you self-learn, which directly translates bad code that’s a nightmare to maintain.

    Finally, the most important thing you can do with the person that has years of specialized education and experience is you can actually have a conversation with them about their code, ask them to explain in detail how it works and the process they used to write it. Then you can ask them followup questions and request further clarification. Trying to get AI to explain itself is a complete shitshow, and while humans do have a propensity to make shit up to cover their own/their coworkers’ asses, AI does that even when it make no sense not to tell the truth because it doesn’t really know what “the truth” is and why other people would want it.

    Will AI eventually catch up? Almost certainly, but we’re nowhere close to that right now. Currently it’s less like an actual professional developer and more like someone who knows just enough to copy paste snippets from Stack Overflow and hack them together into a program that manages to compile.

    I think the biggest takeaway with AI programming is not that it can suddenly do just as well as someone with years of specialized education and experience, but that we’re going to get a lot more shitty software that look professional on the surface, but is a dumpster fire inside.

    • mindbleach@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      11 hours ago

      Self-learned programming, started building stuff on my own, and then went through an actual computer science program.

      Same. Starting with QBASIC, no less, which is an excellent source of terrible practices. At one point I created a code snippet that would perform a division and multiplication to find the remainder, because I’d never heard of modulo. Or functions.

      Right now, this lets people skip the hair-pulling syntax errors, and tell the computer what they think the program should be doing, in plain English. It’s not even “compileable pseudocode.” It’s high-level logic, nearly to the point that logic errors are all that can remain. It desperately needs some non-answer feedback states for if you tell it to “implement MP4 encoding” and expect that to Just Work.

      But it’s teaching people to write the comments first.

      we’re nowhere close to that right now.

      The distance from here to “oh shit” is shorter than we’d prefer. This tech works like a joke. “Chain of thought” apparently means telling the robot to act smarter… and it does. Which is almost less silly than Stable Diffusion removing every part of the marble that doesn’t look like Hatsune Miku. If it’s stupid, but it works… it’s still stupid. But it works.

      Someone’s gonna prompt “Write like Donald Knuth” and the robot’s gonna go, “Oh, you wanted good code? Why didn’t you say so.”