• Chickenstalker@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    Thing is, computer tech tends to grow exponentially. If I were you, I’ll start to learn a backup skill right now.

    • Knusper@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’m also not worried. Software complexity generally grows proportional to the complexity of the requirements. And most projects I’ve been a part of, no one could have told you all the requirements even after we’ve figured them out.

      The code + test code is usually the only document that describes the requirements. And with high-level languages, there’s not that much boilerplate around the codified requirements either. Besides, we can use LLMs for that boilerplate ourselves.

    • Holyginz@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Lmao, I’m not a programmer although i know how, and even if I was I wouldn’t be worried for good reason. AI requires explicit instructions for everything. So in order to use it to code you need to be a programmer.

      • Meho_Nohome@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        I’m not a programmer and I’ve used it to code. It rarely works the first time around, but I’m sure it will improve quickly to be more accurate.

    • RandomVideos@programming.dev
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      If an AI was made that was smarter than programmers, couldnt it make a smarter AI, which could make an even smaryet AI repeating