how tf am i supposed to get any work done now?

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    think of it like this. you build small tools for internal use, so code quality, maintainability and documentation are not your highest priorities, right? most important is to ship the features needed by your colleagues right now. i’ve been in that same boat, building internal testing tools for a big multinational.

    say your tool contains shortcuts for interacting with some internal database. you don’t need auth because it’s all on the internal net, so it’s just a collection of shortcuts. you don’t really care about maintainability, because it’s all just temporary, so you throw every new request together in the fastest way you can, probably trying out new techniques to keep yourself entertained. you don’t really care about testability, because you’re the only one testing so you can check that everything works before doing a release and if something slips through one of your colleagues will walk down the hall and tell you.

    now imagine it gets enough attention that your boss says “we want our customers to use this”. suddenly your priorities are upended: you absolutely need auth, you definitely need testability and you absolutely definitely need the tool to not mess things up. best possible world, you can reimplement the tool in a more manageable way, making sure every interface is consistent, documentation is up to snuff for users, and error handling is centralised.

    the claude cli leak is the opposite of that. it’s the worst code quality i’ve ever seen. it’s full of errors, repeated code, and overzealous exception handling. it is absolutely unmaintainable. the functionality for figuring out what type a file is and reading it into a proper object is 38 000 lines of typescript, excluding the class definitions. the entire thing is half a million lines. the code for uploading a pdf calls the code to upload jpegs, because if a file isn’t identified it’s automatically a jpeg. and jpegs can go through the same compression routine up to 22 times before it tries to upload them because the handler just calls itself repeatedly.

    and this is the code they thought was robust enough to withstand the internet. imagine what their internal tooling looks like.

    • MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      because it’s all just temporary, so you throw every new request together in the fastest way you can

      I have been finding many use cases for “throwaway” AI coded tools like parser based calculators… the old: “if it takes you a day to write it and it speeds up a week long task by even 20% then it’s a win” thing has had the cost of “writing it” come down by a big factor for simple things - a parser tool thingy that might have taken 4 hours can now be certified good enough for internal use in 40 minutes, and if it’s saving 3 hours of hand work that has turned a “meh, maybe we’ll do this again sometime - I’d rather write the tool than drudge through 3 hours of brain-dead error prone work” into an overall big win.

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        i will never understand that attitude. four hours of exploration, learning and puzzle-solving sounds like the best part of the job. an isolated, well-specified problem that can be completed in a day is like the most fun you can have programming. why swap that for an hour of code review?

        • MangoCats@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          16 hours ago

          I guess it’s not puzzle solving for me anymore. The puzzle is solved in my head, the four hours of exploration are mostly occupied with looking up annoying details about how to do what’s in my head within the syntax and limitations of the latest re-invention of whatever lanugage has been selected / is available in the environment. Both routes have to be tested, but glancing over provided source code and nodding “yeah, that looks right” then testing the successful implementation is a hell of a lot more satisfying for me than deciphering complier errors, untangling version incompatibilities, looking up function signatures, etc.