Buried in the story was a deceptively simple question: does your AI agent count as an employee?

At a recent conference, Microsoft executive Rajesh Jha floated a provocative idea. In a future where companies deploy fleets of AI agents, those agents may need their own identities — logins, inboxes, and even seats inside software systems. If so, AI wouldn’t shrink software revenue. It could expand it.

  • CatAssTrophy@safest.space
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    4 hours ago

    This gets close to an idea I heard long ago that I think has some merit.

    Hire an employee? You must not only pay them, but cover taxes to have them there. Buy a robot to replace them? It’s a business expense, no taxes!

    Okay, pay taxes for your robot usage. Use that money to fund UBI, social programs and/or retraining people for other jobs.

    • muusemuuse@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 hours ago

      Then they’ll just make one robot do multiple things. Suddenly the big company only has one taxable employee.

      • CatAssTrophy@safest.space
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        36 minutes ago

        Depends. If the tax is based on jobs replaced, not the abstractly defined number of robots that exist, it would have an impact. Also, monolithic solutions tend to be inherently less efficient than similarly developed defined ones, so limiting the robot models for a tax benefit would have another limit on their efficiency.

        It’s an issue that could be accounted for, if there were sufficient political will. If taxes from automation were committed to public good, there would likely be pretty widespread acceptance.

  • utopiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    6 hours ago

    That’s the beauty of totally arbitrary restrictions, you can change them as you want.

    Pay by seat? Pay by client? Pay by byte of data stored? Pay by backup location?

    … pay by moonphase? Pay by AI personality? Pay by virtual AI seat?

    Such BS but why wouldn’t Microslop extend its business model. It worked well so far. It’s not about software, or datacenter, or AI, it’s just about entrenchment.

  • DarkSurferZA@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 hours ago

    MMM, interesting. Would the AI companies then need to buy a license for all the information they stole to train their AI? Or would they need to buy a license everytime someone uses micro-slop AI to ask it a question about something that has been trademarked?

    Or does licencing only apply to their software

  • SpatchyIsOnline@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    11 hours ago

    So the “amazing tool of the future” that’s “going to make software developers obsolete” is also going to need to buy software licenses?

    Which one is it Microslop?

  • edgemaster72@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    14 hours ago

    MicroSlop: We have this AI for you to use so you can reduce workforce and associated costs

    Also Sloppy: j/k, fuck you pay me

  • bookmeat@fedinsfw.app
    link
    fedilink
    English
    arrow-up
    64
    ·
    15 hours ago

    Jesus, you don’t announce that kind of thing until you have your customers locked in! Amateur.

    • FauxLiving@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      13 hours ago

      The customers are already locked in by virtue of every company who is hoping to run the same rent seeking play around AI are buying up all of the compute and storage hardware on the planet which prices consumers out of everything except the soon-to-be-overpriced subscription service(s) that they offer.

  • lowspeedchase@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    120
    ·
    18 hours ago

    Reads: Our flagship operating system and services have gotten to the point of such terrible shite for humans that we need to pivot to a less discerning customer base.

  • LordMayor@piefed.social
    link
    fedilink
    English
    arrow-up
    39
    ·
    15 hours ago
    1. Integrate AI into the OS
    2. Demand purchase of a Windows license for the AI in the OS
    3. GOTO 2

    It’s an infinite amount of money from every customer!

    • pinball_wizard@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      12 hours ago

      It’s an infinite amount of money from every customer!

      But it’s okay, because there’s infinite money to be saved by laying off technical expert staff.

  • deliriousdreams@fedia.io
    link
    fedilink
    arrow-up
    83
    ·
    18 hours ago

    If the AI Agent counts as an employee then the company “employing” it is liable for what it does.

    My guess is the argument will be that “it’s a tool”, not an employee, and therefore they take no responsibility. Though I’m sure that argument is not going to fly for very long. If your air hammer harms someone because the person operating it wasn’t using it correctly, you’re still liable.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      edit-2
      15 hours ago

      What? Companies aren’t liable if the user doesn’t follow the instructions or warnings and hurts themselves.

      DeWalt isn’t liable because I was using their mini chainsaw while holding a branch with my bare hand and the saw bounced and cut me. I’m liable for being stupid.

      • deliriousdreams@fedia.io
        link
        fedilink
        arrow-up
        5
        ·
        9 hours ago

        I don’t think you understand the context of the situation I was proposing. I am not supposing that DeWalt would be liable. But let’s say we work in a shop together and I’m using an air hammer to I dunno. Punch rivets. If I as an employee of that shop use the air hammer and something involving the air hammer happens to my coworker or a customer or whatever, it is extremely likely that the company I work for would be on the hook. Could they try to penalize me personally? Yes. Could the person who was injured sue me personally? Certainly. Would the company be off the hook if the air hammer malfunctioned causing injury? Maybe - And at that point I would expect the manufacturer to be liable. But my comment never mentioned the manufacturer.

        The context was companies using AI as a tool not companies manufacturing AI.

    • gokayburuc@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      18 hours ago

      Chain fraud activities are being carried out in chain systems like n8n, where AI agents are used together. It didn’t take them long to create systems that generate deepfake voices to sound like real people, directing users to buy a product or deposit money into an account. Many videos on this topic have surfaced in Türkiye, particularly on YouTube. If the users and system creators are to be penalized, then of course, information logs regarding these agents can be used.

      However, if this is being done to keep some agents out of the system using user license fees, it will completely backfire.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      5
      arrow-down
      6
      ·
      17 hours ago

      I don’t see how this distinction affects the question of responsibility at all. If anything, “it’s an employee” gives the company more room for deniability.

  • WanderingThoughts@europe.pub
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    16 hours ago

    The agent immediatly makes cost-benefit analysis and moves everything to open source solutions, and contracts a coding AI agent to write a simple conversion interface.

    • pinball_wizard@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      12 hours ago

      Yes! This is legitimately one of the ways the bubble may burst. Particularly if the AI gets substantially smarter, and just starts recommending full switches to existing libraries and software suites - at a cost of exactly one token, instead of churning out thousands of lines of slop code that require ongoing tokens to maintain.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    18 hours ago

    A house of cards built on top of ten other houses of cards. What could possibly go wrong.