• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    15
    ·
    edit-2
    4 days ago

    There is a certain brand of user (who may or may not be a human) who draws the venn of ‘AI slop’ and ‘AI output’ as a circle.

    They’ve taken the extremist position that AI should be uninvented and any use of AI is the worst thing that could possibly happen to any project and they’ll have an entire grab bag of misinformation-based memes to shotgun at you. Engaging with these people is about as productive as trying to convince a vaccine denier that vaccines don’t cause autism.

    I’m not saying that the user you replies to believes this, but the comment they wrote is indistinguishable from the comments of such a user.

    e: I’d also like to point out that these users are very much attracted to low-effort activism. This is why you see comments like mind being heavily downvoted but not many actual replies. They want to try to influence the discussion but don’t have the capability or motivation to step into the ring, so to speak, and defend their opinions.

    • ell1e@leminal.space
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      edit-2
      4 days ago

      It’s less extremist if you look at how easily these LLMs will just plagiarize 1:1, apparently:

      https://github.com/mastodon/mastodon/issues/38072#issuecomment-4105681567

      Some see “AI slop” as “identified by the immediate problems of it that I can identify right away”.

      Many others see “AI slop” as bringing many more problems beyond the immediate ones. Then seeing LLM output as anything but slop becomes difficult.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        7
        ·
        4 days ago

        It’s extremist to take the fact that you CAN get plagiaristic output and to conclude that all other output is somehow tainted.

        You personally CAN quote copyrighted music and screenplays. If you’re an artist then you also CAN produce copyright violating works. None of these facts taint any of the other things that you produce that are not copyright or plagiarized.

        In this situation, and in the current legal environment, the responsibility to not produce illegal and unlicensed code is on the human. The fact that the tool that they use has the capability to break the law does not mean that everything generated by it is tainted.

        Photoshop can be used to plagiarize and violate copyright too. It would be just as absurd to declare all images created with Photoshop are somehow suspect or unusable because of the capability of the tool to violate copyright laws.

        The fact that AI can, when specifically prompted, produce memorized segments of the training data has essentially no legal weight in any of the cases where it has been argued. It is a fact that is of interest to scientists who study how AI represent knowledge internally and not any kind of foundation for a legal argument against the use of AI.

        • badgermurphy@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          4 days ago

          Sure, but if they can be demonstrated to ever plagiarize without attribution, and the default user behavior is to pencil-whip the output, which it is, then it becomes statistically certain that users are unwittingly plagiarizing other works.

          Its like using a tool that usually bakes cookies, but every once in a great while, it knocks over the building its in. It almost never does that, though.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            4 days ago

            Plagiarism and copyright violation are two different things, one is ethical and the other is legal.

            Copyright has a body of case law which helps determine when a work significantly infringes on the copyrighted work of another. Plagiarism has no body of law at all, it is an ethical construct and not a legal one.

            You can plagiarize something that has no copyright protection and you can infringe on copyright protection without plagiarizing. They’re not interchangeable concepts.

            In your example, some institutions would not allow such a device to operate on their property but it would not be illegal to operate and the liability would be on the person and not on the oven.

            To further strain the metaphor, Linus is saying that you can use (possibly) exploding ovens, because he isn’t taking a moral stance on the topic, but you are responsible for the damages if they cause any because the legal systems require that this be the case.

    • hperrin@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      4 days ago

      According to the US Copyright Office, AI generated material cannot be copyrighted (unless of course it’s plagiarized copyrighted code). That’s reason enough to leave it out of the kernel. If the kernel’s license becomes unenforceable because of public domain code, the kernel is tainted.

      Edit: I don’t know why people are downvoting this. It’s literally just the truth: https://sciactive.com/human-contribution-policy/#More-Information

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 days ago

        Copyright and License terms are two different categories of law. Copyright is an idea created and enforced by the laws of the country which has jurisdiction. Licenses are a contract between two parties and is covered by contract law.

        A thing can be unable to be protected by copyright and also protected by the terms of the license that it is provided under. If a project contains copyrighted code that does not mean that you cannot be held to the terms of the license. Your use of licensed works is granted under the agreement that you follow the terms of the license. You cannot be held liable for copyright violations for using the code, but using the code in a manner that is not allowed by the license makes you liable for violation of the contract that is the license agreement.

        • hperrin@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          I think you’re misunderstanding what I’m saying. Any portions of the kernel that are public domain can be used by anyone for any purpose without following the terms of the GPL. AI generated code is public domain. To make sure all parts of the kernel are protected by the GPL, public domain code should not be accepted unless absolutely necessary.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            I don’t see the problem. GPL protects all of the code that is copyrighted, i.e. 100% made by humans. Accepting a submission created with AI tools doesn’t change this. It’s not going to be a simple task for someone who has decided to violate the GPL license to only use the generated/uncopyrighted portions without using any other GPL code and thus being subject to GPL licensing terms.

            These hypothetical GPL violating people will have a hard time using lines 27-38 of ./kernel/events/ring_buffer.c to do anything even if they technically can do so without releasing their code under the GPL. If they use any piece of GPL code, at all, anywhere, their entire project is required to follow the GPL. So while they could, technically, take 27-38 of ring_buffer.c and build an entire proprietary non-GPL Linux kernel… it is, in practice, not feasible even if it technically possible.

            • hperrin@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              So what happens thirty years from now when 95% of the kernel code is AI generated? It’ll be a lot easier to rewrite the parts that aren’t, and have a fully closed source kernel that you can use without following the GPL.

              • FauxLiving@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 days ago

                The short answer is that this is a slippery slope argument.

                The long answer is:

                In this hypothetical future where 95% of the Linux kernel is AI generated, it stands to reason that generating an OS kernel is possible (by definition of the hypothetical).

                If generating a full OS kernel is possible then people could generate a fully closed source kernel without using any of the 5% of GPL protected code in the Linux kernel.

                If you allow that it’s possible for AI to create a kernel with AI generated code then it will happen regardless of the status of the Linux kernel’s copyright protections.

                • hperrin@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 days ago

                  Generating code costs a lot of money, as does the expertise to review the code. People aren’t going to want to spend the many millions of dollars to do that when they could use a GPL kernel. Of course if the kernel is not only free, but basically public domain, it solves all of their problems. They can modify it and keep those modifications closed source, the complete antithesis of what the GPL stands for.

                  • FauxLiving@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    3 days ago

                    Well, that isn’t the case now and isn’t likely to be the case anytime in the near future.

                    The rules are not written in stone and future Linus will have a better idea of the capabilities of future AI and can change the rules accordingly, as he has done since the beginning, in order to steer the Linux project in the right direction.