• ell1e@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Sadly, it seems to be fairly common to have at least some AI slop code now. E.g. lemmy itself appears to be planning to do so too.

    It’s like having slop would get you some prize.

    • CrypticCoffee@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I think you misrepresented it. Lemmy dev says it needs to be declared and AI is useful for some operations but it must pass review.

      I don’t use AI, and think the code is crap, but assisted is different when in skilled hands.

      The only issue here is your absolute no vs lemmy’s pragmatic no unless used as a tool in small instances. There are better hills to die on than this and you’ll just lose support for whatever objective you have.

      • hperrin@lemmy.caOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        There are lots of legal problems with accepting any AI generated code, regardless of whether it’s bad quality or not. For one, the AI tends to reproduce copyrighted code without a proper license:

        https://youtu.be/xvuiSgXfqc4?t=247

        Another is that AI generated code is not copyrightable, so even if it’s not copying someone else’s code, it can’t be licensed under an open source license.

        • CrypticCoffee@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I watched. He used a prompt with exact wording from an example and obviously it’s the most logical continuation, so AI would generated it. We know how they’re trained. But how many open source prompts will start with exact code and comment? Unlikely to ever happen in real world. So unlikely to he direct infringement. Someone could easily sue the AI companies with these examples to prove it infringes copywrite work, but then govs are going to protect them so it won’t happen.

          It’s unlikely to get identical output without intention, but it’ll also take the infringed actively taking steps to sue that case. Stastical likelihood of this happening in real world is low.

          I’m with you in wanting to watch the AI industry collapse. But I’m unfortunately in a minority without the lobbying power, so it won’t happen.