A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would direct the National Institute of Standards and Technology (NIST) to create standards and guidelines that help prove the origin of content and detect synthetic content, like through watermarking. It also directs the agency to create security measures to prevent tampering and requires AI tools for creative or journalistic content to let users attach information about their origin and prohibit that information from being removed. Under the bill, such content also could not be used to train AI models.

Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers. State attorneys general and the Federal Trade Commission could also enforce the bill, which its backers say prohibits anyone from “removing, disabling, or tampering with content provenance information” outside of an exception for some security research purposes.

(A copy of the bill is in he article, here is the important part imo:

Prohibits the use of “covered content” (digital representations of copyrighted works) with content provenance to either train an AI- /algorithm-based system or create synthetic content without the express, informed consent and adherence to the terms of use of such content, including compensation)

  • e$tGyr#J2pqM8v@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    I don’t like AI but I hate intellectual property. And the people that want to restrict AI don’t seem to understand the implications that has. I am ok with copying as I think copyright is a load of bullocks. But they aren’t even reproducing the content verbatim are they? They’re ‘taking inspiration’ if you will, transforming it into something completely different. Seems like fair use to me. It’s just that people hate AI, and hate the companies behind it, and don’t get me wrong, rightfully so, but that shouldn’t get us all to stop thinking critically about intellectual property laws.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I’m the opposite, actually. I like generative AI. But as a creator who shares his work with the public for their (non-commercial) enjoyment, I am not okay with a billionaire industry training their models on my content without my permission, and then use those models as a money machine.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          What are you basing that on?

          Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers.

          Doesn’t say anything about the right just applying to giant tech companies, it specifically mentions artists as part of the protected content owners.

          • interdimensionalmeme@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            That’s like saying you are just as protected regardless which side of the mote you stand on.

            It’s pretty clear the way things are shaping up is only the big tech elite will control AI and they will lord us over with it.

            The worst thing that could happen with AI. It falling into the hands of the elites, is happening.

            • just another dev@lemmy.my-box.dev
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              I respectfully disagree. I think small time AI (read: pretty much all the custom models on hugging face) will get a giant boost out of this, since they can get away with training on “custom” data sets - since they are too small to be held accountable.

              However, those models will become worthless to enterprise level models, since they wouldn’t be able to account for the legality. In other words, once you make big bucks of of AI you’ll have to prove your models were sourced properly. But if you’re just creating a model for small time use, you can get away with a lot.

              • interdimensionalmeme@lemmy.ml
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                I am skeptical that this is how it will turn out. I don’t really believe there will be a path from 0$ to challenging big tech without a roadblock of lawyers shutting you down with no way out on the way.

                • just another dev@lemmy.my-box.dev
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  4 months ago

                  I don’t think so either, but to me that is the purpose.

                  Somewhere between small time personal-use ML and commercial exploitation, there should be ethical sourcing of input data, rather than the current method of “scrape all you can find, fuck copyright” that OpenAI & co are getting away with.

                  • interdimensionalmeme@lemmy.ml
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    4 months ago

                    I mean this is exactly the kind of regulation that microsoft/openai is begging for to cement their position. Then is going to be just a matter of digesting their surviving competitors until only one competitor remains, similar to Intel / AMD relationship. Then they can have a 20 year period of stagnation while they progressively screw over customers and suppliers.

                    I think that’s the bad ending. By desperately trying to keep the old model of intellectual property going, they’re going to make the real AI nightmare of an elite few in control of the technology with an unconstrained ability to leverage the benefits and further solidifying their lead over everyone else.

                    The collective knowledge of humanity is not their exclusive property. It also isn’t the property of whoever is the lastest person to lay a claim to an idea in effective perpetuity.

    • rekorse@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Just because intellectual property laws currently can be exploited doesnt mean there is no place for it at all.

      • e$tGyr#J2pqM8v@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        That’s an opinion you can have, but I can just as well hold mine, which is that restricting any form of copying is unnatural and harmful to society.

          • e$tGyr#J2pqM8v@feddit.nl
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            That’s right. They can put their art up for sale, but if someone wants to take a free copy nothing should be able to stop them.

                • rekorse@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  4 months ago

                  That would lead to most art being produced by people who are wealthy enough to afford to produce it for free, wouldn’t it?

                  What incentive would a working person have to work on becoming an artist? Its not like artists are decided at birth or something.

                  • e$tGyr#J2pqM8v@feddit.nl
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    4 months ago

                    Most people who make art don’t make any money from it. Some make a little bit of money. A small number of people can afford a living just by making art, and just a fraction of that actually get most of the money that’s being earned by artists, and then of course there is a lot of money that’s being paid for art that never reaches the artist. The business as it is is not working very well for anyone except for some big media companies. The complete lack of commercial success hasn’t stopped a lot of artists, it won’t stop them in the future. Thank god, because it wouldn’t be the first time that after decades of no commercial success whatsoever such an outsider is discovered by the masses. Sure, lack of commercial success has stopped others, but that’s happening now just as it will happen without copyright laws. If donating to artists out of free will would be the norm, and people knew that that’s the main source of income for certain types of artists, then I’m sure a lot of people would do so. And aside from private donations there could be governments and all sorts of institutions financing art. And if someone still can’t make a living, then still none of that could legitimize copyright in my view. We should strive for a world where everyone that wants to follow up on their creative impulses has time and opportunity to do so, irrespective of their commercial success. But there should also be unrestricted access to knowledge, ideas, art, etc. Brilliant research, photography or music shouldn’t be reserved for those who can afford access. The public domain should be the norm so that our shared project of human creativity can reach maximum potential. Copyright seems to me to be a rather bizarre level of control over the freedom of others. It’s making something public for others to see, but then telling these people you’re not allowed to be inspired by it, you can’t take a free copy to show others, you can’t take the idea and do with it as you please. It’s severely limiting us culturally, it’s harming human creativity. And at the same time it’s hypocritical. Artistic ideas are often completely based of the ideas of others, everyone can see that the output is the result of a collective effort. The Beatles didn’t invent pop music, they just made some songs, precisely copying all that came before them, and then added a tiny bit of their own. And that’s not a criticism, that’s how human creativity functions. That’s what people should strive for. To limit copying, is to limit humanity in it’s core. Again, human creativity is very clearly a collective effort. But despite this fact, when someone gets successful suddenly it’s a personal achievement and they are allowed to ask for a lot of money for it. Well my answer is, yes they are allowed to ask, and I am very willing to pay, but they shouldn’t be allowed to go beyond asking, they shouldn’t be allowed to restrict access of something that has been published.

    • Adderbox76@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      They’re ‘taking inspiration’ if you will, transforming it into something completely different.

      That is not at all what takes place with A.I.

      An A.I. doesn’t “learn” like a human does. It aggregates multiple chunks from multiple sources. It’s just really really tiny chunks so it’s hard to tell sometimes.

      That’s why you can ask two AI’s to write a story based on the same prompt and some of their lines will be exactly the same. Because it’s not taking inspiration from, it’s literally copying bits and pieces of other works and it happens that they both chose that particular bit.

      If you do that when writing a paper in university it’s called plagerism.

      Get the fuck out of here with your “A.I. takes inspiration…” it copies nothing more. It doesn’t add anything new to the sum total of the creative zeitgeist because it’s just remixes of things that already exist.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        You can do the same thing with the Hardy Boys. You can find the same page word for word in different books. You can also do that with the Bible. The authors were plagiarizing each other.

        It doesn’t add anything new to the sum total of the creative zeitgeist because it’s just remixes of things that already exist.

        Do yourself a favor and never ever go into design of infrastructure equipment or eat at a Pizza Hut or get a drink from Starbucks or work for an American car company or be connected to Boeing.

        Everyone has this super impressive view of human creativity and I am waiting to see any of it. As far as I can tell the less creative you are the more success you will have. But let me guess you ride a Segway, wear those shoes with toes, have gone through every recipe of Julia Childs, and compose novels that look like Finnegan’s Wake got into a car crash with EE Cummings and Gravity’s Rainbow.

        Now leave me alone with I eat the same burger as everyone else and watch reruns of Family Guy in my house that looks like all the other ones on the street

      • ObliviousEnlightenment@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Consider youtube poop, Im serious. Everyclip in them is sourced from preexisting audio and video, and mixed or distorted in a comedic format. You could make an AI to make youtube poops using those same clips and other “poops” as training data. What it outputs might be of lower quality, but in a technical sense it would be made in an identical fashion. And, to the chagrin of Disney, Nintendo, and Viacom, these are considered legally distinct entities; because I dont watch Frying Nemo in place of Finding Nemo. So why would it be any different when an AI makes it?

      • Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        You just reiterate what other anti-ML extremists have said like a sad little parrot. No, LLMs don’t just copy. They network information and associations and can output entirely new combinations of them. To do this, they make use of neural networks, which are computational concepts analogous to the way your brain works. If, according to you, LLMs just copy, then that’s all that you do as well.

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        it copies nothing more it’s just remixes of things that already exist.

        So it does do more than copying? Because as you said - it remixes.

        It sounds like the line you’re trying to draw is not only arbitrary, but you yourself can’t even stick with it for more than one sentence.

        Everything new is some unique combination of things that already exist, the elements it draws from are called sources and influences, and rules according to which they’re remixed are called techniques.

        Heck even re-arranging elements of just one thing is a unique and different thing, or is your favourite song and a remix of it literally the same? Or does the remix not have artistic value, even though someone out there probably likes the remix, but not the original?

        I think your confusion stems from the fact you’re a top shelf, grade-A Moron.

        You’re an organic, locally sourced and ethically produced idiot, and you need to learn how basic ML works, what “new” is, and glance at some basic epistemology and metaphysics before you lead us to ruin because you don’t even understand what “new” entails, before your reactionary rhetoric leads us all down straight to cyberpunk dystopias.