• deFrisselle@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    15 days ago

    Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it’s over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 days ago

      Technically and legally the photos would be considered child porn

      I don’t think that has been tested in court. It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

      I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone. A form of image based libel, but I don’t think that’s currently a legal concept. It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

      In fact, that raises an interesting simily. We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked. We allow images of human physical abuse as long as they are faked. Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them. The resulting “works of art” are not under such limitations as far as I’m aware.

      What’s the line here? Parental consent? I think that could lead to some very concerning outcomes. We all know abusive parents exist.

      I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful. Ones that will potentially do a lot of harm. Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water. Police people’s actions, not data.

      • Todd Bonzalez@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        15 days ago

        I don’t think that has been tested in court.

        It has and it continues to be.

        And even if it hadn’t, that’s no excuse not to start.

        It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

        It depicts a real child and was distributed intentionally because of who it depicts. Find me then legal definition of pornography that demands that pornography be a “depiction of reality”. Where do you draw the line with such a qualifier?

        I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone.

        It is by definition “photo manipulation”, but the intent is to sexually exploit a child against her will. If you want to argue that this counts as a legal form of free speech (as libel is, FYI), you can fuck right on off with that.

        A form of image based libel, but I don’t think that’s currently a legal concept.

        Maybe actually know something about the law before you do all this “thinking”.

        It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

        Oh no, not the sLiPpErY sLoPe!!!

        We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked.

        Little girls are the same as animals, excellent take. /s

        Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them.

        What kind of horror films are you watching that has naked children in sexual situations?

        What’s the line here?

        Don’t sexually exploit children.

        Parental consent?

        What the living fuck? Parental consent to make porn of their kids? This is insane.

        I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful.

        The bad legal precedent of banning the creation and distribution of child pornography depicting identifiable minors?

        Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water.

        Somebody check this guy’s hard drive…

    • suburban_hillbilly@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      15 days ago

      photos

      They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

      There isn’t any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls’ faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

      I’m sure it doesn’t feel all that different to the girls in the photos, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

      https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/

      This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

      For some, I’m sure purely unrelated reason, I feel like reading Phillip K Dick again…

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        15 days ago

        They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

        most phone cameras alter the original image with AI shit now, it’s really common, they apply all kinds of weird correction to make it look better. Plus if it’s social media there’s probably a filter somewhere in there. At what point does this become the ship of thesseus?

        my point here, is that if we’re arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.

        • suburban_hillbilly@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          15 days ago

          The difference is that a manipulated photo starts with a photo. It actually contains recorded information about the subject. Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

          Yes it is semantics, it’s the reason why we have different words for photography and drawing and they are not interchangeable.

          • Rekorse@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            0
            ·
            14 days ago

            The deepfakes would contain the prompt image provided by the creator. They did not create a whole new approximation of their face as the entire pool it can pull on for that specific part is a single or group of images provided by the prompter.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            14 days ago

            Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

            this is explicitly, untrue, they literally do. You are just factually wrong about this. While it may not be in the training data, how do you think it manages to replace the face of someone in one picture, with the face of someone else in some other video.

            Do you think it just magically guesses? No, it literally uses a real picture of someone. In fact, back in the day with ganimation and early deepfake software, you literally had to train these AIs on pictures of the person you wanted it to do a faceswap on. Remember all those singing deepfakes that were super popular back a couple of years ago? Yep, those literally trained on real pictures.

            Regardless, you are still ignoring my point. My question here was how do we consider AI content to be “not photo” but consider photos manipulated numerous times, through numerous different processes, which are quite literally, not the original photo, and a literal “photo” to rephrase it simpler for you, and other readers. “why is ai generated content not considered to be a photo, when a heavily altered photo of something that vaugely resembles it’s original photo in most aspects, is considered to be a photo”

            You seem to have missed the entire point of my question entirely. And simply said something wrong instead.

            Yes it is semantics

            no, it’s not, this is a ship of thesseus premise here. The semantics results in how we contextualize and conceptualize things into word form. The problem is not semantics (they are just used to convey the problem at hand), the problem is a philosophical conundrum that has existed for thousands of years.

            in fact, if we’re going by semantics here, technically photograph is rather broad as it literally just defines itself as “something in likeness of” though it defines it as taken by method of photography. We could arguably remove that part of it, and simply use it to refer to something that is a likeness of something else. And we see this is contextual usage of words, a “photographic” copy is often used to describe something that is similar enough to something else, that in terms of a photograph, they appear to be the same thing.

            Think about scanning a paper document, that would be a photographic copy of some physical item. While it is literally taken via means of photography. In a contextual and semantic sense, it just refers to the fact that the digital copy is photographically equivalent to the physical copy.

      • daellat@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        15 days ago

        I’ve only read do androids dream of electric sheep by him, what other book(s) should I check out by him?