Okay I can understand maybe the religious AI Jesus pictures maybe this could be a person but then it’s like who comes up with some of the ideas. So I’ve found Facebook pages maybe run by AI that keeps bringing up the same text and a number of times it’s political or religious content sometimes not AI pictures. I know it might not make sense considering we hear that AI is not a person so it can’t form ability to have actual thoughts and views. But AI can learn from humans so what if they learn so well they pick up on personal opinions?

  • Mikina@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    What you are describing is simply a bias from training dataset. The best way how to think abour LLM AI is that works in basically exactly the same way as if you keep mashing space on your phone keyboard, to give you text prediction (assuming your phone does that, mine always recommends next words when I type).

    Would my phone keyboard eventually start recommending words relogious words and phrases? Yes, it will, of I’m using those phrases often. Does it mean ny phone keyboard is religious? That sounds pretty weird, doesn’t it?

    And it’s not even a hyperbole, about this kind of text prediction being similar to how LLMs and AI works. It’s just math that gives you next word based on statistics of what would be most likely based on previous words. And that’s exactly what LLMs do, nothing more. The only difference is that my keyboard has been learning only on what I type, and in a little bit simpler way than LLMs are, but the goal and result is same for both - a text prediction.