• traches@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    They don’t have to, algorithms do whatever they are designed to do. Long division is an algorithm.

    Profit motives are the issue here.

  • pelespirit@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    24 days ago

    They did a study around the 2020 elections and have found the following to work with trolls:

    Respond once with the facts (if you must), and then walk away. I have found Lemmy not needing that most of the time, just downvoting seems to work. But if you’re on the place that shall not be named, this works.

  • ristoril_zip@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    What I don’t get about this is why in this day and age with all the analytics tools we have do companies continue to just happily pay for simple eyeball exposure?

    The only time they seem to have any pause at all on this model is if people post screenshots of ads for their products next to posts literally praising Nazis.

    These so called AIs (LLMs) can learn to tell the difference between positive/happy/uplifting posts, neutral posts, and angry/sad/disturbing posts. The advertisers should be asking for their products to be featured next to the first and second groups of posts.

    People engage based on anger, sure. They click posts and reply and whatnot. But do they click the ad next to a post that pisses them off and then buy the product?

    Or is this purely a subconscious intrusion effort? Do the advertisers just want their products in front of eyeballs regardless of what’s around the ad? It seems like the answer is “no” when they’re called out. But maybe it’s “yes” if they can get away with it?

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    24 days ago

    I’ve been participating in Threads (yeah, I know, should be ashamed) and I’m unfortunately a sucker for some of the ragebait, especially political.

    Guess what Threads pushes at me. A lot of the dumbest ragebait. Not people that actually want to have a conversation. My fault for being a sucker, but the algorithms work.

    Doesn’t really matter, I’m shadowbanned. Pissed off too many republican propagandists by refuting them, so as usual, the “report” button is their remedy.

  • RGB@lemmy.today
    link
    fedilink
    arrow-up
    0
    ·
    24 days ago

    Algorithms simply determine which posts will get the most interaction and feed it to people. Does it benefit corps? Of course! But it’s driven by people who choose to engage in this content.

  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    Wasn’t this literally the shady research that Facebook got caught doing with Cambridge Analytica? Specifically tweaking a user’s feed to be more negative resulted in that user posting more negative things themselves and more engagement overall.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      24 days ago

      Yep!

      Facebook figured out how to monetize trolling.

      Over 10 years later, it’s destroyed society, but made them a lot of money.

    • TheReturnOfPEB@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      24 days ago

      I wonder exactly how much of Hawaii Zuckerberg has to own before people start to question what they are getting from facebook.

  • AVincentInSpace@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    …as opposed to platforms like Lemmy, where the only political ideologies you’ll find are “leftists” who, when asked what they even believe, respond with “what are you, a cop?”

  • MouseKeyboard@ttrpg.network
    link
    fedilink
    arrow-up
    0
    ·
    24 days ago

    For a long time Facebook counted an angry react as equal to five likes for measuring engagement. It’s very much intentional.