Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)

  • Xuderis@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    4 months ago

    But what does it DO? How is it actually useful? An accessibility PDF reader is nice, but AI can do more than that

    Our initial offering will include ChatGPT, Google Gemini, HuggingChat, and Le Chat Mistral

    This is great, but again, what for?

    • Blisterexe@lemmy.zipOP
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      A lot of people use llms a lot, ao its useful for them, but its also nice for summarizing long articles you dont have the time to read, not as good as reading it, but better than skimming jt

      • Rogério Bordini@ursal.zone
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        @Blisterexe @Xuderis It’s true, as a researcher, these models have helped me a lot to speed up the process of reading and identifying specific information in scientific articles. As long as it is privacy respecting, I see this implementation with good eyes.

        • Blisterexe@lemmy.zipOP
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          It lets you use any model, so while it lets you use chatgpt, it also lets you use a self-hosted model if you edit about:config

          • Xuderis@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            But what does using that in my browser get me? If I’m running llama2, I can already copy and paste text into the terminal if I want. Is this just saving me that step?