Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

NodeBB

  1. Home
  2. Selfhosted
  3. What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

Scheduled Pinned Locked Moved Selfhosted
selfhosted
25 Posts 14 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • andrew0@lemmy.dbzer0.comA [email protected]

    Ollama for API, which you can integrate into Open WebUI. You can also integrate image generation with ComfyUI I believe.

    It's less of a hassle to use Docker for Open WebUI, but ollama works as a regular CLI tool.

    M This user is from outside of this forum
    M This user is from outside of this forum
    [email protected]
    wrote last edited by
    #21

    This is what I do its excellent.

    1 Reply Last reply
    0
    • mitexleo@buddyverse.oneM [email protected]

      You should try https://cherry-ai.com/ .. It's the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.

      C This user is from outside of this forum
      C This user is from outside of this forum
      [email protected]
      wrote last edited by
      #22

      But its website is Chinese. Also what's the github?

      H 1 Reply Last reply
      2
      • andrew0@lemmy.dbzer0.comA [email protected]

        Ollama for API, which you can integrate into Open WebUI. You can also integrate image generation with ComfyUI I believe.

        It's less of a hassle to use Docker for Open WebUI, but ollama works as a regular CLI tool.

        C This user is from outside of this forum
        C This user is from outside of this forum
        [email protected]
        wrote last edited by [email protected]
        #23

        But won't this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?

        andrew0@lemmy.dbzer0.comA 1 Reply Last reply
        0
        • C [email protected]

          But won't this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?

          andrew0@lemmy.dbzer0.comA This user is from outside of this forum
          andrew0@lemmy.dbzer0.comA This user is from outside of this forum
          [email protected]
          wrote last edited by
          #24

          All the ones I mentioned can be installed with pip or uv if I am not mistaken. It would probably be more finicky than containers that you can put behind a reverse proxy, but it is possible if you wish to go that route. Ollama will also run system-wide, so any project will be able to use its API without you having to create a separate environment and download the same model twice in order to use it.

          1 Reply Last reply
          1
          • C [email protected]

            But its website is Chinese. Also what's the github?

            H This user is from outside of this forum
            H This user is from outside of this forum
            [email protected]
            wrote last edited by
            #25

            https://github.com/CherryHQ/cherry-studio

            1 Reply Last reply
            2
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Login or register to search.
            Powered by NodeBB Contributors
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups