I installed Ollama but I don’t have any ideas of what to do with it.

Do you have any fun/original use cases for it? I’m a programmer so it doesn’t have to exist already.

  • alphakenny1@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 days ago

    Link it to openweb-ui makes things easier Then can knowledge it on say all the manuals in your house Or your home insurance policy or soemthing.

    Link it to “speaches” and then you can make a voice chat.

    Link it to continue.dev for coding

    I think alot of the use case come from developing system prompts You can them make a “custom” model for specific tasks. I.e this model knows about my home insurance policy but writes back lile it’s pirate with a stutter

    Nothing useful but gets your toes wet.

  • The Hobbyist@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    15 days ago

    Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

    You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

    Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

    Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.