Portable ollama

← Back to all projects

Overview

A single zip file that can run Ollma based LLM's without installing anything onto the host system

Tech Stack

Python Go

The Challenge

As a yougin trying to adapt a project I didn't make, into my vision was a new and worthwhile challenge

The Solution

Writing this a couple years after the fact, I believe the solution was living patching the env variables that ollama worked off of to the dir that the server was run in

What I Learned

Much like LLM's this was a practice in working with a black box. A software which I couldn't understand going into the project, and coming out with a modified product that fit my vision