YACI stands for Yet Another Chat Interface
(for those that know me I love silly acronyms)
Recently I started following this great project called Ollama that much simplified most of the LLMs deployments I’ve tried so far. Since the insanely easy and fast process I decided it could have been nice use it to learn and test a few things I’ve been considering lately.
I would like to emphasize the fact that this is mainly a personal project, again mainly for learning and testing a few things around Nuxt and Nitro. At the current state it is suitable only for personal/local use. If you want to put it in a production/public environment you are on your own and the sole resposible for it.
Copy the docker compose, and then in the same directory do:
docker compose up -d
Then open up a browser and navigate to YACI (defaults to localhost:3000) and start chatting.
Currently what is available is pretty limited, since I’m building YACI only in my free time. But some of them are:
Limitations:
In the future, hopefully:
ModelfileIf you plan to access YACI and Ollama from something other than localhost you should configure the OLLAMA_ORIGINS variable, as well as defining Nuxt to use the address via NUXT_PUBLIC_YACI_OLLAMA_BASE_URL env variable.
You could sponsor me if you’d like that I dedicate more of the non-free time to this project, or you could open up an Issue/PR.
If you go for the latter: clone the repo, install dependencies and start developing YACI:
# pnpm
pnpm install --frozen-lockfile && pnpm run dev
Remember to start an Ollama instance, easiest way is via:
docker run --env=OLLAMA_HOST=0.0.0.0 --env=OLLAMA_ORIGINS="http://192.168.1.1:*,http://HOSTNAME.local:*" --volume=ollama:/root/.ollama -p 11434:11434 --name=ollama -d ollama/ollama:latest
We use cookies
We use cookies to analyze traffic and improve your experience. You can accept or reject analytics cookies.