//yacibysandros94

yaci

1
0
1
10
Vue

YACI

YACI stands for Yet Another Chat Interface

(for those that know me I love silly acronyms)

Recently I started following this great project called Ollama that much simplified most of the LLMs deployments I’ve tried so far. Since the insanely easy and fast process I decided it could have been nice use it to learn and test a few things I’ve been considering lately.

I would like to emphasize the fact that this is mainly a personal project, again mainly for learning and testing a few things around Nuxt and Nitro. At the current state it is suitable only for personal/local use. If you want to put it in a production/public environment you are on your own and the sole resposible for it.

How to use it

Copy the docker compose, and then in the same directory do:

docker compose up -d

Then open up a browser and navigate to YACI (defaults to localhost:3000) and start chatting.

Features

Currently what is available is pretty limited, since I’m building YACI only in my free time. But some of them are:

Limitations:

  • Currently it is only possible to manage Models published by Ollama. Custom ones still needs to be managed via Ollama’s Rest API.
  • No Langchain, only pure Rest API. But I’m planning to experiment with it in the future.
  • No docs, but planned.

In the future, hopefully:

  • Upload Modelfile
  • Generate and use

Ollama

If you plan to access YACI and Ollama from something other than localhost you should configure the OLLAMA_ORIGINS variable, as well as defining Nuxt to use the address via NUXT_PUBLIC_YACI_OLLAMA_BASE_URL env variable.

Contribute

You could sponsor me if you’d like that I dedicate more of the non-free time to this project, or you could open up an Issue/PR.

If you go for the latter: clone the repo, install dependencies and start developing YACI:

# pnpm
pnpm install --frozen-lockfile && pnpm run dev

Remember to start an Ollama instance, easiest way is via:

docker run --env=OLLAMA_HOST=0.0.0.0 --env=OLLAMA_ORIGINS="http://192.168.1.1:*,http://HOSTNAME.local:*" --volume=ollama:/root/.ollama -p 11434:11434 --name=ollama -d ollama/ollama:latest
[beta]v0.14.0