This starter template lets you quickly start working with the Bee Agent Framework in a second.
๐ See the documentation to learn more.
npm ci..env file (default LLM provider is locally hosted Ollama).npm run start src/agent.tsTo run an agent with a custom prompt, simply do this npm run start src/agent.ts <<< 'Hello Bee!'
๐งช More examples can be found here.
[!TIP]
To use Bee agent with Python Code Interpreter refer to the Code Interpreter section.
[!TIP]
To use Bee agent with Bee Observe refer to the Observability section.
[!NOTE]
Docker distribution with support for compose is required, the following are supported:
The Bee Code Interpreter is a gRPC service that an agent uses to execute an arbitrary Python code safely.
Code Interpreter npm run infra:start --profile=code_interpreternpm run start src/agent_code_interpreter.ts[!NOTE]
Code Interpreter runs on
http://127.0.0.1:50051.
Get complete visibility of the agentโs inner workings via our observability stack.
npm run infra:start --profile=observenpm run start src/agent_observe.tsMLFlow (the agent will print instructions on how to do that).http://127.0.0.1:8080/#/experiments/0
admin, password: password)[!TIP]
Configuration file is infra/observe/.env.docker.
We use cookies
We use cookies to analyze traffic and improve your experience. You can accept or reject analytics cookies.