Skip to main content
Version: v0.4.0

Llama Stack UI

The Llama Stack UI is a web-based interface for interacting with Llama Stack servers. Built with Next.js and React, it provides a visual way to work with agents, manage resources, and view logs.

Features​

  • Logs & Monitoring: View chat completions, agent responses, and vector store activity
  • Vector Stores: Create and manage vector databases for RAG (Retrieval-Augmented Generation) workflows
  • Prompt Management: Create and manage reusable prompts

Prerequisites​

You need a running Llama Stack server. The UI is a client that connects to the Llama Stack backend.

If you don't have a Llama Stack server running yet, see the Starting Llama Stack Server guide.

Running the UI​

The fastest way to get started is using npx:

npx llama-stack-ui

This will start the UI server on http://localhost:8322 (default port).

Option 2: Using Docker​

Run the UI in a container:

docker run -p 8322:8322 llamastack/ui

Access the UI at http://localhost:8322.

Environment Variables​

The UI can be configured using the following environment variables:

VariableDescriptionDefault
LLAMA_STACK_BACKEND_URLURL of your Llama Stack serverhttp://localhost:8321
LLAMA_STACK_UI_PORTPort for the UI server8322

If the Llama Stack server is running with authentication enabled, you can configure the UI to use it by setting the following environment variables:

VariableDescriptionDefault
NEXTAUTH_URLNextAuth URL for authenticationhttp://localhost:8322
GITHUB_CLIENT_IDGitHub OAuth client ID (optional, for authentication)-
GITHUB_CLIENT_SECRETGitHub OAuth client secret (optional, for authentication)-

Setting Environment Variables​

For npx:​

LLAMA_STACK_BACKEND_URL=http://localhost:8321 \
LLAMA_STACK_UI_PORT=8080 \
npx llama-stack-ui

For Docker:​

docker run -p 8080:8080 \
-e LLAMA_STACK_BACKEND_URL=http://localhost:8321 \
-e LLAMA_STACK_UI_PORT=8080 \
llamastack/ui

Using the UI​

Managing Resources​

  • Vector Stores: Create vector databases for RAG workflows, view stored documents and embeddings
  • Prompts: Create and manage reusable prompt templates
  • Chat Completions: View history of chat interactions
  • Responses: Browse detailed agent responses and tool calls

Development​

If you want to run the UI from source for development:

# From the project root
cd src/llama_stack_ui

# Install dependencies
npm install

# Set environment variables
export LLAMA_STACK_BACKEND_URL=http://localhost:8321

# Start the development server
npm run dev

The development server will start on http://localhost:8322 with hot reloading enabled.