Skip to main content
Version: Next

inline::chromadb

Description​

Chroma is an inline and remote vector database provider for Llama Stack. It allows you to store and query vectors directly within a Chroma database. That means you're not limited to storing vectors in memory or in a separate service.

Features​

Chroma supports:

  • Store embeddings and their metadata
  • Vector search
  • Full-text search
  • Document storage
  • Metadata filtering
  • Multi-modal retrieval

Usage​

To use Chrome in your Llama Stack project, follow these steps:

  1. Install the necessary dependencies.
  2. Configure your Llama Stack project to use chroma.
  3. Start storing and querying vectors.

Installation​

You can install chroma using pip:

pip install chromadb

Documentation​

See Chroma's documentation for more details about Chroma in general.

Configuration​

FieldTypeRequiredDefaultDescription
db_path<class 'str'>No
persistence<class 'llama_stack.core.storage.datatypes.KVStoreReference'>NoConfig for KV store backend

Sample Configuration​

db_path: ${env.CHROMADB_PATH}
persistence:
namespace: vector_io::chroma
backend: kv_default