Available List of Distributions
Here are a list of distributions you can use to start a Llama Stack server that are provided out of the box.
Selection of a Distribution / Template
Which templates / distributions to choose depends on the hardware you have for running LLM inference.
Do you want a hosted Llama Stack endpoint? If so, we suggest leveraging our partners who host Llama Stack endpoints. Namely, fireworks.ai and together.xyz.
Read more about it here - Remote-Hosted Endpoints.
Do you have access to machines with GPUs? If you wish to run Llama Stack locally or on a cloud instance and host your own Llama Stack endpoint, we suggest:
Are you running on a “regular” desktop or laptop ? We suggest using the ollama template for quick prototyping and get started without having to worry about needing GPUs.
Do you have an API key for a remote inference provider like Fireworks, Together, etc.? If so, we suggest:
Do you want to run Llama Stack inference on your iOS / Android device? Lastly, we also provide templates for running Llama Stack inference on your iOS / Android device:
If none of the above fit your needs, you can also build your own custom distribution.