Brev.dev, inc of San Francisco, California, USA has been acquired by NVIDIA Corporation of Santa Clara, California, USA on July 2024

Learn more
BlogPricing

Ollama on Brev

Get Started with Ollama!

Let's launch Ollama on Brev with just one command

First off, what is Ollama?

Ollama is an open-source tool that democratizes LLMs by enabling anyone to run them locally on their own machines. Ollama simplifies the complex process of setting up LLMs by bundling model weights, configurations, and datasets into a unified "Modelfile", which you can download and run on your own computer.

Why run Ollama on Brev.dev?

Brev allows users to easily provision a GPU and set up a Linux VM. This setup is super ideal for running multiple, sophisticated models via Ollama, providing a seamless experience from model selection to execution.

Together, Ollama and Brev.dev offer a powerful combination for anyone looking to use LLMs without the traditional complexities of setup and optimization. Let's dive into how to get started with Ollama on Brev!

1. Create an account

Make an account on the Brev console.

2. Launch an instance

Go to your terminal and download the Brev CLI

brew install brevdev/homebrew-brev/brev && brev login

Check out the installation instructions if you need help.

Now run the following command to launch Ollama with a specific model

brev ollama -m <model name>

You can see the full list of available models here.

Hang tight for a couple of minutes, while we provision an instance and load Ollama into it!

4. Use your Ollama endpoint!

If you want to use your Ollama endpoint, we'll give you the curl command in your terminal after the instance is ready.

You just deployed Ollama with one command!

Working with Ollama gives you a quick way to get a model running. We'll be adding a lot more support for Ollama in the coming months - if you have any special requests, feel free to email us eng@brev.dev and we'll be sure to add it as a feature!

🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙🤙🦙