This documentation page is also available as an interactive notebook. You can launch the notebook in
Kaggle or Colab, or download it for use with an IDE or local Jupyter installation, by clicking one of the
above links.
Ollama is a popular platform for local serving of LLMs. In this
tutorial, we’ll show how to integrate Ollama models into a Pixeltable
workflow.
Install Ollama
You’ll need to have an Ollama server instance to query. There are
several ways to do this.
Running on a Local Machine
If you’re running this notebook on your own machine, running Windows,
Mac OS, or Linux, you can install Ollama at: https://ollama.com/download
Running on Google Colab
- OR, if you’re running on Colab, you can install Ollama by
uncommenting and running the following code.
# To install Ollama on colab, uncomment and run the following
# three lines (this will also work on a local Linux machine
# if you don't already have Ollama installed).
# !curl -fsSL https://ollama.com/install.sh | sh
# import subprocess
# ollama_process = subprocess.Popen(['ollama', 'serve'], stderr=subprocess.PIPE)
Running on a remote Ollama server
- OR, if you have access to an Ollama server running remotely, you can
uncomment and run the following line, replacing the default URL with
the URL of your remote Ollama instance.
# To run the notebook against an instance of Ollama running on a
# remote server, uncomment the following line and specify the URL.
# os.environs['OLLAMA_HOST'] = 'https://127.0.0.1:11434'
Once you’ve completed the installation, run the following commands to
verify that it’s been successfully installed. This may result in an LLM
being downloaded, so it may take some time.
import ollama
ollama.pull('qwen2.5:0.5b')
ollama.generate('qwen2.5:0.5b', 'What is the capital of Missouri?')['response']
“The capital city of Missouri is Jefferson City. It’s located in the central part of the state and serves as the administrative center for the Midwestern U.S. territory of Missouri. The state is known for its rich history, particularly regarding the Missouri River, which runs through its central parts and provides access to major cities along the border with Illinois.”
Install Pixeltable
Now, let’s install Pixeltable and create a table for the demo.
%pip install -qU pixeltable
import pixeltable as pxt
from pixeltable.functions.ollama import chat
pxt.drop_dir('ollama_demo', force=True)
pxt.create_dir('ollama_demo')
t = pxt.create_table('ollama_demo.chat', {'input': pxt.String})
messages = [{'role': 'user', 'content': t.input}]
t.add_computed_column(output=chat(
messages=messages,
model='qwen2.5:0.5b',
# These parameters are optional and can be used to tune model behavior:
options={'max_tokens': 300, 'top_p': 0.9, 'temperature': 0.5},
))
# Extract the response content into a separate column
t.add_computed_column(response=t.output.message.content)
Connected to Pixeltable database at: postgresql+psycopg://postgres:@/pixeltable?host=/Users/asiegel/.pixeltable/pgdata
Created directory `ollama_demo`.
Created table `chat`.
Added 0 column values with 0 errors.
Added 0 column values with 0 errors.
We can insert our input prompts into the table now. As always,
Pixeltable automatically updates the computed columns by calling the
relevant Ollama endpoint.
# Start a conversation
t.insert(input='What are the most popular services for LLM inference?')
t.select(t.input, t.response).show()
Computing cells: 100%|████████████████████████████████████████████| 3/3 [00:02<00:00, 1.18 cells/s]
Inserting rows into `chat`: 1 rows [00:00, 75.39 rows/s]
Computing cells: 100%|████████████████████████████████████████████| 3/3 [00:02<00:00, 1.17 cells/s]
Inserted 1 row with 0 errors.
Learn More
To learn more about advanced techniques like RAG operations in
Pixeltable, check out the RAG Operations in
Pixeltable
tutorial.
If you have any questions, don’t hesitate to reach out.