Skip to main content
Open in Kaggle  Open in Colab  Download Notebook
This documentation page is also available as an interactive notebook. You can launch the notebook in Kaggle or Colab, or download it for use with an IDE or local Jupyter installation, by clicking one of the above links.
Pixeltable’s OpenRouter integration enables you to access multiple LLM providers through a unified API via OpenRouter.

Prerequisites

Important Notes

  • OpenRouter usage may incur costs based on the models you use and your usage volume.
  • Be mindful of sensitive data and consider security measures when integrating with external services.
First you’ll need to install required libraries and enter your OpenRouter API key.
%pip install -qU pixeltable openai
import os
import getpass

if 'OPENROUTER_API_KEY' not in os.environ:
    os.environ['OPENROUTER_API_KEY'] = getpass.getpass('Enter your OpenRouter API key:')
Now let’s create a Pixeltable directory to hold the tables for our demo.
import pixeltable as pxt

# Remove the 'openrouter_demo' directory and its contents, if it exists
pxt.drop_dir('openrouter_demo', force=True)
pxt.create_dir('openrouter_demo')
Connected to Pixeltable database at: postgresql+psycopg://postgres:@/pixeltable?host=/Users/asiegel/.pixeltable/pgdata
Created directory ‘openrouter_demo’.

Chat Completions

Create a Table: In Pixeltable, create a table with columns to represent your input data and the columns where you want to store the results from OpenRouter.
from pixeltable.functions import openrouter

# Create a table in Pixeltable and add a computed column that calls OpenRouter
t = pxt.create_table('openrouter_demo.chat', {'input': pxt.String})

messages = [{'role': 'user', 'content': t.input}]

t.add_computed_column(output=openrouter.chat_completions(
    messages=messages,
    model='anthropic/claude-3.5-sonnet',
    model_kwargs={
        # Optional dict with parameters compatible with the model
        'max_tokens': 300,
        'temperature': 0.7
    }
))
Created table ‘chat’.
Added 0 column values with 0 errors.
No rows affected.
# Parse the response into a new column
t.add_computed_column(response=t.output.choices[0].message.content)
Added 0 column values with 0 errors.
No rows affected.
# Start a conversation
t.insert([
    {'input': 'How many species of felids have been classified?'},
    {'input': 'Can you make me a coffee?'}
])
t.select(t.input, t.response).head()
Inserting rows into `chat`: 2 rows [00:00, 166.89 rows/s]
Inserted 2 rows with 0 errors.

Using Different Models

One of OpenRouter’s key benefits is easy access to models from multiple providers. Let’s create a table that compares responses from Anthropic Claude, OpenAI GPT-4, and Meta Llama.
# Create a table to compare different models
compare_t = pxt.create_table('openrouter_demo.compare_models', {'prompt': pxt.String})

messages = [{'role': 'user', 'content': compare_t.prompt}]

# Add responses from different models
compare_t.add_computed_column(
    claude=openrouter.chat_completions(
        messages=messages,
        model='anthropic/claude-3.5-sonnet',
        model_kwargs={'max_tokens': 150}
    ).choices[0].message.content
)

compare_t.add_computed_column(
    gpt4=openrouter.chat_completions(
        messages=messages,
        model='openai/gpt-4o-mini',
        model_kwargs={'max_tokens': 150}
    ).choices[0].message.content
)

compare_t.add_computed_column(
    llama=openrouter.chat_completions(
        messages=messages,
        model='meta-llama/llama-3.1-8b-instruct',
        model_kwargs={'max_tokens': 150}
    ).choices[0].message.content
)

Created table ‘compare_models’.
Added 0 column values with 0 errors.
Added 0 column values with 0 errors.
Added 0 column values with 0 errors.
No rows affected.
# Insert a prompt and compare responses
compare_t.insert([{'prompt': 'Explain quantum entanglement in one sentence.'}])
compare_t.select(compare_t.prompt, compare_t.claude, compare_t.gpt4, compare_t.llama).head()
Inserting rows into `compare_models`: 1 rows [00:00, 131.36 rows/s]
Inserted 1 row with 0 errors.

Advanced Features: Provider Routing

OpenRouter allows you to specify provider preferences for fallback behavior and cost optimization.
# Create a table with provider routing
routing_t = pxt.create_table('openrouter_demo.routing', {'input': pxt.String})

messages = [{'role': 'user', 'content': routing_t.input}]
routing_t.add_computed_column(
    output=openrouter.chat_completions(
        messages=messages,
        model='anthropic/claude-3.5-sonnet',
        model_kwargs={'max_tokens': 300},
        # Specify provider preferences
        provider={
            'order': ['Anthropic', 'OpenAI'],  # Try Anthropic first, then OpenAI
            'allow_fallbacks': True
        }
    )
)

routing_t.add_computed_column(response=routing_t.output.choices[0].message.content)
Created table ‘routing’.
Added 0 column values with 0 errors.
Added 0 column values with 0 errors.
No rows affected.
routing_t.insert([{'input': 'What are the primary colors?'}])
routing_t.select(routing_t.input, routing_t.response).head()
Inserting rows into `routing`: 1 rows [00:00, 142.11 rows/s]
Inserted 1 row with 0 errors.

Advanced Features: Context Window Optimization

OpenRouter supports transforms like ‘middle-out’ to optimize handling of long contexts.
# Create a table with transforms for long context optimization
transform_t = pxt.create_table('openrouter_demo.transforms', {'long_context': pxt.String})

messages = [{'role': 'user', 'content': transform_t.long_context}]
transform_t.add_computed_column(
    output=openrouter.chat_completions(
        messages=messages,
        model='openai/gpt-4o-mini',
        model_kwargs={'max_tokens': 200},
        # Apply middle-out transform for better long context handling
        transforms=['middle-out']
    )
)

transform_t.add_computed_column(response=transform_t.output.choices[0].message.content)
Created table ‘transforms’.
Added 0 column values with 0 errors.
Added 0 column values with 0 errors.
No rows affected.
# Example with longer context
long_text = """
Artificial intelligence has transformed many industries. Machine learning algorithms 
can now detect patterns in data that humans might miss. Deep learning has revolutionized 
computer vision and natural language processing. The future of AI looks promising with 
developments in areas like reinforcement learning and generative models.

Question: What are the main AI developments mentioned?
"""

transform_t.insert([{'long_context': long_text}])
transform_t.select(transform_t.response).head()

Inserting rows into `transforms`: 1 rows [00:00, 123.46 rows/s]
Inserted 1 row with 0 errors.

Learn More

To learn more about advanced techniques like RAG operations in Pixeltable, check out the RAG Operations in Pixeltable tutorial. For more information about OpenRouter’s features and available models, visit: - OpenRouter Documentation - Available Models If you have any questions, don’t hesitate to reach out.