Skip to main content

🆕 Databricks

LiteLLM supports all models on Databricks

Usage

ENV VAR

import os 
os.environ["DATABRICKS_API_KEY"] = ""
os.environ["DATABRICKS_API_BASE"] = ""

Example Call

from litellm import completion
import os
## set ENV variables
os.environ["DATABRICKS_API_KEY"] = "databricks key"
os.environ["DATABRICKS_API_BASE"] = "databricks base url" # e.g.: https://adb-3064715882934586.6.azuredatabricks.net/serving-endpoints

# predibase llama-3 call
response = completion(
    model="databricks/databricks-dbrx-instruct", 
    messages = [{ "content": "Hello, how are you?","role": "user"}]
)

Passing additional params - max_tokens, temperature

See all litellm.completion supported params here

# !pip install litellm
from litellm import completion
import os
## set ENV variables
os.environ["DATABRICKS_API_KEY"] = "databricks key"
os.environ["DATABRICKS_API_BASE"] = "databricks api base"

# databricks dbrx call
response = completion(
    model="databricks/databricks-dbrx-instruct", 
    messages = [{ "content": "Hello, how are you?","role": "user"}],
    max_tokens=20,
    temperature=0.5
)

proxy

  model_list:
    - model_name: llama-3
      litellm_params:
        model: predibase/llama-3-8b-instruct
        api_key: os.environ/PREDIBASE_API_KEY
        max_tokens: 20
        temperature: 0.5

Passings Database specific params - 'instruction'

For embedding models, databricks lets you pass in an additional param 'instruction'. Full Spec

# !pip install litellm
from litellm import embedding
import os
## set ENV variables
os.environ["DATABRICKS_API_KEY"] = "databricks key"
os.environ["DATABRICKS_API_BASE"] = "databricks url"

# predibase llama3 call
response = litellm.embedding(
      model="databricks/databricks-bge-large-en",
      input=["good morning from litellm"],
      instruction="Represent this sentence for searching relevant passages:",
  )

proxy

  model_list:
    - model_name: bge-large
      litellm_params:
        model: databricks/databricks-bge-large-en
        api_key: os.environ/DATABRICKS_API_KEY
        api_base: os.environ/DATABRICKS_API_BASE
        instruction: "Represent this sentence for searching relevant passages:"

Supported Databricks Chat Completion Models

Here's an example of using a Databricks models with LiteLLM

Model NameCommand
databricks-dbrx-instructcompletion(model='databricks/databricks-dbrx-instruct', messages=messages)
databricks-meta-llama-3-70b-instructcompletion(model='databricks/databricks-meta-llama-3-70b-instruct', messages=messages)
databricks-llama-2-70b-chatcompletion(model='databricks/databricks-llama-2-70b-chat', messages=messages)
databricks-mixtral-8x7b-instructcompletion(model='databricks/databricks-mixtral-8x7b-instruct', messages=messages)
databricks-mpt-30b-instructcompletion(model='databricks/databricks-mpt-30b-instruct', messages=messages)
databricks-mpt-7b-instructcompletion(model='databricks/databricks-mpt-7b-instruct', messages=messages)

Supported Databricks Embedding Models

Here's an example of using a databricks models with LiteLLM

Model NameCommand
databricks-bge-large-encompletion(model='databricks/databricks-bge-large-en', messages=messages)