LangChain을 Predibase의 모델과 함께 사용하는 방법을 알아보세요.

Setup

  • Predibase 계정과 API key를 생성하세요.
  • pip install predibase 명령어로 Predibase Python client를 설치하세요
  • API key를 사용하여 인증하세요

LLM

Predibase는 LLM module을 구현하여 LangChain과 통합됩니다. 아래에서 간단한 예제를 확인하거나 LLM > Integrations > Predibase에서 전체 notebook을 확인할 수 있습니다.
import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

model = Predibase(
    model="mistral-7b",
    predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
    predibase_sdk_version=None,  # optional parameter (defaults to the latest Predibase SDK version if omitted)
    """
    Optionally use `model_kwargs` to set new default "generate()" settings.  For example:
    {
        "api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
        "max_new_tokens": 5,  # default is 256
    }
    """
    **model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings.  For example:
{
    "temperature": 0.5,  # default is the value in model_kwargs or 0.1 (initialization default)
    "max_new_tokens": 1024,  # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)
Predibase는 model 인자로 제공된 base model에서 fine-tuning된 Predibase-hosted 및 HuggingFace-hosted adapter도 지원합니다:
import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

# The fine-tuned adapter is hosted at Predibase (adapter_version must be specified).
model = Predibase(
    model="mistral-7b",
    predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
    predibase_sdk_version=None,  # optional parameter (defaults to the latest Predibase SDK version if omitted)
    adapter_id="e2e_nlg",
    adapter_version=1,
    """
    Optionally use `model_kwargs` to set new default "generate()" settings.  For example:
    {
        "api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
        "max_new_tokens": 5,  # default is 256
    }
    """
    **model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings.  For example:
{
    "temperature": 0.5,  # default is the value in model_kwargs or 0.1 (initialization default)
    "max_new_tokens": 1024,  # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)
Predibase는 model 인자로 제공된 base model에서 fine-tuning된 adapter도 지원합니다:
import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

# The fine-tuned adapter is hosted at HuggingFace (adapter_version does not apply and will be ignored).
model = Predibase(
    model="mistral-7b",
    predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
    predibase_sdk_version=None,  # optional parameter (defaults to the latest Predibase SDK version if omitted)
    adapter_id="predibase/e2e_nlg",
    """
    Optionally use `model_kwargs` to set new default "generate()" settings.  For example:
    {
        "api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
        "max_new_tokens": 5,  # default is 256
    }
    """
    **model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings.  For example:
{
    "temperature": 0.5,  # default is the value in model_kwargs or 0.1 (initialization default)
    "max_new_tokens": 1024,  # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)

Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
I