이 페이지는 LangChain 내에서 Prediction Guard 생태계를 사용하는 방법을 다룹니다. 설치 및 설정, 그리고 특정 Prediction Guard wrapper에 대한 참조, 이렇게 두 부분으로 나뉩니다. 이 통합은 langchain-predictionguard 패키지에서 유지 관리됩니다.

설치 및 설정

  • PredictionGuard LangChain partner 패키지를 설치합니다:
pip install langchain-predictionguard
  • Prediction Guard API key를 (여기에 설명된 대로) 받아서 환경 변수(PREDICTIONGUARD_API_KEY)로 설정합니다

Prediction Guard LangChain 통합

API설명Endpoint 문서Import사용 예시
ChatChat Bot 구축Chatfrom langchain_predictionguard import ChatPredictionGuardChatPredictionGuard.ipynb
Completions텍스트 생성Completionsfrom langchain_predictionguard import PredictionGuardPredictionGuard.ipynb
Text Embedding문자열을 벡터로 임베딩Embeddingsfrom langchain_predictionguard import PredictionGuardEmbeddingsPredictionGuardEmbeddings.ipynb

시작하기

Chat Model

Prediction Guard Chat

사용 예시를 참조하세요
from langchain_predictionguard import ChatPredictionGuard

사용법

# If predictionguard_api_key is not passed, default behavior is to use the `PREDICTIONGUARD_API_KEY` environment variable.
chat = ChatPredictionGuard(model="Hermes-3-Llama-3.1-8B")

chat.invoke("Tell me a joke")

Embedding Model

Prediction Guard Embedding

사용 예시를 참조하세요
from langchain_predictionguard import PredictionGuardEmbeddings

사용법

# If predictionguard_api_key is not passed, default behavior is to use the `PREDICTIONGUARD_API_KEY` environment variable.
embeddings = PredictionGuardEmbeddings(model="bridgetower-large-itm-mlm-itc")

text = "This is an embedding example."
output = embeddings.embed_query(text)

LLM

Prediction Guard LLM

사용 예시를 참조하세요
from langchain_predictionguard import PredictionGuard

사용법

# If predictionguard_api_key is not passed, default behavior is to use the `PREDICTIONGUARD_API_KEY` environment variable.
llm = PredictionGuard(model="Hermes-2-Pro-Llama-3-8B")

llm.invoke("Tell me a joke about bears")

---

<Callout icon="pen-to-square" iconType="regular">
    [Edit the source of this page on GitHub.](https://github.com/langchain-ai/docs/edit/main/src/oss/python/integrations/providers/predictionguard.mdx)
</Callout>
<Tip icon="terminal" iconType="regular">
    [Connect these docs programmatically](/use-these-docs) to Claude, VSCode, and more via MCP for    real-time answers.
</Tip>
I