이 문서는 Goodfire chat models 시작하기를 도와드립니다. 모든 ChatGoodfire 기능과 구성에 대한 자세한 문서는 PyPI 프로젝트 페이지를 참조하거나, Goodfire SDK 문서로 직접 이동하세요. 모든 Goodfire 전용 기능(예: SAE features, variants 등)은 메인 goodfire 패키지를 통해 사용할 수 있습니다. 이 통합은 Goodfire SDK를 래핑한 것입니다.

Overview

Integration details

ClassPackageLocalSerializableJS supportDownloadsVersion
ChatGoodfirelangchain-goodfirePyPI - DownloadsPyPI - Version

Model features

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs

Setup

Goodfire 모델에 액세스하려면 Goodfire 계정을 생성하고, API key를 발급받고, langchain-goodfire 통합 패키지를 설치해야 합니다.

Credentials

Goodfire Settings로 이동하여 Goodfire에 가입하고 API key를 생성하세요. 완료한 후 GOODFIRE_API_KEY 환경 변수를 설정하세요.
import getpass
import os

if not os.getenv("GOODFIRE_API_KEY"):
    os.environ["GOODFIRE_API_KEY"] = getpass.getpass("Enter your Goodfire API key: ")
모델 호출의 자동 추적을 활성화하려면 LangSmith API key를 설정하세요:
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

Installation

LangChain Goodfire 통합은 langchain-goodfire 패키지에 있습니다:
pip install -qU langchain-goodfire
Note: you may need to restart the kernel to use updated packages.

Instantiation

이제 모델 객체를 인스턴스화하고 chat completion을 생성할 수 있습니다:
import goodfire
from langchain_goodfire import ChatGoodfire

base_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

llm = ChatGoodfire(
    model=base_variant,
    temperature=0,
    max_completion_tokens=1000,
    seed=42,
)
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.

Invocation

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = await llm.ainvoke(messages)
ai_msg
AIMessage(content="J'adore la programmation.", additional_kwargs={}, response_metadata={}, id='run-8d43cf35-bce8-4827-8935-c64f8fb78cd0-0', usage_metadata={'input_tokens': 51, 'output_tokens': 39, 'total_tokens': 90})
print(ai_msg.content)
J'adore la programmation.

Goodfire 전용 기능

SAE features 및 variants와 같은 Goodfire 전용 기능을 사용하려면 goodfire 패키지를 직접 사용할 수 있습니다.
client = goodfire.Client(api_key=os.environ["GOODFIRE_API_KEY"])

pirate_features = client.features.search(
    "assistant should roleplay as a pirate", base_variant
)
pirate_features
FeatureGroup([
   0: "The assistant should adopt the persona of a pirate",
   1: "The assistant should roleplay as a pirate",
   2: "The assistant should engage with pirate-themed content or roleplay as a pirate",
   3: "The assistant should roleplay as a character",
   4: "The assistant should roleplay as a specific character",
   5: "The assistant should roleplay as a game character or NPC",
   6: "The assistant should roleplay as a human character",
   7: "Requests for the assistant to roleplay or pretend to be something else",
   8: "Requests for the assistant to roleplay or pretend to be something",
   9: "The assistant is being assigned a role or persona to roleplay"
])
pirate_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

pirate_variant.set(pirate_features[0], 0.4)
pirate_variant.set(pirate_features[1], 0.3)

await llm.ainvoke("Tell me a joke", model=pirate_variant)
AIMessage(content='Why did the scarecrow win an award? Because he was outstanding in his field! Arrr! Hope that made ye laugh, matey!', additional_kwargs={}, response_metadata={}, id='run-7d8bd30f-7f80-41cb-bdb6-25c29c22a7ce-0', usage_metadata={'input_tokens': 35, 'output_tokens': 60, 'total_tokens': 95})

API reference

모든 ChatGoodfire 기능과 구성에 대한 자세한 문서는 API reference를 참조하세요.
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
I