Copy
---
title: Together AI
---
[Together AI](https://www.together.ai/)는 몇 줄의 코드로 [50개 이상의 주요 오픈소스 모델](https://docs.together.ai/docs/inference-models)을 쿼리할 수 있는 API를 제공합니다.
이 예제는 LangChain을 사용하여 Together AI 모델과 상호작용하는 방법을 다룹니다.
## Installation
```python
pip install -U langchain-together
Environment
Together AI를 사용하려면 API key가 필요하며, 여기에서 찾을 수 있습니다: api.together.ai/settings/api-keys. 이는 초기화 매개변수together_api_key로 전달하거나 환경 변수 TOGETHER_API_KEY로 설정할 수 있습니다.
Example
Copy
# Querying chat models with Together AI
from langchain_together import ChatTogether
# choose from our 50+ models here: https://docs.together.ai/docs/inference-models
chat = ChatTogether(
# together_api_key="YOUR_API_KEY",
model="meta-llama/Llama-3-70b-chat-hf",
)
# stream the response back from the model
for m in chat.stream("Tell me fun things to do in NYC"):
print(m.content, end="", flush=True)
# if you don't want to do streaming, you can use the invoke method
# chat.invoke("Tell me fun things to do in NYC")
Copy
# Querying code and language models with Together AI
from langchain_together import Together
llm = Together(
model="codellama/CodeLlama-70b-Python-hf",
# together_api_key="..."
)
print(llm.invoke("def bubble_sort(): "))
Copy
---
<Callout icon="pen-to-square" iconType="regular">
[Edit the source of this page on GitHub.](https://github.com/langchain-ai/docs/edit/main/src/oss/python/integrations/providers/together.mdx)
</Callout>
<Tip icon="terminal" iconType="regular">
[Connect these docs programmatically](/use-these-docs) to Claude, VSCode, and more via MCP for real-time answers.
</Tip>