이 노트북은 LangChain과 LangGraph agent API 모두에서 UC function을 LangChain tool로 사용하는 방법을 보여줍니다. UC에서 SQL 또는 Python function을 생성하는 방법을 알아보려면 Databricks 문서(AWS|Azure|GCP)를 참조하세요. LLM이 function을 올바르게 호출하는 데 중요한 function 및 parameter 주석을 생략하지 마세요. 이 예제 노트북에서는 임의의 코드를 실행하는 간단한 Python function을 생성하고 이를 LangChain tool로 사용합니다:
CREATE FUNCTION main.tools.python_exec (
  code STRING COMMENT 'Python code to execute. Remember to print the final result to stdout.'
)
RETURNS STRING
LANGUAGE PYTHON
COMMENT 'Executes Python code and returns its stdout.'
AS $$
  import sys
  from io import StringIO
  stdout = StringIO()
  sys.stdout = stdout
  exec(code)
  return stdout.getvalue()
$$
이는 Databricks SQL warehouse 내의 안전하고 격리된 환경에서 실행됩니다.
pip install -qU databricks-sdk langchain-community databricks-langchain langgraph mlflow
Note: you may need to restart the kernel to use updated packages.
from databricks_langchain import ChatDatabricks

llm = ChatDatabricks(endpoint="databricks-meta-llama-3-70b-instruct")
from databricks_langchain.uc_ai import (
    DatabricksFunctionClient,
    UCFunctionToolkit,
    set_uc_function_client,
)

client = DatabricksFunctionClient()
set_uc_function_client(client)

tools = UCFunctionToolkit(
    # Include functions as tools using their qualified names.
    # You can use "{catalog_name}.{schema_name}.*" to get all functions in a schema.
    function_names=["main.tools.python_exec"]
).tools
(선택사항) function 실행 응답을 받기 위한 재시도 시간을 늘리려면 환경 변수 UC_TOOL_CLIENT_EXECUTION_TIMEOUT을 설정하세요. 기본 재시도 시간 값은 120초입니다.

LangGraph agent 예제

import os

os.environ["UC_TOOL_CLIENT_EXECUTION_TIMEOUT"] = "200"

LangGraph agent 예제

from langchain.agents import create_agent


agent = create_agent(
    llm,
    tools,
    system_prompt="You are a helpful assistant. Make sure to use tool for information.",
)
agent.invoke({"messages": [{"role": "user", "content": "36939 * 8922.4"}]})
{'messages': [HumanMessage(content='36939 * 8922.4', additional_kwargs={}, response_metadata={}, id='1a10b10b-8e37-48c7-97a1-cac5006228d5'),
  AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_a8f3986f-4b91-40a3-8d6d-39f431dab69b', 'type': 'function', 'function': {'name': 'main__tools__python_exec', 'arguments': '{"code": "print(36939 * 8922.4)"}'}}]}, response_metadata={'prompt_tokens': 771, 'completion_tokens': 29, 'total_tokens': 800}, id='run-865c3613-20ba-4e80-afc8-fde1cfb26e5a-0', tool_calls=[{'name': 'main__tools__python_exec', 'args': {'code': 'print(36939 * 8922.4)'}, 'id': 'call_a8f3986f-4b91-40a3-8d6d-39f431dab69b', 'type': 'tool_call'}]),
  ToolMessage(content='{"format": "SCALAR", "value": "329584533.59999996\\n", "truncated": false}', name='main__tools__python_exec', id='8b63d4c8-1a3d-46a5-a719-393b2ef36770', tool_call_id='call_a8f3986f-4b91-40a3-8d6d-39f431dab69b'),
  AIMessage(content='The result of the multiplication is:\n\n329584533.59999996', additional_kwargs={}, response_metadata={'prompt_tokens': 846, 'completion_tokens': 22, 'total_tokens': 868}, id='run-22772404-611b-46e4-9956-b85e4a385f0f-0')]}

LangChain agent 예제

from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You are a helpful assistant. Make sure to use tool for information.",
        ),
        ("placeholder", "{chat_history}"),
        ("human", "{input}"),
        ("placeholder", "{agent_scratchpad}"),
    ]
)

agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "36939 * 8922.4"})
> Entering new AgentExecutor chain...

Invoking: `main__tools__python_exec` with `{'code': 'print(36939 * 8922.4)'}`


{"format": "SCALAR", "value": "329584533.59999996\n", "truncated": false}The result of the multiplication is:

329584533.59999996

> Finished chain.
{'input': '36939 * 8922.4',
 'output': 'The result of the multiplication is:\n\n329584533.59999996'}

Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
I