■ StateGraph 클래스의 compile 메소드에서 interrupt_before 인자를 사용해 특정 노드 실행 전 중단 후 이어서 실행하는 방법을 보여준다.
※ OPENAI_API_KEY 환경 변수 값은 .env 파일에 정의한다.
※ TAVILY_API_KEY 환경 변수 값은 .env 파일에 정의한다.
▶ main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 |
from dotenv import load_dotenv from langchain_openai import ChatOpenAI from langchain_community.tools.tavily_search import TavilySearchResults from typing_extensions import TypedDict from typing import Annotated from langgraph.graph.message import add_messages from langgraph.prebuilt import ToolNode from langgraph.prebuilt import tools_condition from langgraph.graph import START from langgraph.graph import StateGraph from langgraph.checkpoint.memory import MemorySaver load_dotenv() chatOpenAI = ChatOpenAI(model = "gpt-4o-mini") tavilySearchResults = TavilySearchResults(max_results = 2) toolList = [tavilySearchResults] runnableBinding = chatOpenAI.bind_tools(toolList) class State(TypedDict): messages : Annotated[list, add_messages] stateGraph = StateGraph(State) def chat(state : State): return {"messages" : [runnableBinding.invoke(state["messages"])]} stateGraph.add_node("chatbot_node", chat) toolNode = ToolNode(tools = toolList) stateGraph.add_node("tools", toolNode) stateGraph.add_conditional_edges("chatbot_node", tools_condition) stateGraph.add_edge(START , "chatbot_node") stateGraph.add_edge("tools", "chatbot_node") memorySaver = MemorySaver() compiledStateGraph = stateGraph.compile( checkpointer = memorySaver, interrupt_before = ["tools"] ) userInput = "I'm learning LangGraph. Could you do some research on it for me?" configurableDictionary = {"configurable" : {"thread_id" : "1"}} generator = compiledStateGraph.stream({"messages" : [("user", userInput)]}, configurableDictionary, stream_mode = "values") for addableValuesDict in generator: if "messages" in addableValuesDict: addableValuesDict["messages"][-1].pretty_print() print() print("-" * 50) stateSnapshot = compiledStateGraph.get_state(configurableDictionary) print(stateSnapshot.next) print("-" * 50) print() print("-" * 50) lastMessage = stateSnapshot.values["messages"][-1] print(lastMessage.tool_calls) print("-" * 50) print() # `None`은 현재 상태에 새로운 것을 추가하지 않으므로 중단된 적이 없었던 것처럼 계속 진행된다. generator = compiledStateGraph.stream(None, configurableDictionary, stream_mode = "values") for addableValuesDict in generator: if "messages" in addableValuesDict: addableValuesDict["messages"][-1].pretty_print() """ (env) D:\localserver>python test.py ================================ Human Message ================================= I'm learning LangGraph. Could you do some research on it for me? ================================== Ai Message ================================== Tool Calls: tavily_search_results_json (call_HAeStkMi9P59nZqDcEQ9W37X) Call ID: call_HAeStkMi9P59nZqDcEQ9W37X Args: query: LangGraph -------------------------------------------------- ('tools',) -------------------------------------------------- -------------------------------------------------- [{'name': 'tavily_search_results_json', 'args': {'query': 'LangGraph'}, 'id': 'call_HAeStkMi9P59nZqDcEQ9W37X', 'type': 'tool_call'}] -------------------------------------------------- ================================== Ai Message ================================== Tool Calls: tavily_search_results_json (call_HAeStkMi9P59nZqDcEQ9W37X) Call ID: call_HAeStkMi9P59nZqDcEQ9W37X Args: query: LangGraph ================================= Tool Message ================================= Name: tavily_search_results_json [{"url": "https://www.langchain.com/langgraph", "content": "LangGraph is a framework for building and scaling agentic applications with LangChain Platform. It supports diverse control flows, human-agent collaboration, streaming, and deployment options for complex tasks."}, {"url": "https://www.datacamp.com/tutorial/langgraph-tutorial", "content": "LangGraph is a library within the LangChain ecosystem that simplifies the development of complex, multi-agent large language model (LLM) applications. Learn how to use LangGraph to create stateful, flexible, and scalable systems with nodes, edges, and state management."}] ================================== Ai Message ================================== Here are some key insights about LangGraph: 1. **Overview**: LangGraph is a framework designed for building and scaling agentic applications within the LangChain Platform. It enables various control flows, human-agent collaboration, streaming, and deployment options for handling complex tasks. You can find more information on the official LangChain website [here](https://www.langchain.com/langgraph). 2. **Library Features**: As part of the LangChain ecosystem, LangGraph simplifies the development of complex, multi-agent applications using large language models (LLMs). It allows developers to create stateful, flexible, and scalable systems utilizing nodes, edges, and state management. A tutorial on how to work with LangGraph is available on DataCamp [here](https://www.datacamp.com/tutorial/langgraph-tutorial). If you have specific aspects of LangGraph you want to learn more about, feel free to ask! """ |
▶ requirements.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
aiohappyeyeballs==2.4.4 aiohttp==3.11.11 aiosignal==1.3.2 annotated-types==0.7.0 anyio==4.7.0 attrs==24.3.0 certifi==2024.12.14 charset-normalizer==3.4.1 colorama==0.4.6 dataclasses-json==0.6.7 distro==1.9.0 frozenlist==1.5.0 greenlet==3.1.1 h11==0.14.0 httpcore==1.0.7 httpx==0.28.1 httpx-sse==0.4.0 idna==3.10 jiter==0.8.2 jsonpatch==1.33 jsonpointer==3.0.0 langchain==0.3.13 langchain-community==0.3.13 langchain-core==0.3.28 langchain-openai==0.2.14 langchain-text-splitters==0.3.4 langgraph==0.2.60 langgraph-checkpoint==2.0.9 langgraph-sdk==0.1.48 langsmith==0.2.6 marshmallow==3.23.2 msgpack==1.1.0 multidict==6.1.0 mypy-extensions==1.0.0 numpy==2.2.1 openai==1.58.1 orjson==3.10.12 packaging==24.2 propcache==0.2.1 pydantic==2.10.4 pydantic-settings==2.7.0 pydantic_core==2.27.2 python-dotenv==1.0.1 PyYAML==6.0.2 regex==2024.11.6 requests==2.32.3 requests-toolbelt==1.0.0 sniffio==1.3.1 SQLAlchemy==2.0.36 tavily-python==0.5.0 tenacity==9.0.0 tiktoken==0.8.0 tqdm==4.67.1 typing-inspect==0.9.0 typing_extensions==4.12.2 urllib3==2.3.0 yarl==1.18.3 |
※ pip install python-dotenv langchain_community langchain_openai langgraph tavily-python 명령을 실행했다.