■ filter_messages 함수를 사용해 RunnableLambda 객체를 만들고 체인에서 사용하는 방법을 보여준다.
※ OPENAI_API_KEY 환경 변수 값은 .env 파일에 정의한다.
▶ main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
from dotenv import load_dotenv from langchain_core.messages import SystemMessage from langchain_core.messages import HumanMessage from langchain_core.messages import AIMessage from langchain_core.messages import filter_messages from langchain_openai import ChatOpenAI load_dotenv() messageList = [ SystemMessage("you are a good assistant", id="1" ), HumanMessage ("example input" , id="2", name = "example_user" ), AIMessage ("example output" , id="3", name = "example_assistant"), HumanMessage ("real input" , id="4", name = "bob" ), AIMessage ("real output" , id="5", name = "alice" ), ] filterRunnableLambda = filter_messages(exclude_names = ["example_user", "example_assistant"]) chatOpenAI = ChatOpenAI(temperature = 0) runnableSequence = filterRunnableLambda | chatOpenAI responseAIMessage = runnableSequence.invoke(messageList) print(responseAIMessage.content) """ I'm sorry, I'm not sure what you mean by "bob real input." Can you please provide more context or clarify your request? """ |
▶ requirements.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
annotated-types==0.7.0 anyio==4.4.0 certifi==2024.6.2 charset-normalizer==3.3.2 distro==1.9.0 exceptiongroup==1.2.1 h11==0.14.0 httpcore==1.0.5 httpx==0.27.0 idna==3.7 jsonpatch==1.33 jsonpointer==3.0.0 langchain-core==0.2.9 langchain-openai==0.1.9 langsmith==0.1.82 openai==1.35.3 orjson==3.10.5 packaging==24.1 pydantic==2.7.4 pydantic_core==2.18.4 python-dotenv==1.0.1 PyYAML==6.0.1 regex==2024.5.15 requests==2.32.3 sniffio==1.3.1 tenacity==8.4.2 tiktoken==0.7.0 tqdm==4.66.4 typing_extensions==4.12.2 urllib3==2.2.2 |
※ pip install python-dotenv langchain-openai 명령을 실행했다.