■ ConversationTokenBufferMemory 클래스의 생성자에서 llm/max_token_limit/return_messages 인자를 사용해 ConversationTokenBufferMemory 객체를 만드는 방법을 보여준다.
※ OPENAI_API_KEY 환경 변수 값은 .env 파일에 정의한다.
▶ main.py
1 2 3 4 5 6 7 8 9 10 11 |
from dotenv import load_dotenv from langchain_openai import ChatOpenAI from langchain.memory import ConversationTokenBufferMemory load_dotenv() chatOpenAI = ChatOpenAI(model = "gpt-4o-mini") conversationTokenBufferMemory = ConversationTokenBufferMemory(llm = chatOpenAI, max_token_limit = 100, return_messages = True) |
▶ requirements.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
aiohappyeyeballs==2.4.4 aiohttp==3.11.11 aiosignal==1.3.2 annotated-types==0.7.0 anyio==4.8.0 async-timeout==4.0.3 attrs==24.3.0 certifi==2024.12.14 charset-normalizer==3.4.1 distro==1.9.0 exceptiongroup==1.2.2 frozenlist==1.5.0 greenlet==3.1.1 h11==0.14.0 httpcore==1.0.7 httpx==0.28.1 idna==3.10 jiter==0.8.2 jsonpatch==1.33 jsonpointer==3.0.0 langchain==0.3.14 langchain-core==0.3.29 langchain-openai==0.3.0 langchain-text-splitters==0.3.5 langsmith==0.2.10 multidict==6.1.0 numpy==1.26.4 openai==1.59.6 orjson==3.10.14 packaging==24.2 propcache==0.2.1 pydantic==2.10.5 pydantic_core==2.27.2 python-dotenv==1.0.1 PyYAML==6.0.2 regex==2024.11.6 requests==2.32.3 requests-toolbelt==1.0.0 sniffio==1.3.1 SQLAlchemy==2.0.37 tenacity==9.0.0 tiktoken==0.8.0 tqdm==4.67.1 typing_extensions==4.12.2 urllib3==2.3.0 yarl==1.18.3 |
※ pip install python-dotenv langchain langchain-openai 명령을 실행했다.