■ init_chat_model 함수의 생성자를 사용해 모델을 만드는 방법을 보여준다.
※ API KEY 환경 변수 값은 .env 파일에 정의한다.
▶ main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
from dotenv import load_dotenv from langchain.chat_models import init_chat_model load_dotenv() chatOpenAI = init_chat_model("gpt-4o" , model_provider = "openai" , temperature = 0) chatAnthropic = init_chat_model("claude-3-opus-20240229", model_provider = "anthropic" , temperature = 0) chatVertexAI = init_chat_model("gemini-1.5-pro" , model_provider = "google_vertexai", temperature = 0) print("GPT-4o : " + chatOpenAI.invoke("what's your name").content ) print("Claude Opus : " + chatAnthropic.invoke("what's your name").content) print("Gemini 1.5 : " + chatVertexAI .invoke("what's your name").content) """ GPT-4o : I'm an AI created by OpenAI, and I don't have a personal name. You can call me Assistant! How can I help you today? Claude Opus : My name is Claude. It's nice to meet you! Gemini 1.5 : I am a large language model, trained by Google. I do not have a name. """ |
▶ requirements.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
aiohttp==3.9.5 aiosignal==1.3.1 annotated-types==0.7.0 anthropic==0.29.0 anyio==4.4.0 async-timeout==4.0.3 attrs==23.2.0 cachetools==5.3.3 certifi==2024.6.2 charset-normalizer==3.3.2 defusedxml==0.7.1 distro==1.9.0 docstring_parser==0.16 exceptiongroup==1.2.1 filelock==3.15.4 frozenlist==1.4.1 fsspec==2024.6.0 google-api-core==2.19.1 google-auth==2.30.0 google-cloud-aiplatform==1.56.0 google-cloud-bigquery==3.25.0 google-cloud-core==2.4.1 google-cloud-resource-manager==1.12.3 google-cloud-storage==2.17.0 google-crc32c==1.5.0 google-resumable-media==2.7.1 googleapis-common-protos==1.63.2 greenlet==3.0.3 grpc-google-iam-v1==0.13.1 grpcio==1.64.1 grpcio-status==1.62.2 h11==0.14.0 httpcore==1.0.5 httpx==0.27.0 huggingface-hub==0.23.4 idna==3.7 jiter==0.5.0 jsonpatch==1.33 jsonpointer==3.0.0 langchain==0.2.5 langchain-anthropic==0.1.15 langchain-core==0.2.9 langchain-google-vertexai==1.0.5 langchain-openai==0.1.9 langchain-text-splitters==0.2.1 langsmith==0.1.82 multidict==6.0.5 numpy==1.26.4 openai==1.35.3 orjson==3.10.5 packaging==24.1 proto-plus==1.24.0 protobuf==4.25.3 pyasn1==0.6.0 pyasn1_modules==0.4.0 pydantic==2.7.4 pydantic_core==2.18.4 python-dateutil==2.9.0.post0 python-dotenv==1.0.1 PyYAML==6.0.1 regex==2024.5.15 requests==2.32.3 rsa==4.9 shapely==2.0.4 six==1.16.0 sniffio==1.3.1 SQLAlchemy==2.0.31 tenacity==8.4.2 tiktoken==0.7.0 tokenizers==0.19.1 tqdm==4.66.4 typing_extensions==4.12.2 urllib3==2.2.2 yarl==1.9.4 |
※ pip install python-dotenv langchain langchain-openai langchain-anthropic langchain-google-vertexai 명령을 실행했다.