✅ 개요
- LangChain 공식문서 Build a Simple LLM Application with LCEL 학습 기록
- 영문 번역 ai 서비스 langchain으로 만들기
- Colab을 이용해 python으로 진행
✅ 설치 및 LangSmith 설정
!pip install langchain
- LangSmith Tracing 설정 및 api key 설정
import os // env 설정 위해
import getpass // api key 등 민감정보를 입력할 수 있는 라이브러리
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_ENDPOINT"] = "https://api.smith.langchain.com"
os.environ["LANGCHAIN_API_KEY"] = getpass.getpass()
os.environ["LANGCHAIN_PROJECT"] = "langchain-test"
✅ Using Language Model with OutputParsers
- import and load llm model
os.environ["OPENAI_API_KEY"] = getpass.getpass()
from langchain_openai import ChatOpenAI
model = ChatOpenAI(model="gpt-4o-mini")
from langchain_core.messages import HumanMessage, SystemMessage
from langchain_core.output_parsers import StrOutputParser
messages = [
SystemMessage(content="Translate the following from English into Korean"),
HumanMessage(content="Hello, My name is DoYeong!!"),
]
parser = StrOutputParser()
result = model.invoke(messages)
parser.invoke(result)
- also can do this with chain!!
chain = model | parser
chain.invoke(messages)
✅ Prompt Templates
- PromptTemplates are a concept in LangChain designed to assist with this transformation. They take in raw user input and return data (a prompt) that is ready to pass into a language model.
- 동적으로 조작 가능한 템플릿을 만들어보쟈
from langchain_core.prompts import ChatPromptTemplate
system_template = "Translate the following into {language}:"
prompt_template = ChatPromptTemplate.from_messages(
[("system", system_template), ("user", "{text}")]
)
result = prompt_template.invoke({"language": "korean", "text": "hi"})
result.to_messages()
print(result)
chain = prompt_template | model | parser
chain.invoke({"language": "italian", "text": "Hi, My name is Doyoung"})
- "system"에서 실행할 프롬프트를 주입하고,
- "user"에서 사용자에게 input을 받는 프로세스