Member-only story
Overview — Langchain & LlamaIndex
In the realm of Generative AI applications, orchestration frameworks play a crucial role akin to high-level languages in software development. These frameworks offer a layer of abstraction, effectively concealing intricate vendor-specific details, diverse interactions with open-source model APIs, and notably, the burdensome boilerplate code.
Let’s explore the popular frameworks here Langchain and LLamaIndex.
In simple terms, these are wrappers around LLM (Large language models)
LangChain
from langchain_openai.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.schema.output_parser import StrOutputParser
application_prompt =""" Given a word, find and write all the word forms like its noun, verb, adjective and collocations.
Reply with only the word forms, one on each line,
with no additional text.
Word:
{user_input}
"""
user_input = """Victory"""
lm = ChatOpenAI( temperature=0.7, max_tokens=500, model='gpt-4–1106-preview' )
prompt = PromptTemplate( input_variables=["user_input"], template=application_prompt )
chain = prompt | llm | StrOutputParser()
result = chain.invoke({"user_input": user_input})
chain = prompt | llm | StrOutputParser()
This is where LangChain gets its name from. And we’re using what’s called LangChain Expression Language or LCEL.
LangChain Expression Language looks and works much like how you might pipe information from one command to…