Member-only story

Overview — Langchain & LlamaIndex

Venkatesh Subramanian
4 min readMay 7, 2024

--

In the realm of Generative AI applications, orchestration frameworks play a crucial role akin to high-level languages in software development. These frameworks offer a layer of abstraction, effectively concealing intricate vendor-specific details, diverse interactions with open-source model APIs, and notably, the burdensome boilerplate code.

Let’s explore the popular frameworks here Langchain and LLamaIndex.

In simple terms, these are wrappers around LLM (Large language models)

LangChain

from langchain_openai.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.schema.output_parser import StrOutputParser

application_prompt =""" Given a word, find and write all the word forms like its noun, verb, adjective and collocations.
Reply with only the word forms, one on each line,
with no additional text.
Word:
{user_input}
"""
user_input = """Victory"""
lm = ChatOpenAI( temperature=0.7, max_tokens=500, model='gpt-4–1106-preview' )
prompt = PromptTemplate( input_variables=["user_input"], template=application_prompt )
chain = prompt | llm | StrOutputParser()
result = chain.invoke({"user_input": user_input})

chain = prompt | llm | StrOutputParser()

This is where LangChain gets its name from. And we’re using what’s called LangChain Expression Language or LCEL.

LangChain Expression Language looks and works much like how you might pipe information from one command to…

--

--

Venkatesh Subramanian
Venkatesh Subramanian

Written by Venkatesh Subramanian

Product development & Engineering Leader| Software Architect | AI/ML | Cloud computing|https://www.linkedin.com/in/venkatesh-subramanian-377451b4/

Responses (1)