LangChain in Action- Develop LLM-Powered Applications
- Development
- Feb 24, 2025

LangChain in Action: Develop LLM-Powered Applications, available at $69.99, has an average rating of 4.51, with 63 lectures, based on 365 reviews, and has 2846 subscribers.
You will learn about Master LangChain from basics to advanced features Understand and implement Retrieval Augmented Generation (RAG) using VectorStores Learn about the creation and use of powerful Autonomous Agents. Grasp the functionalities and applications of the Indexing API. Explore the LangSmith Platform for production ready application Learn about Microservice architecture in the context of large language model (LLM) applications. Learn about the new LangChain Expression Language with the Runnable Interface This course is ideal for individuals who are Python Developers, AI Enthusiats It is particularly useful for Python Developers, AI Enthusiats.
Enroll now: LangChain in Action: Develop LLM-Powered Applications
Summary
Title: LangChain in Action: Develop LLM-Powered Applications
Price: $69.99
Average Rating: 4.51
Number of Lectures: 63
Number of Published Lectures: 63
Number of Curriculum Items: 63
Number of Published Curriculum Objects: 63
Original Price: $27.99
Quality Status: approved
Status: Live
What You Will Learn
Who Should Attend
Target Audiences
This course provides an in-depth exploration into LangChain, a framework pivotal for developing generative AI applications. Aimed at both beginners and experienced practitioners in the AI world, the course starts with the fundamentals, such as the basic usage of the OpenAI API, progressively delving into the more intricate aspects of LangChain.
You’ll learn about the intricacies of input and output mechanisms in LangChain and how to craft effective prompt templates for OpenAI models. The course takes you through the critical components of LangChain, such as Chains, Callbacks, and Memory, teaching you to create interactive and context-aware AI systems.
Midway, the focus shifts to advanced concepts like Retrieval Augmented Generation (RAG) and the creation of Autonomous Agents, enriching your understanding of intelligent system design. Topics like Hybrid Search, Indexing API, and LangSmith will be covered, highlighting their roles in enhancing the efficiency and functionality of AI applications.
Toward the end, the course integrates theory with practical skills, introducing Microservice Architecture in large language model (LLM) applications and the LangChain Expression Language. This ensures not only a theoretical understanding of the concepts but also their practical applications.
This course is tailored for individuals with a foundational knowledge of Python, aiming to build or enhance their expertise in AI. The structured curriculum ensures a comprehensive grasp of LangChain, from basic concepts to complex applications, preparing you for the future of generative AI.
Course Curriculum
Chapter 1: Before we start
Lecture 1: What to expect from this course and how to get all ressources
Lecture 2: Why this course is different
Lecture 3: Prerequisites
Lecture 4: Essential topics and terms (theory)
Lecture 5: Why this course does not cover Open Source models like LLama2
Lecture 6: Optional: Install Visual Studio Code
Lecture 7: Get the source files with Git from Github
Lecture 8: Create OpenAI Account and create API Key
Chapter 2: Preparation
Lecture 1: What we have to do before delving into LangChain
Lecture 2: Setup of a virtual environment
Lecture 3: Setup OpenAI Api-Key as environment variable
Lecture 4: Exploring the vanilla OpenAI package
Chapter 3: LangChain Basics
Lecture 1: IMPORTANT NOTE – LANGCHAIN 0.1 Codechanges
Lecture 2: LLM Basics
Lecture 3: Prompting Basics
Lecture 4: Theory: Prompt Engineering Basics
Lecture 5: Few Shot Prompting
Lecture 6: Chain of thought prompting
Lecture 7: Pipeline-Prompts
Lecture 8: Prompt Serialisation
Chapter 4: Chains – From basic to advanced chains
Lecture 1: Introduction to chains
Lecture 2: Basic chains – the LLMChain
Lecture 3: Response Schemas and OutputParsers
Lecture 4: LLMChain with multiple inputs
Lecture 5: SequentialChains
Lecture 6: RouterChains
Chapter 5: Callbacks
Lecture 1: Callbacks
Chapter 6: Memory
Lecture 1: Memory basics – ConversationBufferMemory
Lecture 2: ConversationSummaryMemory
Lecture 3: EXERCISE: Use Memory to build a streamlit Chatbot
Lecture 4: SOLUTION: Chatbot with Streamlit
Chapter 7: OpenAI Function Calling
Lecture 1: OpenAI Function Calling – Vanilla OpenAI Package
Lecture 2: Function Calling with LangChain [DEPRECATED]
Lecture 3: Limits and issues of the langchain Implementation [DEPRECATED]
Lecture 4: Tool/Function Calling with LangChain – The new way
Chapter 8: Retrieval Augmented Generation (RAG)
Lecture 1: RAG – Theory and building blocks
Lecture 2: Loaders and Splitters
Lecture 3: Embeddings – Theory and practice
Lecture 4: VectorStores and Retrievers
Lecture 5: RAG Service with FastAPI
Chapter 9: Agents
Lecture 1: Agents Basics – LLMs learn to use tools
Lecture 2: Agents with a custom RAG-Tool
Lecture 3: ChatAgents
Chapter 10: Indexing API
Lecture 1: Indexing API – keep your documents in sync
Lecture 2: PREREQUISITE: Docker Installation
Lecture 3: Setup of PgVector and RecordManager
Lecture 4: Indexing Documents in practice
Lecture 5: Document Retrieval with PgVector
Chapter 11: LangSmith
Lecture 1: Introduction to LangSmith (User Interface and Hub)
Lecture 2: LangSmith Projects
Lecture 3: LangSmith Datasets and Evaluation
Chapter 12: Microservice Architecture for LLM Applications
Lecture 1: Before you watch this section
Lecture 2: Introduction to Microservice Architecture
Lecture 3: How our Chatbot works in a Microservice Architecture
Lecture 4: Introduction to Docker
Lecture 5: Introduction to Kubernetes
Lecture 6: Deployment of the LLM Microservices to Kubernetes
Chapter 13: LangChain Expression Language (LCEL)
Lecture 1: LangChain Expression Language
Lecture 2: Intro to LangChain Expression Language
Lecture 3: LCEL Part 1 – Pipes and OpenAI Function Calling
Lecture 4: LCEL – Part 2 – VectorStores, ItemGetter, Tools
Lecture 5: LCEL – Part 3 – Arbitrary Functions, Runnable Interface, Fallbacks
Chapter 14: Congratulations!
Lecture 1: Thank you for participating in this course
Instructors

Markus Lang
Software Engineer – Python Developer – LLM Expert
Rating Distribution
Frequently Asked Questions
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!
- Random Picks
- Popular
- Hot Reviews
- MailChimp Free Mailing Lists- MailChimp Email Marketing
- Affiliate Marketing 101 - Affiliate Marketing For Beginners
- Interview Questions and Answer on Python
- Life Insurance Annuity Ultimate Buyer’s Guide
- 3DS Max Tutorial. Learn The Art of Modelling and Animation
- Crypto Trading Mastery (Scalping, Day trading, price action)
- Company Valuation Financial Modeling
- The Beginner Forex Trading Playbook
- 1YouTube Masterclass The Best Guide to YouTube Success
- 2Photoshop CC- Adjustement Layers, Blending Modes Masks
- 3Personal Finance
- 4The Architecture of Oscar Niemeyer
- 5SolidWorks Essential Training ( 2023 2024 )
- 6Advanced Photoshop Manipulations Tutorials Bundle
- 7ZB Trading Cryptocurrency Price Action Course
- 8Python for Absolute Beginners
- 1Linux Performance Monitoring Analysis Hands On !!
- 2Content Writing Mastery 1- Content Writing For Beginners
- 3Media Training for PrintOnline Interviews-Get Great Quotes
- 4Learn Facebook Ads from Scratch Get more Leads and Sales
- 5The Complete Digital Marketing Course Learn From Scratch
- 6C#- Start programming with C# (for complete beginners)
- 7[FREE] How to code 10 times faster with Emmet
- 8Driving Results through Data Storytelling