
LangChain.rb — Chains, Agents, and Memory for Ruby AI Apps
This is post #21 in the Ruby for AI series. We've been building everything from scratch — API calls, RAG pipelines, agents, streaming. That's great for understanding. But sometimes you want a framework that handles the plumbing. Enter LangChain.rb — Ruby's port of the popular LangChain library. What Is LangChain.rb? LangChain.rb gives you pre-built abstractions for common AI patterns: LLM clients, prompt templates, chains, vector search, agents, and conversation memory. Instead of wiring up OpenAI calls and pgvector queries manually, you get composable building blocks. Install it: gem install langchainrb # or in your Gemfile: gem "langchainrb" Basic LLM Usage Start with a simple LLM call: require "langchain" llm = Langchain :: LLM :: OpenAI . new ( api_key: ENV [ "OPENAI_API_KEY" ], default_options: { temperature: 0.7 , chat_model: "gpt-4o" } ) response = llm . chat ( messages: [{ role: "user" , content: "Explain Ruby blocks in one paragraph." }]) puts response . chat_completion LangCh
Continue reading on Dev.to
Opens in a new tab



