
LangChain + ODEI: Persistent World Models for Long-Running Agents
LangChain + ODEI: Persistent World Models ConversationBufferMemory resets on restart. ODEI gives LangChain agents a persistent world model. Quick Integration from langchain.tools import tool import requests @tool def check_action ( action : str ) -> str : r = requests . post ( " https://api.odei.ai/api/v2/guardrail/check " , json = { " action " : action , " severity " : " medium " } ). json () return f " { r [ " verdict " ] } : { r . get ( " reasoning " , "" )[ : 200 ] } " @tool def query_memory ( term : str ) -> str : r = requests . post ( " https://api.odei.ai/api/v2/world-model/query " , json = { " queryType " : " search " , " searchTerm " : term } ). json () return str ( r ) Session Continuity Inject world model at session start: def build_context (): wm = requests . get ( " https://api.odei.ai/api/v2/world-model/live " ). json () active = [ n [ " title " ] for n in wm [ " nodes " ] if n [ " domain " ] == " TACTICS " ] return f " Current tasks: { active } " Production Results 0 dup
Continue reading on Dev.to Python
Opens in a new tab


