AI Framework Integrations
Use Awareness as persistent memory for multi-agent frameworks. All integrations are available through the Python SDK and share a unified adapter pattern.
Supported Frameworks
| Framework | Install | Adapter | Key Feature |
|---|---|---|---|
| LangChain | pip install awareness-memory-cloud[langchain] | MemoryCloudLangChain | Retriever, RAG chains, tool registration |
| CrewAI | pip install awareness-memory-cloud[crewai] | MemoryCloudCrewAI | Cross-task memory, crew-member scoping |
| PraisonAI | pip install awareness-memory-cloud[praisonai] | MemoryCloudPraisonAI | Tool-based memory in agent workflows |
| AutoGen / AG2 | pip install awareness-memory-cloud[autogen] | MemoryCloudAutoGen | Multi-agent conversations with memory |
Unified Adapter Pattern
All framework integrations extend MemoryCloudBaseAdapter and provide:
wrap_llm()— intercept LLM calls to auto-recall and auto-capture (Interceptor pattern)build_tools()— generate framework-native tool definitionsget_tool_functions()— get raw tool functions for manual registrationinject_into_messages()— manual message-level injectionawareness_recall()/awareness_record()— explicit memory read/write
Interceptor: The Fastest Path
All framework adapters support the Interceptor pattern — wrap your LLM client once and get automatic memory injection:
from memory_cloud import MemoryCloudClient
from memory_cloud.integrations.langchain import MemoryCloudLangChain
import openai
client = MemoryCloudClient(base_url="...", api_key="...")
mc = MemoryCloudLangChain(client=client, memory_id="memory_123")
# One line: all LLM calls get memory context
mc.wrap_llm(openai.OpenAI())
This works identically across all four frameworks — just swap the adapter class.
OpenAI / Anthropic Direct (No Framework)
For direct API usage without a framework, use the SDK's built-in AwarenessInterceptor:
from memory_cloud import MemoryCloudClient, AwarenessInterceptor
import openai
client = MemoryCloudClient(base_url="...", api_key="...")
interceptor = AwarenessInterceptor(client=client, memory_id="mem-xxx")
oai = openai.OpenAI()
interceptor.wrap_openai(oai)
# All oai.chat.completions.create() calls now have memory injection
See the SDK Usage Guide for full interceptor options and query rewrite modes.
Multi-User / Multi-Role
All framework integrations support user_id and agent_role for team scenarios:
mc_dev = MemoryCloudLangChain(
client=client, memory_id="memory_123",
default_metadata={"agent_role": "developer", "user_id": "alice"}
)
mc_reviewer = MemoryCloudLangChain(
client=client, memory_id="memory_123",
default_metadata={"agent_role": "reviewer", "user_id": "bob"}
)
When querying with a user_id, results include both that user's data AND public (user_id=NULL) data.
TypeScript
The TypeScript SDK (@awareness-sdk/memory-cloud) provides the same core API and interceptor but does not yet include framework-specific adapters. Use the direct client or interceptor:
import { MemoryCloudClient, AwarenessInterceptor } from "@awareness-sdk/memory-cloud";
import OpenAI from "openai";
const client = new MemoryCloudClient({ baseUrl: "...", apiKey: "..." });
const interceptor = await AwarenessInterceptor.create({ client, memoryId: "mem-xxx" });
const oai = new OpenAI();
interceptor.wrapOpenAI(oai);