Awareness
Back to Home

Documentation

Guides, SDK references, and ecosystem integrations for Awareness.

AI Frameworks

AI Framework Integrations

Use Awareness as persistent memory for multi-agent frameworks. All integrations are available through the Python SDK and share a unified adapter pattern.


Installation

pip install awareness-memory-cloud

# Framework-specific extras:
pip install awareness-memory-cloud[langchain]
pip install awareness-memory-cloud[crewai]
pip install awareness-memory-cloud[praisonai]
pip install awareness-memory-cloud[autogen]

Unified Adapter Pattern

All framework integrations extend MemoryCloudBaseAdapter and provide:

  • wrap_llm() — intercept LLM calls to auto-recall and auto-capture
  • build_tools() — generate framework-native tool definitions
  • get_tool_functions() — get raw tool functions for manual registration

LangChain

from memory_cloud import MemoryCloudClient
from memory_cloud.integrations.langchain import MemoryCloudLangChain

client = MemoryCloudClient(base_url="...", api_key="...")
mc = MemoryCloudLangChain(client=client, memory_id="memory_123")

# Option 1: Wrap the LLM client for auto-recall + auto-capture
mc.wrap_llm(openai.OpenAI())

# Option 2: Use as a LangChain Retriever
retriever = mc.as_retriever()
docs = retriever._get_relevant_documents("What did we decide yesterday?")

Use cases:

  • RAG chains with persistent memory context
  • Agent tool registration for memory read/write
  • Retriever integration for LangChain pipelines

Example: examples/e2e_langchain_cloud.py


CrewAI

from memory_cloud import MemoryCloudClient
from memory_cloud.integrations.crewai import MemoryCloudCrewAI

client = MemoryCloudClient(base_url="...", api_key="...")
mc = MemoryCloudCrewAI(client=client, memory_id="memory_123")

# Wrap the LLM client
mc.wrap_llm(openai.OpenAI())

# Or get tool definitions for agent config
tools = mc.build_tools()

Use cases:

  • Crew agents that remember across task executions
  • Shared memory between crew members via user_id / agent_role
  • Long-running crews that accumulate knowledge over days

Example: examples/e2e_crewai_cloud.py


PraisonAI

from memory_cloud import MemoryCloudClient
from memory_cloud.integrations.praisonai import MemoryCloudPraisonAI

client = MemoryCloudClient(base_url="...", api_key="...")
mc = MemoryCloudPraisonAI(client=client, memory_id="memory_123")

# Wrap the LLM client
mc.wrap_llm(openai.OpenAI())

# Or get tool dicts for PraisonAI agent config
tools = mc.build_tools()

Use cases:

  • PraisonAI agents with cross-session memory
  • Tool-based memory access in agent workflows

Example: examples/e2e_praisonai_cloud.py


AutoGen / AG2

from memory_cloud import MemoryCloudClient
from memory_cloud.integrations.autogen import MemoryCloudAutoGen

client = MemoryCloudClient(base_url="...", api_key="...")
mc = MemoryCloudAutoGen(client=client, memory_id="memory_123")

# Hook into agent message processing
mc.wrap_agent(autogen_agent)

Use cases:

  • Multi-agent conversations with persistent memory
  • Agent group chats that remember prior decisions
  • AutoGen workflows spanning multiple sessions

Example: examples/e2e_autogen_cloud.py


OpenAI / Anthropic Interceptor

For direct API usage without a framework, you can wrap any OpenAI or Anthropic client:

from memory_cloud import MemoryCloudClient

client = MemoryCloudClient(base_url="...", api_key="...")

# Before each LLM call: retrieve relevant context
context = client.recall_for_task(
    memory_id="memory_123",
    task="What's the current auth implementation?",
    limit=8,
)

# After each LLM call: store the interaction
client.remember_step(
    memory_id="memory_123",
    text="User asked about auth. Explained JWT + refresh token approach.",
    source="custom-agent",
)

Multi-User / Multi-Role

All framework integrations support user_id and agent_role for team scenarios:

# Each framework agent can have its own role
mc_dev = MemoryCloudLangChain(
    client=client, memory_id="memory_123",
    default_metadata={"agent_role": "developer", "user_id": "alice"}
)

mc_reviewer = MemoryCloudLangChain(
    client=client, memory_id="memory_123",
    default_metadata={"agent_role": "reviewer", "user_id": "bob"}
)

When querying with a user_id, results include both that user's data AND public (user_id=NULL) data.


TypeScript

The TypeScript SDK (@awareness-sdk/memory-cloud) provides the same core API but does not yet include framework-specific adapters. Use the direct client methods:

import { MemoryCloudClient } from "@awareness-sdk/memory-cloud";

const client = new MemoryCloudClient({ baseUrl: "...", apiKey: "..." });

// Recall
const ctx = await client.recallForTask({
  memoryId: "memory_123",
  task: "summarize recent progress",
});

// Remember
await client.rememberStep({
  memoryId: "memory_123",
  text: "Completed the auth refactor.",
});