The Two Core Operations
Elephantasm has two primary SDK functions:
extract()— Capture events (messages, tool calls) for memory synthesisinject()— Retrieve a compiled memory pack for your LLM's context
Minimal Example
from elephantasm import inject, extract, EventType
# Get memory context for your LLM
pack = inject()
system_prompt = f"You are a helpful assistant.\n\n{pack.as_prompt()}"
# Capture conversation events
extract(EventType.MESSAGE_IN, "Hello!", role="user")
extract(EventType.MESSAGE_OUT, "Hi there!", role="assistant")The module-level functions (inject, extract) read ELEPHANTASM_API_KEY and ELEPHANTASM_ANIMA_ID from environment variables automatically.
Using the Explicit Client
For more control — multiple animas, custom endpoints, or dependency injection — use the client class directly:
from elephantasm import Elephantasm, EventType
# Initialize with credentials
with Elephantasm(api_key="sk_live_...", anima_id="...") as client:
# Get memory pack
pack = client.inject()
# Capture events
client.extract(EventType.MESSAGE_IN, "Hello!", role="user")
# Create a new anima
anima = client.create_anima("my-agent", description="Personal assistant")Full Chat Example
A complete chat function with memory injection:
from elephantasm import inject, extract, EventType
def chat(user_message: str) -> str:
# Get memory context
pack = inject()
# Build prompt with memory
messages = [
{"role": "system", "content": f"You are a helpful assistant.\n\n{pack.as_prompt()}"},
{"role": "user", "content": user_message}
]
# Call your LLM
response = openai.chat.completions.create(model="gpt-4", messages=messages)
assistant_message = response.choices[0].message.content
# Capture both sides of conversation
extract(EventType.MESSAGE_IN, user_message, role="user")
extract(EventType.MESSAGE_OUT, assistant_message, role="assistant")
return assistant_messageWhat Happens Next
Once you extract events, the Dreamer background process automatically:
- Synthesizes events into structured memories
- Extracts knowledge from memory patterns
- Builds an identity fingerprint over time
The next time you call inject(), these curated layers are compiled into a deterministic memory pack for your agent's context.
Next Steps
- Authentication — Get your API key
- Events — Understand event types and capture options
- Memory Packs — How inject() builds context