Memories
The Memories resource handles episodic memory from conversations. Add conversations to extract facts automatically, and retrieve relevant context for new interactions.
client.memories.add()POST /v1/memoriesAdd a conversation to memory. Facts are extracted asynchronously and stored in the knowledge graph.
Parameters
group_idstrRequiredUnique identifier for the user or group. Convention: "user:123" or "org:acme".
messageslist[Message | dict]RequiredList of messages to add to memory. Each message should have content, role_type, and optionally role.
entity_typeslist[EntityTypeConfig]OptionalCustom entity types to extract from the conversation.
link_to_org_graphboolOptionalDefault: TrueWhether to link extracted entities to the organization graph.
link_to_ontologyboolOptionalDefault: FalseWhether to link entities to the ontology graph.
map_to_ontologyboolOptionalDefault: FalseMap extracted facts to ontology concepts.
ontology_domainstrOptionalDomain to use for ontology mapping (required if map_to_ontology=True).
ontology_confidence_thresholdfloatOptionalDefault: 0.7Minimum confidence score for ontology mapping.
Returns
AddMessagesResponse with message and success fields.
Example
1from memoair import MemoAir, Message23client = MemoAir()45# Add a conversation6response = client.memories.add(7 group_id="user:john",8 messages=[9 Message(10 content="I'm a Python developer working on AI projects",11 role_type="user",12 role="John"13 ),14 Message(15 content="That's great! What frameworks do you use?",16 role_type="assistant"17 ),18 Message(19 content="Mainly FastAPI for backends and PyTorch for ML",20 role_type="user",21 role="John"22 ),23 ],24 link_to_org_graph=True, # Connect to organization knowledge25)2627print(response.message)28# "Messages added successfully. Facts will be extracted asynchronously."client.memories.get()POST /v1/memories/retrieveRetrieve relevant context from memory for the current conversation.
Parameters
group_idstrRequiredThe group ID to retrieve memory for.
messageslist[Message | dict]RequiredCurrent conversation context to find relevant memories.
max_factsintOptionalDefault: 10Maximum number of facts to retrieve.
center_node_uuidstrOptionalUUID of a node to center the memory retrieval around.
Returns
GetMemoryResponse with facts: list[FactResult] and a to_prompt_context() helper method.
Example
1from memoair import MemoAir, Message23client = MemoAir()45# Retrieve context for a new query6context = client.memories.get(7 group_id="user:john",8 messages=[9 Message(content="What frameworks should I use?", role_type="user")10 ],11 max_facts=10,12)1314# Print retrieved facts15for fact in context.facts:16 print(f"- {fact.fact}")1718# Use in your LLM prompt19prompt = f"""You are a helpful assistant.2021What you know about this user:22{context.to_prompt_context()}2324User: What frameworks should I use?25"""client.memories.delete_group()DELETE /v1/memories/group/{group_id}Delete all memories for a group. Useful for GDPR compliance.
Parameters
group_idstrRequiredThe group ID to delete all memories for.
Example
1# GDPR: Delete all user data2result = client.memories.delete_group(group_id="user:john")34if result.success:5 print("All user memories deleted")