Open-source memory
for AI agents.

cortex-engine gives your agent persistent semantic memory. Install, init, query. That's it.

$ npm install cortex-engine && npx fozikio init my-agent

Memory in ten lines

Give your agent persistent, semantic memory. No vector DB setup, no embedding pipelines — cortex-engine handles it.

agent.ts
import { CortexEngine } from 'cortex-engine'; const cortex = new CortexEngine({  provider: 'sqlite',  embeddings: 'ollama',}); // Your agent observes somethingawait cortex.observe('User prefers TypeScript over Python'); // Later, it recalls relevant contextconst memories = await cortex.query('language preferences');// → [{ content: 'User prefers TypeScript...', salience: 0.92 }]

Up and running in one command

cortex-engine handles storage, embeddings, and MCP tooling. You focus on your agent.

Step 1

Install

npm install cortex-engine, then npx fozikio init. Local SQLite storage, Ollama embeddings, default config. No accounts needed.

Step 2

Configure

Choose your storage provider (Firestore, SQLite), embedding engine (Ollama, Vertex, OpenAI), and install plugins. Or just use the defaults.

Step 3

Build

observe(), query(), validate(), evolve() — 17 core MCP tools plus plugin tools. Your agent remembers everything.

Built in the open

Everything is MIT licensed. Read it, fork it, contribute. 10 stars on GitHub is worth more than 0 sales on Gumroad.