Autonomous AI Agents
Build agents that plan, execute tools, and deliver structured outputs with full deterministic control and observability.
Overview
Autonomous AI agents can plan multi-step tasks, use tools, and deliver structured results without constant human supervision. With Cognipeer, you get a deterministic runtime that keeps you in full control — no black-box graphs, just a transparent message-driven loop.
This use case walks you through building a research assistant agent that can search, analyse, and synthesise information using the Cognipeer ecosystem.
Architecture
Agent SDK provides the deterministic runtime that drives agent reasoning. It handles the core loop: resolve state → summarise context → call LLM → execute tools.
Agent Server exposes your agent as a production REST API with streaming, auth, and storage.
Chat UI gives users a conversational interface to interact with the agent in real time.
1. Define Tools with Agent SDK
Start by creating the tools your agent will use. Each tool has a name, description, Zod schema, and an async function.
import { createTool } from "@cognipeer/agent-sdk";
import { z } from "zod";
const searchWeb = createTool({
name: "search_web",
description: "Search the web for information",
schema: z.object({
query: z.string().describe("Search query"),
}),
func: async ({ query }) => {
// Integrate with your preferred search API
const results = await fetch(`https://api.search.example/q=${query}`);
return results.json();
},
});
const summarise = createTool({
name: "summarise",
description: "Summarise a piece of text",
schema: z.object({
text: z.string().describe("Text to summarise"),
maxLength: z.number().optional().describe("Max summary length"),
}),
func: async ({ text, maxLength }) => {
return { summary: text.slice(0, maxLength || 500) + "..." };
},
});2. Create the Agent
Create a smart agent with planning enabled. The agent will use a TODO list to break down complex tasks into steps.
import { createSmartAgent, fromLangchainModel } from "@cognipeer/agent-sdk";
import { ChatOpenAI } from "@langchain/openai";
const model = fromLangchainModel(new ChatOpenAI({
model: "gpt-4o",
apiKey: process.env.OPENAI_API_KEY,
}));
const researchAgent = createSmartAgent({
name: "ResearchAssistant",
model,
tools: [searchWeb, summarise],
systemPrompt: "You are a research assistant. Break down complex research questions into steps, search for information, and provide synthesised answers.",
useTodoList: true,
limits: { maxToolCalls: 10, maxToken: 16000 },
summarisation: { enabled: true, threshold: 12000 },
tracing: { enabled: true },
});3. Serve with Agent Server
Register the agent and expose it as a REST API with streaming support, persistent storage, and auto-generated Swagger docs.
import express from "express";
import {
createAgentServer,
createPostgresProvider,
createExpressMiddleware,
} from "@cognipeer/agent-server";
const storage = createPostgresProvider({
connectionString: process.env.DATABASE_URL!,
});
const server = createAgentServer({
basePath: "/api/agents",
storage,
swagger: { enabled: true, path: "/docs" },
auth: { enabled: true, type: "bearer" },
});
// Register the agent
server.registerSDKAgent("research-assistant", researchAgent, {
description: "A research assistant that searches and synthesises",
version: "1.0.0",
});
const app = express();
app.use(express.json());
await storage.connect();
app.use(createExpressMiddleware(server));
app.listen(3000, () => console.log("Agent server on :3000"));4. Connect Chat UI
Add a conversational interface with real-time streaming, tool visibility, and conversation history.
import { Chat } from "@cognipeer/chat-ui";
import "@cognipeer/chat-ui/styles.css";
function ResearchApp() {
return (
<div style={{ height: "100vh" }}>
<Chat
baseUrl="http://localhost:3000/api/agents"
agentId="research-assistant"
authorization="Bearer your-token"
theme="dark"
enableFileUpload={true}
/>
</div>
);
}Result
You now have a complete autonomous research agent that:
- Plans complex tasks using a TODO list - Executes web searches and summarisation tools - Streams real-time responses to the user - Persists conversations in PostgreSQL - Traces every step for observability - Exposes interactive API docs at `/docs`