Executive Intelligence System is a multi-agent assistant that routes user requests through a Groq-powered supervisor into specialized worker agents for email, calendar, research, notes, and reporting. The system uses LangGraph for orchestration, FastAPI for the backend API, React + Vite for the UI, and PostgreSQL checkpointing for thread history and cross-run state persistence.
- Understands a freeform request and classifies the intent automatically.
- Routes the request to the correct worker agent using LangGraph conditional edges.
- Preserves thread history so the same conversation can continue across runs.
- Exposes synchronous and streaming execution modes.
- Shows execution trace, conversation history, and agent output in the UI.
- Generates PDF and Excel output files from previous agent runs.
- Uses Groq
llama-3.3-70b-versatileto classify the current request. - Routes to email, calendar, research, notes, or report agent.
- Falls back to deterministic keyword routing if LLM classification fails.
- List recent Gmail messages.
- Read a specific message or the latest message.
- Create a Gmail draft.
- Send an email through Gmail.
- List upcoming Google Calendar events.
- Create events with summary, start time, end time, timezone, description, and location.
- Check calendar availability using Google free/busy queries.
- Run live Tavily web search.
- Summarize results with Groq.
- Return structured sources, key points, and follow-up questions.
- Search and read Notion pages.
- Create new Notion pages.
- Append content blocks to existing pages.
- Generate PDF summaries.
- Generate Excel summaries.
- Build reports from previously produced agent outputs in the same thread.
- Backend: FastAPI, Uvicorn, Python 3.11
- Frontend: React 18, Vite, JavaScript
- Orchestration: LangGraph
StateGraph - LLM: Groq via
langchain-groq - Search: Tavily
- Persistence: PostgreSQL via
PostgresSaver - Messaging and calendar integrations: Gmail API and Google Calendar API
- Notes integration: Notion API
- Report generation: ReportLab and openpyxl
- Retry logic: tenacity
- Containers: Docker and Docker Compose
- The frontend sends a request to
POST /invokeorGET /stream. - The backend loads prior thread state from PostgreSQL.
- The supervisor classifies the intent.
- LangGraph routes execution to the selected worker node.
- The selected worker agent returns structured JSON into
agent_output. - A final response node appends an assistant-style summary into the message history.
- The updated state is checkpointed to PostgreSQL and can later be retrieved through
GET /history/{thread_id}.
executive-intelligence/
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── agents/
│ ├── generated_reports/
│ ├── tests/
│ ├── graph.py
│ ├── main.py
│ ├── memory.py
│ ├── requirements.txt
│ └── state.py
├── docker-compose.yml
├── Dockerfile.backend
├── Dockerfile.frontend
├── frontend/
│ ├── index.html
│ ├── package.json
│ ├── vite.config.js
│ └── src/
├── secrets/
└── README.md
- Docker Desktop
- Python 3.11 if you want to run the backend locally without Docker
- Node.js 20 if you want to run the frontend locally without Docker
- A Groq API key
- A Tavily API key
- A Notion integration token and target parent page ID
- Google OAuth client credentials for Gmail and Calendar
- Copy
.env.exampleto.envif.envdoes not already exist. - Fill in the required values.
- Keep
.envprivate. It must not be committed. - The backend reads
.envautomatically at startup.
GROQ_API_KEY: required for supervisor classification and research summarization.TAVILY_API_KEY: required for research agent live search.NOTION_API_KEY: required for notes agent access.NOTION_PARENT_PAGE_ID: required for Notion page creation.GOOGLE_CLIENT_ID: required for Gmail and Calendar OAuth when JSON is not mounted.GOOGLE_CLIENT_SECRET: required for Gmail and Calendar OAuth when JSON is not mounted.GOOGLE_REDIRECT_URI: must match the redirect configured in Google Cloud Console. Default ishttp://localhost:8080/.DATABASE_URL: PostgreSQL connection string used by LangGraph checkpointing.REDIS_URL: Redis connection string used by the stack.VITE_API_BASE_URL: frontend target for backend API calls.
You can provide Google OAuth in one of two ways:
- Option 1: set
GOOGLE_CLIENT_IDandGOOGLE_CLIENT_SECRETin.env. - Option 2: place a credentials JSON file at
secrets/google_credentials.jsonand keepGOOGLE_CREDENTIALS_PATH=/app/secrets/google_credentials.json.
If the JSON file is missing, the application now falls back to env-based OAuth client config automatically.
- Open Google Cloud Console.
- Enable Gmail API.
- Enable Google Calendar API.
- Create an OAuth client.
- Add this redirect URI exactly:
http://localhost:8080/. - Put the client id and secret into
.envor mount the credentials JSON insecrets/google_credentials.json. - Start the stack.
- Trigger a Gmail or Calendar action.
- Watch backend logs with
docker compose logs -f api. - Open the Google consent URL printed by the backend.
- Complete the auth flow.
- The token file will be written to
secrets/google_token.json.
- Gmail and Calendar share the same Google OAuth setup.
- The first successful auth is interactive.
- After token creation, future requests reuse
google_token.json. - The API container exposes port
8080specifically for the Google redirect callback.
- Create an internal integration in Notion.
- Copy the integration token into
NOTION_API_KEY. - Share the target parent page with the integration.
- Put that parent page id into
NOTION_PARENT_PAGE_ID.
From the project root:
docker compose up --build -dUseful runtime commands:
docker compose ps
docker compose logs -f api
docker compose logs -f frontend
docker compose downLive service URLs:
- Frontend UI:
http://localhost:5173 - Backend API docs:
http://localhost:8000/docs - Backend OpenAPI JSON:
http://localhost:8000/openapi.json - Google OAuth callback port:
http://localhost:8080 - PostgreSQL:
localhost:5432 - Redis:
localhost:6379
python -m venv .venv
.venv\Scripts\activate
pip install -r backend/requirements.txt
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000cd frontend
npm install
npm run dev -- --host 0.0.0.0 --port 5173- Open
http://localhost:5173. - Enter a request in the message box.
- Choose
Stream runfor live execution updates orSync invokefor a single final response. - Watch the right-side trace panel for streamed graph updates.
- Use the left-side history panel to inspect stored checkpoints for the active thread.
- Optionally edit the memory JSON panel before running a request.
Runs the graph synchronously and returns the final state.
Example body:
{
"message": "Research the latest semiconductor export controls",
"thread_id": "thread-001",
"user_id": "executive-user",
"memory": {
"priority": "high"
}
}Streams state updates as server-sent events.
Example query:
/stream?message=query%3A+latest+AI+policy+changes&thread_id=thread-001&user_id=executive-user
Returns checkpoint history for a thread in reverse chronological order.
query: Compare US and EU AI regulation updates this quarter
Draft an email
to: ops@example.com
subject: Budget review
body: Please send the updated budget deck before 3 PM.
Send an email
to: ops@example.com
subject: Follow up
body: Here is the final update.
Create event
summary: Leadership sync
start: 2026-03-14T10:00:00Z
end: 2026-03-14T10:30:00Z
timezone: UTC
description: Weekly operating review
location: Zoom
Check availability
start: 2026-03-14T10:00:00Z
end: 2026-03-14T11:00:00Z
timezone: UTC
Create page
title: Q2 Planning Notes
content: Capture risks, hiring needs, and revenue assumptions.
Append to page
page_id: your-page-id
content: Add another action item and owner.
Generate a PDF and Excel report
title: Weekly Executive Brief
- Thread history is persisted in PostgreSQL.
- Final assistant summaries are appended to the conversation state.
- PDF and Excel files are written under
backend/generated_reports/. - Google OAuth token is written under
secrets/google_token.jsonafter first auth.
python -m unittest backend/tests/test_graph_finalizer.py- Open the UI and confirm it loads.
- Run a research query and confirm a streamed or synchronous result returns.
- Refresh history and confirm checkpoints appear for the same thread.
- Run a report after a successful research request and confirm files are created in
backend/generated_reports/. - Run a Notion page search and confirm notes agent returns structured results.
- Trigger Gmail or Calendar and complete the first OAuth flow.
Use one of these:
- Set
GOOGLE_CLIENT_IDandGOOGLE_CLIENT_SECRETin.env. - Or mount
secrets/google_credentials.json.
Then restart API:
docker compose up -d --build api- Confirm
http://localhost:8080/exists in the Google OAuth client redirect URIs. - Confirm Gmail API and Google Calendar API are enabled.
- Confirm the first consent flow was completed.
- Confirm
secrets/google_token.jsonexists after auth.
- Reuse the same
thread_idacross related requests. - Confirm PostgreSQL container is healthy.
- Confirm backend is reachable at
http://localhost:8000.
- Reports require at least one previous successful agent output in the same thread.
- Check
backend/generated_reports/for written files.
- Confirm
NOTION_API_KEYis valid. - Confirm
NOTION_PARENT_PAGE_IDis set. - Confirm the integration has access to the target Notion page.
- Confirm
GROQ_API_KEYis valid. - Confirm
TAVILY_API_KEYis valid.
- Never commit
.env. - Never commit real secret JSON files.
- Rotate any API keys that were ever exposed in git history, logs, screenshots, or pasted templates.
The application is already runnable through Docker, the UI is live, the backend endpoints are working, thread history is persisted, report generation is wired, and research plus non-interactive flows are operational. Gmail and Calendar require a one-time Google OAuth consent before they become fully operational.