Feature Requests 0
Loading…
Loading…
No squads yet.
API keys and integration settings. Secrets are never sent back to the browser.
Each squad PO needs its own Slack app. Each app has a unique @mention alias.
SLACK_PO_LLM. Claude requires a valid API key. All others require Ollama running locally.SLACK_PO_INSTANCES when saved to the database.Required for Upload / Load / Database source mode. Uses DATABASE_URL from above if left blank.
Your token budget for this deployment. Credits are managed by your admin.
Configure concurrent job slots and rate limits. Changes apply immediately to the running process.
Choose which events trigger outbound notifications. Configure Slack or webhook targets per event.
Set a default Slack Incoming Webhook URL and/or a generic webhook endpoint. These are used when you enable Slack or Webhook for any event below.
Create and manage Nova user accounts.
Control how each agent is allowed to run. Changes are saved to the database and take effect within 30 seconds.
supervised — runs normally · allowed — runs with no restrictions · dry_run — agent is skipped entirely (pipeline stops)
User invitations and access management.
Generate a one-time link (valid for 1 hour) or send a direct email invitation.
Opens your default mail app with a pre-filled invitation. A new link is generated automatically if none exists yet.
Nova Pool Analytics for Company A
Loading memory statistics…
Visual blueprints of the system's core design — memory, solution, communication, and LLM strategy.
Two-tier persistent memory: PostgreSQL + pgvector as primary store with JSON fallback. Embeddings generated via Ollama's nomic-embed-text model (768 dimensions).
End-to-end pipeline from file drop or API to a delivered feature — watcher, queue, orchestrator, gated agent stages, and dual-write persistence.
Queue-based async messaging with a 2-second worker poll loop. Human decision gates block stage progression until resolved via the dashboard.
Unified AI client routes per-agent LLM calls to either Claude (Anthropic cloud) or Ollama (local). Each agent's provider is configurable per feature.