Skip to main content
Most agent issues fall into a few buckets. Here’s how to clear them.

”No API key found”

The agent can’t find a valid API key. Options:
  • Run percio login and paste your key.
  • Set PERCIO_API_KEY in your shell: export PERCIO_API_KEY="pk_...".
  • If both are set, the env var wins.
Generate keys on the dashboard under IntegrationsAgent integrations.

”Daemon not reachable” or port conflicts

The lockfile is pointing at a daemon that isn’t actually running, or the port is in use by something else. Clear it:
percio stop
percio start
If that doesn’t work, try starting on a different port:
percio stop
percio start --port 5174 -d

The CLI hangs at startup

Rare, but can happen if something else on your machine is holding the lockfile. Delete it:
rm -rf ~/.config/percio/daemon.lock
Then retry.

command not found: percio

Your npm global bin directory isn’t on your PATH. Find it:
npm config get prefix
Add $(npm config get prefix)/bin to your shell’s PATH.

Cursor doesn’t see the Percio tools

  • Did you restart Cursor? MCP config changes require a reload.
  • Is the JSON valid? A missing comma or bracket breaks the whole file.
  • Is the key correct? Try the same key with percio personas list from your terminal. If that fails, the key is the problem.
  • Is npx installed? It comes with npm — npx --version should print a number.

MCP server starts but tools fail

Check what the MCP server actually sees. Run it manually:
PERCIO_API_KEY=pk_... percio-mcp
Any errors print to stderr.

Tests on localhost don’t find my app

  • Is the port right? http://localhost:3000 vs. http://localhost:5173 — make sure the URL matches your dev server.
  • Is the dev server running? Check with your browser first.
  • Is the app behind auth? Scope the scenario to unauthenticated pages, or pre-authenticate the browser context (requires more advanced setup — open an issue if you need this).

The test abandons immediately

Usually means the agent couldn’t interact with the page:
  • Cookie banner or modal blocking the viewport — include “dismiss any cookie banner first” in your scenario.
  • Heavy client-side routing that takes longer than the agent expects to wait — rerun with --max-steps higher if needed.
  • URL redirected to a login page — see above.

The test runs but results are generic

Almost always a scenario problem. See Writing good scenarios. The agent can only evaluate what you ask it to — a vague scenario produces a vague report.

Where to get help

  • Check core concepts if the terminology is the confusion.
  • Ping Percio support with your test id or the output of the failing command.

What’s next