Confidant
Secure secret handoff and credential setup wizard for AI agents. Use when you need sensitive information from the user (API keys, passwords, tokens) or need...
Secure secret handoff and credential setup wizard for AI agents. Use when you need sensitive information from the user (API keys, passwords, tokens) or need...
Real data. Real impact.
Emerging
Developers
Per week
Open source
Skills give you superpowers. Install in 30 seconds.
Receive secrets from humans securely — no chat exposure, no copy-paste, no history leaks.
This is a human-in-the-loop process. You CANNOT retrieve the secret yourself.
❌ DO NOT curl/fetch the secret URL yourself — it's a web form for humans ❌ DO NOT skip sharing the URL — the user MUST receive it in chat ❌ DO NOT poll the API to check if the secret arrived — the script does this ❌ DO NOT proceed without confirming the secret was received ✅ Share URL → Wait → Confirm success → Use the secret silently
Run this once to install the CLI globally (avoids slow
npx calls):
bash {skill}/scripts/setup.sh
is the absolute path to the directory containing this{skill}file. Agents can resolve it at runtime:SKILL.mdSKILL_DIR=$(find "$HOME" -name "SKILL.md" -path "*/confidant/skill*" -exec dirname {} \; 2>/dev/null | head -1) # Then use: bash "$SKILL_DIR/scripts/setup.sh"
You need an API key from the user? One command:
bash {skill}/scripts/request-secret.sh --label "OpenAI API Key" --service openai
The script handles everything:
~/.config/openai/api_key (chmod 600) and exitsIf the user is remote (not on the same network), add
--tunnel:
bash {skill}/scripts/request-secret.sh --label "OpenAI API Key" --service openai --tunnel
This starts a localtunnel automatically (no account needed) and returns a public URL.
Output example:
🔐 Secure link created!URL: https://gentle-pig-42.loca.lt/requests/abc123 (tunnel: localtunnel | local: http://localhost:3000/requests/abc123) Save to: ~/.config/openai/api_key
Share the URL above with the user. Secret expires after submission or 24h.
Share the URL → user opens it → submits the secret → script saves to disk → done.
Without
or --service
, the script still polls and prints the secret to stdout (useful for piping or manual inspection).--save
request-secret.sh — Request, receive, and save a secret (recommended)# Save to ~/.config/<service>/api_key (convention) bash {skill}/scripts/request-secret.sh --label "SerpAPI Key" --service serpapiSave to explicit path
bash {skill}/scripts/request-secret.sh --label "Token" --save ~/.credentials/token.txt
Save + set env var
bash {skill}/scripts/request-secret.sh --label "API Key" --service openai --env OPENAI_API_KEY
Just receive (no auto-save)
bash {skill}/scripts/request-secret.sh --label "Password"
Remote user — start tunnel automatically
bash {skill}/scripts/request-secret.sh --label "Key" --service myapp --tunnel
JSON output (for automation)
bash {skill}/scripts/request-secret.sh --label "Key" --service myapp --json
| Flag | Description |
|---|---|
| Description shown on the web form (required) |
| Auto-save to |
| Auto-save to explicit file path |
| Set env var (requires or ) |
| Start localtunnel if no tunnel detected (for remote users) |
| Server port (default: 3000) |
| Max wait for startup (default: 30) |
| Output JSON instead of human-readable text |
check-server.sh — Server diagnostics (no side effects)bash {skill}/scripts/check-server.sh bash {skill}/scripts/check-server.sh --json
Reports server status, port, PID, and tunnel state (ngrok or localtunnel).
The
request-secret.sh script blocks until the secret is submitted (it polls continuously). Most agent runtimes (including OpenClaw's exec tool) impose execution timeouts that will kill the process before the user has time to submit.
Always run Confidant inside a tmux session:
# 1. Start server in tmux tmux new-session -d -s confidant tmux send-keys -t confidant "confidant serve --port 3000" Enter2. Create request in a second tmux window
tmux new-window -t confidant -n request tmux send-keys -t confidant:request "confidant request --label 'API Key' --service openai" Enter
3. Share the URL with the user (read from tmux output)
tmux capture-pane -p -t confidant:request -S -30
4. After user submits, check the result
tmux capture-pane -p -t confidant:request -S -10
Why not
? Agent runtimes typically kill processes after 30-60s. Since the script waits for human input (which can take minutes), it gets SIGKILL before completion. tmux keeps the process alive independently.exec
If your agent platform supports long-running background processes without timeouts,
exec with request-secret.sh works fine. But when in doubt, use tmux.
curl the Confidant API directly — use the scripts--tunnel instead--tunnel when the user is remote (not on the same machine/network)--service for API keys — cleanest conventionAgents can branch on exit codes for programmatic error handling:
| Code | Constant | Meaning |
|---|---|---|
| — | Success — secret received (saved to disk or printed to stdout) |
| | flag not provided |
| | , , , or not installed |
| / | Server failed to start or died during startup |
| | API returned empty URL — request not created |
| (from CLI) | failed (expired, not found, etc.) |
With
--json, all errors include a "code" field for programmatic branching:
{ "error": "...", "code": "MISSING_DEPENDENCY", "hint": "..." }
This is what the interaction should look like:
User: Can you set up my OpenAI key? Agent: I'll create a secure link for you to submit your API key safely. [runs: request-secret.sh --label "OpenAI API Key" --service openai --tunnel] Agent: Here's your secure link — open it in your browser and paste your key: 🔐 https://gentle-pig-42.loca.lt/requests/abc123 The link expires after you submit or after 24h. User: Done, I submitted it. Agent: ✅ Received and saved to ~/.config/openai/api_key. You're all set!
⚠️ Notice: the agent SENDS the URL and WAITS. It does NOT try to access the URL itself.
confidant request --poll which blocks until the secret is submitted--service or --save: secret is saved to disk (chmod 600), then destroyed on server--service/--save: secret is printed to stdout, then destroyed on server| Provider | Account needed | How |
|---|---|---|
| localtunnel (default) | No | flag or |
| ngrok | Yes (free tier) | Auto-detected if running on same port |
The script auto-detects both. If neither is running and
--tunnel is passed, it starts localtunnel.
For edge cases not covered by the scripts:
# Start server only confidant serve --port 3000 &Start server + create request + poll (single command)
confidant serve-request --label "Key" --service myapp
Create request on running server
confidant request --label "Key" --service myapp
Submit a secret (agent-to-agent)
confidant fill "<url>" --secret "<value>"
Check status of a specific request
confidant get-request <id>
Retrieve a delivered secret (by secret ID, not request ID)
confidant get <secret-id>
If
is not installed globally, runconfidantfirst, or prefix withbash {skill}/scripts/setup.sh.npx @aiconnect/confidant
⚠️ Only use direct CLI if the scripts don't cover your case.
No automatic installation available. Please visit the source repository for installation instructions.
View Installation Instructions1,500+ AI skills, agents & workflows. Install in 30 seconds. Part of the Torly.ai family.
© 2026 Torly.ai. All rights reserved.