Use scenarios

How to use tunr and Vibecoder features in real workflows.

Share a local app for a client demo

Expose your local dev server and share the URL. No deploy, no config.

tunr share -p 3000

Share the printed URL (e.g. https://abc123.tunr.sh) with your client.

Feedback widget: collect visual and error feedback

Viewers can pin feedback and report JS errors; you see them in the dashboard and terminal.

  1. Start the tunnel with the widget: tunr share -p 3000 --inject-widget
  2. Share the URL with your client.
  3. They click the feedback control on the page, highlight an element or describe an issue, and submit.
  4. Open Dashboard → Feedback to see submissions and mark them resolved.

Demo mode + freeze: safe, read-only demos

Prevent writes and serve the last response if your local server crashes.

tunr share -p 3000 --demo --freeze
  • --demo: Intercepts POST/PUT/DELETE so nothing is written.
  • --freeze: If the app crashes, the last successful response is served.

MCP (Claude, Cursor, Windsurf)

Let AI tools manage tunnels via the Model Context Protocol.

tunr mcp

Configure your AI environment to use the tunr MCP server so agents can start/stop tunnels and open URLs.

Path routing: frontend + API from one URL

Send different paths to different local ports.

tunr share --route /=3000 --route /api=8080

Frontend on 3000, API on 8080, single public URL.