Skip to main content

Quick Start Guide

Get up and running with Shinzo Platform in just a few minutes. This guide will walk you through instrumenting your MCP server and viewing product insights in the dashboard.

Prerequisites

Choose one based on your language: TypeScript/JavaScript: Python: Both:

Step 1: Install the SDK

Install the Shinzo instrumentation SDK in your MCP server project:
npm install @shinzolabs/instrumentation-mcp

Step 2: Get Your Ingest Token

  1. Sign up or log in at app.shinzo.ai.
  2. Navigate to Settings → Ingest Tokens.
  3. Copy your ingest token (it will look like abc123def456...).

Step 3: Instrument Your Server

Add telemetry to your MCP server with just a few lines of code:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"
import { instrumentServer } from "@shinzolabs/instrumentation-mcp"

const server = new McpServer({
  name: "my-awesome-server",
  version: "1.0.0",
  description: "My MCP server with telemetry"
})

// Add this after creating your server
const telemetry = instrumentServer(server, {
  serverName: server.name,
  serverVersion: server.version,
  exporterEndpoint: "https://api.app.shinzo.ai/telemetry/ingest_http",
  exporterAuth: {
    type: "bearer",
    token: "your-ingest-token-here" // Replace with your actual token
  }
})

// Continue with your existing server setup
server.tool("example_tool", {
  description: "An example tool",
  inputSchema: {
    type: "object",
    properties: {
      message: { type: "string" }
    }
  }
}, async (args) => {
  return { content: `Hello, ${args.message}!` }
})
Replace your-ingest-token-here with your actual ingest token from Step 2.

Step 4: Start Your Server

Run your instrumented MCP server as usual. Telemetry data will automatically be collected and sent to Shinzo whenever tools are executed.
Support for other capabilities like Resources, Prompts, and User Elicitation is coming soon!

Step 5: View Your Data

  1. Go to app.shinzo.ai.
  2. Navigate to the Dashboard to see real-time metrics.
  3. Check the Traces page to see individual tool executions.
  4. Visit Resources to monitor server performance.

Example: Complete MCP Server

Here’s a complete example of an instrumented MCP server:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js"
import { instrumentServer } from "@shinzolabs/instrumentation-mcp"

const server = new McpServer({
  name: "weather-server",
  version: "1.0.0",
  description: "Weather information MCP server"
})

// Instrument the server
const telemetry = instrumentServer(server, {
  serverName: "weather-server",
  serverVersion: "1.0.0",
  exporterEndpoint: "https://api.app.shinzo.ai/telemetry/ingest_http",
  exporterAuth: {
    type: "bearer",
    token: process.env.SHINZO_TOKEN
  }
})

// Add a weather tool
server.tool("get_weather", {
  description: "Get weather for a city",
  inputSchema: {
    type: "object",
    properties: {
      city: { type: "string" }
    },
    required: ["city"]
  }
}, async (args) => {
  // Simulate API call
  const weather = {
    city: args.city,
    temperature: Math.round(Math.random() * 30 + 10),
    condition: "sunny"
  }

  return {
    content: `Weather in ${weather.city}: ${weather.temperature}°C, ${weather.condition}`
  }
})

// Start the server
async function main() {
  const transport = new StdioServerTransport()
  await server.connect(transport)
}

main().catch(console.error)

What Gets Collected?

By default, Shinzo Platform collects: Tool execution metrics: Response times, success/failure rates Error tracking: Exceptions and error rates Tool arguments: Not collected by default (privacy-first) Response data: Not collected by default PII: Automatically sanitized when detected

Next Steps

Now that your server is instrumented:

Troubleshooting

Not seeing data?

  1. Check your ingest token: Make sure it’s correct and active.
  2. Verify network access: Ensure your server can reach api.app.shinzo.ai.
  3. Check console logs: Use exporterType: "console" to check logs for anomalies.

Performance concerns?

The SDK is designed to be lightweight with minimal performance impact:
  • Telemetry is exported in batches
  • Network calls are non-blocking
  • Sampling can be configured to reduce data volume
Need help? Contact us at [email protected]