Setting Up SUNRA MCP Server

The SUNRA Model Context Protocol (MCP) server provides seamless integration with popular development environments, allowing you to access SUNRA’s AI models directly from your editor or IDE. This guide will walk you through setting up the MCP server with various tools.

What is MCP?

Model Context Protocol (MCP) is a standardized way for AI assistants to securely access external resources and tools. The SUNRA MCP server allows you to:
  • List and search AI models available on SUNRA
  • Submit requests to any model endpoint
  • Check status and retrieve results from the queue system
  • Upload files and manage model schemas
  • Manage API authentication seamlessly

Prerequisites

Before setting up the MCP server, ensure you have:
  1. Node.js installed (version 18 or higher)
  2. SUNRA API key from your dashboard
  3. Your preferred editor/IDE installed

Installation

The SUNRA MCP server is available as an npm package:
npm install -g @sunra/mcp-server
Or use it directly with npx (recommended):
npx @sunra/mcp-server

Configuration by Editor

Cursor

Cursor supports MCP through its configuration file. Create or update .cursor/mcp.json in your project root:
{
  "mcpServers": {
    "sunra-mcp-server": {
      "command": "npx",
      "args": ["@sunra/mcp-server"]
    }
  }
}
Alternative global configuration in your user settings:
  1. Open Cursor Settings
  2. Navigate to “MCP Servers”
  3. Add a new server with:
    • Name: sunra-mcp-server
    • Command: npx
    • Args: @sunra/mcp-server

VS Code

For VS Code with MCP support (requires compatible extension): Create .vscode/mcp.json:
{
  "mcpServers": {
    "sunra-mcp-server": {
      "command": "npx",
      "args": ["@sunra/mcp-server"],
      "env": {
        "SUNRA_KEY": "your-api-key-here"
      }
    }
  }
}

Cline

Cline supports MCP servers through its settings. Add to your Cline configuration:
{
  "mcpServers": {
    "sunra": {
      "command": "npx",
      "args": ["@sunra/mcp-server"],
      "description": "SUNRA AI model access"
    }
  }
}

Windsurf

For Windsurf IDE, configure MCP in the workspace settings:
{
  "mcp": {
    "servers": {
      "sunra-mcp-server": {
        "command": "npx",
        "args": ["@sunra/mcp-server"],
        "timeout": 30000
      }
    }
  }
}

Claude Desktop

Add to your Claude Desktop configuration file: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "sunra-mcp-server": {
      "command": "npx",
      "args": ["@sunra/mcp-server"]
    }
  }
}

Other MCP-Compatible Tools

For any other tool that supports MCP, use this general configuration pattern:
{
  "mcpServers": {
    "sunra-mcp-server": {
      "command": "npx",
      "args": ["@sunra/mcp-server"],
      "env": {
        "SUNRA_KEY": "${SUNRA_KEY}"
      }
    }
  }
}

Environment Setup

Setting Your API Key

You can configure your SUNRA API key in several ways:
export SUNRA_KEY="your-api-key-here"
For Windows:
set SUNRA_KEY=your-api-key-here

Option 2: Configuration File

Some editors allow you to set environment variables directly in the MCP configuration:
{
  "mcpServers": {
    "sunra-mcp-server": {
      "command": "npx",
      "args": ["@sunra/mcp-server"],
      "env": {
        "SUNRA_KEY": "your-api-key-here"
      }
    }
  }
}

Option 3: Runtime Configuration

The MCP server also supports setting the API key at runtime using the set-sunra-key tool.

Available MCP Tools

Once configured, you’ll have access to these tools through your AI assistant:

Model Management

  • list-models - Browse all available AI models
  • search-models - Find models by keywords
  • model-schema - Get input/output schemas for specific models

Request Management

  • submit - Submit requests to model endpoints
  • status - Check request status and logs
  • result - Retrieve completed results
  • cancel - Cancel running requests
  • subscribe - Submit and wait for completion

File Management

  • upload - Upload files to SUNRA storage

Usage Examples

Listing Available Models

Use the list-models tool to show me what AI models are available.

Generating an Image

Use the submit tool to generate an image with the black-forest-labs/flux-1.1-pro/text-to-image endpoint. 
Use the prompt: "A serene mountain landscape at sunset"

Checking Request Status

Check the status of request ID: pd_xxxxxx

Troubleshooting

Common Issues

MCP Server Not Found
  • Ensure Node.js is installed and accessible
  • Try installing globally: npm install -g @sunra/mcp-server
  • Verify the command path is correct
Authentication Errors
  • Check that your SUNRA_KEY environment variable is set
  • Verify your API key is valid at SUNRA Dashboard
  • Try setting the key using the set-sunra-key tool
Connection Timeouts
  • Increase timeout values in your configuration
  • Check your internet connection
  • Verify SUNRA API status
Permission Errors
  • Ensure proper file permissions for configuration files
  • Try running with appropriate user permissions

Getting Help

If you encounter issues:
  1. Check the SUNRA Documentation
  2. Review your editor’s MCP documentation
  3. Raise an issue on GitHub

Next Steps

Once your MCP server is configured:
  1. Explore Models: Use list-models to see all available AI capabilities
  2. Try Examples: Start with simple text-to-image or text-to-video generations
  3. Build Workflows: Combine multiple models for complex AI pipelines
  4. Monitor Usage: Track your API usage in the SUNRA Dashboard
The MCP integration makes it easy to incorporate powerful AI models directly into your development workflow, enabling rapid prototyping and seamless AI-powered features in your applications.