Fireworks AI MCP Server

Stableother
70 starsTypeScriptfireworks-ai
GitHub
About

Overview

The Fireworks AI MCP Server provides access to Fireworks' optimized model inference. It supports fast generation, function calling, and batch inference for open-source models.
Capabilities

Tools & Capabilities

chat_completion

Generate text with optimized inference

embed

Generate embeddings

list_models

List available models

Setup

Installation

bash
Install
npx -y mcp-server-fireworks-ai
Examples

Example Usage

javascript
Usage
{
  "mcpServers": {
    "fireworks-ai": {
      "command": "npx",
      "args": ["-y", "mcp-server-fireworks-ai"]
    }
  }
}

Quick Info

Authorfireworks-ai
LanguageTypeScript
StatusStable
Stars 70
Last UpdatedFeb 12, 2026

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow.

Get in Touch

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow. From proprietary data sources to internal tools, we have you covered.

Contact Us
CortexAgent Customer Service

Want to skip the form?

Our team is available to help you get started with CortexAgent.

This chat may be recorded for quality assurance. You can view our Privacy Policy.