Morph + Vercel AI SDK

Stream code edits at 4500+ tokens/second using the Vercel AI SDK with Morph’s fast apply model. Use Vercel’s AI Gateway for unified billing, rate limits, and failover across 100+ AI models.

Setup

  1. Get an AI Gateway API key from Vercel
  2. Add it to your environment variables as OPENAI_API_KEY
  3. Install the AI SDK:
npm install ai@beta

Option 2: Direct API

  1. Get a Morph API key from the Morph dashboard
  2. Add it to your environment variables as MORPH_API_KEY
  3. Install the AI SDK:
npm install ai@beta

Implementation

import { streamText } from 'ai'
import { createOpenAI } from '@ai-sdk/openai'

const openai = createOpenAI({
  apiKey: process.env.OPENAI_API_KEY!,
  baseURL: 'https://gateway.ai.vercel.com/v1',
  headers: {
    'X-Vercel-AI-Provider': 'morph',
  },
})

export async function POST(req: Request) {
  const { originalCode, editInstructions } = await req.json()

  // Get the morph model through AI Gateway
  const model = openai('morph-v3-fast')

  // Call the language model with the prompt
  const result = streamText({
    model,
    messages: [
      {
        role: 'user',
        content: `<instruction>${editInstructions}</instruction>\n<code>${originalCode}</code>\n<update>${editInstructions}</update>`
      }
    ],
    topP: 1,
  })

  // Respond with a streaming response
  return result.toAIStreamResponse()
}
That’s it! Stream fast code edits with Morph using the Vercel AI SDK.