AI Content & Workflows8 min read

How to Integrate Groq with Your Headless CMS

Connect Groq to your headless CMS so published content can trigger fast LLM tasks like summaries, translations, classifications, and AI-ready answers.

Published April 29, 2026
01Overview

What is Groq?

Groq is an AI inference platform for running open models, including Llama, Mixtral, and Gemma, through an OpenAI-compatible API. Teams use Groq when they need low-latency text generation, content classification, extraction, and agent responses inside production workflows. Its core capability is fast LLM inference, not content structuring, so it works best when paired with clean, typed content from a source like Sanity's Content Lake.


02The case for integration

Why integrate Groq with a headless CMS?

If your editors publish 50 product updates, docs pages, or campaign articles a week, AI workflows get messy fast. Someone has to paste content into a model, write prompts, copy the result back, check formatting, and repeat it all when the source changes. Connecting Groq to a headless CMS category tool changes that flow. A publish event can send only the fields Groq needs, ask a model for a summary, FAQ, label, translation, or draft response, and write the result back to the content record or ship it to an app.


03Architecture

Architecture overview

A typical Sanity and Groq integration starts when an editor publishes or updates a document in Sanity Studio. A Sanity webhook fires on that mutation, or a Sanity Function runs directly on the content event. The handler receives the document ID, uses @sanity/client to fetch the full record from the Content Lake, and uses GROQ to select only the fields Groq needs, such as title, body text, product name, locale, category, and related references. The handler then calls Groq's chat completions API through the groq-sdk package, passing the selected content and a task-specific prompt. Groq returns generated text, JSON, or labels. The handler can patch those results back into Sanity, send them to a frontend API, or cache them for a user-facing experience. The end user sees the result as an article summary, product answer, support response, translated snippet, or AI-generated recommendation, while the source content stays structured in the Content Lake.


04Use cases

Common use cases

Publish-time summaries

Generate 2-sentence article summaries with Groq the moment a Sanity document is published, then save them to a reviewed field.

🏷️

Fast content classification

Send titles, body text, and taxonomy references to Groq to suggest topics, audience labels, or routing metadata.

🌍

Draft translations

Use Groq to create first-pass localized drafts from structured Sanity fields, then let editors review them in Sanity Studio.

🤖

AI support answers

Fetch approved docs from the Content Lake, send compact context to Groq, and return fast answers in a help bot or product assistant.


05Implementation

Step-by-step integration

  1. 1

    Create a Groq account and API key

    Sign up at GroqCloud, create an API key, and add it to your server environment as GROQ_API_KEY. For a TypeScript project, install the SDK with npm install groq-sdk.

  2. 2

    Model the Sanity fields Groq will read and write

    In Sanity Studio, add typed fields for the source content and AI output. For an article, that might include title, body, excerpt, locale, category references, aiSummary, aiTags, and aiStatus.

  3. 3

    Write the GROQ query

    Use GROQ to fetch a small, model-ready payload. Join references when needed, for example category->title, author->name, and relatedProducts[]->title, instead of sending an entire document.

  4. 4

    Create the event trigger

    Use a Sanity webhook filtered to publish events, or use Functions when you want server-side processing without running separate infrastructure. Pass the document ID and type to your handler.

  5. 5

    Call Groq from your server code

    Use groq-sdk to call chat.completions.create with a specific model, prompt, and response format. Keep the call server-side so your Groq API key never reaches the browser.

  6. 6

    Test with real editorial cases

    Publish a draft article, confirm the webhook fires, inspect the payload sent to Groq, patch the result back to Sanity, and add editor review states before using the output in production.


06Code

Code example

typescriptapp/api/sanity-groq/route.ts
import {createClient} from '@sanity/client';
import Groq from 'groq-sdk';

const sanity = createClient({
  projectId: process.env.SANITY_PROJECT_ID!,
  dataset: process.env.SANITY_DATASET!,
  apiVersion: '2025-01-01',
  token: process.env.SANITY_WRITE_TOKEN!,
  useCdn: false,
});

const groq = new Groq({apiKey: process.env.GROQ_API_KEY!});

export async function POST(req: Request) {
  const {documentId} = await req.json();

  const article = await sanity.fetch(`
    *[_id == $id][0]{
      _id,
      title,
      "body": pt::text(body),
      "category": category->title
    }
  `, {id: documentId});

  if (!article) return Response.json({ok: false}, {status: 404});

  const completion = await groq.chat.completions.create({
    model: 'llama-3.1-8b-instant',
    messages: [
      {role: 'system', content: 'Write clear editorial summaries.'},
      {role: 'user', content: `Summarize this article in 2 sentences.
Title: ${article.title}
Category: ${article.category}
Body: ${article.body}`}
    ],
  });

  const aiSummary = completion.choices[0]?.message?.content?.trim();

  await sanity.patch(article._id)
    .set({aiSummary, aiStatus: 'needsReview'})
    .commit();

  return Response.json({ok: true, aiSummary});
}

07Why Sanity

How Sanity + Groq works

Build your Groq integration on Sanity

Sanity gives you the structured content foundation, real-time event system, and flexible APIs you need to connect Groq to editorial workflows and user-facing AI experiences.

Start building free →

08Comparison

CMS approaches to Groq

CapabilityTraditional CMSSanity
Model-ready contentContent is often mixed with layout, shortcodes, and rendered HTML, so you clean it before sending it to Groq.Content Lake returns typed JSON, joined references, and Portable Text converted to plain text through GROQ queries.
AI workflow triggersAI tasks usually run through plugins, cron jobs, or manual copy and paste.Webhooks or Functions can call Groq on publish, update, or delete events, with server-side logic close to the content change.
Field-level controlIntegrations may receive full pages or database records, which increases token use and prompt noise.GROQ projections select only the fields Groq needs, including referenced categories, authors, and products in one query.
Editorial reviewGenerated text often lands in plugin-specific fields or external tools, which can make review harder to track.AI output can be written to schema fields like aiSummary and aiStatus, then reviewed in Sanity Studio with the rest of the content.
Multi-channel deliveryThe website is usually the primary destination, and AI outputs may need separate publishing steps for apps or agents.The same structured content and Groq-generated fields can feed web, mobile, support bots, search experiences, and Agent Context.

09Next steps

Keep building

Explore related integrations to complete your content stack.

Ready to try Sanity?

See how Sanity's Content Operating System powers integrations with Groq and 200+ other tools.