Generate llms.txt (table of contents) and llms-full.txt (full content) from your app’s pages, and optionally serve any page as Markdown via content negotiation (Accept: text/markdown).
- Standard: llmstxt.org
- Works with: Next.js App Router (
@llmtxt/next), any Node.js framework (@llmtxt/core), and Next.js middleware (@llmtxt/middleware)
| Package | What it does |
|---|---|
@llmtxt/core |
Scan your app/ directory and generate llms.txt + llms-full.txt |
@llmtxt/next |
Next.js App Router route handlers for /llms.txt and /llms-full.txt |
@llmtxt/middleware |
Next.js middleware: any page responds as Markdown on Accept: text/markdown |
@llmtxt/react |
(It’s not working right now. It’s still in the testing phase.) |
- Quick Start — Next.js (App Router)
- Quick Start — Markdown for Agents (Next.js Middleware)
- Quick Start — React (Build-time, no backend)
- Quick Start — Other Frameworks
- How It Works
- API Reference (
@llmtxt/core) - Environment Variables (
@llmtxt/next) - Tips for Better Descriptions
- Development (this repo)
- License
npm install @llmtxt/nextsrc/app/
llms.txt/
route.ts
llms-full.txt/
route.ts
src/app/llms.txt/route.ts (zero-config):
export { GET } from '@llmtxt/next'src/app/llms-full.txt/route.ts (zero-config):
import { createLlmsFullTxtHandler } from '@llmtxt/next'
export const GET = createLlmsFullTxtHandler()Set one of:
NEXT_PUBLIC_APP_URL(recommended)VERCEL_URL(usually already set on Vercel)
Tip: Set
NEXT_PUBLIC_APP_URLexplicitly in local development and production.@llmtxt/nextwill only fall back tohttp://localhost:3000whenNODE_ENV=development.
Visit:
/llms.txt/llms-full.txt
// src/app/llms.txt/route.ts
import { createLlmsTxtHandler } from '@llmtxt/next'
export const GET = createLlmsTxtHandler({
title: 'My SaaS App',
summary: 'A tool that helps teams collaborate and ship faster.',
exclude: ['api', '(auth)', '_*', 'admin'],
})// src/app/llms-full.txt/route.ts
import { createLlmsFullTxtHandler } from '@llmtxt/next'
export const GET = createLlmsFullTxtHandler({
fetchTimeoutMs: 8000,
htmlToMarkdown: async (html) => {
const { convert } = await import('html-to-text')
return convert(html)
},
})Serve the Markdown version of any page when a client sends Accept: text/markdown.
npm install @llmtxt/middleware// middleware.ts (project root)
export { middleware, config } from '@llmtxt/middleware'This returns:
Content-Type: text/markdown; charset=utf-8Vary: Accept(so CDNs cache HTML and Markdown separately)x-markdown-tokens(token estimate for context sizing)Content-Signal(content usage preferences)
Test it:
curl https://yoursite.com/blog/my-post -H "Accept: text/markdown"For best Markdown quality in production, plug in @mozilla/readability + turndown (see packages/middleware/README.md).
For React SPAs (Vite/CRA/React Router), generate the files at build time and ship them from public/.
npm install -D @llmtxt/reactCreate a route list:
// llmtxt.routes.ts
import type { LlmtxtRoute } from '@llmtxt/react'
export const routes: LlmtxtRoute[] = [
{ path: '/', title: 'Home', description: 'What this site is about.' },
{ path: '/docs', title: 'Docs' },
]Generate:
// scripts/generate-llms.ts
import path from 'path'
import { writeLlmsFiles } from '@llmtxt/react'
import { routes } from '../llmtxt.routes'
await writeLlmsFiles({
routes,
baseUrl: process.env.PUBLIC_SITE_URL!, // e.g. https://example.com
outDir: path.join(process.cwd(), 'public'),
})Run it after deploy (recommended) or against a preview server:
PUBLIC_SITE_URL=https://example.com node scripts/generate-llms.tsnpm install @llmtxt/coreimport { generateLlmsTxt, generateLlmsFullTxt } from '@llmtxt/core'
import path from 'path'
app.get('/llms.txt', async (req, res) => {
const txt = await generateLlmsTxt({
appDir: path.join(process.cwd(), 'src/app'),
baseUrl: 'https://example.com',
title: 'My App',
summary: 'What my app does.',
})
res.setHeader('Content-Type', 'text/plain; charset=utf-8')
res.send(txt)
})
app.get('/llms-full.txt', async (req, res) => {
const txt = await generateLlmsFullTxt({
appDir: path.join(process.cwd(), 'src/app'),
baseUrl: 'https://example.com',
})
res.setHeader('Content-Type', 'text/plain; charset=utf-8')
res.send(txt)
})@llmtxt/core scans your app/ directory for page.tsx/page.jsx/page.ts/page.js files and produces a structured
index of links + descriptions. LLMs use this like a sitemap to understand what your site covers.
Example output:
# My App
> A tool that helps teams ship faster.
## Blog
- [Getting Started](https://example.com/blog/getting-started): Introduction to the platform
- [Advanced Usage](https://example.com/blog/advanced): Deep-dives and recipes
---
*Generated by @llmtxt/core · 2025-01-01T00:00:00.000Z · 12 pages*
Same scan, but also fetches and converts each page to Markdown (or plain text by default). This produces one big file containing your site’s content for ingestion.
@llmtxt/middleware adds content negotiation to your Next.js app: when a client requests any URL with Accept: text/markdown, it re-fetches that page as HTML, converts it to Markdown, and returns it with Content-Type: text/markdown, Vary: Accept, and x-markdown-tokens.
Search engines don’t directly “rank” llms.txt, but having clean, crawlable pages with accurate titles/descriptions helps both humans and models.
These files are primarily for LLM discoverability: they provide a structured summary and/or full content in a format that’s easy to ingest.
Generates llms.txt as a string.
Generates llms-full.txt as a string (fetches pages and converts HTML→text/Markdown).
| Option | Type | Default | Description |
|---|---|---|---|
appDir |
string |
— | Absolute path to your app/ directory |
baseUrl |
string |
— | Public base URL, e.g. https://example.com |
exclude |
string[] |
['api','_*','(auth)','(private)'] |
Route segments to skip (supports * wildcards) |
extractDescription |
(input) => string | undefined |
First // comment or metadata.description |
Custom description extractor |
maxDescriptionLength |
number |
150 |
Truncate descriptions at N chars |
| Option | Type | Default | Description |
|---|---|---|---|
title |
string |
'Documentation' |
H1 title in the output |
summary |
string |
— | One-liner below the title |
| Option | Type | Default | Description |
|---|---|---|---|
fetchTimeoutMs |
number |
5000 |
Per-page fetch timeout |
htmlToMarkdown |
(html) => Promise<string> | string |
Built-in stripper | Custom HTML→Markdown converter |
| Variable | Purpose |
|---|---|
NEXT_PUBLIC_APP_URL |
Your public app root URL, e.g. https://example.com — recommended for production and local dev |
VERCEL_URL |
Automatically set by Vercel; used as a fallback if NEXT_PUBLIC_APP_URL is missing |
Note: When
NODE_ENV=development,@llmtxt/nextwill default tohttp://localhost:3000if neither variable is set. For predictable results, always setNEXT_PUBLIC_APP_URLin.env.localor your deployment environment.
This repository publishes three npm packages:
@llmtxt/core— framework-agnostic generator forllms.txtandllms-full.txt@llmtxt/next— Next.js App Router route handlers forllms.txtandllms-full.txt@llmtxt/middleware— Next.js middleware that serves Markdown onAccept: text/markdown
Install only the packages you need:
npm install @llmtxt/core
npm install @llmtxt/next
npm install @llmtxt/middlewareSee each package README for the full option reference.
The scanner extracts descriptions automatically. Help it by adding a comment at the top of each page:
// The main dashboard showing user analytics and recent activity.
export default function DashboardPage() {
...
}Or use Next.js metadata:
export const metadata = {
description: 'The main dashboard showing user analytics.',
}Install deps:
npm installBuild:
npm run buildTest:
npm testMIT