llms.txt
ng-oat provides machine-readable documentation following the llms.txt standard, so AI assistants and large language models can understand the library's full API surface without any manual copy-pasting.
What is llms.txt?
llms.txt is an open standard that lets projects provide structured, machine-readable documentation that LLMs can consume directly. Think of it like robots.txt, but for AI assistants, it tells them exactly what your project offers and how to use it correctly.
When an AI assistant has access to an llms.txt file, it can generate more accurate code, suggest correct API usage, and avoid hallucinating component names or properties that don't exist.
ng-oat's llms.txt
The ng-oat documentation site serves two LLM-focused endpoints. You can point any AI tool at these URLs to give it full knowledge of the library:
| File | URL | Content |
|---|---|---|
llms.txt | ng-oat.letsprogram.in/llms.txt | Overview, library summary, component list, and links to detailed docs |
llms-full.txt | ng-oat.letsprogram.in/llms-full.txt | Complete reference, all 37 components with full APIs, 87 design tokens, 300+ utility classes, 12 recipes, theming guide, and setup instructions |
What's included
The llms-full.txt file contains the complete ng-oat knowledge base:
📦 37 Components
Every component with its selector, import path, typed inputs, outputs, content slots, methods, and a working usage example.
🎨 87 Design Tokens
All CSS custom properties organized by category, colors, spacing, typography, radius, shadows, and transitions.
🧱 300+ Utility Classes
Every utility class across 30+ categories with responsive variant support (sm:/md:/lg:/xl:).
📖 12 Page Recipes
Full-page layout patterns, login, dashboard, pricing, blog, settings, and more, with HTML skeletons.
🌗 Theming Guide
Dark/light mode setup, provideNgOatTheme, NgOatThemeRef, light-dark(), and all 22 color tokens.
🚀 Setup Guide
Installation, angular.json config, provider setup, signal forms, and project structure recommendations.
How to use it
There are several ways to give your AI assistant access to ng-oat's documentation:
Option 1: MCP Server (recommended)
The ng-oat MCP server exposes the same data as queryable tools, so your assistant can look up exactly what it needs on demand rather than loading everything at once. This is the most efficient approach.
// .vscode/mcp.json, recommended approach
{
"servers": {
"ng-oat": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@letsprogram/ng-oat-mcp"]
}
}
}Option 2: Direct URL in prompts
Paste the URL into your AI chat to give it the full library context. Most AI assistants can fetch and parse the content:
Use the ng-oat documentation at https://ng-oat.letsprogram.in/llms-full.txt
as reference for all ng-oat components, tokens, and utilities.Option 3: Custom instructions
Add the URL to your IDE's custom instructions or system prompt so it's always available. For example, in VS Code's .github/copilot-instructions.md:
## ng-oat Reference
When working with @letsprogram/ng-oat, refer to the full API docs:
https://ng-oat.letsprogram.in/llms-full.txt
This contains all 37 components, 87 design tokens, 300+ utility classes,
and 12 page recipes for the ng-oat Angular component library.Option 4: Download locally
Fetch the file and keep it in your project for offline access:
curl -o llms-full.txt https://ng-oat.letsprogram.in/llms-full.txtSome AI tools support loading local reference files, placing llms-full.txt in your project root can help.
llms.txt file structure
Here's a preview of how the llms.txt file is structured:
# ng-oat v0.2.4, Angular Component Library
> Built on Oat CSS. 37 components, 87 tokens, 300+ utilities.
## Components
### NgOatButton
- Selector: `ng-oat-button`
- Import: `import { NgOatButton } from '@letsprogram/ng-oat'`
- Inputs: variant, size, disabled, loading, type
- Example: `<ng-oat-button variant="primary">Save</ng-oat-button>`
### NgOatCard
...
## Design Tokens
| Token | Category | Light | Dark |
|----------------|----------|-------------|-------------|
| --primary | color | #574747 | #fafafa |
| --background | color | #fff | #09090b |
...
## Utility Classes
- Layout: hstack, vstack, center, flex, grid
- Spacing: m-0..m-12, p-0..p-12, gap-0..gap-12
- Responsive: sm:flex, md:grid, lg:hidden
...
## Recipes
- Login page, Dashboard, Pricing, Blog, Settings...
## Theming
- provideNgOatTheme({ tokens: { '--oat-primary': '#6366f1' } })
- NgOatThemeRef for runtime switching
- CSS light-dark() for automatic dark modeMCP Server vs llms.txt
Both approaches give AI assistants ng-oat knowledge, but they work differently:
| Feature | llms.txt | MCP Server |
|---|---|---|
| How it works | Static file, loaded all at once | Live server, queried on demand |
| Context usage | Consumes full context window | Only fetches what's needed |
| Setup effort | Just paste a URL | One JSON config per IDE |
| Code generation | Basic, AI interprets the text | Structured, dedicated generate tool |
| Best for | Quick reference, any AI tool | Deep IDE integration, precise queries |
Recommendation: Use the MCP server for daily development work. Keep llms.txt as a fallback for tools that don't support MCP yet.