A tool will allow users to assess how well their site's robots.txt file is optimized for AI-based crawlers, helping them improve their AI optimization strategies.
Enter a website URL to analyze its robots.txt file and see which AI crawlers are blocked. The tool provides step-by-step analysis with recommendations for optimization.
User Input → URL Validation → Fetch robots.txt → Parse Content →
AI Bot Analysis → Generate Recommendations → Display Results
Main
├── UrlInput
├── ProcessPanel
│ └── ProcessCard (x2)
│ ├── StatusBadge + IconComponent
│ └── CollapsibleContent
│ ├── RobotsTxtDisplay
│ └── AiBotAnalysisDisplay
├── ActionBar
└── RecommendationsPanel
└── RecommendationsDisplay
- User enters URL →
validation.ts
validates input - User clicks analyze →
useRobotsQuery.ts
triggers API call - API fetches robots.txt →
robots.ts
parses content process-helpers.ts
analyzes against AI bots database- Results update UI through context and process steps
- Recommendations display based on analysis results
This app can be essentialy divided into two kinds: UI only components and the components that handle how all these components can be used to give us a functional app.
UrlInput
: TheUrlInput
component renders aInput
instance, and is responsible for enabling users to enter the url, and showcase an appropriate error message in case of when we enter a wrong url (validation is handled elsewhere, and we'll explain it later).ProcessPanel
,ProcessCard
andCollapsibleContent
: In order ProcessPanel renders the processese in the form of processcard (two in our case for now), and the processcard basically has two parts, the top part that shows high level meta data about the process, and collapsiblecontent handles the actual content relevant to the process.StatusBadge
: This serves as a nice visual indicator alongside theIconComponent
and usesSTATUS_CONFIG
to show an approrpiate indication. The iconcomponent is animated!RobotsTxtDisplay
andAiBotAnalysisDisplay
: are the components that render inside the collapsiblecontent.
RecommendationsPanel
andRecommendationsDisplay
: When we have the results based off of what we get from our utility, we consitionally render the Recommendation section using this in whichRecommendationsDisplay
shows us what the user needs to improve in their robots.txt file.src/components/display/display-components.tsx
: This has a few common display components that are shared betweenRobotsTxtDisplay
,AiBotAnalysisDisplay
andRecommendationsDisplay
.ActionBar
: This renders twoButton
instances to be used as our main search handlers.
-
Main
: The central orchestration component that manages application flow and state coordination. It uses theuseAnalysis
context for view management,useProcessSteps
for workflow tracking, anduseRobotsAnalysis
for robots.txt fetching through TanStack Query. The component handles form validation, user interactions, and conditionally renders the home or recommendations view based on application state. -
useRobotsAnalysis
: A custom hook that extendsuseRobotsQuery
with user-friendly methods likecanAnalyze
for URL validation andstartAnalysis
for manual query triggering. It provides anisAnalyzing
flag mapped to the query's fetching state for loading indicators. -
useAnalysis
(AnalysisProvider): A context provider managing global application state including view navigation, analysis results, and URL state. It centralizes shared state management with computed properties likehasAnalysisResult
for result-dependent UI rendering. -
useProcessSteps
: Manages the step-by-step process visualization throughout the analysis workflow. It handles adding, updating, and resetting process steps with specialized methods likestartFetchStep
andupdateFromRobotsResult
for coordinating between fetching and analysis phases. -
isValidUrl
: A utility function providing comprehensive URL validation for robots.txt analysis. It handles various formats including localhost addresses, enforces HTTP/HTTPS protocols, and ensures URLs point to domain roots where robots.txt must be located. -
fetchRobotsTxt
andparseRobotsTxt
: Utilities for robots.txt retrieval and parsing.fetchRobotsTxt
calls our internal API endpoint to handle CORS complexities, whileparseRobotsTxt
processes raw content into structured data with user agents, rules, and sitemaps. -
analyzeRobotsTxt
: The core analysis logic that determines which AI bots are blocked by robots.txt files. It processes parsed data against known AI bots, handles wildcard rules, checks root path disallows, and returns structured analysis results. -
API Route (/api/robots)
: A Next.js API endpoint handling server-side robots.txt fetching to bypass CORS restrictions. It normalizes URLs, fetches content with proper timeouts and headers, and returns structured JSON responses matching theRobotsTxtResult
interface.
- Next.js 14 with App Router - Full-stack React framework with server-side API routes for CORS-free robots.txt fetching
- TypeScript - Type-safe development across all components, hooks, and utilities
- TanStack Query - Data synchronization and caching for server state management
- React Context API - Global state management for view navigation and analysis results
- Tailwind CSS - Utility-first CSS framework for responsive, mobile-first design
- Real-time Process Tracking - Step-by-step visual feedback with animated status indicators
- Node.js 18+
- npm or yarn package manager
-
Clone the repository
git clone <https://github.com/deeepsig/aio-tool> cd aio-tool
-
Install dependencies
npm install # or yarn install
-
Start the development server
npm run dev # or yarn dev
-
Open your browser Navigate to http://localhost:3000 to see the application
- Enter any valid domain (e.g.,
https://reddit.com
orgithub.com
) - Click "Start Analysis" to fetch and analyze the robots.txt file
- View the step-by-step process and results in real-time
- Check the recommendations panel for optimization suggestions
Note
This project is still under active development, but the main functionality works as intended! The core features for analyzing robots.txt files and providing AI bot blocking insights are fully operational.
If you have suggestions for improvements or encounter any issues, please feel free to open an issue in the repository. Your feedback is greatly appreciated!
I'll be doing a detailed writeup about the design decisions and engineering approach behind this tool later. This was an incredibly fun project to build!
- A list of AI agents and robots. - Ibelick
- Process card inspiration
- Linear
- Vercel
- John Pham
- Mariana Castilho
- Rauno!!
There are more which I'll update later.