Feature: Generate documentation in LLM-friendly Markdown (#6555)
* feat(llms): LLM-friendly Markdown, ChatGPT and Claude links. This enables LLM-friendly documentation for entire sections, allowing users to copy complete documentation sections with a single click. Lambda@Edge now generates .md files on-demand with: - Evaluated Hugo shortcodes - Proper YAML frontmatter with product metadata - Clean markdown without UI elements - Section aggregation (parent + children in single file) The llms.txt files are now generated automatically during build from content structure and product metadata in data/products.yml, eliminating the need for hardcoded files and ensuring maintainability. **Testing**: - Automated markdown generation in test setup via cy.exec() - Implement dynamic content validation that extracts HTML content and verifies it appears in markdown version **Documentation**: Documents LLM-friendly markdown generation **Details**: Add gzip decompression for S3 HTML files in Lambda markdown generator HTML files stored in S3 are gzip-compressed but the Lambda was attempting to parse compressed data as UTF-8, causing JSDOM to fail to find article elements. This resulted in 404 errors for .md and .section.md requests. - Add zlib gunzip decompression in s3-utils.js fetchHtmlFromS3() - Detect gzip via ContentEncoding header or magic bytes (0x1f 0x8b) - Add configurable DEBUG constant for verbose logging - Add debug logging for buffer sizes and decompression in both files The decompression adds ~1-5ms per request but is necessary to parse HTML correctly. CloudFront caching minimizes Lambda invocations. Await async markdown conversion functions The convertToMarkdown and convertSectionToMarkdown functions are async but weren't being awaited, causing the Lambda to return a Promise object instead of a string. This resulted in CloudFront validation errors: "The body is not a string, is not an object, or exceeds the maximum size" **Troubleshooting**: - Set DEBUG for troubleshooting in lambda * feat(llms): Add build-time LLM-friendly Markdown generation Implements static Markdown generation during Hugo build. **Key Features:** - Two-phase generation: HTML→MD (memory-bounded), MD→sections (fast) - Automatic redirect detection via file size check (skips Hugo aliases) - Product detection using compiled TypeScript product-mappings module - Token estimation for LLM context planning (4 chars/token heuristic) - YAML serialization with description sanitization **Performance:** - ~105 seconds for 5,000 pages + 500 sections - ~300MB peak memory (safe for 2GB CircleCI environment) - 23 files/sec conversion rate with controlled concurrency **Configuration Parameters:** - MIN_HTML_SIZE_BYTES (default: 1024) - Skip files below threshold - CHARS_PER_TOKEN (default: 4) - Token estimation ratio - Concurrency: 10 workers (CI), 20 workers (local) **Output:** - Single pages: public/*/index.md (with frontmatter + content) - Section bundles: public/*/index.section.md (aggregated child pages) **Files Changed:** - scripts/build-llm-markdown.js (new) - Main build script - scripts/lib/markdown-converter.cjs (renamed from .js) - Core conversion - scripts/html-to-markdown.js - Updated import path - package.json - Updated exports for .cjs module Related: Replaces Lambda@Edge on-demand generation (5s response time) with build-time static generation for production deployment. feat(deploy): Add staging deployment workflow and update CI Integrates LLM markdown generation into deployment workflows with a complete staging deployment solution. **CircleCI Updates:** - Switch from legacy html-to-markdown.js to optimized build:md - 2x performance improvement (105s vs 200s+ for 5000 pages) - Better memory management (300MB vs variable) - Enables section bundle generation (index.section.md files) **Staging Deployment:** - New scripts/deploy-staging.sh for local staging deploys - Complete workflow: Hugo build → markdown gen → S3 upload - Environment variable driven configuration - Optional step skipping for faster iteration - CloudFront cache invalidation support **NPM Scripts:** - Added deploy:staging command for convenience - Wraps deploy-staging.sh script **Documentation:** - Updated DOCS-DEPLOYING.md with comprehensive guide - Merged staging/production workflows with Lambda@Edge docs - Build-time generation now primary, Lambda@Edge fallback - Troubleshooting section with common issues - Environment variable reference - Performance metrics and optimization tips **Benefits:** - Manual staging validation before production - Consistent markdown generation across environments - Faster CI builds with optimized script - Better error handling and progress reporting - Section aggregation for improved LLM context **Usage:** ```bash export STAGING_BUCKET="test2.docs.influxdata.com" export AWS_REGION="us-east-1" export STAGING_CF_DISTRIBUTION_ID="E1XXXXXXXXXX" yarn deploy:staging ``` Related: Completes build-time markdown generation implementation refactor: Remove Lambda@Edge implementation Build-time markdown generation has replaced Lambda@Edge on-demand generation as the primary method. Removed Lambda code and updated documentation to focus on build-time generation and testing. Removed: - deploy/llm-markdown/ directory (Lambda@Edge code) - Lambda@Edge section from DOCS-DEPLOYING.md Added: - Testing and Validation section in DOCS-DEPLOYING.md - Focus on build-time generation workflow * feat: Add Rust HTML-to-Markdown prototype Implements core markdown-converter.cjs functions in Rust for performance comparison. Performance results: - Rust: ~257 files/sec (10× faster) - JavaScript: ~25 files/sec average Recommendation: Keep JavaScript for now, implement incremental builds first. Rust migration provides 10× speedup but requires 3-4 weeks integration effort. Files: - Cargo.toml: Rust dependencies (html2md, scraper, serde_yaml, clap) - src/main.rs: Core conversion logic + CLI benchmark tool - benchmark-comparison.js: Side-by-side performance testing - README.md: Comprehensive findings and recommendations * fix(ui): improve dropdown positioning on viewport resize - Ensure dropdown stays within viewport bounds (min 8px padding) - Reposition dropdown on window resize and scroll events - Clean up event listeners when dropdown closes * chore(deps): add remark and unified packages for markdown processing Add remark-parse, remark-frontmatter, remark-gfm, and unified for enhanced markdown processing capabilities. * fix(edge): add return to prevent trailing-slash redirect for valid extensions Without the return statement, the Lambda@Edge function would continue executing after the callback, eventually hitting the trailing-slash redirect logic. This caused .md files to redirect to URLs with trailing slashes, which returned 404 from S3. * fix(md): add built-in product mappings and full URL support - Add URL_PATTERN_MAP and PRODUCT_NAME_MAP constants directly in the CommonJS module (ESM product-mappings.js cannot be require()'d) - Update generateFrontmatter() to accept baseUrl parameter and construct full URLs for the frontmatter url field - Update generateSectionFrontmatter() similarly for section pages - Update all call sites to pass baseUrl parameter This fixes empty product fields and relative URLs in generated markdown frontmatter when served via Lambda@Edge. * feat(md): add environment flag for base URL control Add -e, --env flag to html-to-markdown.js to control the base URL in generated markdown frontmatter. This matches Hugo's -e flag behavior and allows generating markdown with staging or production URLs. Also update build-llm-markdown.js with similar environment support. * feat(md): add Rust markdown converter and improve validation - Add Rust-based HTML-to-Markdown converter with NAPI-RS bindings - Update Cypress markdown validation tests - Update deploy-staging.sh with force upload flag * deploy-staging.sh: - Defaults STAGING_URL to https://test2.docs.influxdata.com if not set - Exports it so yarn build:md -e staging can use it - Displays it in the summary * Delete scripts/prototypes/rust-markdown/benchmark-comparison.js * Delete scripts/prototypes directory * fix(llms): Include full URL for section page Markdown and list of child pages * feat(llms): clarify format selector text for AI use case Update button and dropdown text to make the AI/LLM purpose clearer: - Button: "Copy page for AI" / "Copy section for AI" - Sublabel: "Clean Markdown optimized for AI assistants" - Section sublabel: "{N} pages combined as clean Markdown for AI assistants" Cypress tests updated and passing (13/13). --------- Co-authored-by: Scott Anderson <scott@influxdata.com>pull/6581/head
parent
afc34b97b6
commit
c2093c8212
|
|
@ -42,6 +42,9 @@ jobs:
|
|||
- run:
|
||||
name: Hugo Build
|
||||
command: yarn hugo --environment production --logLevel info --gc --destination workspace/public
|
||||
- run:
|
||||
name: Generate LLM-friendly Markdown
|
||||
command: yarn build:md
|
||||
- persist_to_workspace:
|
||||
root: workspace
|
||||
paths:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
],
|
||||
"deny": [
|
||||
"Read(./.env)",
|
||||
"Read(./.env.*)",
|
||||
"Read(./secrets/**)",
|
||||
"Read(./config/credentials.json)",
|
||||
"Read(./build)"
|
||||
],
|
||||
"ask": [
|
||||
"Bash(git push:*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
@ -38,6 +38,7 @@ tmp
|
|||
|
||||
# TypeScript build output
|
||||
**/dist/
|
||||
**/dist-lambda/
|
||||
|
||||
# User context files for AI assistant tools
|
||||
.context/*
|
||||
|
|
@ -45,3 +46,17 @@ tmp
|
|||
|
||||
# External repos
|
||||
.ext/*
|
||||
|
||||
# Lambda deployment artifacts
|
||||
deploy/llm-markdown/lambda-edge/markdown-generator/*.zip
|
||||
deploy/llm-markdown/lambda-edge/markdown-generator/package-lock.json
|
||||
deploy/llm-markdown/lambda-edge/markdown-generator/.package-tmp/
|
||||
deploy/llm-markdown/lambda-edge/markdown-generator/yarn.lock
|
||||
deploy/llm-markdown/lambda-edge/markdown-generator/config.json
|
||||
|
||||
# JavaScript/TypeScript build artifacts
|
||||
*.tsbuildinfo
|
||||
*.d.ts
|
||||
*.d.ts.map
|
||||
*.js.map
|
||||
.eslintcache
|
||||
|
|
|
|||
|
|
@ -4,5 +4,5 @@ routes:
|
|||
headers:
|
||||
Cache-Control: "max-age=630720000, no-transform, public"
|
||||
gzip: true
|
||||
- route: "^.+\\.(html|xml|json|js)$"
|
||||
- route: "^.+\\.(html|xml|json|js|md)$"
|
||||
gzip: true
|
||||
|
|
|
|||
|
|
@ -0,0 +1,330 @@
|
|||
# Deploying InfluxData Documentation
|
||||
|
||||
This guide covers deploying the docs-v2 site to staging and production environments, as well as LLM markdown generation.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Staging Deployment](#staging-deployment)
|
||||
- [Production Deployment](#production-deployment)
|
||||
- [LLM Markdown Generation](#llm-markdown-generation)
|
||||
- [Testing and Validation](#testing-and-validation)
|
||||
- [Troubleshooting](#troubleshooting)
|
||||
|
||||
## Staging Deployment
|
||||
|
||||
Staging deployments are manual and run locally with your AWS credentials.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. **AWS Credentials** - Configure AWS CLI with appropriate permissions:
|
||||
```bash
|
||||
aws configure
|
||||
```
|
||||
|
||||
2. **s3deploy** - Install the s3deploy binary:
|
||||
```bash
|
||||
./deploy/ci-install-s3deploy.sh
|
||||
```
|
||||
|
||||
3. **Environment Variables** - Set required variables:
|
||||
```bash
|
||||
export STAGING_BUCKET="test2.docs.influxdata.com"
|
||||
export AWS_REGION="us-east-1"
|
||||
export STAGING_CF_DISTRIBUTION_ID="E1XXXXXXXXXX" # Optional
|
||||
```
|
||||
|
||||
### Deploy to Staging
|
||||
|
||||
Use the staging deployment script:
|
||||
|
||||
```bash
|
||||
yarn deploy:staging
|
||||
```
|
||||
|
||||
Or run the script directly:
|
||||
|
||||
```bash
|
||||
./scripts/deploy-staging.sh
|
||||
```
|
||||
|
||||
### What the Script Does
|
||||
|
||||
1. **Builds Hugo site** with staging configuration (`config/staging/hugo.yml`)
|
||||
2. **Generates LLM-friendly Markdown** (`yarn build:md`)
|
||||
3. **Uploads to S3** using s3deploy
|
||||
4. **Invalidates CloudFront cache** (if `STAGING_CF_DISTRIBUTION_ID` is set)
|
||||
|
||||
### Optional Environment Variables
|
||||
|
||||
Skip specific steps for faster iteration:
|
||||
|
||||
```bash
|
||||
# Skip Hugo build (use existing public/)
|
||||
export SKIP_BUILD=true
|
||||
|
||||
# Skip markdown generation
|
||||
export SKIP_MARKDOWN=true
|
||||
|
||||
# Build only (no S3 upload)
|
||||
export SKIP_DEPLOY=true
|
||||
```
|
||||
|
||||
### Example: Test Markdown Generation Only
|
||||
|
||||
```bash
|
||||
SKIP_DEPLOY=true ./scripts/deploy-staging.sh
|
||||
```
|
||||
|
||||
## Production Deployment
|
||||
|
||||
Production deployments are **automatic** via CircleCI when merging to `master`.
|
||||
|
||||
### Workflow
|
||||
|
||||
1. **Build Job** (`.circleci/config.yml`):
|
||||
- Installs dependencies
|
||||
- Builds Hugo site with production config
|
||||
- Generates LLM-friendly Markdown (`yarn build:md`)
|
||||
- Persists workspace for deploy job
|
||||
|
||||
2. **Deploy Job**:
|
||||
- Attaches workspace
|
||||
- Uploads to S3 using s3deploy
|
||||
- Invalidates CloudFront cache
|
||||
- Posts success notification to Slack
|
||||
|
||||
### Environment Variables (CircleCI)
|
||||
|
||||
Production deployment requires the following environment variables set in CircleCI:
|
||||
|
||||
- `BUCKET` - Production S3 bucket name
|
||||
- `REGION` - AWS region
|
||||
- `CF_DISTRIBUTION_ID` - CloudFront distribution ID
|
||||
- `SLACK_WEBHOOK_URL` - Slack notification webhook
|
||||
|
||||
### Trigger Production Deploy
|
||||
|
||||
```bash
|
||||
git push origin master
|
||||
```
|
||||
|
||||
CircleCI will automatically build and deploy.
|
||||
|
||||
## LLM Markdown Generation
|
||||
|
||||
Both staging and production deployments generate LLM-friendly Markdown files at build time.
|
||||
|
||||
### Output Files
|
||||
|
||||
The build generates two types of markdown files in `public/`:
|
||||
|
||||
1. **Single-page markdown** (`index.md`)
|
||||
- Individual page content with frontmatter
|
||||
- Contains: title, description, URL, product, version, token estimate
|
||||
|
||||
2. **Section bundles** (`index.section.md`)
|
||||
- Aggregated section with all child pages
|
||||
- Includes child page list in frontmatter
|
||||
- Optimized for LLM context windows
|
||||
|
||||
### Generation Script
|
||||
|
||||
```bash
|
||||
# Generate all markdown
|
||||
yarn build:md
|
||||
|
||||
# Generate for specific path
|
||||
node scripts/build-llm-markdown.js --path influxdb3/core/get-started
|
||||
|
||||
# Limit number of files (for testing)
|
||||
node scripts/build-llm-markdown.js --limit 100
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
Edit `scripts/build-llm-markdown.js` to adjust:
|
||||
|
||||
```javascript
|
||||
// Skip files smaller than this (Hugo alias redirects)
|
||||
const MIN_HTML_SIZE_BYTES = 1024;
|
||||
|
||||
// Token estimation ratio
|
||||
const CHARS_PER_TOKEN = 4;
|
||||
|
||||
// Concurrency (workers)
|
||||
const CONCURRENCY = process.env.CI ? 10 : 20;
|
||||
```
|
||||
|
||||
### Performance
|
||||
|
||||
- **Speed**: \~105 seconds for 5,000 pages + 500 sections
|
||||
- **Memory**: \~300MB peak (safe for 2GB CircleCI)
|
||||
- **Rate**: \~23 files/second with memory-bounded parallelism
|
||||
|
||||
## Testing and Validation
|
||||
|
||||
### Local Testing
|
||||
|
||||
Test markdown generation locally before deploying:
|
||||
|
||||
```bash
|
||||
# Prerequisites
|
||||
yarn install
|
||||
yarn build:ts
|
||||
npx hugo --quiet
|
||||
|
||||
# Generate markdown for testing
|
||||
yarn build:md
|
||||
|
||||
# Generate markdown for specific path
|
||||
node scripts/build-llm-markdown.js --path influxdb3/core/get-started --limit 10
|
||||
|
||||
# Run validation tests
|
||||
node cypress/support/run-e2e-specs.js \
|
||||
--spec "cypress/e2e/content/markdown-content-validation.cy.js"
|
||||
```
|
||||
|
||||
### Validation Checks
|
||||
|
||||
The Cypress tests validate:
|
||||
|
||||
- ✅ No raw Hugo shortcodes (`{{< >}}` or `{{% %}}`)
|
||||
- ✅ No HTML comments
|
||||
- ✅ Proper YAML frontmatter with required fields
|
||||
- ✅ UI elements removed (feedback forms, navigation)
|
||||
- ✅ GitHub-style callouts (Note, Warning, etc.)
|
||||
- ✅ Properly formatted tables, lists, and code blocks
|
||||
- ✅ Product context metadata
|
||||
- ✅ Clean link formatting
|
||||
|
||||
See [DOCS-TESTING.md](DOCS-TESTING.md) for comprehensive testing documentation.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### s3deploy Not Found
|
||||
|
||||
Install the s3deploy binary:
|
||||
|
||||
```bash
|
||||
./deploy/ci-install-s3deploy.sh
|
||||
```
|
||||
|
||||
Verify installation:
|
||||
|
||||
```bash
|
||||
s3deploy -version
|
||||
```
|
||||
|
||||
### Missing Environment Variables
|
||||
|
||||
Check required variables are set:
|
||||
|
||||
```bash
|
||||
echo $STAGING_BUCKET
|
||||
echo $AWS_REGION
|
||||
```
|
||||
|
||||
Set them if missing:
|
||||
|
||||
```bash
|
||||
export STAGING_BUCKET="test2.docs.influxdata.com"
|
||||
export AWS_REGION="us-east-1"
|
||||
```
|
||||
|
||||
### AWS Permission Errors
|
||||
|
||||
Ensure your AWS credentials have the required permissions:
|
||||
|
||||
- `s3:PutObject` - Upload files to S3
|
||||
- `s3:DeleteObject` - Delete old files from S3
|
||||
- `cloudfront:CreateInvalidation` - Invalidate cache
|
||||
|
||||
Check your AWS profile:
|
||||
|
||||
```bash
|
||||
aws sts get-caller-identity
|
||||
```
|
||||
|
||||
### Hugo Build Fails
|
||||
|
||||
Check for:
|
||||
|
||||
- Missing dependencies (`yarn install`)
|
||||
- TypeScript compilation errors (`yarn build:ts`)
|
||||
- Invalid Hugo configuration
|
||||
|
||||
Build Hugo separately to isolate the issue:
|
||||
|
||||
```bash
|
||||
yarn hugo --environment staging
|
||||
```
|
||||
|
||||
### Markdown Generation Fails
|
||||
|
||||
Check for:
|
||||
|
||||
- Hugo build completed successfully
|
||||
- TypeScript compiled (`yarn build:ts`)
|
||||
- Sufficient memory available
|
||||
|
||||
Test markdown generation separately:
|
||||
|
||||
```bash
|
||||
yarn build:md --limit 10
|
||||
```
|
||||
|
||||
### CloudFront Cache Not Invalidating
|
||||
|
||||
If you see stale content after deployment:
|
||||
|
||||
1. Check `STAGING_CF_DISTRIBUTION_ID` is set correctly
|
||||
2. Verify AWS credentials have `cloudfront:CreateInvalidation` permission
|
||||
3. Manual invalidation:
|
||||
```bash
|
||||
aws cloudfront create-invalidation \
|
||||
--distribution-id E1XXXXXXXXXX \
|
||||
--paths "/*"
|
||||
```
|
||||
|
||||
### Deployment Timing Out
|
||||
|
||||
For large deployments:
|
||||
|
||||
1. **Skip markdown generation** if unchanged:
|
||||
```bash
|
||||
SKIP_MARKDOWN=true ./scripts/deploy-staging.sh
|
||||
```
|
||||
|
||||
2. **Use s3deploy's incremental upload**:
|
||||
- s3deploy only uploads changed files
|
||||
- First deploy is slower, subsequent deploys are faster
|
||||
|
||||
3. **Check network speed**:
|
||||
- Large uploads require good bandwidth
|
||||
- Consider deploying from an AWS region closer to the S3 bucket
|
||||
|
||||
## Deployment Checklist
|
||||
|
||||
### Before Deploying to Staging
|
||||
|
||||
- [ ] Run tests locally (`yarn lint`)
|
||||
- [ ] Build Hugo successfully (`yarn hugo --environment staging`)
|
||||
- [ ] Generate markdown successfully (`yarn build:md`)
|
||||
- [ ] Set staging environment variables
|
||||
- [ ] Have AWS credentials configured
|
||||
|
||||
### Before Merging to Master (Production)
|
||||
|
||||
- [ ] Test on staging first
|
||||
- [ ] Verify LLM markdown quality
|
||||
- [ ] Check for broken links (`yarn test:links`)
|
||||
- [ ] Run code block tests (`yarn test:codeblocks:all`)
|
||||
- [ ] Review CircleCI configuration changes
|
||||
- [ ] Ensure all tests pass
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Contributing Guide](DOCS-CONTRIBUTING.md)
|
||||
- [Testing Guide](DOCS-TESTING.md)
|
||||
- [CircleCI Configuration](.circleci/config.yml)
|
||||
- [S3 Deploy Configuration](.s3deploy.yml)
|
||||
261
DOCS-TESTING.md
261
DOCS-TESTING.md
|
|
@ -11,12 +11,13 @@ This guide covers all testing procedures for the InfluxData documentation, inclu
|
|||
|
||||
## Test Types Overview
|
||||
|
||||
| Test Type | Purpose | Command |
|
||||
|-----------|---------|---------|
|
||||
| **Code blocks** | Validate shell/Python code examples | `yarn test:codeblocks:all` |
|
||||
| **Link validation** | Check internal/external links | `yarn test:links` |
|
||||
| **Style linting** | Enforce writing standards | `docker compose run -T vale` |
|
||||
| **E2E tests** | UI and functionality testing | `yarn test:e2e` |
|
||||
| Test Type | Purpose | Command |
|
||||
| ----------------------- | ----------------------------------- | ---------------------------- |
|
||||
| **Code blocks** | Validate shell/Python code examples | `yarn test:codeblocks:all` |
|
||||
| **Link validation** | Check internal/external links | `yarn test:links` |
|
||||
| **Style linting** | Enforce writing standards | `docker compose run -T vale` |
|
||||
| **Markdown generation** | Generate LLM-friendly Markdown | `yarn build:md` |
|
||||
| **E2E tests** | UI and functionality testing | `yarn test:e2e` |
|
||||
|
||||
## Code Block Testing
|
||||
|
||||
|
|
@ -70,7 +71,8 @@ See `./test/src/prepare-content.sh` for the full list of variables you may need.
|
|||
|
||||
For influxctl commands to run in tests, move or copy your `config.toml` file to the `./test` directory.
|
||||
|
||||
> [!Warning]
|
||||
> \[!Warning]
|
||||
>
|
||||
> - The database you configure in `.env.test` and any written data may be deleted during test runs
|
||||
> - Don't add your `.env.test` files to Git. Git is configured to ignore `.env*` files to prevent accidentally committing credentials
|
||||
|
||||
|
|
@ -111,6 +113,7 @@ pytest-codeblocks has features for skipping tests and marking blocks as failed.
|
|||
#### "Pytest collected 0 items"
|
||||
|
||||
Potential causes:
|
||||
|
||||
- Check test discovery options in `pytest.ini`
|
||||
- Use `python` (not `py`) for Python code block language identifiers:
|
||||
```python
|
||||
|
|
@ -121,6 +124,215 @@ Potential causes:
|
|||
# This is ignored
|
||||
```
|
||||
|
||||
## LLM-Friendly Markdown Generation
|
||||
|
||||
The documentation includes tooling to generate LLM-friendly Markdown versions of documentation pages, both locally via CLI and on-demand via Lambda\@Edge in production.
|
||||
|
||||
### Quick Start
|
||||
|
||||
```bash
|
||||
# Prerequisites (run once)
|
||||
yarn install
|
||||
yarn build:ts
|
||||
npx hugo --quiet
|
||||
|
||||
# Generate Markdown
|
||||
node scripts/html-to-markdown.js --path influxdb3/core/get-started --limit 10
|
||||
|
||||
# Validate generated Markdown
|
||||
node cypress/support/run-e2e-specs.js \
|
||||
--spec "cypress/e2e/content/markdown-content-validation.cy.js"
|
||||
```
|
||||
|
||||
### Comprehensive Documentation
|
||||
|
||||
For complete documentation including prerequisites, usage examples, output formats, frontmatter structure, troubleshooting, and architecture details, see the inline documentation:
|
||||
|
||||
```bash
|
||||
# Or view the first 150 lines in terminal
|
||||
head -150 scripts/html-to-markdown.js
|
||||
```
|
||||
|
||||
The script documentation includes:
|
||||
|
||||
- Prerequisites and setup steps
|
||||
- Command-line options and examples
|
||||
- Output file types (single page vs section aggregation)
|
||||
- Frontmatter structure for both output types
|
||||
- Testing procedures
|
||||
- Common issues and solutions
|
||||
- Architecture overview
|
||||
- Related files
|
||||
|
||||
### Related Files
|
||||
|
||||
- **CLI tool**: `scripts/html-to-markdown.js` - Comprehensive inline documentation
|
||||
- **Core logic**: `scripts/lib/markdown-converter.js` - Shared conversion library
|
||||
- **Lambda handler**: `deploy/llm-markdown/lambda-edge/markdown-generator/index.js` - Production deployment
|
||||
- **Lambda docs**: `deploy/llm-markdown/README.md` - Deployment guide
|
||||
- **Cypress tests**: `cypress/e2e/content/markdown-content-validation.cy.js` - Validation tests
|
||||
|
||||
### Frontmatter Structure
|
||||
|
||||
All generated markdown files include structured YAML frontmatter:
|
||||
|
||||
```yaml
|
||||
---
|
||||
title: Page Title
|
||||
description: Page description for SEO
|
||||
url: /influxdb3/core/get-started/
|
||||
product: InfluxDB 3 Core
|
||||
version: core
|
||||
date: 2024-01-15T00:00:00Z
|
||||
lastmod: 2024-11-20T00:00:00Z
|
||||
type: page
|
||||
estimated_tokens: 2500
|
||||
---
|
||||
```
|
||||
|
||||
Section pages include additional fields:
|
||||
|
||||
```yaml
|
||||
---
|
||||
type: section
|
||||
pages: 4
|
||||
child_pages:
|
||||
- title: Set up InfluxDB 3 Core
|
||||
url: /influxdb3/core/get-started/setup/
|
||||
- title: Write data
|
||||
url: /influxdb3/core/get-started/write/
|
||||
---
|
||||
```
|
||||
|
||||
### Testing Generated Markdown
|
||||
|
||||
#### Manual Testing
|
||||
|
||||
```bash
|
||||
# Generate markdown with verbose output
|
||||
node scripts/html-to-markdown.js --path influxdb3/core/get-started --limit 2 --verbose
|
||||
|
||||
# Check files were created
|
||||
ls -la public/influxdb3/core/get-started/*.md
|
||||
|
||||
# View generated content
|
||||
cat public/influxdb3/core/get-started/index.md
|
||||
|
||||
# Check frontmatter
|
||||
head -20 public/influxdb3/core/get-started/index.md
|
||||
```
|
||||
|
||||
#### Automated Testing with Cypress
|
||||
|
||||
The repository includes comprehensive Cypress tests for markdown validation:
|
||||
|
||||
```bash
|
||||
# Run all markdown validation tests
|
||||
node cypress/support/run-e2e-specs.js --spec "cypress/e2e/content/markdown-content-validation.cy.js"
|
||||
|
||||
# Test specific content file
|
||||
node cypress/support/run-e2e-specs.js \
|
||||
--spec "cypress/e2e/content/markdown-content-validation.cy.js" \
|
||||
content/influxdb3/core/query-data/execute-queries/_index.md
|
||||
```
|
||||
|
||||
The Cypress tests validate:
|
||||
|
||||
- ✅ No raw Hugo shortcodes (`{{< >}}` or `{{% %}}`)
|
||||
- ✅ No HTML comments
|
||||
- ✅ Proper YAML frontmatter with required fields
|
||||
- ✅ UI elements removed (feedback forms, navigation)
|
||||
- ✅ GitHub-style callouts (Note, Warning, etc.)
|
||||
- ✅ Properly formatted tables, lists, and code blocks
|
||||
- ✅ Product context metadata
|
||||
- ✅ Clean link formatting
|
||||
|
||||
### Common Issues and Solutions
|
||||
|
||||
#### Issue: "No article content found" warnings
|
||||
|
||||
**Cause**: Page doesn't have `<article class="article--content">` element (common for index/list pages)
|
||||
|
||||
**Solution**: This is normal behavior. The converter skips pages without article content. To verify:
|
||||
|
||||
```bash
|
||||
# Check HTML structure
|
||||
grep -l 'article--content' public/path/to/page/index.html
|
||||
```
|
||||
|
||||
#### Issue: "Cannot find module" errors
|
||||
|
||||
**Cause**: TypeScript not compiled (product-mappings.js missing)
|
||||
|
||||
**Solution**: Build TypeScript first:
|
||||
|
||||
```bash
|
||||
yarn build:ts
|
||||
ls -la dist/utils/product-mappings.js
|
||||
```
|
||||
|
||||
#### Issue: Memory issues when processing all files
|
||||
|
||||
**Cause**: Attempting to process thousands of pages at once
|
||||
|
||||
**Solution**: Use `--limit` flag to process in batches:
|
||||
|
||||
```bash
|
||||
# Process 1000 files at a time
|
||||
node scripts/html-to-markdown.js --limit 1000
|
||||
```
|
||||
|
||||
#### Issue: Missing or incorrect product detection
|
||||
|
||||
**Cause**: Product mappings not up to date or path doesn't match known patterns
|
||||
|
||||
**Solution**:
|
||||
|
||||
1. Rebuild TypeScript: `yarn build:ts`
|
||||
2. Check product mappings in `assets/js/utils/product-mappings.ts`
|
||||
3. Add new product paths if needed
|
||||
|
||||
### Validation Checklist
|
||||
|
||||
Before committing markdown generation changes:
|
||||
|
||||
- [ ] Run TypeScript build: `yarn build:ts`
|
||||
- [ ] Build Hugo site: `npx hugo --quiet`
|
||||
- [ ] Generate markdown for affected paths
|
||||
- [ ] Run Cypress validation tests
|
||||
- [ ] Manually check sample output files:
|
||||
- [ ] Frontmatter is valid YAML
|
||||
- [ ] No shortcode remnants (`{{<`, `{{%`)
|
||||
- [ ] No HTML comments (`<!--`, `-->`)
|
||||
- [ ] Product context is correct
|
||||
- [ ] Links are properly formatted
|
||||
- [ ] Code blocks have language identifiers
|
||||
- [ ] Tables render correctly
|
||||
|
||||
### Architecture
|
||||
|
||||
The markdown generation uses a shared library architecture:
|
||||
|
||||
```
|
||||
docs-v2/
|
||||
├── scripts/
|
||||
│ ├── html-to-markdown.js # CLI wrapper (filesystem operations)
|
||||
│ └── lib/
|
||||
│ └── markdown-converter.js # Core conversion logic (shared library)
|
||||
├── dist/
|
||||
│ └── utils/
|
||||
│ └── product-mappings.js # Product detection (compiled from TS)
|
||||
└── public/ # Generated HTML + Markdown files
|
||||
```
|
||||
|
||||
The shared library (`scripts/lib/markdown-converter.js`) is:
|
||||
|
||||
- Used by local markdown generation scripts
|
||||
- Imported by docs-tooling Lambda\@Edge for on-demand generation
|
||||
- Tested independently with isolated conversion logic
|
||||
|
||||
For deployment details, see [deploy/lambda-edge/markdown-generator/README.md](deploy/lambda-edge/markdown-generator/README.md).
|
||||
|
||||
## Link Validation with Link-Checker
|
||||
|
||||
Link validation uses the `link-checker` tool to validate internal and external links in documentation files.
|
||||
|
|
@ -158,8 +370,8 @@ chmod +x link-checker
|
|||
./link-checker --version
|
||||
```
|
||||
|
||||
> [!Note]
|
||||
> Pre-built binaries are currently Linux x86_64 only. For macOS development, use Option 1 to build from source.
|
||||
> \[!Note]
|
||||
> Pre-built binaries are currently Linux x86\_64 only. For macOS development, use Option 1 to build from source.
|
||||
|
||||
```bash
|
||||
# Clone and build link-checker
|
||||
|
|
@ -188,11 +400,11 @@ cp target/release/link-checker /usr/local/bin/
|
|||
curl -L -H "Authorization: Bearer $(gh auth token)" \
|
||||
-o link-checker-linux-x86_64 \
|
||||
"https://github.com/influxdata/docs-tooling/releases/download/link-checker-v1.2.x/link-checker-linux-x86_64"
|
||||
|
||||
|
||||
curl -L -H "Authorization: Bearer $(gh auth token)" \
|
||||
-o checksums.txt \
|
||||
"https://github.com/influxdata/docs-tooling/releases/download/link-checker-v1.2.x/checksums.txt"
|
||||
|
||||
|
||||
# Create docs-v2 release
|
||||
gh release create \
|
||||
--repo influxdata/docs-v2 \
|
||||
|
|
@ -209,7 +421,7 @@ cp target/release/link-checker /usr/local/bin/
|
|||
sed -i 's/link-checker-v[0-9.]*/link-checker-v1.2.x/' .github/workflows/pr-link-check.yml
|
||||
```
|
||||
|
||||
> [!Note]
|
||||
> \[!Note]
|
||||
> The manual distribution is required because docs-tooling is a private repository and the default GitHub token doesn't have cross-repository access for private repos.
|
||||
|
||||
#### Core Commands
|
||||
|
|
@ -230,6 +442,7 @@ link-checker config
|
|||
The link-checker automatically handles relative link resolution based on the input type:
|
||||
|
||||
**Local Files → Local Resolution**
|
||||
|
||||
```bash
|
||||
# When checking local files, relative links resolve to the local filesystem
|
||||
link-checker check public/influxdb3/core/admin/scale-cluster/index.html
|
||||
|
|
@ -238,6 +451,7 @@ link-checker check public/influxdb3/core/admin/scale-cluster/index.html
|
|||
```
|
||||
|
||||
**URLs → Production Resolution**
|
||||
|
||||
```bash
|
||||
# When checking URLs, relative links resolve to the production site
|
||||
link-checker check https://docs.influxdata.com/influxdb3/core/admin/scale-cluster/
|
||||
|
|
@ -246,6 +460,7 @@ link-checker check https://docs.influxdata.com/influxdb3/core/admin/scale-cluste
|
|||
```
|
||||
|
||||
**Why This Matters**
|
||||
|
||||
- **Testing new content**: Tag pages generated locally will be found when testing local files
|
||||
- **Production validation**: Production URLs validate against the live site
|
||||
- **No false positives**: New content won't appear broken when testing locally before deployment
|
||||
|
|
@ -321,6 +536,7 @@ The docs-v2 repository includes automated link checking for pull requests:
|
|||
- **Results reporting**: Broken links reported as GitHub annotations with detailed summaries
|
||||
|
||||
The workflow automatically:
|
||||
|
||||
1. Detects content changes in PRs using GitHub Files API
|
||||
2. Downloads latest link-checker binary from docs-v2 releases
|
||||
3. Builds Hugo site and maps changed content to public HTML files
|
||||
|
|
@ -405,6 +621,7 @@ docs-v2 uses [Lefthook](https://github.com/evilmartians/lefthook) to manage Git
|
|||
### What Runs Automatically
|
||||
|
||||
When you run `git commit`, Git runs:
|
||||
|
||||
- **Vale**: Style linting (if configured)
|
||||
- **Prettier**: Code formatting
|
||||
- **Cypress**: Link validation tests
|
||||
|
|
@ -459,6 +676,7 @@ For JavaScript code in the documentation UI (`assets/js`):
|
|||
```
|
||||
|
||||
3. Start Hugo: `yarn hugo server`
|
||||
|
||||
4. In VS Code, select "Debug JS (debug-helpers)" configuration
|
||||
|
||||
Remember to remove debug statements before committing.
|
||||
|
|
@ -490,6 +708,18 @@ yarn test:codeblocks:stop-monitors
|
|||
- Format code to fit within 80 characters
|
||||
- Use long options in command-line examples (`--option` vs `-o`)
|
||||
|
||||
### Markdown Generation
|
||||
|
||||
- Build Hugo site before generating markdown: `npx hugo --quiet`
|
||||
- Compile TypeScript before generation: `yarn build:ts`
|
||||
- Test on small subsets first using `--limit` flag
|
||||
- Use `--verbose` flag to debug conversion issues
|
||||
- Always run Cypress validation tests after generation
|
||||
- Check sample output manually for quality
|
||||
- Verify shortcodes are evaluated (no `{{<` or `{{%` in output)
|
||||
- Ensure UI elements are removed (no "Copy page", "Was this helpful?")
|
||||
- Test both single pages (`index.md`) and section pages (`index.section.md`)
|
||||
|
||||
### Link Validation
|
||||
|
||||
- Test links regularly, especially after content restructuring
|
||||
|
|
@ -511,9 +741,14 @@ yarn test:codeblocks:stop-monitors
|
|||
- **Scripts**: `.github/scripts/` directory
|
||||
- **Test data**: `./test/` directory
|
||||
- **Vale config**: `.ci/vale/styles/`
|
||||
- **Markdown generation**:
|
||||
- `scripts/html-to-markdown.js` - CLI wrapper
|
||||
- `scripts/lib/markdown-converter.js` - Core conversion library
|
||||
- `deploy/lambda-edge/markdown-generator/` - Lambda deployment
|
||||
- `cypress/e2e/content/markdown-content-validation.cy.js` - Validation tests
|
||||
|
||||
## Getting Help
|
||||
|
||||
- **GitHub Issues**: [docs-v2 issues](https://github.com/influxdata/docs-v2/issues)
|
||||
- **Good first issues**: [good-first-issue label](https://github.com/influxdata/docs-v2/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue)
|
||||
- **InfluxData CLA**: [Sign here](https://www.influxdata.com/legal/cla/) for substantial contributions
|
||||
- **InfluxData CLA**: [Sign here](https://www.influxdata.com/legal/cla/) for substantial contributions
|
||||
|
|
|
|||
|
|
@ -160,14 +160,21 @@ function getVersionSpecificConfig(configKey: string): unknown {
|
|||
// Try version-specific config first (e.g., ai_sample_questions__v1)
|
||||
if (version && version !== 'n/a') {
|
||||
const versionKey = `${configKey}__v${version}`;
|
||||
const versionConfig = productData?.product?.[versionKey];
|
||||
if (versionConfig) {
|
||||
return versionConfig;
|
||||
const product = productData?.product;
|
||||
if (product && typeof product === 'object' && !Array.isArray(product)) {
|
||||
const versionConfig = product[versionKey];
|
||||
if (versionConfig) {
|
||||
return versionConfig;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to default config
|
||||
return productData?.product?.[configKey];
|
||||
const product = productData?.product;
|
||||
if (product && typeof product === 'object' && !Array.isArray(product)) {
|
||||
return product[configKey];
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
function getProductExampleQuestions(): string {
|
||||
|
|
|
|||
|
|
@ -0,0 +1,666 @@
|
|||
/**
|
||||
* Format Selector Component
|
||||
*
|
||||
* Provides a dropdown menu for users and AI agents to access documentation
|
||||
* in different formats (Markdown for LLMs, ChatGPT/Claude integration, MCP servers).
|
||||
*
|
||||
* FEATURES:
|
||||
* - Copy page/section as Markdown to clipboard
|
||||
* - Open page in ChatGPT or Claude with context
|
||||
* - Connect to MCP servers (Cursor, VS Code) - future enhancement
|
||||
* - Adaptive UI for leaf nodes (single pages) vs branch nodes (sections)
|
||||
* - Smart section download for large sections (>10 pages)
|
||||
*
|
||||
* UI PATTERN:
|
||||
* Matches Mintlify's format selector with dark dropdown, icons, and sublabels.
|
||||
* See `.context/Screenshot 2025-11-13 at 11.39.13 AM.png` for reference.
|
||||
*/
|
||||
|
||||
interface FormatSelectorConfig {
|
||||
pageType: 'leaf' | 'branch'; // Leaf = single page, Branch = section with children
|
||||
markdownUrl: string;
|
||||
sectionMarkdownUrl?: string; // For branch nodes - aggregated content
|
||||
markdownContent?: string; // For clipboard copy (lazy-loaded)
|
||||
pageTitle: string;
|
||||
pageUrl: string;
|
||||
|
||||
// For branch nodes (sections)
|
||||
childPageCount?: number;
|
||||
estimatedTokens?: number;
|
||||
sectionDownloadUrl?: string;
|
||||
|
||||
// AI integration URLs
|
||||
chatGptUrl: string;
|
||||
claudeUrl: string;
|
||||
|
||||
// Future MCP server links
|
||||
mcpCursorUrl?: string;
|
||||
mcpVSCodeUrl?: string;
|
||||
}
|
||||
|
||||
interface FormatSelectorOption {
|
||||
label: string;
|
||||
sublabel: string;
|
||||
icon: string; // SVG icon name or class
|
||||
action: () => void;
|
||||
href?: string; // For external links
|
||||
target?: string; // '_blank' for external links
|
||||
external: boolean; // Shows ↗ arrow
|
||||
visible: boolean; // Conditional display based on pageType/size
|
||||
dataAttribute: string; // For testing (e.g., 'copy-page', 'open-chatgpt')
|
||||
}
|
||||
|
||||
interface ComponentOptions {
|
||||
component: HTMLElement;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize format selector component
|
||||
* @param {ComponentOptions} options - Component configuration
|
||||
*/
|
||||
export default function FormatSelector(options: ComponentOptions) {
|
||||
const { component } = options;
|
||||
|
||||
// State
|
||||
let isOpen = false;
|
||||
let config: FormatSelectorConfig = {
|
||||
pageType: 'leaf',
|
||||
markdownUrl: '',
|
||||
pageTitle: '',
|
||||
pageUrl: '',
|
||||
chatGptUrl: '',
|
||||
claudeUrl: '',
|
||||
};
|
||||
|
||||
// DOM elements
|
||||
const button = component.querySelector('button') as HTMLButtonElement;
|
||||
const dropdownMenu = component.querySelector(
|
||||
'[data-dropdown-menu]'
|
||||
) as HTMLElement;
|
||||
|
||||
if (!button || !dropdownMenu) {
|
||||
console.error('Format selector: Missing required elements');
|
||||
return;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize component config from page context and data attributes
|
||||
*/
|
||||
function initConfig(): void {
|
||||
// page-context exports individual properties, not a detect() function
|
||||
const currentUrl = window.location.href;
|
||||
const currentPath = window.location.pathname;
|
||||
|
||||
// Determine page type (leaf vs branch)
|
||||
const childCount = parseInt(component.dataset.childCount || '0', 10);
|
||||
const pageType: 'leaf' | 'branch' = childCount > 0 ? 'branch' : 'leaf';
|
||||
|
||||
// Construct markdown URL
|
||||
// Hugo generates markdown files as index.md in directories matching the URL path
|
||||
let markdownUrl = currentPath;
|
||||
if (!markdownUrl.endsWith('.md')) {
|
||||
// Ensure path ends with /
|
||||
if (!markdownUrl.endsWith('/')) {
|
||||
markdownUrl += '/';
|
||||
}
|
||||
// Append index.md
|
||||
markdownUrl += 'index.md';
|
||||
}
|
||||
|
||||
// Construct section markdown URL (for branch pages only)
|
||||
let sectionMarkdownUrl: string | undefined;
|
||||
if (pageType === 'branch') {
|
||||
sectionMarkdownUrl = markdownUrl.replace('index.md', 'index.section.md');
|
||||
}
|
||||
|
||||
// Get page title from meta or h1
|
||||
const pageTitle =
|
||||
document
|
||||
.querySelector('meta[property="og:title"]')
|
||||
?.getAttribute('content') ||
|
||||
document.querySelector('h1')?.textContent ||
|
||||
document.title;
|
||||
|
||||
config = {
|
||||
pageType,
|
||||
markdownUrl,
|
||||
sectionMarkdownUrl,
|
||||
pageTitle,
|
||||
pageUrl: currentUrl,
|
||||
childPageCount: childCount,
|
||||
estimatedTokens: parseInt(component.dataset.estimatedTokens || '0', 10),
|
||||
sectionDownloadUrl: component.dataset.sectionDownloadUrl,
|
||||
|
||||
// AI integration URLs
|
||||
chatGptUrl: generateChatGPTUrl(pageTitle, currentUrl, markdownUrl),
|
||||
claudeUrl: generateClaudeUrl(pageTitle, currentUrl, markdownUrl),
|
||||
|
||||
// Future MCP server links
|
||||
mcpCursorUrl: component.dataset.mcpCursorUrl,
|
||||
mcpVSCodeUrl: component.dataset.mcpVSCodeUrl,
|
||||
};
|
||||
|
||||
// Update button label based on page type
|
||||
updateButtonLabel();
|
||||
}
|
||||
|
||||
/**
|
||||
* Update button label: "Copy page for AI" vs "Copy section for AI"
|
||||
*/
|
||||
function updateButtonLabel(): void {
|
||||
const label =
|
||||
config.pageType === 'leaf' ? 'Copy page for AI' : 'Copy section for AI';
|
||||
const buttonText = button.querySelector('[data-button-text]');
|
||||
if (buttonText) {
|
||||
buttonText.textContent = label;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate ChatGPT share URL with page context
|
||||
*/
|
||||
function generateChatGPTUrl(
|
||||
title: string,
|
||||
pageUrl: string,
|
||||
markdownUrl: string
|
||||
): string {
|
||||
// ChatGPT share URL pattern (as of 2025)
|
||||
// This may need updating based on ChatGPT's URL scheme
|
||||
const baseUrl = 'https://chatgpt.com';
|
||||
const markdownFullUrl = `${window.location.origin}${markdownUrl}`;
|
||||
const prompt = `Read from ${markdownFullUrl} so I can ask questions about it.`;
|
||||
return `${baseUrl}/?q=${encodeURIComponent(prompt)}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate Claude share URL with page context
|
||||
*/
|
||||
function generateClaudeUrl(
|
||||
title: string,
|
||||
pageUrl: string,
|
||||
markdownUrl: string
|
||||
): string {
|
||||
// Claude.ai share URL pattern (as of 2025)
|
||||
const baseUrl = 'https://claude.ai/new';
|
||||
const markdownFullUrl = `${window.location.origin}${markdownUrl}`;
|
||||
const prompt = `Read from ${markdownFullUrl} so I can ask questions about it.`;
|
||||
return `${baseUrl}?q=${encodeURIComponent(prompt)}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch markdown content for clipboard copy
|
||||
*/
|
||||
async function fetchMarkdownContent(): Promise<string> {
|
||||
try {
|
||||
const response = await fetch(config.markdownUrl);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch Markdown: ${response.statusText}`);
|
||||
}
|
||||
return await response.text();
|
||||
} catch (error) {
|
||||
console.error('Error fetching Markdown content:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy content to clipboard
|
||||
*/
|
||||
async function copyToClipboard(text: string): Promise<void> {
|
||||
try {
|
||||
await navigator.clipboard.writeText(text);
|
||||
showNotification('Copied to clipboard!', 'success');
|
||||
} catch (error) {
|
||||
console.error('Failed to copy to clipboard:', error);
|
||||
showNotification('Failed to copy to clipboard', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show notification (integrates with existing notifications module)
|
||||
*/
|
||||
function showNotification(message: string, type: 'success' | 'error'): void {
|
||||
// TODO: Integrate with existing notifications module
|
||||
// For now, use a simple console log
|
||||
console.log(`[${type.toUpperCase()}] ${message}`);
|
||||
|
||||
// Optionally add a simple visual notification
|
||||
const notification = document.createElement('div');
|
||||
notification.textContent = message;
|
||||
notification.style.cssText = `
|
||||
position: fixed;
|
||||
bottom: 20px;
|
||||
right: 20px;
|
||||
padding: 12px 20px;
|
||||
background: ${type === 'success' ? '#10b981' : '#ef4444'};
|
||||
color: white;
|
||||
border-radius: 6px;
|
||||
z-index: 10000;
|
||||
font-size: 14px;
|
||||
`;
|
||||
document.body.appendChild(notification);
|
||||
setTimeout(() => notification.remove(), 3000);
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle copy page action
|
||||
*/
|
||||
async function handleCopyPage(): Promise<void> {
|
||||
try {
|
||||
const markdown = await fetchMarkdownContent();
|
||||
await copyToClipboard(markdown);
|
||||
closeDropdown();
|
||||
} catch (error) {
|
||||
console.error('Failed to copy page:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle copy section action (aggregates child pages)
|
||||
*/
|
||||
async function handleCopySection(): Promise<void> {
|
||||
try {
|
||||
// Fetch aggregated section markdown (includes all child pages)
|
||||
const url = config.sectionMarkdownUrl || config.markdownUrl;
|
||||
const response = await fetch(url);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch section markdown: ${response.statusText}`
|
||||
);
|
||||
}
|
||||
|
||||
const markdown = await response.text();
|
||||
await copyToClipboard(markdown);
|
||||
showNotification('Section copied to clipboard', 'success');
|
||||
closeDropdown();
|
||||
} catch (error) {
|
||||
console.error('Failed to copy section:', error);
|
||||
showNotification('Failed to copy section', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle download page action (for single pages)
|
||||
* Commented out - not needed right now
|
||||
*/
|
||||
/*
|
||||
function handleDownloadPage(): void {
|
||||
// Trigger download of current page as markdown
|
||||
window.open(config.markdownUrl, '_self');
|
||||
closeDropdown();
|
||||
}
|
||||
*/
|
||||
|
||||
/**
|
||||
* Handle download section action
|
||||
* Commented out - not yet implemented
|
||||
*/
|
||||
/*
|
||||
function handleDownloadSection(): void {
|
||||
if (config.sectionDownloadUrl) {
|
||||
window.open(config.sectionDownloadUrl, '_self');
|
||||
closeDropdown();
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
/**
|
||||
* Handle external link action
|
||||
*/
|
||||
function handleExternalLink(url: string): void {
|
||||
window.open(url, '_blank', 'noopener,noreferrer');
|
||||
closeDropdown();
|
||||
}
|
||||
|
||||
/**
|
||||
* Build dropdown options based on config
|
||||
*/
|
||||
function buildOptions(): FormatSelectorOption[] {
|
||||
const options: FormatSelectorOption[] = [];
|
||||
|
||||
// Option 1: Copy page/section
|
||||
if (config.pageType === 'leaf') {
|
||||
options.push({
|
||||
label: 'Copy page for AI',
|
||||
sublabel: 'Clean Markdown optimized for AI assistants',
|
||||
icon: 'document',
|
||||
action: handleCopyPage,
|
||||
external: false,
|
||||
visible: true,
|
||||
dataAttribute: 'copy-page',
|
||||
});
|
||||
} else {
|
||||
options.push({
|
||||
label: 'Copy section for AI',
|
||||
sublabel: `${config.childPageCount} pages combined as clean Markdown for AI assistants`,
|
||||
icon: 'document',
|
||||
action: handleCopySection,
|
||||
external: false,
|
||||
visible: true,
|
||||
dataAttribute: 'copy-section',
|
||||
});
|
||||
}
|
||||
|
||||
// Option 1b: Download page (for leaf nodes)
|
||||
// Removed - not needed right now
|
||||
/*
|
||||
if (config.pageType === 'leaf' && config.markdownUrl) {
|
||||
options.push({
|
||||
label: 'Download page',
|
||||
sublabel: 'Download page as Markdown file',
|
||||
icon: 'download',
|
||||
action: handleDownloadPage,
|
||||
external: false,
|
||||
visible: true,
|
||||
dataAttribute: 'download-page',
|
||||
});
|
||||
}
|
||||
*/
|
||||
|
||||
// Option 2: Open in ChatGPT
|
||||
options.push({
|
||||
label: 'Open in ChatGPT',
|
||||
sublabel: 'Ask questions about this page',
|
||||
icon: 'chatgpt',
|
||||
action: () => handleExternalLink(config.chatGptUrl),
|
||||
href: config.chatGptUrl,
|
||||
target: '_blank',
|
||||
external: true,
|
||||
visible: true,
|
||||
dataAttribute: 'open-chatgpt',
|
||||
});
|
||||
|
||||
// Option 3: Open in Claude
|
||||
options.push({
|
||||
label: 'Open in Claude',
|
||||
sublabel: 'Ask questions about this page',
|
||||
icon: 'claude',
|
||||
action: () => handleExternalLink(config.claudeUrl),
|
||||
href: config.claudeUrl,
|
||||
target: '_blank',
|
||||
external: true,
|
||||
visible: true,
|
||||
dataAttribute: 'open-claude',
|
||||
});
|
||||
|
||||
// Future: Download section option
|
||||
// Commented out - not yet implemented
|
||||
/*
|
||||
if (config.pageType === 'branch') {
|
||||
const shouldShowDownload =
|
||||
(config.childPageCount && config.childPageCount > 10) ||
|
||||
(config.estimatedTokens && config.estimatedTokens >= 50000);
|
||||
|
||||
if (shouldShowDownload && config.sectionDownloadUrl) {
|
||||
options.push({
|
||||
label: 'Download section',
|
||||
sublabel: `Download all ${config.childPageCount} pages (.zip with /md and /txt folders)`,
|
||||
icon: 'download',
|
||||
action: handleDownloadSection,
|
||||
external: false,
|
||||
visible: true,
|
||||
dataAttribute: 'download-section',
|
||||
});
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
// Future: MCP server options
|
||||
// Commented out for now - will be implemented as future enhancement
|
||||
/*
|
||||
if (config.mcpCursorUrl) {
|
||||
options.push({
|
||||
label: 'Connect to Cursor',
|
||||
sublabel: 'Install MCP Server on Cursor',
|
||||
icon: 'cursor',
|
||||
action: () => handleExternalLink(config.mcpCursorUrl!),
|
||||
href: config.mcpCursorUrl,
|
||||
target: '_blank',
|
||||
external: true,
|
||||
visible: true,
|
||||
dataAttribute: 'connect-cursor',
|
||||
});
|
||||
}
|
||||
|
||||
if (config.mcpVSCodeUrl) {
|
||||
options.push({
|
||||
label: 'Connect to VS Code',
|
||||
sublabel: 'Install MCP Server on VS Code',
|
||||
icon: 'vscode',
|
||||
action: () => handleExternalLink(config.mcpVSCodeUrl!),
|
||||
href: config.mcpVSCodeUrl,
|
||||
target: '_blank',
|
||||
external: true,
|
||||
visible: true,
|
||||
dataAttribute: 'connect-vscode',
|
||||
});
|
||||
}
|
||||
*/
|
||||
|
||||
return options.filter((opt) => opt.visible);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get SVG icon for option
|
||||
*/
|
||||
function getIconSVG(iconName: string): string {
|
||||
const icons: Record<string, string> = {
|
||||
document: `<svg viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M6 2C4.89543 2 4 2.89543 4 4V16C4 17.1046 4.89543 18 6 18H14C15.1046 18 16 17.1046 16 16V7L11 2H6Z" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M11 2V7H16" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
</svg>`,
|
||||
chatgpt: `<svg viewBox="0 0 721 721" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<g clip-path="url(#clip0_chatgpt)">
|
||||
<path d="M304.246 294.611V249.028C304.246 245.189 305.687 242.309 309.044 240.392L400.692 187.612C413.167 180.415 428.042 177.058 443.394 177.058C500.971 177.058 537.44 221.682 537.44 269.182C537.44 272.54 537.44 276.379 536.959 280.218L441.954 224.558C436.197 221.201 430.437 221.201 424.68 224.558L304.246 294.611ZM518.245 472.145V363.224C518.245 356.505 515.364 351.707 509.608 348.349L389.174 278.296L428.519 255.743C431.877 253.826 434.757 253.826 438.115 255.743L529.762 308.523C556.154 323.879 573.905 356.505 573.905 388.171C573.905 424.636 552.315 458.225 518.245 472.141V472.145ZM275.937 376.182L236.592 353.152C233.235 351.235 231.794 348.354 231.794 344.515V238.956C231.794 187.617 271.139 148.749 324.4 148.749C344.555 148.749 363.264 155.468 379.102 167.463L284.578 222.164C278.822 225.521 275.942 230.319 275.942 237.039V376.186L275.937 376.182ZM360.626 425.122L304.246 393.455V326.283L360.626 294.616L417.002 326.283V393.455L360.626 425.122ZM396.852 570.989C376.698 570.989 357.989 564.27 342.151 552.276L436.674 497.574C442.431 494.217 445.311 489.419 445.311 482.699V343.552L485.138 366.582C488.495 368.499 489.936 371.379 489.936 375.219V480.778C489.936 532.117 450.109 570.985 396.852 570.985V570.989ZM283.134 463.99L191.486 411.211C165.094 395.854 147.343 363.229 147.343 331.562C147.343 294.616 169.415 261.509 203.48 247.593V356.991C203.48 363.71 206.361 368.508 212.117 371.866L332.074 441.437L292.729 463.99C289.372 465.907 286.491 465.907 283.134 463.99ZM277.859 542.68C223.639 542.68 183.813 501.895 183.813 451.514C183.813 447.675 184.294 443.836 184.771 439.997L279.295 494.698C285.051 498.056 290.812 498.056 296.568 494.698L417.002 425.127V470.71C417.002 474.549 415.562 477.429 412.204 479.346L320.557 532.126C308.081 539.323 293.206 542.68 277.854 542.68H277.859ZM396.852 599.776C454.911 599.776 503.37 558.513 514.41 503.812C568.149 489.896 602.696 439.515 602.696 388.176C602.696 354.587 588.303 321.962 562.392 298.45C564.791 288.373 566.231 278.296 566.231 268.224C566.231 199.611 510.571 148.267 446.274 148.267C433.322 148.267 420.846 150.184 408.37 154.505C386.775 133.392 357.026 119.958 324.4 119.958C266.342 119.958 217.883 161.22 206.843 215.921C153.104 229.837 118.557 280.218 118.557 331.557C118.557 365.146 132.95 397.771 158.861 421.283C156.462 431.36 155.022 441.437 155.022 451.51C155.022 520.123 210.682 571.466 274.978 571.466C287.931 571.466 300.407 569.549 312.883 565.228C334.473 586.341 364.222 599.776 396.852 599.776Z" fill="currentColor"/>
|
||||
</g>
|
||||
<defs>
|
||||
<clipPath id="clip0_chatgpt">
|
||||
<rect width="720" height="720" fill="white" transform="translate(0.607 0.1)"/>
|
||||
</clipPath>
|
||||
</defs>
|
||||
</svg>`,
|
||||
claude: `<svg viewBox="0 0 250 251" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M49.0541 166.749L98.2432 139.166L99.0541 136.75L98.2432 135.405H95.8108L87.5676 134.903L59.4595 134.151L35.1351 133.148L11.4865 131.894L5.54054 130.64L0 123.243L0.540541 119.607L5.54054 116.222L12.7027 116.849L28.5135 117.977L52.2973 119.607L69.4595 120.61L95 123.243H99.0541L99.5946 121.613L98.2432 120.61L97.1622 119.607L72.5676 102.932L45.9459 85.3796L32.027 75.2242L24.5946 70.0837L20.8108 65.3195L19.1892 54.7879L25.9459 47.2653L35.1351 47.8922L37.4324 48.5191L46.7568 55.6655L66.6216 71.0868L92.5676 90.1439L96.3514 93.2783L97.875 92.25L98.1081 91.5231L96.3514 88.6394L82.2973 63.1881L67.2973 37.2352L60.5405 26.4529L58.7838 20.0587C58.1033 17.3753 57.7027 15.1553 57.7027 12.4107L65.4054 1.87914L69.7297 0.5L80.1351 1.87914L84.4595 5.64042L90.9459 20.4348L101.351 43.6294L117.568 75.2242L122.297 84.6274L124.865 93.2783L125.811 95.9112H127.432V94.4067L128.784 76.6033L131.216 54.7879L133.649 26.7036L134.459 18.8049L138.378 9.27633L146.216 4.13591L152.297 7.01956L157.297 14.166L156.622 18.8049L153.649 38.1128L147.838 68.3285L144.054 88.6394H146.216L148.784 86.0065L159.054 72.4659L176.216 50.9012L183.784 42.3756L192.703 32.9724L198.378 28.4589H209.189L217.027 40.2442L213.514 52.4057L202.432 66.4478L193.243 78.3586L180.068 96.011L171.892 110.204L172.625 111.375L174.595 111.207L204.324 104.813L220.405 101.929L239.595 98.6695L248.243 102.682L249.189 106.819L245.811 115.219L225.27 120.234L201.216 125.124L165.397 133.556L165 133.875L165.468 134.569L181.622 136.032L188.514 136.408H205.405L236.892 138.79L245.135 144.181L250 150.826L249.189 155.966L236.486 162.361L219.459 158.349L179.595 148.82L165.946 145.435H164.054V146.563L175.405 157.722L196.351 176.528L222.432 200.851L223.784 206.869L220.405 211.633L216.892 211.132L193.919 193.83L185 186.057L165 169.131H163.649V170.886L168.243 177.656L192.703 214.392L193.919 225.676L192.162 229.311L185.811 231.568L178.919 230.314L164.459 210.129L149.73 187.561L137.838 167.25L136.402 168.157L129.324 243.73L126.081 247.616L118.514 250.5L112.162 245.736L108.784 237.962L112.162 222.541L116.216 202.481L119.459 186.558L122.432 166.749L124.248 160.131L124.088 159.688L122.637 159.932L107.703 180.415L85 211.132L67.027 230.314L62.7027 232.07L55.2703 228.183L55.9459 221.287L60.1351 215.144L85 183.549L100 163.865L109.668 152.566L109.573 150.932L109.04 150.886L42.973 193.955L31.2162 195.46L26.0811 190.696L26.7568 182.922L29.1892 180.415L49.0541 166.749Z" fill="currentColor"/>
|
||||
</svg>`,
|
||||
download: `<svg viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M10 3V13M10 13L14 9M10 13L6 9" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M3 15V16C3 16.5523 3.44772 17 4 17H16C16.5523 17 17 16.5523 17 16V15" stroke-width="1.5" stroke-linecap="round"/>
|
||||
</svg>`,
|
||||
cursor: `<svg viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M3 3L17 10L10 12L8 17L3 3Z" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
</svg>`,
|
||||
vscode: `<svg viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M14 3L6 10L3 7L14 3Z" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M14 17L6 10L3 13L14 17Z" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M14 3V17" stroke-width="1.5" stroke-linecap="round"/>
|
||||
</svg>`,
|
||||
};
|
||||
return icons[iconName] || icons.document;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render dropdown options
|
||||
*/
|
||||
function renderOptions(): void {
|
||||
const options = buildOptions();
|
||||
dropdownMenu.innerHTML = '';
|
||||
|
||||
options.forEach((option) => {
|
||||
const optionEl = document.createElement(option.href ? 'a' : 'button');
|
||||
optionEl.classList.add('format-selector__option');
|
||||
optionEl.setAttribute('data-option', option.dataAttribute);
|
||||
|
||||
if (option.href) {
|
||||
(optionEl as HTMLAnchorElement).href = option.href;
|
||||
if (option.target) {
|
||||
(optionEl as HTMLAnchorElement).target = option.target;
|
||||
(optionEl as HTMLAnchorElement).rel = 'noopener noreferrer';
|
||||
}
|
||||
}
|
||||
|
||||
optionEl.innerHTML = `
|
||||
<span class="format-selector__icon">
|
||||
${getIconSVG(option.icon)}
|
||||
</span>
|
||||
<span class="format-selector__label-group">
|
||||
<span class="format-selector__label">
|
||||
${option.label}
|
||||
${option.external ? '<span class="format-selector__external">↗</span>' : ''}
|
||||
</span>
|
||||
<span class="format-selector__sublabel">${option.sublabel}</span>
|
||||
</span>
|
||||
`;
|
||||
|
||||
optionEl.addEventListener('click', (e) => {
|
||||
if (!option.href) {
|
||||
e.preventDefault();
|
||||
option.action();
|
||||
}
|
||||
});
|
||||
|
||||
dropdownMenu.appendChild(optionEl);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Position dropdown relative to button using fixed positioning
|
||||
* Ensures dropdown stays within viewport bounds
|
||||
*/
|
||||
function positionDropdown(): void {
|
||||
const buttonRect = button.getBoundingClientRect();
|
||||
const dropdownWidth = dropdownMenu.offsetWidth;
|
||||
const viewportWidth = window.innerWidth;
|
||||
const padding = 8; // Minimum padding from viewport edge
|
||||
|
||||
// Always position dropdown below button with 8px gap
|
||||
dropdownMenu.style.top = `${buttonRect.bottom + 8}px`;
|
||||
|
||||
// Calculate ideal left position (right-aligned with button)
|
||||
let leftPos = buttonRect.right - dropdownWidth;
|
||||
|
||||
// Ensure dropdown doesn't go off the left edge
|
||||
if (leftPos < padding) {
|
||||
leftPos = padding;
|
||||
}
|
||||
|
||||
// Ensure dropdown doesn't go off the right edge
|
||||
if (leftPos + dropdownWidth > viewportWidth - padding) {
|
||||
leftPos = viewportWidth - dropdownWidth - padding;
|
||||
}
|
||||
|
||||
dropdownMenu.style.left = `${leftPos}px`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle resize events to reposition dropdown
|
||||
*/
|
||||
function handleResize(): void {
|
||||
if (isOpen) {
|
||||
positionDropdown();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Open dropdown
|
||||
*/
|
||||
function openDropdown(): void {
|
||||
isOpen = true;
|
||||
dropdownMenu.classList.add('is-open');
|
||||
button.setAttribute('aria-expanded', 'true');
|
||||
|
||||
// Position dropdown relative to button
|
||||
positionDropdown();
|
||||
|
||||
// Add listeners for repositioning and closing
|
||||
setTimeout(() => {
|
||||
document.addEventListener('click', handleClickOutside);
|
||||
}, 0);
|
||||
window.addEventListener('resize', handleResize);
|
||||
window.addEventListener('scroll', handleResize, true); // Capture scroll on any element
|
||||
}
|
||||
|
||||
/**
|
||||
* Close dropdown
|
||||
*/
|
||||
function closeDropdown(): void {
|
||||
isOpen = false;
|
||||
dropdownMenu.classList.remove('is-open');
|
||||
button.setAttribute('aria-expanded', 'false');
|
||||
document.removeEventListener('click', handleClickOutside);
|
||||
window.removeEventListener('resize', handleResize);
|
||||
window.removeEventListener('scroll', handleResize, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle dropdown
|
||||
*/
|
||||
function toggleDropdown(): void {
|
||||
if (isOpen) {
|
||||
closeDropdown();
|
||||
} else {
|
||||
openDropdown();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle click outside dropdown
|
||||
*/
|
||||
function handleClickOutside(event: Event): void {
|
||||
if (!component.contains(event.target as Node)) {
|
||||
closeDropdown();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle button click
|
||||
*/
|
||||
function handleButtonClick(event: Event): void {
|
||||
event.preventDefault();
|
||||
event.stopPropagation();
|
||||
toggleDropdown();
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle escape key
|
||||
*/
|
||||
function handleKeyDown(event: KeyboardEvent): void {
|
||||
if (event.key === 'Escape' && isOpen) {
|
||||
closeDropdown();
|
||||
button.focus();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize component
|
||||
*/
|
||||
function init(): void {
|
||||
// Initialize config
|
||||
initConfig();
|
||||
|
||||
// Render options
|
||||
renderOptions();
|
||||
|
||||
// Add event listeners
|
||||
button.addEventListener('click', handleButtonClick);
|
||||
document.addEventListener('keydown', handleKeyDown);
|
||||
|
||||
// Set initial ARIA attributes
|
||||
button.setAttribute('aria-expanded', 'false');
|
||||
button.setAttribute('aria-haspopup', 'true');
|
||||
dropdownMenu.setAttribute('role', 'menu');
|
||||
}
|
||||
|
||||
// Initialize on load
|
||||
init();
|
||||
|
||||
// Expose for debugging
|
||||
return {
|
||||
get config() {
|
||||
return config;
|
||||
},
|
||||
openDropdown,
|
||||
closeDropdown,
|
||||
renderOptions,
|
||||
};
|
||||
}
|
||||
|
|
@ -35,6 +35,7 @@ import DocSearch from './components/doc-search.js';
|
|||
import FeatureCallout from './feature-callouts.js';
|
||||
import FluxGroupKeysDemo from './flux-group-keys.js';
|
||||
import FluxInfluxDBVersionsTrigger from './flux-influxdb-versions.js';
|
||||
import FormatSelector from './components/format-selector.ts';
|
||||
import InfluxDBVersionDetector from './influxdb-version-detector.ts';
|
||||
import KeyBinding from './keybindings.js';
|
||||
import ListFilters from './list-filters.js';
|
||||
|
|
@ -65,6 +66,7 @@ const componentRegistry = {
|
|||
'feature-callout': FeatureCallout,
|
||||
'flux-group-keys-demo': FluxGroupKeysDemo,
|
||||
'flux-influxdb-versions-trigger': FluxInfluxDBVersionsTrigger,
|
||||
'format-selector': FormatSelector,
|
||||
'influxdb-version-detector': InfluxDBVersionDetector,
|
||||
keybinding: KeyBinding,
|
||||
'list-filters': ListFilters,
|
||||
|
|
|
|||
|
|
@ -1,12 +1,32 @@
|
|||
/** This module retrieves browser context information and site data for the
|
||||
/**
|
||||
* This module retrieves browser context information and site data for the
|
||||
* current page, version, and product.
|
||||
*/
|
||||
import { products } from './services/influxdata-products.js';
|
||||
import { influxdbUrls } from './services/influxdb-urls.js';
|
||||
import { getProductKeyFromPath } from './utils/product-mappings.js';
|
||||
|
||||
function getCurrentProductData() {
|
||||
/**
|
||||
* Product data return type
|
||||
*/
|
||||
interface ProductDataResult {
|
||||
product: string | Record<string, unknown>;
|
||||
urls: Record<string, unknown>;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current product data based on URL path
|
||||
*/
|
||||
function getCurrentProductData(): ProductDataResult {
|
||||
const path = window.location.pathname;
|
||||
const mappings = [
|
||||
|
||||
interface ProductMapping {
|
||||
pattern: RegExp;
|
||||
product: Record<string, unknown> | string;
|
||||
urls: Record<string, unknown>;
|
||||
}
|
||||
|
||||
const mappings: ProductMapping[] = [
|
||||
{
|
||||
pattern: /\/influxdb\/cloud\//,
|
||||
product: products.influxdb_cloud,
|
||||
|
|
@ -87,57 +107,58 @@ function getCurrentProductData() {
|
|||
return { product: 'other', urls: {} };
|
||||
}
|
||||
|
||||
// Return the page context
|
||||
// (cloud, serverless, oss/enterprise, dedicated, clustered, explorer, other)
|
||||
function getContext() {
|
||||
if (/\/influxdb\/cloud\//.test(window.location.pathname)) {
|
||||
return 'cloud';
|
||||
} else if (/\/influxdb3\/core/.test(window.location.pathname)) {
|
||||
return 'core';
|
||||
} else if (/\/influxdb3\/enterprise/.test(window.location.pathname)) {
|
||||
return 'enterprise';
|
||||
} else if (/\/influxdb3\/cloud-serverless/.test(window.location.pathname)) {
|
||||
return 'serverless';
|
||||
} else if (/\/influxdb3\/cloud-dedicated/.test(window.location.pathname)) {
|
||||
return 'dedicated';
|
||||
} else if (/\/influxdb3\/clustered/.test(window.location.pathname)) {
|
||||
return 'clustered';
|
||||
} else if (/\/influxdb3\/explorer/.test(window.location.pathname)) {
|
||||
return 'explorer';
|
||||
} else if (
|
||||
/\/(enterprise_|influxdb).*\/v[1-2]\//.test(window.location.pathname)
|
||||
) {
|
||||
return 'oss/enterprise';
|
||||
} else {
|
||||
return 'other';
|
||||
}
|
||||
/**
|
||||
* Return the page context
|
||||
* (cloud, serverless, oss/enterprise, dedicated, clustered, core, enterprise, other)
|
||||
* Uses shared product key detection for consistency
|
||||
*/
|
||||
function getContext(): string {
|
||||
const productKey = getProductKeyFromPath(window.location.pathname);
|
||||
|
||||
// Map product keys to context strings
|
||||
const contextMap: Record<string, string> = {
|
||||
influxdb_cloud: 'cloud',
|
||||
influxdb3_core: 'core',
|
||||
influxdb3_enterprise: 'enterprise',
|
||||
influxdb3_cloud_serverless: 'serverless',
|
||||
influxdb3_cloud_dedicated: 'dedicated',
|
||||
influxdb3_clustered: 'clustered',
|
||||
enterprise_influxdb: 'oss/enterprise',
|
||||
influxdb: 'oss/enterprise',
|
||||
};
|
||||
|
||||
return contextMap[productKey || ''] || 'other';
|
||||
}
|
||||
|
||||
// Store the host value for the current page
|
||||
const currentPageHost = window.location.href.match(/^(?:[^/]*\/){2}[^/]+/g)[0];
|
||||
const currentPageHost =
|
||||
window.location.href.match(/^(?:[^/]*\/){2}[^/]+/g)?.[0] || '';
|
||||
|
||||
function getReferrerHost() {
|
||||
/**
|
||||
* Get referrer host from document.referrer
|
||||
*/
|
||||
function getReferrerHost(): string {
|
||||
// Extract the protocol and hostname of referrer
|
||||
const referrerMatch = document.referrer.match(/^(?:[^/]*\/){2}[^/]+/g);
|
||||
return referrerMatch ? referrerMatch[0] : '';
|
||||
}
|
||||
|
||||
const context = getContext(),
|
||||
host = currentPageHost,
|
||||
hostname = location.hostname,
|
||||
path = location.pathname,
|
||||
pathArr = location.pathname.split('/').slice(1, -1),
|
||||
product = pathArr[0],
|
||||
productData = getCurrentProductData(),
|
||||
protocol = location.protocol,
|
||||
referrer = document.referrer === '' ? 'direct' : document.referrer,
|
||||
referrerHost = getReferrerHost(),
|
||||
// TODO: Verify this works since the addition of InfluxDB 3 naming
|
||||
// and the Core and Enterprise versions.
|
||||
version =
|
||||
/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud')
|
||||
? pathArr[1].replace(/^v/, '')
|
||||
: 'n/a';
|
||||
const context = getContext();
|
||||
const host = currentPageHost;
|
||||
const hostname = location.hostname;
|
||||
const path = location.pathname;
|
||||
const pathArr = location.pathname.split('/').slice(1, -1);
|
||||
const product = pathArr[0];
|
||||
const productData = getCurrentProductData();
|
||||
const protocol = location.protocol;
|
||||
const referrer = document.referrer === '' ? 'direct' : document.referrer;
|
||||
const referrerHost = getReferrerHost();
|
||||
// TODO: Verify this works since the addition of InfluxDB 3 naming
|
||||
// and the Core and Enterprise versions.
|
||||
const version =
|
||||
/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud')
|
||||
? pathArr[1].replace(/^v/, '')
|
||||
: 'n/a';
|
||||
|
||||
export {
|
||||
context,
|
||||
|
|
@ -0,0 +1,126 @@
|
|||
/**
|
||||
* Node.js module shim for TypeScript code that runs in both browser and Node.js
|
||||
*
|
||||
* This utility provides conditional imports for Node.js-only modules, allowing
|
||||
* TypeScript files to be bundled for the browser (via Hugo/esbuild) while still
|
||||
* working in Node.js environments.
|
||||
*
|
||||
* @module utils/node-shim
|
||||
*/
|
||||
|
||||
/**
|
||||
* Detect if running in Node.js vs browser environment
|
||||
*/
|
||||
export const isNode =
|
||||
typeof process !== 'undefined' &&
|
||||
process.versions != null &&
|
||||
process.versions.node != null;
|
||||
|
||||
/**
|
||||
* Node.js module references (lazily loaded in Node.js environment)
|
||||
*/
|
||||
export interface NodeModules {
|
||||
fileURLToPath: (url: string) => string;
|
||||
dirname: (path: string) => string;
|
||||
join: (...paths: string[]) => string;
|
||||
readFileSync: (path: string, encoding: BufferEncoding) => string;
|
||||
existsSync: (path: string) => boolean;
|
||||
yaml: { load: (content: string) => unknown };
|
||||
}
|
||||
|
||||
let nodeModulesCache: NodeModules | undefined;
|
||||
|
||||
/**
|
||||
* Lazy load Node.js modules (only when running in Node.js)
|
||||
*
|
||||
* This function dynamically imports Node.js built-in modules (`url`, `path`, `fs`)
|
||||
* and third-party modules (`js-yaml`) only when called in a Node.js environment.
|
||||
* In browser environments, this returns undefined and the imports are tree-shaken out.
|
||||
*
|
||||
* @returns Promise resolving to NodeModules or undefined
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* import { loadNodeModules, isNode } from './utils/node-shim.js';
|
||||
*
|
||||
* async function readConfig() {
|
||||
* if (!isNode) return null;
|
||||
*
|
||||
* const nodeModules = await loadNodeModules();
|
||||
* if (!nodeModules) return null;
|
||||
*
|
||||
* const configPath = nodeModules.join(__dirname, 'config.yml');
|
||||
* if (nodeModules.existsSync(configPath)) {
|
||||
* const content = nodeModules.readFileSync(configPath, 'utf8');
|
||||
* return nodeModules.yaml.load(content);
|
||||
* }
|
||||
* }
|
||||
* ```
|
||||
*/
|
||||
export async function loadNodeModules(): Promise<NodeModules | undefined> {
|
||||
// Early return for browser - this branch will be eliminated by tree-shaking
|
||||
if (!isNode) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
// Return cached modules if already loaded
|
||||
if (nodeModulesCache) {
|
||||
return nodeModulesCache;
|
||||
}
|
||||
|
||||
// This code path is never reached in browser builds due to isNode check above
|
||||
// The dynamic imports will be tree-shaken out by esbuild
|
||||
try {
|
||||
// Use Function constructor to hide imports from static analysis
|
||||
// This prevents esbuild from trying to resolve them during browser builds
|
||||
const loadModule = new Function('moduleName', 'return import(moduleName)');
|
||||
|
||||
const [urlModule, pathModule, fsModule, yamlModule] = await Promise.all([
|
||||
loadModule('url'),
|
||||
loadModule('path'),
|
||||
loadModule('fs'),
|
||||
loadModule('js-yaml'),
|
||||
]);
|
||||
|
||||
nodeModulesCache = {
|
||||
fileURLToPath: urlModule.fileURLToPath,
|
||||
dirname: pathModule.dirname,
|
||||
join: pathModule.join,
|
||||
readFileSync: fsModule.readFileSync,
|
||||
existsSync: fsModule.existsSync,
|
||||
yaml: yamlModule.default as { load: (content: string) => unknown },
|
||||
};
|
||||
|
||||
return nodeModulesCache;
|
||||
} catch (err) {
|
||||
if (err instanceof Error) {
|
||||
console.warn('Failed to load Node.js modules:', err.message);
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the directory path of the current module (Node.js only)
|
||||
*
|
||||
* @param importMetaUrl - import.meta.url from the calling module
|
||||
* @returns Directory path or undefined if not in Node.js
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* import { getModuleDir } from './utils/node-shim.js';
|
||||
*
|
||||
* const moduleDir = await getModuleDir(import.meta.url);
|
||||
* ```
|
||||
*/
|
||||
export async function getModuleDir(
|
||||
importMetaUrl: string
|
||||
): Promise<string | undefined> {
|
||||
const nodeModules = await loadNodeModules();
|
||||
if (!nodeModules) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const filename = nodeModules.fileURLToPath(importMetaUrl);
|
||||
return nodeModules.dirname(filename);
|
||||
}
|
||||
|
|
@ -0,0 +1,234 @@
|
|||
/**
|
||||
* Shared product mapping and detection utilities
|
||||
*
|
||||
* This module provides URL-to-product mapping for both browser and Node.js environments.
|
||||
* In Node.js, it reads from data/products.yml. In browser, it uses fallback mappings.
|
||||
*
|
||||
* @module utils/product-mappings
|
||||
*/
|
||||
|
||||
import { isNode, loadNodeModules } from './node-shim.js';
|
||||
|
||||
/**
|
||||
* Product information interface
|
||||
*/
|
||||
export interface ProductInfo {
|
||||
/** Full product display name */
|
||||
name: string;
|
||||
/** Product version or context identifier */
|
||||
version: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Full product data from products.yml
|
||||
*/
|
||||
export interface ProductData {
|
||||
name: string;
|
||||
altname?: string;
|
||||
namespace: string;
|
||||
menu_category?: string;
|
||||
versions?: string[];
|
||||
list_order?: number;
|
||||
latest?: string;
|
||||
latest_patch?: string;
|
||||
latest_patches?: Record<string, string>;
|
||||
latest_cli?: string | Record<string, string>;
|
||||
placeholder_host?: string;
|
||||
link?: string;
|
||||
succeeded_by?: string;
|
||||
detector_config?: {
|
||||
query_languages?: Record<string, unknown>;
|
||||
characteristics?: string[];
|
||||
detection?: {
|
||||
ping_headers?: Record<string, string>;
|
||||
url_contains?: string[];
|
||||
};
|
||||
};
|
||||
ai_sample_questions?: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Products YAML data structure
|
||||
*/
|
||||
type ProductsData = Record<string, ProductData>;
|
||||
|
||||
let productsData: ProductsData | null = null;
|
||||
|
||||
/**
|
||||
* Load products data from data/products.yml (Node.js only)
|
||||
*/
|
||||
async function loadProductsData(): Promise<ProductsData | null> {
|
||||
if (!isNode) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (productsData) {
|
||||
return productsData;
|
||||
}
|
||||
|
||||
try {
|
||||
// Lazy load Node.js modules using shared shim
|
||||
const nodeModules = await loadNodeModules();
|
||||
if (!nodeModules) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const __filename = nodeModules.fileURLToPath(import.meta.url);
|
||||
const __dirname = nodeModules.dirname(__filename);
|
||||
const productsPath = nodeModules.join(
|
||||
__dirname,
|
||||
'../../../data/products.yml'
|
||||
);
|
||||
|
||||
if (nodeModules.existsSync(productsPath)) {
|
||||
const fileContents = nodeModules.readFileSync(productsPath, 'utf8');
|
||||
productsData = nodeModules.yaml.load(fileContents) as ProductsData;
|
||||
return productsData;
|
||||
}
|
||||
} catch (err) {
|
||||
if (err instanceof Error) {
|
||||
console.warn('Could not load products.yml:', err.message);
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* URL pattern to product key mapping
|
||||
* Used for quick lookups based on URL path
|
||||
*/
|
||||
const URL_PATTERN_MAP: Record<string, string> = {
|
||||
'/influxdb3/core/': 'influxdb3_core',
|
||||
'/influxdb3/enterprise/': 'influxdb3_enterprise',
|
||||
'/influxdb3/cloud-dedicated/': 'influxdb3_cloud_dedicated',
|
||||
'/influxdb3/cloud-serverless/': 'influxdb3_cloud_serverless',
|
||||
'/influxdb3/clustered/': 'influxdb3_clustered',
|
||||
'/influxdb3/explorer/': 'influxdb3_explorer',
|
||||
'/influxdb/cloud/': 'influxdb_cloud',
|
||||
'/influxdb/v2': 'influxdb',
|
||||
'/influxdb/v1': 'influxdb',
|
||||
'/enterprise_influxdb/': 'enterprise_influxdb',
|
||||
'/telegraf/': 'telegraf',
|
||||
'/chronograf/': 'chronograf',
|
||||
'/kapacitor/': 'kapacitor',
|
||||
'/flux/': 'flux',
|
||||
};
|
||||
|
||||
/**
|
||||
* Get the product key from a URL path
|
||||
*
|
||||
* @param path - URL path (e.g., '/influxdb3/core/get-started/')
|
||||
* @returns Product key (e.g., 'influxdb3_core') or null
|
||||
*/
|
||||
export function getProductKeyFromPath(path: string): string | null {
|
||||
for (const [pattern, key] of Object.entries(URL_PATTERN_MAP)) {
|
||||
if (path.includes(pattern)) {
|
||||
return key;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
// Fallback product mappings (used in browser and as fallback in Node.js)
|
||||
const PRODUCT_FALLBACK_MAP: Record<string, ProductInfo> = {
|
||||
influxdb3_core: { name: 'InfluxDB 3 Core', version: 'core' },
|
||||
influxdb3_enterprise: {
|
||||
name: 'InfluxDB 3 Enterprise',
|
||||
version: 'enterprise',
|
||||
},
|
||||
influxdb3_cloud_dedicated: {
|
||||
name: 'InfluxDB Cloud Dedicated',
|
||||
version: 'cloud-dedicated',
|
||||
},
|
||||
influxdb3_cloud_serverless: {
|
||||
name: 'InfluxDB Cloud Serverless',
|
||||
version: 'cloud-serverless',
|
||||
},
|
||||
influxdb3_clustered: { name: 'InfluxDB Clustered', version: 'clustered' },
|
||||
influxdb3_explorer: { name: 'InfluxDB 3 Explorer', version: 'explorer' },
|
||||
influxdb_cloud: { name: 'InfluxDB Cloud (TSM)', version: 'cloud' },
|
||||
influxdb: { name: 'InfluxDB', version: 'v1' }, // Will be refined below
|
||||
enterprise_influxdb: { name: 'InfluxDB Enterprise v1', version: 'v1' },
|
||||
telegraf: { name: 'Telegraf', version: 'v1' },
|
||||
chronograf: { name: 'Chronograf', version: 'v1' },
|
||||
kapacitor: { name: 'Kapacitor', version: 'v1' },
|
||||
flux: { name: 'Flux', version: 'v0' },
|
||||
};
|
||||
|
||||
/**
|
||||
* Get product information from a URL path (synchronous)
|
||||
* Returns simplified product info with name and version
|
||||
*
|
||||
* @param path - URL path to check (e.g., '/influxdb3/core/get-started/')
|
||||
* @returns Product info or null if no match
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* const product = getProductFromPath('/influxdb3/core/admin/');
|
||||
* // Returns: { name: 'InfluxDB 3 Core', version: 'core' }
|
||||
* ```
|
||||
*/
|
||||
export function getProductFromPath(path: string): ProductInfo | null {
|
||||
const productKey = getProductKeyFromPath(path);
|
||||
if (!productKey) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// If we have cached YAML data (Node.js), use it
|
||||
if (productsData && productsData[productKey]) {
|
||||
const product = productsData[productKey];
|
||||
return {
|
||||
name: product.name,
|
||||
version: product.latest || product.versions?.[0] || 'unknown',
|
||||
};
|
||||
}
|
||||
|
||||
// Use fallback map
|
||||
const fallbackInfo = PRODUCT_FALLBACK_MAP[productKey];
|
||||
if (!fallbackInfo) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Handle influxdb product which can be v1 or v2
|
||||
if (productKey === 'influxdb') {
|
||||
return {
|
||||
name: path.includes('/v2') ? 'InfluxDB OSS v2' : 'InfluxDB OSS v1',
|
||||
version: path.includes('/v2') ? 'v2' : 'v1',
|
||||
};
|
||||
}
|
||||
|
||||
return fallbackInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize product data from YAML (Node.js only, async)
|
||||
* Call this in Node.js scripts to load product data before using getProductFromPath
|
||||
*/
|
||||
export async function initializeProductData(): Promise<void> {
|
||||
if (isNode && !productsData) {
|
||||
await loadProductsData();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get full product data from products.yml (Node.js only)
|
||||
* Note: Call initializeProductData() first to load the YAML data
|
||||
*
|
||||
* @param productKey - Product key (e.g., 'influxdb3_core')
|
||||
* @returns Full product data object or null
|
||||
*/
|
||||
export function getProductData(productKey: string): ProductData | null {
|
||||
if (!isNode) {
|
||||
console.warn('getProductData() is only available in Node.js environment');
|
||||
return null;
|
||||
}
|
||||
|
||||
// Use cached data (requires initializeProductData() to have been called)
|
||||
return productsData?.[productKey] || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Export URL pattern map for external use
|
||||
*/
|
||||
export { URL_PATTERN_MAP };
|
||||
|
|
@ -0,0 +1,243 @@
|
|||
/**
|
||||
* Format Selector Component Styles
|
||||
*
|
||||
* Dropdown menu for accessing documentation in LLM-friendly formats.
|
||||
* Uses theme colors to match light/dark modes.
|
||||
*/
|
||||
|
||||
.format-selector {
|
||||
position: relative;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
margin-left: auto; // Right-align in title container
|
||||
margin-top: 0.5rem;
|
||||
|
||||
// Position near article title
|
||||
.title & {
|
||||
margin-left: auto;
|
||||
}
|
||||
}
|
||||
|
||||
.format-selector__button {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
padding: 0.5rem 0.75rem;
|
||||
background: $sidebar-search-bg;
|
||||
color: $article-text;
|
||||
border: 1px solid $nav-border;
|
||||
border-radius: $radius;
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
line-height: 1;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
white-space: nowrap;
|
||||
box-shadow: 2px 2px 6px $sidebar-search-shadow;
|
||||
|
||||
&:hover {
|
||||
border-color: $sidebar-search-highlight;
|
||||
box-shadow: 1px 1px 10px rgba($sidebar-search-highlight, .5);
|
||||
}
|
||||
|
||||
&:focus {
|
||||
outline: 2px solid $sidebar-search-highlight;
|
||||
outline-offset: 2px;
|
||||
}
|
||||
|
||||
&[aria-expanded='true'] {
|
||||
border-color: $sidebar-search-highlight;
|
||||
|
||||
.format-selector__button-arrow svg {
|
||||
transform: rotate(180deg);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.format-selector__button-icon {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
width: 16px;
|
||||
height: 16px;
|
||||
|
||||
svg {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
color: $nav-item;
|
||||
}
|
||||
}
|
||||
|
||||
.format-selector__button-text {
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.format-selector__button-arrow {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
width: 12px;
|
||||
height: 12px;
|
||||
margin-left: 0.25rem;
|
||||
|
||||
svg {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
transition: transform 0.2s ease;
|
||||
}
|
||||
}
|
||||
|
||||
// Dropdown menu
|
||||
.format-selector__dropdown {
|
||||
position: fixed; // Use fixed to break out of parent stacking context
|
||||
// Position will be calculated by JavaScript to align with button
|
||||
min-width: 280px;
|
||||
max-width: 320px;
|
||||
background: $article-bg;
|
||||
border: 1px solid $nav-border;
|
||||
border-radius: 8px;
|
||||
box-shadow: 2px 2px 6px $article-shadow;
|
||||
padding: 0.5rem;
|
||||
z-index: 10000; // Higher than sidebar and other elements
|
||||
opacity: 0;
|
||||
visibility: hidden;
|
||||
transform: translateY(-8px);
|
||||
transition: all 0.2s ease;
|
||||
pointer-events: none;
|
||||
|
||||
&.is-open {
|
||||
opacity: 1;
|
||||
visibility: visible;
|
||||
transform: translateY(0);
|
||||
pointer-events: auto;
|
||||
}
|
||||
}
|
||||
|
||||
// Dropdown options (buttons and links)
|
||||
.format-selector__option {
|
||||
display: flex;
|
||||
align-items: flex-start;
|
||||
gap: 0.75rem;
|
||||
width: 100%;
|
||||
padding: 0.75rem;
|
||||
background: transparent;
|
||||
color: $article-text;
|
||||
border: none;
|
||||
border-radius: $radius;
|
||||
text-align: left;
|
||||
text-decoration: none;
|
||||
cursor: pointer;
|
||||
transition: background 0.15s ease;
|
||||
|
||||
&:hover {
|
||||
background: $sidebar-search-bg;
|
||||
color: $nav-item-hover;
|
||||
}
|
||||
|
||||
&:focus {
|
||||
outline: 2px solid $sidebar-search-highlight;
|
||||
outline-offset: -2px;
|
||||
}
|
||||
|
||||
&:not(:last-child) {
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
}
|
||||
|
||||
.format-selector__icon {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
width: 20px;
|
||||
height: 20px;
|
||||
flex-shrink: 0;
|
||||
margin-top: 2px; // Align with first line of text
|
||||
|
||||
svg {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
|
||||
// Support both stroke and fill-based icons
|
||||
stroke: $nav-item;
|
||||
|
||||
// For fill-based icons (like OpenAI Blossom), use currentColor
|
||||
[fill]:not([fill="none"]):not([fill="white"]) {
|
||||
fill: $nav-item;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.format-selector__label-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.25rem;
|
||||
flex: 1;
|
||||
min-width: 0; // Allow text truncation
|
||||
}
|
||||
|
||||
.format-selector__label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
line-height: 1.3;
|
||||
color: $article-text;
|
||||
}
|
||||
|
||||
.format-selector__external {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
font-size: 12px;
|
||||
color: $nav-item;
|
||||
margin-left: 0.25rem;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.format-selector__sublabel {
|
||||
font-size: 12px;
|
||||
line-height: 1.4;
|
||||
color: $nav-item;
|
||||
}
|
||||
|
||||
// Responsive adjustments
|
||||
@media (max-width: 768px) {
|
||||
.format-selector {
|
||||
// Stack vertically on mobile
|
||||
margin-left: 0;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.format-selector__dropdown {
|
||||
right: auto;
|
||||
left: 0;
|
||||
min-width: 100%;
|
||||
max-width: 100%;
|
||||
}
|
||||
}
|
||||
|
||||
// Theme styles are now automatically handled by SCSS variables
|
||||
// that switch based on the active theme (light/dark)
|
||||
|
||||
// Ensure dropdown appears above other content
|
||||
.format-selector__dropdown {
|
||||
isolation: isolate;
|
||||
}
|
||||
|
||||
// Animation for notification (temporary toast)
|
||||
@keyframes slideInUp {
|
||||
from {
|
||||
transform: translateY(100%);
|
||||
opacity: 0;
|
||||
}
|
||||
to {
|
||||
transform: translateY(0);
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
// Add smooth transitions
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
|
@ -2,7 +2,7 @@
|
|||
display: flex;
|
||||
flex-direction: row;
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
overflow: visible; // Changed from hidden to allow format-selector dropdown
|
||||
border-radius: $radius 0 0 $radius;
|
||||
min-height: 700px;
|
||||
@include gradient($landing-artwork-gradient);
|
||||
|
|
|
|||
|
|
@ -35,5 +35,6 @@
|
|||
"layouts/v3-wayfinding";
|
||||
|
||||
// Import Components
|
||||
@import "components/influxdb-version-detector";
|
||||
@import "components/influxdb-version-detector",
|
||||
"components/format-selector";
|
||||
|
||||
|
|
|
|||
|
|
@ -55,6 +55,24 @@ outputFormats:
|
|||
mediaType: application/json
|
||||
baseName: pages
|
||||
isPlainText: true
|
||||
llmstxt:
|
||||
mediaType: text/plain
|
||||
baseName: llms
|
||||
isPlainText: true
|
||||
notAlternative: true
|
||||
permalinkable: true
|
||||
suffixes:
|
||||
- txt
|
||||
|
||||
outputs:
|
||||
page:
|
||||
- HTML
|
||||
section:
|
||||
- HTML
|
||||
# llmstxt disabled for sections - using .md files via Lambda@Edge instead
|
||||
home:
|
||||
- HTML
|
||||
- llmstxt # Root /llms.txt for AI agent discovery
|
||||
|
||||
# Asset processing configuration for development
|
||||
build:
|
||||
|
|
|
|||
|
|
@ -139,13 +139,15 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
// Each describe block will visit the page once
|
||||
|
||||
describe('Component Data Attributes', function () {
|
||||
beforeEach(() => {
|
||||
cy.visit('/test-version-detector/');
|
||||
// The trigger is an anchor element with .btn class, not a button
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
});
|
||||
|
||||
it('should not throw JavaScript console errors', function () {
|
||||
cy.visit('/test-version-detector/');
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for modal to be visible
|
||||
cy.get('[data-component="influxdb-version-detector"]', {
|
||||
timeout: 5000,
|
||||
}).should('be.visible');
|
||||
|
||||
cy.window().then((win) => {
|
||||
const logs = [];
|
||||
const originalError = win.console.error;
|
||||
|
|
@ -177,7 +179,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
cy.get('[data-component="influxdb-version-detector"]')
|
||||
.eq(0)
|
||||
.within(() => {
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
it('should suggest legacy editions for custom URL or hostname', function () {
|
||||
cy.get('#url-input', { timeout: 10000 })
|
||||
.clear()
|
||||
|
|
@ -358,8 +363,23 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
});
|
||||
|
||||
it('should handle cloud context detection', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.visit('/test-version-detector/');
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for the button within the modal and question to be interactable
|
||||
cy.get('[data-component="influxdb-version-detector"]', { timeout: 5000 })
|
||||
.should('be.visible')
|
||||
.within(() => {
|
||||
cy.get('#q-url-known', { timeout: 5000 })
|
||||
.should('be.visible')
|
||||
.within(() => {
|
||||
cy.contains('.option-button', 'Yes, I know the URL', {
|
||||
timeout: 5000,
|
||||
})
|
||||
.should('be.visible')
|
||||
.click();
|
||||
});
|
||||
});
|
||||
|
||||
// Wait for URL input question to appear and then enter cloud context
|
||||
cy.get('#q-url-input', { timeout: 10000 }).should('be.visible');
|
||||
|
|
@ -367,7 +387,7 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
.should('be.visible')
|
||||
.clear()
|
||||
.type('cloud 2');
|
||||
cy.get('.submit-button').click();
|
||||
cy.get('#q-url-input .submit-button').click();
|
||||
|
||||
// Should proceed to next step - either show result or start questionnaire
|
||||
// Don't be too specific about what happens next, just verify it progresses
|
||||
|
|
@ -381,8 +401,23 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
});
|
||||
|
||||
it('should handle v3 port detection', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.visit('/test-version-detector/');
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for the button within the modal and question to be interactable
|
||||
cy.get('[data-component="influxdb-version-detector"]', { timeout: 5000 })
|
||||
.should('be.visible')
|
||||
.within(() => {
|
||||
cy.get('#q-url-known', { timeout: 5000 })
|
||||
.should('be.visible')
|
||||
.within(() => {
|
||||
cy.contains('.option-button', 'Yes, I know the URL', {
|
||||
timeout: 5000,
|
||||
})
|
||||
.should('be.visible')
|
||||
.click();
|
||||
});
|
||||
});
|
||||
|
||||
// Wait for URL input question to appear and then test v3 port detection (8181)
|
||||
cy.get('#q-url-input', { timeout: 10000 }).should('be.visible');
|
||||
|
|
@ -390,7 +425,7 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
.should('be.visible')
|
||||
.clear()
|
||||
.type('http://localhost:8181');
|
||||
cy.get('.submit-button').click();
|
||||
cy.get('#q-url-input .submit-button').click();
|
||||
|
||||
// Should progress to either result or questionnaire
|
||||
cy.get('body', { timeout: 15000 }).then(($body) => {
|
||||
|
|
@ -408,10 +443,18 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
cy.visit('/test-version-detector/');
|
||||
// The trigger is an anchor element with .btn class, not a button
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for modal to be visible
|
||||
cy.get('[data-component="influxdb-version-detector"]', {
|
||||
timeout: 5000,
|
||||
}).should('be.visible');
|
||||
});
|
||||
it('should start questionnaire for unknown URL', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -422,7 +465,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should complete basic questionnaire flow', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
// Start questionnaire
|
||||
cy.get('#url-input')
|
||||
|
|
@ -469,7 +515,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should NOT recommend InfluxDB 3 for Flux users (regression test)', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').should('be.visible').clear().type('cloud 2');
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -639,7 +688,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
questionnaireScenarios.forEach((scenario) => {
|
||||
it(`should handle questionnaire scenario: ${scenario.name}`, function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
// Start questionnaire
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
|
|
@ -669,7 +721,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should NOT recommend InfluxDB 3 for 5+ year installations (time-aware)', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -693,7 +748,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should apply -100 Flux penalty to InfluxDB 3 products', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -716,7 +774,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
const cloudPatterns = ['cloud 2', 'cloud v2', 'influxdb cloud 2'];
|
||||
|
||||
// Test first pattern in current session
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
cy.get('#url-input').clear().type(cloudPatterns[0]);
|
||||
cy.get('.submit-button').click();
|
||||
cy.get('.question.active').should('be.visible');
|
||||
|
|
@ -725,7 +786,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
// Navigation and interaction tests
|
||||
it('should allow going back through questionnaire questions', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
// Start questionnaire
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
|
|
@ -747,7 +811,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should allow restarting questionnaire from results', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
// Complete a questionnaire
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
|
|
@ -779,11 +846,19 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
cy.visit('/test-version-detector/');
|
||||
// The trigger is an anchor element with .btn class, not a button
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for modal to be visible
|
||||
cy.get('[data-component="influxdb-version-detector"]', {
|
||||
timeout: 5000,
|
||||
}).should('be.visible');
|
||||
});
|
||||
|
||||
it('should handle empty URL input gracefully', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').clear();
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -794,7 +869,10 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
|
||||
it('should handle invalid URL format gracefully', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
cy.get('#url-input').clear().type('not-a-valid-url');
|
||||
cy.get('.submit-button').click();
|
||||
|
|
@ -809,11 +887,19 @@ describe('InfluxDB Version Detector Component', function () {
|
|||
cy.visit('/test-version-detector/');
|
||||
// The trigger is an anchor element with .btn class, not a button
|
||||
cy.contains(modalTriggerSelector, 'Detect my InfluxDB version').click();
|
||||
|
||||
// Wait for modal to be visible
|
||||
cy.get('[data-component="influxdb-version-detector"]', {
|
||||
timeout: 5000,
|
||||
}).should('be.visible');
|
||||
});
|
||||
|
||||
it('should only show InfluxDB 3 products when SQL is selected', function () {
|
||||
// Click "Yes, I know the URL" first
|
||||
cy.get('.option-button').contains('Yes, I know the URL').click();
|
||||
cy.get('#q-url-known .option-button')
|
||||
.contains('Yes, I know the URL')
|
||||
.should('be.visible')
|
||||
.click();
|
||||
|
||||
// Start questionnaire with unknown URL
|
||||
cy.get('#url-input').clear().type('https://unknown-server.com:9999');
|
||||
|
|
|
|||
|
|
@ -0,0 +1,289 @@
|
|||
/**
|
||||
* E2E tests for LLM-friendly format selector component
|
||||
* These tests validate the format selector dropdown for both leaf nodes (single pages)
|
||||
* and branch nodes (sections with children).
|
||||
*/
|
||||
|
||||
describe('LLM Format Selector', () => {
|
||||
// Test configuration
|
||||
const LEAF_PAGE_URL = '/influxdb3/core/get-started/setup/';
|
||||
const SMALL_SECTION_URL = '/influxdb3/core/get-started/'; // Section with ≤10 pages
|
||||
const LARGE_SECTION_URL = '/influxdb3/core/query-data/'; // Section with >10 pages (if exists)
|
||||
|
||||
/**
|
||||
* Setup: Generate markdown files for test paths
|
||||
* This runs once before all tests in this suite
|
||||
*/
|
||||
before(() => {
|
||||
cy.log('Generating markdown files for test paths...');
|
||||
|
||||
// Generate markdown for get-started section (small section + leaf page)
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/core/get-started',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
).then((result) => {
|
||||
if (result.code !== 0) {
|
||||
cy.log(
|
||||
'Warning: get-started markdown generation had issues:',
|
||||
result.stderr
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// Generate markdown for query-data section (large section)
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/core/query-data --limit 15',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
).then((result) => {
|
||||
if (result.code !== 0) {
|
||||
cy.log(
|
||||
'Warning: query-data markdown generation had issues:',
|
||||
result.stderr
|
||||
);
|
||||
}
|
||||
cy.log('Markdown files generated successfully');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Format Selector - Leaf Nodes (Single Pages)', () => {
|
||||
beforeEach(() => {
|
||||
cy.visit(LEAF_PAGE_URL);
|
||||
|
||||
// Wait for component initialization
|
||||
cy.window().should((win) => {
|
||||
expect(win.influxdatadocs).to.exist;
|
||||
expect(win.influxdatadocs.instances).to.exist;
|
||||
expect(win.influxdatadocs.instances['format-selector']).to.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should display format selector button with correct label', () => {
|
||||
cy.get('[data-component="format-selector"]')
|
||||
.should('exist')
|
||||
.should('be.visible');
|
||||
|
||||
cy.get(
|
||||
'[data-component="format-selector"] .format-selector__button'
|
||||
).should('contain', 'Copy page for AI');
|
||||
});
|
||||
|
||||
describe('Dropdown functionality', () => {
|
||||
beforeEach(() => {
|
||||
// Open dropdown once for all tests in this block
|
||||
cy.get(
|
||||
'[data-component="format-selector"] .format-selector__button'
|
||||
).trigger('click');
|
||||
|
||||
// Wait for dropdown animation (0.2s transition + small buffer)
|
||||
cy.wait(300);
|
||||
|
||||
// Verify dropdown is open
|
||||
cy.get('[data-dropdown-menu].is-open')
|
||||
.should('exist')
|
||||
.should('be.visible');
|
||||
});
|
||||
|
||||
it('should display dropdown menu with all options', () => {
|
||||
// Check that dropdown has options
|
||||
cy.get('[data-dropdown-menu].is-open [data-option]').should(
|
||||
'have.length.at.least',
|
||||
3
|
||||
); // copy-page, open-chatgpt, open-claude
|
||||
});
|
||||
|
||||
it('should display "Copy page for AI" option', () => {
|
||||
cy.get('[data-dropdown-menu].is-open [data-option="copy-page"]')
|
||||
.should('be.visible')
|
||||
.should('contain', 'Copy page for AI')
|
||||
.should('contain', 'Clean Markdown optimized for AI assistants');
|
||||
});
|
||||
|
||||
it('should display "Open in ChatGPT" option with external link indicator', () => {
|
||||
cy.get('[data-dropdown-menu].is-open [data-option="open-chatgpt"]')
|
||||
.should('be.visible')
|
||||
.should('contain', 'Open in ChatGPT')
|
||||
.should('contain', 'Ask questions about this page')
|
||||
.should('contain', '↗')
|
||||
.should('have.attr', 'href')
|
||||
.and('include', 'chatgpt.com');
|
||||
});
|
||||
|
||||
it('should display "Open in Claude" option with external link indicator', () => {
|
||||
cy.get('[data-dropdown-menu].is-open [data-option="open-claude"]')
|
||||
.should('be.visible')
|
||||
.should('contain', 'Open in Claude')
|
||||
.should('contain', 'Ask questions about this page')
|
||||
.should('contain', '↗')
|
||||
.should('have.attr', 'href')
|
||||
.and('include', 'claude.ai');
|
||||
});
|
||||
|
||||
it('should display icons for each option', () => {
|
||||
cy.get('[data-dropdown-menu].is-open [data-option]').each(($option) => {
|
||||
cy.wrap($option).find('.format-selector__icon').should('exist');
|
||||
});
|
||||
});
|
||||
|
||||
it('should open AI integration links in new tab', () => {
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="open-chatgpt"]'
|
||||
).should('have.attr', 'target', '_blank');
|
||||
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="open-claude"]'
|
||||
).should('have.attr', 'target', '_blank');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Format Selector - Branch Nodes (Small Sections)', () => {
|
||||
beforeEach(() => {
|
||||
cy.visit(SMALL_SECTION_URL);
|
||||
|
||||
// Wait for component initialization
|
||||
cy.window().should((win) => {
|
||||
expect(win.influxdatadocs).to.exist;
|
||||
expect(win.influxdatadocs.instances).to.exist;
|
||||
expect(win.influxdatadocs.instances['format-selector']).to.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should show "Copy section for AI" label for branch nodes', () => {
|
||||
cy.get(
|
||||
'[data-component="format-selector"] .format-selector__button'
|
||||
).should('contain', 'Copy section for AI');
|
||||
});
|
||||
|
||||
describe('Dropdown functionality', () => {
|
||||
beforeEach(() => {
|
||||
// Open dropdown once for all tests in this block
|
||||
cy.get(
|
||||
'[data-component="format-selector"] .format-selector__button'
|
||||
).trigger('click');
|
||||
|
||||
// Wait for dropdown animation
|
||||
cy.wait(300);
|
||||
|
||||
// Verify dropdown is open
|
||||
cy.get('[data-dropdown-menu].is-open')
|
||||
.should('exist')
|
||||
.should('be.visible');
|
||||
});
|
||||
|
||||
it('should display "Copy section for AI" option with page count', () => {
|
||||
cy.get('[data-dropdown-menu].is-open [data-option="copy-section"]')
|
||||
.should('be.visible')
|
||||
.should('contain', 'Copy section for AI')
|
||||
.should(
|
||||
'contain',
|
||||
'pages combined as clean Markdown for AI assistants'
|
||||
);
|
||||
});
|
||||
|
||||
it('should NOT show "Download section" option for small sections', () => {
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="download-section"]'
|
||||
).should('not.exist');
|
||||
});
|
||||
|
||||
it('should display ChatGPT and Claude options', () => {
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="open-chatgpt"]'
|
||||
).should('be.visible');
|
||||
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="open-claude"]'
|
||||
).should('be.visible');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Format Selector - Branch Nodes (Large Sections)', () => {
|
||||
beforeEach(() => {
|
||||
// Skip if large section doesn't exist
|
||||
cy.visit(LARGE_SECTION_URL, { failOnStatusCode: false });
|
||||
|
||||
// Wait for component initialization if it exists
|
||||
cy.window().then((win) => {
|
||||
if (win.influxdatadocs && win.influxdatadocs.instances) {
|
||||
expect(win.influxdatadocs.instances['format-selector']).to.exist;
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should show "Download section" option for large sections (>10 pages)', () => {
|
||||
// First check if this is actually a large section
|
||||
cy.get('[data-component="format-selector"]').then(($selector) => {
|
||||
const childCount = $selector.data('child-count');
|
||||
|
||||
if (childCount && childCount > 10) {
|
||||
cy.get('[data-component="format-selector"] button').trigger('click');
|
||||
|
||||
cy.wait(300);
|
||||
|
||||
cy.get(
|
||||
'[data-dropdown-menu].is-open [data-option="download-section"]'
|
||||
)
|
||||
.should('be.visible')
|
||||
.should('contain', 'Download section')
|
||||
.should('contain', '.zip');
|
||||
} else {
|
||||
cy.log('Skipping: This section has ≤10 pages');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Markdown Content Quality', () => {
|
||||
it('should contain actual page content from HTML version', () => {
|
||||
// First, get the HTML version and extract some text
|
||||
cy.visit(LEAF_PAGE_URL);
|
||||
|
||||
// Get the page title from h1
|
||||
cy.get('h1')
|
||||
.first()
|
||||
.invoke('text')
|
||||
.then((pageTitle) => {
|
||||
// Get some body content from the article
|
||||
cy.get('article')
|
||||
.first()
|
||||
.invoke('text')
|
||||
.then((articleText) => {
|
||||
// Extract a meaningful snippet (first 50 chars of article text, trimmed)
|
||||
const contentSnippet = articleText.trim().substring(0, 50).trim();
|
||||
|
||||
// Now fetch the markdown version
|
||||
cy.request(LEAF_PAGE_URL + 'index.md').then((response) => {
|
||||
expect(response.status).to.eq(200);
|
||||
|
||||
const markdown = response.body;
|
||||
|
||||
// Basic structure checks
|
||||
expect(markdown).to.include('---'); // Frontmatter delimiter
|
||||
expect(markdown).to.match(/^#+ /m); // Has headings
|
||||
|
||||
// Content from HTML should appear in markdown
|
||||
expect(markdown).to.include(pageTitle.trim());
|
||||
expect(markdown).to.include(contentSnippet);
|
||||
|
||||
// Clean markdown (no raw HTML or Hugo syntax)
|
||||
expect(markdown).to.not.include('<!DOCTYPE html>');
|
||||
expect(markdown).to.not.include('<div');
|
||||
expect(markdown).to.not.match(/\{\{[<%]/); // No shortcodes
|
||||
expect(markdown).to.not.include('<!--'); // No HTML comments
|
||||
expect(markdown).to.not.match(/\{\{-?\s*end\s*-?\}\}/); // No {{end}}
|
||||
|
||||
// Not empty
|
||||
expect(markdown.length).to.be.greaterThan(100);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,608 @@
|
|||
/**
|
||||
* E2E tests for Markdown content validation
|
||||
* Validates that generated Markdown files:
|
||||
* - Don't contain raw Hugo shortcodes
|
||||
* - Don't contain HTML comments
|
||||
* - Have proper frontmatter
|
||||
* - Have valid Markdown structure
|
||||
* - Contain expected content
|
||||
*/
|
||||
|
||||
import {
|
||||
validateMarkdown,
|
||||
validateFrontmatter,
|
||||
validateTable,
|
||||
containsText,
|
||||
doesNotContainText,
|
||||
} from '../../support/markdown-validator.js';
|
||||
|
||||
describe('Markdown Content Validation', () => {
|
||||
// Test URLs for different page types
|
||||
const LEAF_PAGE_URL = '/influxdb3/core/get-started/setup/';
|
||||
const SECTION_PAGE_URL = '/influxdb3/core/get-started/';
|
||||
const ENTERPRISE_INDEX_URL = '/influxdb3/enterprise/';
|
||||
|
||||
/**
|
||||
* Setup: Generate markdown files for test paths
|
||||
* This runs once before all tests in this suite
|
||||
*/
|
||||
before(() => {
|
||||
cy.log('Generating markdown files for test paths...');
|
||||
|
||||
// Generate markdown for get-started section
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/core/get-started',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
).then((result) => {
|
||||
if (result.code !== 0) {
|
||||
cy.log(
|
||||
'Warning: get-started markdown generation had issues:',
|
||||
result.stderr
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// Generate markdown for enterprise index page
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/enterprise --limit 1',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
).then((result) => {
|
||||
if (result.code !== 0) {
|
||||
cy.log(
|
||||
'Warning: enterprise markdown generation had issues:',
|
||||
result.stderr
|
||||
);
|
||||
}
|
||||
cy.log('Markdown files generated successfully');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Markdown Format - Basic Validation', () => {
|
||||
it('should return 200 status for markdown file requests', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.status).to.eq(200);
|
||||
});
|
||||
});
|
||||
|
||||
it('should have correct content-type for markdown files', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Note: Hugo may serve as text/plain or text/markdown depending on config
|
||||
expect(response.headers['content-type']).to.match(
|
||||
/text\/(plain|markdown)/
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it('should be accessible at URL/index.md', () => {
|
||||
// Hugo generates markdown as index.md in directory matching URL path
|
||||
// Note: llmstxt.org spec recommends /path/index.html.md, but we use
|
||||
// /path/index.md for cleaner URLs and Hugo compatibility
|
||||
cy.visit(`${LEAF_PAGE_URL}`);
|
||||
cy.url().then((htmlUrl) => {
|
||||
const markdownUrl = htmlUrl + 'index.md';
|
||||
cy.request(markdownUrl).then((response) => {
|
||||
expect(response.status).to.eq(200);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Frontmatter Validation', () => {
|
||||
it('should start with YAML frontmatter delimiters', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.body).to.match(/^---\n/);
|
||||
});
|
||||
});
|
||||
|
||||
it('should include required frontmatter fields', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
const frontmatterMatch = response.body.match(/^---\n([\s\S]*?)\n---/);
|
||||
expect(frontmatterMatch).to.not.be.null;
|
||||
|
||||
const frontmatter = frontmatterMatch[1];
|
||||
expect(frontmatter).to.include('title:');
|
||||
expect(frontmatter).to.include('description:');
|
||||
expect(frontmatter).to.include('url:');
|
||||
});
|
||||
});
|
||||
|
||||
it('should include product context in frontmatter', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
const frontmatterMatch = response.body.match(/^---\n([\s\S]*?)\n---/);
|
||||
const frontmatter = frontmatterMatch[1];
|
||||
|
||||
expect(frontmatter).to.include('product:');
|
||||
expect(frontmatter).to.include('product_version:');
|
||||
});
|
||||
});
|
||||
|
||||
it('should include date and lastmod fields', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
const frontmatterMatch = response.body.match(/^---\n([\s\S]*?)\n---/);
|
||||
const frontmatter = frontmatterMatch[1];
|
||||
|
||||
expect(frontmatter).to.include('date:');
|
||||
expect(frontmatter).to.include('lastmod:');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Shortcode Evaluation', () => {
|
||||
it('should NOT contain raw Hugo shortcodes with {{< >}}', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Check for common shortcode patterns
|
||||
expect(response.body).to.not.include('{{<');
|
||||
expect(response.body).to.not.include('>}}');
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain raw Hugo shortcodes with {{% %}}', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.body).to.not.include('{{%');
|
||||
expect(response.body).to.not.include('%}}');
|
||||
});
|
||||
});
|
||||
|
||||
it('should have evaluated product-name shortcode', () => {
|
||||
cy.request(`${ENTERPRISE_INDEX_URL}index.md`).then((response) => {
|
||||
// Should contain "InfluxDB 3 Enterprise" not "{{< product-name >}}"
|
||||
expect(response.body).to.include('InfluxDB 3 Enterprise');
|
||||
expect(response.body).to.not.include('{{< product-name >}}');
|
||||
});
|
||||
});
|
||||
|
||||
it('should have evaluated req shortcode for required markers', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should NOT contain raw {{< req >}} shortcode
|
||||
expect(response.body).to.not.include('{{< req >}}');
|
||||
expect(response.body).to.not.include('{{< req ');
|
||||
});
|
||||
});
|
||||
|
||||
it('should have evaluated code-placeholder shortcode', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should NOT contain code-placeholder shortcodes
|
||||
expect(response.body).to.not.include('{{< code-placeholder');
|
||||
expect(response.body).to.not.include('{{% code-placeholder');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Comment Removal', () => {
|
||||
it('should NOT contain HTML comments', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Check for HTML comment patterns
|
||||
expect(response.body).to.not.include('<!--');
|
||||
expect(response.body).to.not.include('-->');
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain source file comments', () => {
|
||||
cy.request(`${ENTERPRISE_INDEX_URL}index.md`).then((response) => {
|
||||
// Check for the "SOURCE - content/shared/..." comments
|
||||
expect(response.body).to.not.include('SOURCE -');
|
||||
expect(response.body).to.not.include('//SOURCE');
|
||||
expect(response.body).to.not.include('content/shared/');
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain editorial comments', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Common editorial comment patterns
|
||||
expect(response.body).to.not.match(/<!-- TODO:/i);
|
||||
expect(response.body).to.not.match(/<!-- NOTE:/i);
|
||||
expect(response.body).to.not.match(/<!-- FIXME:/i);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('UI Element Removal', () => {
|
||||
it('should NOT contain format selector button text', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should not contain UI button text from format selector
|
||||
expect(response.body).to.not.include('Copy page');
|
||||
expect(response.body).to.not.include('Copy section');
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain page feedback form text', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.body).to.not.include('Was this page helpful?');
|
||||
expect(response.body).to.not.include('Thank you for your feedback');
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain navigation breadcrumbs', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should not contain navigation UI text
|
||||
expect(response.body).to.not.match(/Home\s*>\s*InfluxDB/);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Content Quality', () => {
|
||||
it('should not have excessive consecutive newlines', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should not have more than 2 consecutive newlines (3+ \n in a row)
|
||||
expect(response.body).to.not.match(/\n\n\n+/);
|
||||
});
|
||||
});
|
||||
|
||||
it('should have proper markdown headings', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should contain markdown headings (# or ##)
|
||||
expect(response.body).to.match(/^# /m);
|
||||
});
|
||||
});
|
||||
|
||||
it('should have properly formatted code blocks', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Code blocks should use ``` fences
|
||||
const codeBlockMatches = response.body.match(/```/g);
|
||||
if (codeBlockMatches) {
|
||||
// Code blocks should come in pairs (opening and closing)
|
||||
expect(codeBlockMatches.length % 2).to.eq(0);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should preserve language identifiers in code blocks', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Look for common language identifiers after ```
|
||||
if (response.body.includes('```')) {
|
||||
expect(response.body).to.match(
|
||||
/```(?:bash|sh|python|js|go|sql|json|yaml)/
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should have properly formatted links', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Links should be in markdown format [text](url)
|
||||
const linkMatches = response.body.match(/\[.+?\]\(.+?\)/g);
|
||||
|
||||
if (linkMatches) {
|
||||
// Each link should have both text and URL
|
||||
linkMatches.forEach((link) => {
|
||||
expect(link).to.match(/\[.+\]\(.+\)/);
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT have broken link conversions (text without URLs)', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Links should NOT be plain text where there should be a link
|
||||
// This was a previous bug where links were stripped
|
||||
|
||||
// If we see content that looks like it should be linked, verify it is
|
||||
if (response.body.includes('Telegraf')) {
|
||||
// Should have links to Telegraf, not just plain "Telegraf" everywhere
|
||||
expect(response.body).to.match(/\[.*Telegraf.*\]\(.*\/telegraf\//i);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Tab Delimiters', () => {
|
||||
const TAB_PAGE_URL = '/influxdb3/enterprise/write-data/client-libraries/';
|
||||
const CODE_TAB_PAGE_URL =
|
||||
'/influxdb3/core/query-data/execute-queries/influxdb3-cli/';
|
||||
|
||||
before(() => {
|
||||
// Generate markdown for pages with tabs
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/enterprise/write-data/client-libraries --limit 1',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
);
|
||||
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/core/query-data/execute-queries/influxdb3-cli --limit 1',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should convert tabs-wrapper to heading delimiters', () => {
|
||||
cy.request(`${TAB_PAGE_URL}index.md`).then((response) => {
|
||||
// Should contain tab delimiter headings (e.g., #### Go ####)
|
||||
expect(response.body).to.match(/^#### \w+ ####$/m);
|
||||
|
||||
// Should NOT contain raw tab link patterns
|
||||
expect(response.body).to.not.match(/\[Go\]\(#\)\[Node\.js\]\(#\)/);
|
||||
expect(response.body).to.not.match(/\[Python\]\(#\)\[Java\]\(#\)/);
|
||||
});
|
||||
});
|
||||
|
||||
it('should convert code-tabs-wrapper to heading delimiters', () => {
|
||||
cy.request(`${CODE_TAB_PAGE_URL}index.md`).then((response) => {
|
||||
// Should contain tab delimiter headings for code tabs
|
||||
expect(response.body).to.match(/^#### \w+ ####$/m);
|
||||
|
||||
// Should NOT contain raw code-tab link patterns
|
||||
expect(response.body).to.not.match(/\[string\]\(#\)\[file\]\(#\)/);
|
||||
expect(response.body).to.not.match(/\[SQL\]\(#\)\[InfluxQL\]\(#\)/);
|
||||
});
|
||||
});
|
||||
|
||||
it('should preserve first tab name in delimiter heading', () => {
|
||||
cy.request(`${TAB_PAGE_URL}index.md`).then((response) => {
|
||||
// For tabs like [Go](#)[Node.js](#)[Python](#), should have #### Go ####
|
||||
if (response.body.includes('Go')) {
|
||||
expect(response.body).to.match(/^#### Go ####$/m);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should have delimiter headings followed by content', () => {
|
||||
cy.request(`${TAB_PAGE_URL}index.md`).then((response) => {
|
||||
// Tab delimiter headings should be followed by actual content
|
||||
const delimiterMatches = response.body.match(/^#### \w+ ####$/gm);
|
||||
|
||||
if (delimiterMatches && delimiterMatches.length > 0) {
|
||||
// Each delimiter should be followed by content (not another delimiter immediately)
|
||||
delimiterMatches.forEach((delimiter) => {
|
||||
const delimiterIndex = response.body.indexOf(delimiter);
|
||||
const afterDelimiter = response.body.substring(
|
||||
delimiterIndex + delimiter.length,
|
||||
delimiterIndex + delimiter.length + 200
|
||||
);
|
||||
|
||||
// Should have content after delimiter (not just another delimiter)
|
||||
expect(afterDelimiter.trim()).to.not.match(/^#### \w+ ####/);
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Section Pages - Child Content', () => {
|
||||
beforeEach(() => {
|
||||
// Note: Current implementation may not aggregate child pages
|
||||
// These tests document the expected behavior for future implementation
|
||||
});
|
||||
|
||||
it('should be accessible at section URL with index.md', () => {
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.status).to.eq(200);
|
||||
});
|
||||
});
|
||||
|
||||
it('should have section page frontmatter', () => {
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
expect(response.body).to.match(/^---\n/);
|
||||
expect(response.body).to.include('title:');
|
||||
expect(response.body).to.include('url:');
|
||||
});
|
||||
});
|
||||
|
||||
it('should reference child pages in content', () => {
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
// Section page should at least link to child pages
|
||||
expect(response.body).to.match(/\[.*Set up.*\]\(.*setup.*\)/i);
|
||||
expect(response.body).to.match(/\[.*Write.*\]\(.*write.*\)/i);
|
||||
expect(response.body).to.match(/\[.*Query.*\]\(.*query.*\)/i);
|
||||
});
|
||||
});
|
||||
|
||||
// Future enhancement: section pages should include full child content
|
||||
it.skip('should include full content of child pages with delimiters', () => {
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
// Should contain section header
|
||||
expect(response.body).to.include('# Get started with InfluxDB 3 Core');
|
||||
|
||||
// Should contain child page delimiters
|
||||
expect(response.body).to.match(/---\n\n## .*:/);
|
||||
|
||||
// Should contain content from child pages
|
||||
expect(response.body).to.include('Set up InfluxDB 3 Core');
|
||||
expect(response.body).to.include('Write data');
|
||||
expect(response.body).to.include('Query data');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Multiple Product Validation', () => {
|
||||
const PRODUCTS = [
|
||||
{ url: '/influxdb3/core/', name: 'InfluxDB 3 Core', version: 'core' },
|
||||
{
|
||||
url: '/influxdb3/enterprise/',
|
||||
name: 'InfluxDB 3 Enterprise',
|
||||
version: 'enterprise',
|
||||
},
|
||||
];
|
||||
|
||||
PRODUCTS.forEach((product) => {
|
||||
describe(`${product.name} (${product.version})`, () => {
|
||||
it('should have correct product metadata', () => {
|
||||
cy.request(`${product.url}index.md`).then((response) => {
|
||||
expect(response.body).to.include(`product: ${product.name}`);
|
||||
expect(response.body).to.include(
|
||||
`product_version: ${product.version}`
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it('should not contain shortcodes', () => {
|
||||
cy.request(`${product.url}index.md`).then((response) => {
|
||||
expect(response.body).to.not.include('{{<');
|
||||
expect(response.body).to.not.include('{{%');
|
||||
});
|
||||
});
|
||||
|
||||
it('should not contain HTML comments', () => {
|
||||
cy.request(`${product.url}index.md`).then((response) => {
|
||||
expect(response.body).to.not.include('<!--');
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Markdown Rendering Quality', () => {
|
||||
it('should render GitHub-style callouts', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should contain GitHub-style callout syntax if callouts are present
|
||||
if (response.body.match(/Note|Warning|Important|Tip|Caution/i)) {
|
||||
expect(response.body).to.match(
|
||||
/> \[!(Note|Warning|Important|Tip|Caution)\]/
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should render tables in markdown format', () => {
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
// If tables are present, they should use markdown table syntax
|
||||
if (
|
||||
response.body.includes('Tool') &&
|
||||
response.body.includes('Administration')
|
||||
) {
|
||||
expect(response.body).to.match(/\|.*\|.*\|/); // Table rows
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should render lists in markdown format', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Validate using Markdown parser instead of regex
|
||||
const validation = validateMarkdown(response.body);
|
||||
expect(validation.info.hasLists).to.be.true;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Table Content - Enterprise Get Started', () => {
|
||||
const ENTERPRISE_GET_STARTED_URL = '/influxdb3/enterprise/get-started/';
|
||||
|
||||
before(() => {
|
||||
// Ensure markdown is generated for this specific page
|
||||
cy.exec(
|
||||
'node scripts/html-to-markdown.js --path influxdb3/enterprise/get-started',
|
||||
{
|
||||
failOnNonZeroExit: false,
|
||||
timeout: 60000,
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should have valid table structure with expected headers', () => {
|
||||
cy.request(`${ENTERPRISE_GET_STARTED_URL}index.md`).then((response) => {
|
||||
const validation = validateMarkdown(response.body);
|
||||
|
||||
// Should have at least one table
|
||||
expect(validation.info.hasTables).to.be.true;
|
||||
expect(validation.info.tableCount).to.be.greaterThan(0);
|
||||
|
||||
// Find the tools comparison table (should have Tool, Administration, Write, Query headers)
|
||||
const toolsTable = validation.info.tables.find(
|
||||
(table) =>
|
||||
table.headers.some((h) => h.toLowerCase().includes('tool')) &&
|
||||
table.headers.some((h) =>
|
||||
h.toLowerCase().includes('administration')
|
||||
) &&
|
||||
table.headers.some((h) => h.toLowerCase().includes('write')) &&
|
||||
table.headers.some((h) => h.toLowerCase().includes('query'))
|
||||
);
|
||||
|
||||
expect(toolsTable).to.exist;
|
||||
|
||||
// Validate table structure using semantic validator
|
||||
const tableValidation = validateTable(
|
||||
toolsTable,
|
||||
['Tool', 'Administration', 'Write', 'Query'],
|
||||
2 // At least 2 rows (header + data)
|
||||
);
|
||||
|
||||
expect(tableValidation.valid).to.be.true;
|
||||
if (!tableValidation.valid) {
|
||||
cy.log('Table validation errors:', tableValidation.errors);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
it('should include expected tools in table content', () => {
|
||||
cy.request(`${ENTERPRISE_GET_STARTED_URL}index.md`).then((response) => {
|
||||
const validation = validateMarkdown(response.body);
|
||||
const toolsTable = validation.info.tables[0]; // First table should be tools table
|
||||
|
||||
// Convert table cells to flat array for easier searching
|
||||
const allCells = toolsTable.cells.flat().map((c) => c.toLowerCase());
|
||||
|
||||
// Check for key tools (case-insensitive content check)
|
||||
expect(allCells.some((c) => c.includes('influxdb3'))).to.be.true;
|
||||
expect(allCells.some((c) => c.includes('http api'))).to.be.true;
|
||||
expect(allCells.some((c) => c.includes('explorer'))).to.be.true;
|
||||
expect(allCells.some((c) => c.includes('telegraf'))).to.be.true;
|
||||
expect(allCells.some((c) => c.includes('grafana'))).to.be.true;
|
||||
});
|
||||
});
|
||||
|
||||
it('should have consistent column count in all table rows', () => {
|
||||
cy.request(`${ENTERPRISE_GET_STARTED_URL}index.md`).then((response) => {
|
||||
const validation = validateMarkdown(response.body);
|
||||
const toolsTable = validation.info.tables[0];
|
||||
|
||||
// All rows should have the same number of columns
|
||||
const columnCounts = toolsTable.cells.map((row) => row.length);
|
||||
const uniqueCounts = [...new Set(columnCounts)];
|
||||
|
||||
expect(uniqueCounts.length).to.equal(
|
||||
1,
|
||||
`Table has inconsistent column counts: ${uniqueCounts.join(', ')}`
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Regression Tests - Known Issues', () => {
|
||||
it('should NOT contain localhost URLs in frontmatter', () => {
|
||||
cy.request(`${ENTERPRISE_INDEX_URL}index.md`).then((response) => {
|
||||
expect(doesNotContainText(response.body, 'http://localhost')).to.be
|
||||
.true;
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain horizontal rule duplicates at end', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Should not end with multiple * * * in a row
|
||||
expect(response.body).to.not.match(/\* \* \*\n\n\* \* \*$/);
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain UI element text (Copy page, Was this helpful, etc)', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Regression: UI elements were appearing in markdown
|
||||
expect(doesNotContainText(response.body, 'Copy page')).to.be.true;
|
||||
expect(doesNotContainText(response.body, 'Was this page helpful')).to.be
|
||||
.true;
|
||||
expect(doesNotContainText(response.body, 'Submit feedback')).to.be.true;
|
||||
});
|
||||
|
||||
cy.request(`${SECTION_PAGE_URL}index.md`).then((response) => {
|
||||
expect(doesNotContainText(response.body, 'Copy section')).to.be.true;
|
||||
});
|
||||
});
|
||||
|
||||
it('should NOT contain Support section content', () => {
|
||||
cy.request(`${LEAF_PAGE_URL}index.md`).then((response) => {
|
||||
// Support sections should be removed during conversion
|
||||
expect(doesNotContainText(response.body, 'InfluxDB Discord')).to.be
|
||||
.true;
|
||||
expect(doesNotContainText(response.body, 'Customer portal')).to.be.true;
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,95 @@
|
|||
/**
|
||||
* Minimal tests for page-context module usage
|
||||
* Verifies that page-context exports are available to modules that import them
|
||||
*/
|
||||
|
||||
describe('Page Context Module', () => {
|
||||
const testUrls = [
|
||||
{
|
||||
url: 'http://localhost:1315/influxdb3/core/get-started/',
|
||||
expectedContext: 'core',
|
||||
expectedProduct: 'influxdb3',
|
||||
description: 'InfluxDB 3 Core',
|
||||
},
|
||||
{
|
||||
url: 'http://localhost:1315/influxdb3/enterprise/get-started/',
|
||||
expectedContext: 'enterprise',
|
||||
expectedProduct: 'influxdb3',
|
||||
description: 'InfluxDB 3 Enterprise',
|
||||
},
|
||||
{
|
||||
url: 'http://localhost:1315/influxdb/cloud/',
|
||||
expectedContext: 'cloud',
|
||||
expectedProduct: 'influxdb',
|
||||
description: 'InfluxDB Cloud (TSM)',
|
||||
},
|
||||
];
|
||||
|
||||
testUrls.forEach(({ url, expectedContext, expectedProduct, description }) => {
|
||||
describe(`${description}`, () => {
|
||||
beforeEach(() => {
|
||||
cy.visit(url);
|
||||
});
|
||||
|
||||
it('should load page-context module exports', () => {
|
||||
cy.window().then((win) => {
|
||||
// Access the main.js global which loads page-context
|
||||
expect(win.influxdatadocs).to.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should have correct context value for v3-wayfinding.js usage', () => {
|
||||
// v3-wayfinding.js imports: context, host, hostname, path, protocol, referrer, referrerHost
|
||||
cy.window().then((win) => {
|
||||
// These values should be available in the window for modules to use
|
||||
expect(win.location.pathname).to.include(expectedProduct);
|
||||
expect(win.location.protocol).to.match(/https?:/);
|
||||
expect(win.location.hostname).to.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should have correct values for page-feedback.js usage', () => {
|
||||
// page-feedback.js imports: hostname, path, product, protocol, version
|
||||
cy.window().then((win) => {
|
||||
const pathname = win.location.pathname;
|
||||
const pathArr = pathname.split('/').filter((s) => s);
|
||||
|
||||
// Verify product is extractable from path
|
||||
expect(pathArr[0]).to.equal(expectedProduct);
|
||||
|
||||
// Verify required context properties exist
|
||||
expect(win.location.hostname).to.exist;
|
||||
expect(win.location.protocol).to.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should provide consistent product detection', () => {
|
||||
cy.window().then((win) => {
|
||||
const pathname = win.location.pathname;
|
||||
|
||||
// Verify the path matches expected context
|
||||
if (expectedContext === 'core') {
|
||||
expect(pathname).to.include('/influxdb3/core');
|
||||
} else if (expectedContext === 'enterprise') {
|
||||
expect(pathname).to.include('/influxdb3/enterprise');
|
||||
} else if (expectedContext === 'cloud') {
|
||||
expect(pathname).to.include('/influxdb/cloud');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Fallback behavior', () => {
|
||||
it('should handle unknown product paths gracefully', () => {
|
||||
// Visit a path that doesn't match any product
|
||||
cy.visit('http://localhost:1315/');
|
||||
|
||||
cy.window().then((win) => {
|
||||
// Should still have basic location properties
|
||||
expect(win.location.pathname).to.exist;
|
||||
expect(win.location.hostname).to.exist;
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
@ -0,0 +1,260 @@
|
|||
/**
|
||||
* Markdown Validation Helper for Cypress Tests
|
||||
*
|
||||
* Uses remark/unified to parse and validate Markdown structure
|
||||
* instead of brittle regex patterns.
|
||||
*/
|
||||
|
||||
import { unified } from 'unified';
|
||||
import remarkParse from 'remark-parse';
|
||||
import remarkGfm from 'remark-gfm';
|
||||
import remarkFrontmatter from 'remark-frontmatter';
|
||||
import { visit } from 'unist-util-visit';
|
||||
|
||||
/**
|
||||
* Parse Markdown and return AST (Abstract Syntax Tree)
|
||||
*/
|
||||
export function parseMarkdown(markdown) {
|
||||
return unified()
|
||||
.use(remarkParse)
|
||||
.use(remarkGfm)
|
||||
.use(remarkFrontmatter, ['yaml'])
|
||||
.parse(markdown);
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate Markdown structure and return validation results
|
||||
*/
|
||||
export function validateMarkdown(markdown) {
|
||||
const ast = parseMarkdown(markdown);
|
||||
const results = {
|
||||
valid: true,
|
||||
errors: [],
|
||||
warnings: [],
|
||||
info: {
|
||||
hasFrontmatter: false,
|
||||
frontmatter: null,
|
||||
hasLists: false,
|
||||
hasTables: false,
|
||||
tableCount: 0,
|
||||
tables: [],
|
||||
headings: [],
|
||||
codeBlocks: [],
|
||||
links: [],
|
||||
},
|
||||
};
|
||||
|
||||
// Extract frontmatter
|
||||
visit(ast, 'yaml', (node) => {
|
||||
results.info.hasFrontmatter = true;
|
||||
try {
|
||||
// Store raw frontmatter for parsing
|
||||
results.info.frontmatter = node.value;
|
||||
} catch (error) {
|
||||
results.errors.push(`Invalid YAML frontmatter: ${error.message}`);
|
||||
results.valid = false;
|
||||
}
|
||||
});
|
||||
|
||||
// Check for lists
|
||||
visit(ast, 'list', (node) => {
|
||||
results.info.hasLists = true;
|
||||
});
|
||||
|
||||
// Check for tables and validate structure
|
||||
visit(ast, 'table', (node) => {
|
||||
results.info.hasTables = true;
|
||||
results.info.tableCount++;
|
||||
|
||||
const table = {
|
||||
rows: node.children.length,
|
||||
columns: node.children[0]?.children.length || 0,
|
||||
headers: [],
|
||||
cells: [],
|
||||
};
|
||||
|
||||
// Extract headers from first row
|
||||
if (node.children[0]) {
|
||||
node.children[0].children.forEach((cell) => {
|
||||
const text = extractText(cell);
|
||||
table.headers.push(text);
|
||||
});
|
||||
}
|
||||
|
||||
// Extract all cell content
|
||||
node.children.forEach((row, rowIndex) => {
|
||||
const rowCells = [];
|
||||
row.children.forEach((cell) => {
|
||||
rowCells.push(extractText(cell));
|
||||
});
|
||||
table.cells.push(rowCells);
|
||||
});
|
||||
|
||||
results.info.tables.push(table);
|
||||
});
|
||||
|
||||
// Extract headings
|
||||
visit(ast, 'heading', (node) => {
|
||||
results.info.headings.push({
|
||||
depth: node.depth,
|
||||
text: extractText(node),
|
||||
});
|
||||
});
|
||||
|
||||
// Extract code blocks
|
||||
visit(ast, 'code', (node) => {
|
||||
results.info.codeBlocks.push({
|
||||
lang: node.lang || null,
|
||||
value: node.value,
|
||||
});
|
||||
});
|
||||
|
||||
// Extract links
|
||||
visit(ast, 'link', (node) => {
|
||||
results.info.links.push({
|
||||
url: node.url,
|
||||
title: node.title || null,
|
||||
text: extractText(node),
|
||||
});
|
||||
});
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract text content from a node (recursively handles all node types)
|
||||
*/
|
||||
function extractText(node) {
|
||||
if (!node) {
|
||||
return '';
|
||||
}
|
||||
|
||||
if (node.type === 'text') {
|
||||
return node.value;
|
||||
}
|
||||
|
||||
// Handle inline code
|
||||
if (node.type === 'inlineCode') {
|
||||
return node.value;
|
||||
}
|
||||
|
||||
// Handle links - extract the text children
|
||||
if (node.type === 'link') {
|
||||
if (node.children) {
|
||||
return node.children.map(extractText).join('');
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
// Handle emphasis, strong, etc - recursively extract children
|
||||
if (node.children) {
|
||||
return node.children.map(extractText).join('');
|
||||
}
|
||||
|
||||
// For any other node type with a value
|
||||
if (node.value) {
|
||||
return node.value;
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if content contains specific text (case-insensitive)
|
||||
*/
|
||||
export function containsText(markdown, searchText) {
|
||||
return markdown.toLowerCase().includes(searchText.toLowerCase());
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if content does NOT contain specific text (case-insensitive)
|
||||
*/
|
||||
export function doesNotContainText(markdown, searchText) {
|
||||
return !containsText(markdown, searchText);
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate frontmatter has required fields
|
||||
*/
|
||||
export function validateFrontmatter(frontmatter, requiredFields) {
|
||||
const errors = [];
|
||||
|
||||
if (!frontmatter) {
|
||||
return { valid: false, errors: ['No frontmatter found'] };
|
||||
}
|
||||
|
||||
// Parse YAML frontmatter
|
||||
let parsed;
|
||||
try {
|
||||
// Simple YAML parsing - split by lines and extract key-value pairs
|
||||
parsed = {};
|
||||
const lines = frontmatter.split('\n');
|
||||
lines.forEach((line) => {
|
||||
const match = line.match(/^([^:]+):\s*(.*)$/);
|
||||
if (match) {
|
||||
parsed[match[1].trim()] = match[2].trim();
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
return {
|
||||
valid: false,
|
||||
errors: [`Failed to parse frontmatter: ${error.message}`],
|
||||
};
|
||||
}
|
||||
|
||||
// Check required fields
|
||||
requiredFields.forEach((field) => {
|
||||
if (!parsed[field]) {
|
||||
errors.push(`Missing required frontmatter field: ${field}`);
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
data: parsed,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate table structure
|
||||
*/
|
||||
export function validateTable(tableInfo, expectedHeaders = null, minRows = 0) {
|
||||
const errors = [];
|
||||
|
||||
if (!tableInfo) {
|
||||
return { valid: false, errors: ['Table not found'] };
|
||||
}
|
||||
|
||||
// Check column count consistency
|
||||
const columnCounts = tableInfo.cells.map((row) => row.length);
|
||||
const uniqueCounts = [...new Set(columnCounts)];
|
||||
if (uniqueCounts.length > 1) {
|
||||
errors.push(`Inconsistent column count: ${uniqueCounts.join(', ')}`);
|
||||
}
|
||||
|
||||
// Check expected headers
|
||||
if (expectedHeaders) {
|
||||
expectedHeaders.forEach((header) => {
|
||||
if (
|
||||
!tableInfo.headers.some((h) =>
|
||||
h.toLowerCase().includes(header.toLowerCase())
|
||||
)
|
||||
) {
|
||||
errors.push(`Missing expected header: ${header}`);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Check minimum rows
|
||||
if (tableInfo.rows < minRows) {
|
||||
errors.push(
|
||||
`Table has ${tableInfo.rows} rows, expected at least ${minRows}`
|
||||
);
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
};
|
||||
}
|
||||
|
|
@ -103,7 +103,7 @@ exports.handler = (event, context, callback) => {
|
|||
|
||||
// If file has a valid extension, return the request unchanged
|
||||
if (validExtensions[parsedPath.ext]) {
|
||||
callback(null, request);
|
||||
return callback(null, request);
|
||||
}
|
||||
|
||||
////////////////////// START PRODUCT-SPECIFIC REDIRECTS //////////////////////
|
||||
|
|
|
|||
|
|
@ -0,0 +1,226 @@
|
|||
# llms.txt Generation System
|
||||
|
||||
This directory contains Hugo templates for automatically generating `llms.txt` files following the [llmstxt.org](https://llmstxt.org/) specification.
|
||||
|
||||
## Overview
|
||||
|
||||
The llms.txt format helps LLMs discover and understand documentation structure. Hugo automatically generates these files during the build process.
|
||||
|
||||
## Template Files
|
||||
|
||||
### `index.llms.txt`
|
||||
- **Location**: `/layouts/index.llms.txt`
|
||||
- **Output**: `/llms.txt` (site-level)
|
||||
- **Type**: Hugo template
|
||||
- **Purpose**: Primary entry point for LLM discovery
|
||||
- **Content**: Dynamically generated from `data/products.yml` with:
|
||||
- Product descriptions from data files
|
||||
- Organized by product category
|
||||
- Conditional rendering for optional products
|
||||
|
||||
### `section.llms.txt`
|
||||
- **Location**: `/layouts/_default/section.llms.txt`
|
||||
- **Output**: Product/section-level llms.txt files (e.g., `/influxdb3/core/llms.txt`)
|
||||
- **Type**: Hugo template
|
||||
- **Purpose**: Provide curated navigation for specific products/sections
|
||||
- **Content**: Dynamically generated from:
|
||||
- Product metadata from `data/products.yml`
|
||||
- Section content and child pages
|
||||
- Page descriptions
|
||||
|
||||
## Hugo Configuration
|
||||
|
||||
In `config/_default/hugo.yml`:
|
||||
|
||||
```yaml
|
||||
outputFormats:
|
||||
llmstxt:
|
||||
mediaType: text/plain
|
||||
baseName: llms
|
||||
isPlainText: true
|
||||
notAlternative: true
|
||||
permalinkable: true
|
||||
suffixes:
|
||||
- txt
|
||||
|
||||
outputs:
|
||||
section:
|
||||
- HTML
|
||||
- llmstxt # Generates llms.txt for all sections
|
||||
home:
|
||||
- HTML
|
||||
- llmstxt # Generates root /llms.txt
|
||||
```
|
||||
|
||||
## Generated Files
|
||||
|
||||
After building with `hugo`:
|
||||
|
||||
```
|
||||
public/
|
||||
├── llms.txt # Site-level discovery file
|
||||
├── influxdb3/
|
||||
│ ├── core/
|
||||
│ │ ├── llms.txt # InfluxDB 3 Core product index
|
||||
│ │ ├── get-started/
|
||||
│ │ │ └── llms.txt # Section-level index
|
||||
│ │ └── query-data/
|
||||
│ │ └── llms.txt # Section-level index
|
||||
│ ├── cloud-dedicated/
|
||||
│ │ └── llms.txt # Cloud Dedicated product index
|
||||
│ └── cloud-serverless/
|
||||
│ └── llms.txt # Cloud Serverless product index
|
||||
├── telegraf/
|
||||
│ └── v1/
|
||||
│ └── llms.txt # Telegraf product index
|
||||
└── flux/
|
||||
└── v0/
|
||||
└── llms.txt # Flux product index
|
||||
```
|
||||
|
||||
## llmstxt.org Specification Compliance
|
||||
|
||||
### Required Elements
|
||||
- ✅ **H1 header**: Product or section name
|
||||
- ✅ **Curated links**: Not exhaustive - intentionally selective
|
||||
|
||||
### Optional Elements
|
||||
- ✅ **Blockquote summary**: Brief product/section description
|
||||
- ✅ **Content paragraphs**: Additional context (NO headings allowed)
|
||||
- ✅ **H2-delimited sections**: Organize links by category
|
||||
- ✅ **Link format**: `[Title](url): Description`
|
||||
|
||||
### Key Rules
|
||||
1. **H1 is required** - Only the product/section name
|
||||
2. **Content sections cannot have headings** - Use paragraphs only
|
||||
3. **Curate, don't list everything** - Be selective with links
|
||||
4. **Use relative URLs** - LLMs resolve them in context
|
||||
5. **"Optional" section** - Signals skippable secondary content
|
||||
|
||||
## Customizing llms.txt Files
|
||||
|
||||
### For Site-Level (/llms.txt)
|
||||
|
||||
Edit `/layouts/index.llms.txt` directly. This file is hardcoded for precise curation of top-level products.
|
||||
|
||||
### For Product/Section-Level
|
||||
|
||||
The `/layouts/_default/section.llms.txt` template automatically generates llms.txt files for all sections.
|
||||
|
||||
**To customize a specific product's llms.txt:**
|
||||
|
||||
1. Create a product-specific template following Hugo's lookup order:
|
||||
```
|
||||
layouts/influxdb3/core/section.llms.txt # Specific to Core
|
||||
layouts/influxdb3/section.llms.txt # All InfluxDB 3 products
|
||||
layouts/_default/section.llms.txt # Default for all
|
||||
```
|
||||
|
||||
2. **Example: Custom template for InfluxDB 3 Core**
|
||||
|
||||
Create `/layouts/influxdb3/core/section.llms.txt`:
|
||||
```
|
||||
# InfluxDB 3 Core
|
||||
|
||||
> InfluxDB 3 Core is the open source, high-performance time series database.
|
||||
|
||||
{{- /* Custom curated sections */ -}}
|
||||
|
||||
## Getting Started
|
||||
|
||||
- [Install InfluxDB 3 Core](/influxdb3/core/install/): Installation guide
|
||||
- [Quick start](/influxdb3/core/get-started/): Get started in 5 minutes
|
||||
|
||||
## Guides
|
||||
|
||||
- [Write data](/influxdb3/core/write-data/): Write data guide
|
||||
- [Query with SQL](/influxdb3/core/query-data/sql/): SQL query guide
|
||||
```
|
||||
|
||||
### Using Product Metadata from data/products.yml
|
||||
|
||||
The template accesses product metadata:
|
||||
|
||||
```go-html-template
|
||||
{{- $product := index .Site.Data.products "influxdb3_core" -}}
|
||||
{{- $productName := $product.name -}} {{/* "InfluxDB 3 Core" */}}
|
||||
{{- $productAltname := $product.altname -}} {{/* Alternative name */}}
|
||||
{{- $productVersion := $product.latest -}} {{/* "core" */}}
|
||||
```
|
||||
|
||||
## Testing llms.txt Files
|
||||
|
||||
### Build and Check Output
|
||||
|
||||
```bash
|
||||
# Build Hugo site
|
||||
./node_modules/.bin/hugo --quiet
|
||||
|
||||
# Check generated llms.txt files
|
||||
ls -la public/llms.txt
|
||||
ls -la public/influxdb3/core/llms.txt
|
||||
|
||||
# View content
|
||||
cat public/llms.txt
|
||||
cat public/influxdb3/core/llms.txt
|
||||
```
|
||||
|
||||
### Validate Against Specification
|
||||
|
||||
Check that generated files follow llmstxt.org spec:
|
||||
|
||||
1. ✅ Starts with single H1 header
|
||||
2. ✅ Optional blockquote summary after H1
|
||||
3. ✅ Content sections have NO headings
|
||||
4. ✅ H2 sections organize curated links
|
||||
5. ✅ Link format: `[Title](url): Description`
|
||||
6. ✅ Relative URLs work correctly
|
||||
|
||||
### Test LLM Discovery
|
||||
|
||||
```bash
|
||||
# Test with curl (simulates LLM access)
|
||||
curl https://docs.influxdata.com/llms.txt
|
||||
curl https://docs.influxdata.com/influxdb3/core/llms.txt
|
||||
|
||||
# Verify content type
|
||||
curl -I https://docs.influxdata.com/llms.txt
|
||||
# Should return: Content-Type: text/plain
|
||||
```
|
||||
|
||||
## Build Process
|
||||
|
||||
llms.txt files are automatically generated during:
|
||||
|
||||
1. **Local development**: `hugo server` regenerates on file changes
|
||||
2. **Production build**: `hugo --quiet` generates all llms.txt files
|
||||
3. **CI/CD**: Build pipeline includes llms.txt generation
|
||||
|
||||
## Maintenance
|
||||
|
||||
### Adding a New Product
|
||||
|
||||
1. Add product metadata to `data/products.yml`
|
||||
2. llms.txt will auto-generate using `section.llms.txt` template
|
||||
3. Optionally create custom template in `layouts/<product>/section.llms.txt`
|
||||
|
||||
### Updating Site-Level llms.txt
|
||||
|
||||
Edit `/layouts/index.llms.txt` to add/remove product links.
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
**Problem**: llms.txt file not generated
|
||||
**Solution**: Check that output format is configured in `config/_default/hugo.yml`
|
||||
|
||||
**Problem**: Content includes HTML tags
|
||||
**Solution**: Use `| plainify` filter in template
|
||||
|
||||
**Problem**: URLs are absolute instead of relative
|
||||
**Solution**: Use `.RelPermalink` instead of `.Permalink`
|
||||
|
||||
## Resources
|
||||
|
||||
- [llmstxt.org specification](https://llmstxt.org/)
|
||||
- [Hugo output formats](https://gohugo.io/templates/output-formats/)
|
||||
- [InfluxData products.yml](../../data/products.yml)
|
||||
|
|
@ -52,7 +52,11 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<!-- {{ partial "article.html" . }} -->
|
||||
<div class="article">
|
||||
<article class="article--content">
|
||||
{{ partial "article/format-selector.html" . }}
|
||||
</article>
|
||||
</div>
|
||||
<div class="copyright">© {{ now.Year }} InfluxData, Inc.</div>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -0,0 +1,55 @@
|
|||
{{- /*
|
||||
Generate llms.txt file for landing pages following https://llmstxt.org specification
|
||||
|
||||
Required elements:
|
||||
1. H1 with project/site name (required)
|
||||
2. Blockquote with brief summary (optional)
|
||||
3. Zero or more markdown sections (NO headings allowed) (optional)
|
||||
4. H2-delimited file list sections with URLs (optional)
|
||||
5. Optional H2 section for secondary information (optional)
|
||||
*/ -}}
|
||||
{{- $productKey := "" -}}
|
||||
{{- $productName := "" -}}
|
||||
{{- $productDescription := "" -}}
|
||||
|
||||
{{- /* Detect product from URL path */ -}}
|
||||
{{- if hasPrefix .RelPermalink "/influxdb/cloud/" -}}
|
||||
{{- $productKey = "influxdb_cloud" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb/v2" -}}
|
||||
{{- $productKey = "influxdb" -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- /* Get product data from products.yml */ -}}
|
||||
{{- if $productKey -}}
|
||||
{{- $product := index .Site.Data.products $productKey -}}
|
||||
{{- if $product -}}
|
||||
{{- $productName = $product.name -}}
|
||||
{{- $productDescription = $product.description -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- /* Use product name or fallback to page title */ -}}
|
||||
{{- $h1Title := .Title -}}
|
||||
{{- if $productName -}}
|
||||
{{- $h1Title = $productName -}}
|
||||
{{- end -}}
|
||||
|
||||
# {{ $h1Title }}
|
||||
{{- with $productDescription }}
|
||||
|
||||
> {{ . }}
|
||||
{{- end }}
|
||||
{{- with .Description }}
|
||||
|
||||
> {{ . }}
|
||||
{{- end }}
|
||||
|
||||
This is the landing page for {{ $h1Title }} documentation. Select a section below to get started.
|
||||
{{- /* List main documentation sections if available */ -}}
|
||||
{{- if .Pages }}
|
||||
|
||||
## Main documentation sections
|
||||
{{ range .Pages }}
|
||||
- [{{ .Title }}]({{ .RelPermalink }}){{ with .Description }}: {{ . }}{{ end }}
|
||||
{{- end }}
|
||||
{{- end -}}
|
||||
|
|
@ -0,0 +1,47 @@
|
|||
{{- /*
|
||||
Root /llms.txt file following https://llmstxt.org specification
|
||||
|
||||
This is the main discovery file for AI agents.
|
||||
It points to aggregated .section.md files for each major product area.
|
||||
|
||||
Per llmstxt.org spec:
|
||||
- H1 with site/project name (required)
|
||||
- Blockquote with brief summary (optional)
|
||||
- Content sections with details (optional)
|
||||
- H2-delimited file lists with curated links (optional)
|
||||
*/ -}}
|
||||
# InfluxData Documentation
|
||||
|
||||
> Documentation for InfluxDB time series database and related tools including Telegraf, Chronograf, and Kapacitor.
|
||||
|
||||
This documentation covers all InfluxDB versions and ecosystem tools. Each section provides comprehensive guides, API references, and tutorials.
|
||||
|
||||
## InfluxDB 3
|
||||
|
||||
- [InfluxDB 3 Core](influxdb3/core/index.section.md): Open source time series database optimized for real-time data
|
||||
- [InfluxDB 3 Enterprise](influxdb3/enterprise/index.section.md): Enterprise features including clustering and high availability
|
||||
- [InfluxDB Cloud Dedicated](influxdb3/cloud-dedicated/index.section.md): Dedicated cloud deployment with predictable performance
|
||||
- [InfluxDB Cloud Serverless](influxdb3/cloud-serverless/index.section.md): Serverless cloud deployment with usage-based pricing
|
||||
- [InfluxDB Clustered](influxdb3/clustered/index.section.md): Self-managed clustered deployment
|
||||
- [InfluxDB 3 Explorer](influxdb3/explorer/index.md): Web-based data exploration tool
|
||||
|
||||
## InfluxDB 2
|
||||
|
||||
- [InfluxDB OSS v2](influxdb/v2/index.section.md): Open source version 2.x documentation
|
||||
- [InfluxDB Cloud (TSM)](influxdb/cloud/index.section.md): Managed cloud service based on InfluxDB 2.x
|
||||
|
||||
## InfluxDB 1
|
||||
|
||||
- [InfluxDB OSS v1](influxdb/v1/index.section.md): Open source version 1.x documentation
|
||||
- [InfluxDB Enterprise v1](enterprise_influxdb/v1/index.section.md): Enterprise features for version 1.x
|
||||
|
||||
## Tools and Integrations
|
||||
|
||||
- [Telegraf](telegraf/v1/index.section.md): Plugin-driven server agent for collecting and sending metrics
|
||||
- [Chronograf](chronograf/v1/index.section.md): User interface and administrative component
|
||||
- [Kapacitor](kapacitor/v1/index.section.md): Real-time streaming data processing engine
|
||||
- [Flux](flux/v0/index.section.md): Functional data scripting language
|
||||
|
||||
## API References
|
||||
|
||||
For API documentation, see the API reference section within each product's documentation.
|
||||
|
|
@ -4,6 +4,7 @@
|
|||
<h1>{{ .RenderString .Title }}</h1>
|
||||
{{ partial "article/supported-versions.html" . }}
|
||||
{{ partial "article/page-meta.html" . }}
|
||||
{{ partial "article/format-selector.html" . }}
|
||||
</div>
|
||||
{{ partial "article/special-state.html" . }}
|
||||
{{ partial "article/stable-version.html" . }}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,87 @@
|
|||
{{/*
|
||||
Format Selector Component
|
||||
|
||||
Provides a dropdown menu for accessing documentation in LLM-friendly formats.
|
||||
Supports both leaf nodes (single pages) and branch nodes (sections with children).
|
||||
|
||||
Features:
|
||||
- Copy page/section as Markdown
|
||||
- Open in ChatGPT or Claude
|
||||
- Download section ZIP (for large sections)
|
||||
- Future: MCP server integration
|
||||
|
||||
UI Pattern: Matches Mintlify's format selector style
|
||||
*/}}
|
||||
|
||||
{{- $childCount := 0 -}}
|
||||
{{- $isSection := false -}}
|
||||
|
||||
{{/* Determine if this is a section (branch node) by checking for child pages */}}
|
||||
{{- if .IsSection -}}
|
||||
{{- $isSection = true -}}
|
||||
{{- range .Pages -}}
|
||||
{{- $childCount = add $childCount 1 -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/* Calculate estimated tokens (rough estimate: ~500 tokens per page) */}}
|
||||
{{- $estimatedTokens := mul $childCount 500 -}}
|
||||
|
||||
{{/* Construct section download URL if applicable */}}
|
||||
{{- $sectionDownloadUrl := "" -}}
|
||||
{{- if $isSection -}}
|
||||
{{- $sectionDownloadUrl = printf "%s-download.zip" .RelPermalink -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/* Only show format selector on documentation pages, not on special pages */}}
|
||||
{{- if not (in .RelPermalink "/search") -}}
|
||||
|
||||
<div
|
||||
class="format-selector"
|
||||
data-component="format-selector"
|
||||
data-child-count="{{ $childCount }}"
|
||||
data-estimated-tokens="{{ $estimatedTokens }}"
|
||||
{{- if $sectionDownloadUrl }}
|
||||
data-section-download-url="{{ $sectionDownloadUrl }}"
|
||||
{{- end }}
|
||||
{{- /* Future MCP server URLs - commented out for now */ -}}
|
||||
{{- /* data-mcp-cursor-url="/docs/mcp/cursor-setup/" */ -}}
|
||||
{{- /* data-mcp-vscode-url="/docs/mcp/vscode-setup/" */ -}}
|
||||
>
|
||||
{{/* Button triggers dropdown */}}
|
||||
<button
|
||||
class="format-selector__button"
|
||||
aria-haspopup="true"
|
||||
aria-expanded="false"
|
||||
aria-label="{{ if $isSection }}Copy section for AI{{ else }}Copy page for AI{{ end }} - Access documentation in different formats"
|
||||
>
|
||||
<span class="format-selector__button-icon">
|
||||
{{/* Document icon SVG */}}
|
||||
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M9 1H3C2.44772 1 2 1.44772 2 2V14C2 14.5523 2.44772 15 3 15H13C13.5523 15 14 14.5523 14 14V6L9 1Z" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M9 1V6H14" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
</svg>
|
||||
</span>
|
||||
<span class="format-selector__button-text" data-button-text>
|
||||
{{ if $isSection }}Copy section for AI{{ else }}Copy page for AI{{ end }}
|
||||
</span>
|
||||
<span class="format-selector__button-arrow">
|
||||
{{/* Dropdown arrow icon */}}
|
||||
<svg width="12" height="12" viewBox="0 0 12 12" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M3 4.5L6 7.5L9 4.5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
</svg>
|
||||
</span>
|
||||
</button>
|
||||
|
||||
{{/* Dropdown menu - populated by TypeScript */}}
|
||||
<div
|
||||
class="format-selector__dropdown"
|
||||
data-dropdown-menu
|
||||
role="menu"
|
||||
aria-label="Format options"
|
||||
>
|
||||
{{/* Options will be dynamically rendered by the TypeScript component */}}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{{- end -}}
|
||||
|
|
@ -0,0 +1,70 @@
|
|||
{{- /*
|
||||
Generate llms.txt file for a section following https://llmstxt.org specification
|
||||
|
||||
Required: H1 with project/product name
|
||||
Optional: Blockquote with brief summary
|
||||
Optional: Content sections (NO HEADINGS allowed)
|
||||
Optional: H2-delimited file list sections with curated links
|
||||
*/ -}}
|
||||
{{- $productKey := "" -}}
|
||||
{{- $productName := "" -}}
|
||||
{{- $productDescription := "" -}}
|
||||
{{- $sectionName := .Title -}}
|
||||
|
||||
{{- /* Detect product from URL path */ -}}
|
||||
{{- if hasPrefix .RelPermalink "/influxdb3/core/" -}}
|
||||
{{- $productKey = "influxdb3_core" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb3/enterprise/" -}}
|
||||
{{- $productKey = "influxdb3_enterprise" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb3/cloud-dedicated/" -}}
|
||||
{{- $productKey = "influxdb3_cloud_dedicated" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb3/cloud-serverless/" -}}
|
||||
{{- $productKey = "influxdb3_cloud_serverless" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb3/clustered/" -}}
|
||||
{{- $productKey = "influxdb3_clustered" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb/cloud/" -}}
|
||||
{{- $productKey = "influxdb_cloud" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb/v2" -}}
|
||||
{{- $productKey = "influxdb" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/telegraf/" -}}
|
||||
{{- $productKey = "telegraf" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/chronograf/" -}}
|
||||
{{- $productKey = "chronograf" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/kapacitor/" -}}
|
||||
{{- $productKey = "kapacitor" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/flux/" -}}
|
||||
{{- $productKey = "flux" -}}
|
||||
{{- else if hasPrefix .RelPermalink "/influxdb3_explorer/" -}}
|
||||
{{- $productKey = "influxdb3_explorer" -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- /* Get product data from products.yml */ -}}
|
||||
{{- if $productKey -}}
|
||||
{{- $product := index .Site.Data.products $productKey -}}
|
||||
{{- if $product -}}
|
||||
{{- $productName = $product.name -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- /* Use product name for root product sections, otherwise use section title */ -}}
|
||||
{{- $h1Title := $sectionName -}}
|
||||
{{- if and $productName (or (eq .RelPermalink (printf "/influxdb3/core/")) (eq .RelPermalink (printf "/influxdb3/enterprise/")) (eq .RelPermalink (printf "/influxdb3/cloud-dedicated/")) (eq .RelPermalink (printf "/influxdb3/cloud-serverless/")) (eq .RelPermalink (printf "/influxdb3/clustered/")) (eq .RelPermalink (printf "/influxdb/cloud/")) (eq .RelPermalink (printf "/influxdb/v2/")) (eq .RelPermalink (printf "/telegraf/v1/")) (eq .RelPermalink (printf "/chronograf/v1/")) (eq .RelPermalink (printf "/kapacitor/v1/")) (eq .RelPermalink (printf "/flux/v0/")) (eq .RelPermalink (printf "/influxdb3_explorer/"))) -}}
|
||||
{{- $h1Title = $productName -}}
|
||||
{{- end -}}
|
||||
|
||||
# {{ $h1Title }}
|
||||
{{- with .Description }}
|
||||
|
||||
> {{ . }}
|
||||
{{- end }}
|
||||
{{- with .Content }}
|
||||
{{ . | plainify | truncate 500 }}
|
||||
{{- end }}
|
||||
{{- /* Only show child pages if there are any */ -}}
|
||||
{{- if .Pages }}
|
||||
|
||||
## Pages in this section
|
||||
{{ range .Pages }}
|
||||
- [{{ .Title }}]({{ .RelPermalink }}){{ with .Description }}: {{ . }}{{ end }}
|
||||
{{- end }}
|
||||
{{- end -}}
|
||||
21
package.json
21
package.json
|
|
@ -1,9 +1,13 @@
|
|||
{
|
||||
"private": true,
|
||||
"name": "docs-v2",
|
||||
"name": "@influxdata/docs-site",
|
||||
"version": "1.0.0",
|
||||
"description": "InfluxDB documentation",
|
||||
"license": "MIT",
|
||||
"exports": {
|
||||
"./markdown-converter": "./scripts/lib/markdown-converter.cjs",
|
||||
"./product-mappings": "./dist/utils/product-mappings.js"
|
||||
},
|
||||
"bin": {
|
||||
"docs": "scripts/docs-cli.js"
|
||||
},
|
||||
|
|
@ -13,6 +17,7 @@
|
|||
"devDependencies": {
|
||||
"@eslint/js": "^9.18.0",
|
||||
"@evilmartians/lefthook": "^1.7.1",
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"@vvago/vale": "^3.12.0",
|
||||
"autoprefixer": ">=10.2.5",
|
||||
"cypress": "^14.0.1",
|
||||
|
|
@ -27,19 +32,29 @@
|
|||
"postcss-cli": ">=9.1.0",
|
||||
"prettier": "^3.2.5",
|
||||
"prettier-plugin-sql": "^0.18.0",
|
||||
"remark": "^15.0.1",
|
||||
"remark-frontmatter": "^5.0.0",
|
||||
"remark-gfm": "^4.0.1",
|
||||
"remark-parse": "^11.0.0",
|
||||
"typescript": "^5.8.3",
|
||||
"typescript-eslint": "^8.32.1",
|
||||
"unified": "^11.0.5",
|
||||
"winston": "^3.16.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@types/turndown": "^5.0.6",
|
||||
"axios": "^1.12.0",
|
||||
"glob": "^10.3.10",
|
||||
"gray-matter": "^4.0.3",
|
||||
"jquery": "^3.7.1",
|
||||
"js-cookie": "^3.0.5",
|
||||
"js-yaml": "^4.1.1",
|
||||
"jsdom": "^27.2.0",
|
||||
"lefthook": "^1.10.10",
|
||||
"markdown-link": "^0.1.1",
|
||||
"mermaid": "^11.10.0",
|
||||
"p-limit": "^5.0.0",
|
||||
"turndown": "^7.2.2",
|
||||
"vanillajs-datepicker": "^1.3.4"
|
||||
},
|
||||
"scripts": {
|
||||
|
|
@ -52,6 +67,10 @@
|
|||
"build:agent:instructions": "node ./helper-scripts/build-agent-instructions.js",
|
||||
"build:ts": "tsc --project tsconfig.json --outDir dist",
|
||||
"build:ts:watch": "tsc --project tsconfig.json --outDir dist --watch",
|
||||
"build:md": "node scripts/build-llm-markdown.js",
|
||||
"build:md:legacy": "node scripts/html-to-markdown.js",
|
||||
"build:md:verbose": "node scripts/html-to-markdown.js --verbose",
|
||||
"deploy:staging": "sh scripts/deploy-staging.sh",
|
||||
"lint": "LEFTHOOK_EXCLUDE=test lefthook run pre-commit && lefthook run pre-push",
|
||||
"pre-commit": "lefthook run pre-commit",
|
||||
"test": "echo \"Run 'yarn test:e2e', 'yarn test:links', 'yarn test:codeblocks:all' or a specific test command. e2e and links test commands can take a glob of file paths to test. Some commands run automatically during the git pre-commit and pre-push hooks.\" && exit 0",
|
||||
|
|
|
|||
|
|
@ -0,0 +1,171 @@
|
|||
# Documentation Build Scripts
|
||||
|
||||
## html-to-markdown.js
|
||||
|
||||
Converts Hugo-generated HTML files to fully-rendered Markdown with evaluated shortcodes, dereferenced shared content, and removed comments.
|
||||
|
||||
### Purpose
|
||||
|
||||
This script generates production-ready Markdown output for LLM consumption and user downloads. The generated Markdown:
|
||||
|
||||
- Has all Hugo shortcodes evaluated to text (e.g., `{{% product-name %}}` → "InfluxDB 3 Core")
|
||||
- Includes dereferenced shared content in the body
|
||||
- Removes HTML/Markdown comments
|
||||
- Adds product context to frontmatter
|
||||
- Mirrors the HTML version but in clean Markdown format
|
||||
|
||||
### Usage
|
||||
|
||||
```bash
|
||||
# Generate all markdown files (run after Hugo build)
|
||||
yarn build:md
|
||||
|
||||
# Generate with verbose logging
|
||||
yarn build:md:verbose
|
||||
|
||||
# Generate for specific path
|
||||
node scripts/html-to-markdown.js --path influxdb3/core
|
||||
|
||||
# Generate limited number for testing
|
||||
node scripts/html-to-markdown.js --limit 10
|
||||
|
||||
# Combine options
|
||||
node scripts/html-to-markdown.js --path telegraf/v1 --verbose
|
||||
```
|
||||
|
||||
### Options
|
||||
|
||||
- `--path <path>`: Process specific path within `public/` (default: process all)
|
||||
- `--limit <n>`: Limit number of files to process (useful for testing)
|
||||
- `--verbose`: Enable detailed logging of conversion progress
|
||||
|
||||
### Build Process
|
||||
|
||||
1. **Hugo generates HTML** (with all shortcodes evaluated):
|
||||
```bash
|
||||
npx hugo --quiet
|
||||
```
|
||||
|
||||
2. **Script converts HTML to Markdown**:
|
||||
```bash
|
||||
yarn build:md
|
||||
```
|
||||
|
||||
3. **Generated files**:
|
||||
- Location: `public/**/index.md` (alongside `index.html`)
|
||||
- Git status: Ignored (entire `public/` directory is gitignored)
|
||||
- Deployment: Generated at build time, like API docs
|
||||
|
||||
### Features
|
||||
|
||||
#### Product Context Detection
|
||||
|
||||
Automatically detects and adds product information to frontmatter:
|
||||
|
||||
```yaml
|
||||
---
|
||||
title: Set up InfluxDB 3 Core
|
||||
description: Install, configure, and set up authorization...
|
||||
url: /influxdb3/core/get-started/setup/
|
||||
product: InfluxDB 3 Core
|
||||
product_version: core
|
||||
date: 2025-11-13
|
||||
lastmod: 2025-11-13
|
||||
---
|
||||
```
|
||||
|
||||
Supported products:
|
||||
- InfluxDB 3 Core, Enterprise, Cloud Dedicated, Cloud Serverless, Clustered
|
||||
- InfluxDB v2, v1, Cloud (TSM), Enterprise v1
|
||||
- Telegraf, Chronograf, Kapacitor, Flux
|
||||
|
||||
#### Turndown Configuration
|
||||
|
||||
Custom Turndown rules for InfluxData documentation:
|
||||
|
||||
- **Code blocks**: Preserves language identifiers
|
||||
- **GitHub callouts**: Converts to `> [!Note]` format
|
||||
- **Tables**: GitHub-flavored markdown tables
|
||||
- **Lists**: Preserves nested lists and formatting
|
||||
- **Links**: Keeps relative links intact
|
||||
- **Images**: Preserves alt text and paths
|
||||
|
||||
#### Content Extraction
|
||||
|
||||
Extracts only article content (removes navigation, footer, etc.):
|
||||
- Target selector: `article.article--content`
|
||||
- Skips files without article content (with warning)
|
||||
|
||||
### Integration
|
||||
|
||||
**Local Development:**
|
||||
```bash
|
||||
# After making content changes
|
||||
npx hugo --quiet && yarn build:md
|
||||
```
|
||||
|
||||
**CircleCI Build Pipeline:**
|
||||
|
||||
The script runs automatically in the CircleCI build pipeline after Hugo generates HTML:
|
||||
|
||||
```yaml
|
||||
# .circleci/config.yml
|
||||
- run:
|
||||
name: Hugo Build
|
||||
command: yarn hugo --environment production --logLevel info --gc --destination workspace/public
|
||||
- run:
|
||||
name: Generate LLM-friendly Markdown
|
||||
command: node scripts/html-to-markdown.js
|
||||
```
|
||||
|
||||
**Build order:**
|
||||
1. Hugo builds HTML → `workspace/public/**/*.html`
|
||||
2. `html-to-markdown.js` converts HTML → `workspace/public/**/*.md`
|
||||
3. All files deployed to S3
|
||||
|
||||
**Production Build (Manual):**
|
||||
```bash
|
||||
npx hugo --quiet
|
||||
yarn build:md
|
||||
```
|
||||
|
||||
**Watch Mode:**
|
||||
For development with auto-regeneration, run Hugo server and regenerate markdown after content changes:
|
||||
```bash
|
||||
# Terminal 1: Hugo server
|
||||
npx hugo server
|
||||
|
||||
# Terminal 2: After making changes
|
||||
yarn build:md
|
||||
```
|
||||
|
||||
### Performance
|
||||
|
||||
- **Processing speed**: ~10-20 files/second
|
||||
- **Full site**: 5,581 HTML files in ~5 minutes
|
||||
- **Memory usage**: Minimal (processes files sequentially)
|
||||
- **Caching**: None (regenerates from HTML each time)
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
**No article content found:**
|
||||
```
|
||||
⚠️ No article content found in /path/to/file.html
|
||||
```
|
||||
- File doesn't have `article.article--content` selector
|
||||
- Usually navigation pages or redirects
|
||||
- Safe to ignore
|
||||
|
||||
**Shortcodes still present:**
|
||||
- Run after Hugo has generated HTML, not before
|
||||
- Hugo must complete its build first
|
||||
|
||||
**Missing product context:**
|
||||
- Check that URL path matches patterns in `PRODUCT_MAP`
|
||||
- Add new products to the map if needed
|
||||
|
||||
### See Also
|
||||
|
||||
- [Plan document](../.context/PLAN-markdown-rendering.md) - Architecture decisions
|
||||
- [API docs generation](../api-docs/README.md) - Similar pattern for API reference
|
||||
- [Package.json scripts](../package.json) - Build commands
|
||||
|
|
@ -0,0 +1,453 @@
|
|||
#!/usr/bin/env node
|
||||
/**
|
||||
* Build LLM-friendly Markdown from Hugo-generated HTML
|
||||
*
|
||||
* This script generates static .md files at build time for optimal performance.
|
||||
* Two-phase approach:
|
||||
* 1. Convert HTML → individual page markdown (memory-bounded parallelism)
|
||||
* 2. Combine pages → section bundles (fast string concatenation)
|
||||
*
|
||||
*/
|
||||
|
||||
import { glob } from 'glob';
|
||||
import fs from 'fs/promises';
|
||||
import { readFileSync } from 'fs';
|
||||
import path from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import { dirname } from 'path';
|
||||
import { createRequire } from 'module';
|
||||
import yaml from 'js-yaml';
|
||||
import pLimit from 'p-limit';
|
||||
|
||||
// Get __dirname equivalent in ESM
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
// Create require function for CommonJS modules
|
||||
const require = createRequire(import.meta.url);
|
||||
const { convertToMarkdown } = require('./lib/markdown-converter.cjs');
|
||||
|
||||
// ============================================================================
|
||||
// CONFIGURATION
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Minimum file size threshold for processing HTML files.
|
||||
* Files smaller than this are assumed to be Hugo alias redirects and skipped.
|
||||
*
|
||||
* Hugo alias redirects are typically 300-400 bytes (simple meta refresh pages).
|
||||
* Content pages are typically 30KB-100KB+.
|
||||
*
|
||||
* Set to 0 to disable redirect detection (process all files).
|
||||
*
|
||||
* @default 1024 (1KB) - Safe threshold with large margin
|
||||
*/
|
||||
const MIN_HTML_SIZE_BYTES = 1024;
|
||||
|
||||
/**
|
||||
* Approximate character-to-token ratio for estimation.
|
||||
* Used to estimate token count from markdown content length.
|
||||
*
|
||||
* @default 4 - Rough heuristic (4 characters ≈ 1 token)
|
||||
*/
|
||||
const CHARS_PER_TOKEN = 4;
|
||||
|
||||
// ============================================================================
|
||||
// PHASE 1: HTML → MARKDOWN CONVERSION
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Phase 1: Convert all HTML files to individual page markdown
|
||||
* Uses memory-bounded parallelism to avoid OOM in CI
|
||||
*/
|
||||
async function buildPageMarkdown() {
|
||||
console.log('📄 Converting HTML to Markdown (individual pages)...\n');
|
||||
const startTime = Date.now();
|
||||
|
||||
// Find all HTML files
|
||||
const htmlFiles = await glob('public/**/index.html', {
|
||||
ignore: ['**/node_modules/**', '**/api-docs/**'],
|
||||
});
|
||||
|
||||
console.log(`Found ${htmlFiles.length} HTML files\n`);
|
||||
|
||||
// Memory-bounded concurrency
|
||||
// CircleCI medium (2GB RAM): 10 workers safe
|
||||
// Local development (16GB RAM): 20 workers faster
|
||||
const CONCURRENCY = process.env.CI ? 10 : 20;
|
||||
const limit = pLimit(CONCURRENCY);
|
||||
|
||||
let converted = 0;
|
||||
let skipped = 0;
|
||||
const errors = [];
|
||||
|
||||
// Map all files to limited-concurrency tasks
|
||||
const tasks = htmlFiles.map((htmlPath) =>
|
||||
limit(async () => {
|
||||
try {
|
||||
// Check file size before reading (skip Hugo alias redirects)
|
||||
if (MIN_HTML_SIZE_BYTES > 0) {
|
||||
const stats = await fs.stat(htmlPath);
|
||||
if (stats.size < MIN_HTML_SIZE_BYTES) {
|
||||
skipped++;
|
||||
return; // Skip redirect page
|
||||
}
|
||||
}
|
||||
|
||||
// Read HTML
|
||||
const html = await fs.readFile(htmlPath, 'utf-8');
|
||||
|
||||
// Derive URL path for frontmatter
|
||||
const urlPath = htmlPath
|
||||
.replace(/^public/, '')
|
||||
.replace(/\/index\.html$/, '/');
|
||||
|
||||
// Convert to markdown (JSDOM + Turndown processing)
|
||||
const markdown = await convertToMarkdown(html, urlPath);
|
||||
|
||||
if (!markdown) {
|
||||
skipped++;
|
||||
return;
|
||||
}
|
||||
|
||||
// Write .md file next to .html
|
||||
const mdPath = htmlPath.replace(/index\.html$/, 'index.md');
|
||||
await fs.writeFile(mdPath, markdown, 'utf-8');
|
||||
|
||||
converted++;
|
||||
|
||||
// Progress logging
|
||||
if (converted % 100 === 0) {
|
||||
const elapsed = ((Date.now() - startTime) / 1000).toFixed(1);
|
||||
const rate = ((converted / (Date.now() - startTime)) * 1000).toFixed(
|
||||
0
|
||||
);
|
||||
const memUsed = (
|
||||
process.memoryUsage().heapUsed /
|
||||
1024 /
|
||||
1024
|
||||
).toFixed(0);
|
||||
console.log(
|
||||
` ✓ ${converted}/${htmlFiles.length} (${rate}/sec, ${elapsed}s elapsed, ${memUsed}MB memory)`
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
errors.push({ file: htmlPath, error: error.message });
|
||||
console.error(` ✗ ${htmlPath}: ${error.message}`);
|
||||
}
|
||||
})
|
||||
);
|
||||
|
||||
// Execute all tasks (p-limit ensures only CONCURRENCY run simultaneously)
|
||||
await Promise.all(tasks);
|
||||
|
||||
const duration = ((Date.now() - startTime) / 1000).toFixed(1);
|
||||
const rate = ((converted / (Date.now() - startTime)) * 1000).toFixed(0);
|
||||
|
||||
console.log(`\n✅ Converted ${converted} files (${rate}/sec)`);
|
||||
if (MIN_HTML_SIZE_BYTES > 0) {
|
||||
console.log(
|
||||
`⏭️ Skipped ${skipped} files (Hugo alias redirects < ${MIN_HTML_SIZE_BYTES} bytes)`
|
||||
);
|
||||
} else {
|
||||
console.log(`⏭️ Skipped ${skipped} files (no article content)`);
|
||||
}
|
||||
console.log(`⏱️ Phase 1 time: ${duration}s`);
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.log(`⚠️ ${errors.length} errors occurred`);
|
||||
}
|
||||
|
||||
console.log('');
|
||||
|
||||
return { converted, skipped, errors };
|
||||
}
|
||||
|
||||
/**
|
||||
* Phase 2: Build section bundles by combining individual markdown files
|
||||
* Fast string concatenation with minimal memory usage
|
||||
*/
|
||||
async function buildSectionBundles() {
|
||||
console.log('📦 Building section bundles...\n');
|
||||
const startTime = Date.now();
|
||||
|
||||
// Find all sections (directories with index.md + child index.md files)
|
||||
const sections = await findSections();
|
||||
|
||||
console.log(`Found ${sections.length} sections\n`);
|
||||
|
||||
let built = 0;
|
||||
const errors = [];
|
||||
|
||||
// High concurrency OK - just string operations, minimal memory
|
||||
const limit = pLimit(50);
|
||||
|
||||
const tasks = sections.map((section) =>
|
||||
limit(async () => {
|
||||
try {
|
||||
// Read parent markdown
|
||||
const parentMd = await fs.readFile(section.mdPath, 'utf-8');
|
||||
|
||||
// Read all child markdowns
|
||||
const childMds = await Promise.all(
|
||||
section.children.map(async (child) => ({
|
||||
markdown: await fs.readFile(child.mdPath, 'utf-8'),
|
||||
url: child.url,
|
||||
title: child.title,
|
||||
}))
|
||||
);
|
||||
|
||||
// Combine markdown files (string manipulation only)
|
||||
const combined = combineMarkdown(parentMd, childMds, section.url);
|
||||
|
||||
// Write section bundle
|
||||
const sectionMdPath = section.mdPath.replace(
|
||||
/index\.md$/,
|
||||
'index.section.md'
|
||||
);
|
||||
await fs.writeFile(sectionMdPath, combined, 'utf-8');
|
||||
|
||||
built++;
|
||||
|
||||
if (built % 50 === 0) {
|
||||
console.log(` ✓ Built ${built}/${sections.length} sections`);
|
||||
}
|
||||
} catch (error) {
|
||||
errors.push({ section: section.url, error: error.message });
|
||||
console.error(` ✗ ${section.url}: ${error.message}`);
|
||||
}
|
||||
})
|
||||
);
|
||||
|
||||
await Promise.all(tasks);
|
||||
|
||||
const duration = ((Date.now() - startTime) / 1000).toFixed(1);
|
||||
console.log(`\n✅ Built ${built} section bundles`);
|
||||
console.log(`⏱️ Phase 2 time: ${duration}s`);
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.log(`⚠️ ${errors.length} errors occurred`);
|
||||
}
|
||||
|
||||
console.log('');
|
||||
|
||||
return { built, errors };
|
||||
}
|
||||
|
||||
/**
|
||||
* Find all sections (parent pages with child pages)
|
||||
*/
|
||||
async function findSections() {
|
||||
const allMdFiles = await glob('public/**/index.md');
|
||||
const sections = [];
|
||||
|
||||
for (const mdPath of allMdFiles) {
|
||||
const dir = path.dirname(mdPath);
|
||||
|
||||
// Find child directories with index.md
|
||||
const childMdFiles = await glob(path.join(dir, '*/index.md'));
|
||||
|
||||
if (childMdFiles.length === 0) continue; // Not a section
|
||||
|
||||
sections.push({
|
||||
mdPath: mdPath,
|
||||
url: dir.replace(/^public/, '') + '/',
|
||||
children: childMdFiles.map((childMdPath) => ({
|
||||
mdPath: childMdPath,
|
||||
url: path.dirname(childMdPath).replace(/^public/, '') + '/',
|
||||
title: extractTitleFromMd(childMdPath),
|
||||
})),
|
||||
});
|
||||
}
|
||||
|
||||
return sections;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract title from markdown file (quick regex, no full parsing)
|
||||
*/
|
||||
function extractTitleFromMd(mdPath) {
|
||||
try {
|
||||
const content = readFileSync(mdPath, 'utf-8');
|
||||
const match = content.match(/^---[\s\S]+?title:\s*(.+?)$/m);
|
||||
return match ? match[1].trim() : 'Untitled';
|
||||
} catch {
|
||||
return 'Untitled';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Combine parent and child markdown into section bundle
|
||||
*/
|
||||
function combineMarkdown(parentMd, childMds, sectionUrl) {
|
||||
// Parse parent frontmatter + content
|
||||
const parent = parseMarkdown(parentMd);
|
||||
|
||||
// Parse child frontmatter + content
|
||||
const children = childMds.map(({ markdown, url, title }) => {
|
||||
const child = parseMarkdown(markdown);
|
||||
|
||||
// Remove h1 heading (will be added as h2 to avoid duplicate)
|
||||
const contentWithoutH1 = child.content.replace(/^#\s+.+?\n+/, '');
|
||||
|
||||
return {
|
||||
title: child.frontmatter.title || title,
|
||||
url: child.frontmatter.url || url, // Use full URL from frontmatter
|
||||
content: `## ${child.frontmatter.title || title}\n\n${contentWithoutH1}`,
|
||||
tokens: child.frontmatter.estimated_tokens || 0,
|
||||
};
|
||||
});
|
||||
|
||||
// Calculate total tokens
|
||||
const totalTokens =
|
||||
(parent.frontmatter.estimated_tokens || 0) +
|
||||
children.reduce((sum, c) => sum + c.tokens, 0);
|
||||
|
||||
// Sanitize description (remove newlines, truncate to reasonable length)
|
||||
let description = parent.frontmatter.description || '';
|
||||
description = description
|
||||
.replace(/\s+/g, ' ') // Replace all whitespace (including newlines) with single space
|
||||
.trim()
|
||||
.substring(0, 500); // Truncate to 500 characters max
|
||||
|
||||
// Build section frontmatter object (will be serialized to YAML)
|
||||
const frontmatterObj = {
|
||||
title: parent.frontmatter.title,
|
||||
description: description,
|
||||
url: parent.frontmatter.url || sectionUrl, // Use full URL from parent frontmatter
|
||||
product: parent.frontmatter.product || '',
|
||||
type: 'section',
|
||||
pages: children.length + 1,
|
||||
estimated_tokens: totalTokens,
|
||||
child_pages: children.map((c) => ({
|
||||
url: c.url,
|
||||
title: c.title,
|
||||
})),
|
||||
};
|
||||
|
||||
// Serialize to YAML (handles special characters properly)
|
||||
const sectionFrontmatter =
|
||||
'---\n' +
|
||||
yaml
|
||||
.dump(frontmatterObj, {
|
||||
lineWidth: -1, // Disable line wrapping
|
||||
noRefs: true, // Disable anchors/aliases
|
||||
})
|
||||
.trim() +
|
||||
'\n---';
|
||||
|
||||
// Combine all content
|
||||
const allContent = [parent.content, ...children.map((c) => c.content)].join(
|
||||
'\n\n---\n\n'
|
||||
);
|
||||
|
||||
return `${sectionFrontmatter}\n\n${allContent}\n`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse markdown into frontmatter + content
|
||||
*/
|
||||
function parseMarkdown(markdown) {
|
||||
const match = markdown.match(/^---\n([\s\S]+?)\n---\n\n([\s\S]+)$/);
|
||||
|
||||
if (!match) {
|
||||
return { frontmatter: {}, content: markdown };
|
||||
}
|
||||
|
||||
try {
|
||||
const frontmatter = yaml.load(match[1]);
|
||||
const content = match[2];
|
||||
return { frontmatter, content };
|
||||
} catch (error) {
|
||||
console.warn('Failed to parse frontmatter:', error.message);
|
||||
return { frontmatter: {}, content: markdown };
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// COMMAND-LINE ARGUMENT PARSING
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Parse command-line arguments
|
||||
*/
|
||||
function parseArgs() {
|
||||
const args = process.argv.slice(2);
|
||||
const options = {
|
||||
environment: null,
|
||||
};
|
||||
|
||||
for (let i = 0; i < args.length; i++) {
|
||||
if ((args[i] === '-e' || args[i] === '--env') && args[i + 1]) {
|
||||
options.environment = args[++i];
|
||||
}
|
||||
}
|
||||
|
||||
return options;
|
||||
}
|
||||
|
||||
// Parse arguments and set environment
|
||||
const cliOptions = parseArgs();
|
||||
if (cliOptions.environment) {
|
||||
process.env.HUGO_ENV = cliOptions.environment;
|
||||
}
|
||||
|
||||
/**
|
||||
* Main execution
|
||||
*/
|
||||
async function main() {
|
||||
console.log('🚀 Building LLM-friendly Markdown\n');
|
||||
|
||||
// Show environment if specified
|
||||
if (cliOptions.environment) {
|
||||
console.log(`🌍 Environment: ${cliOptions.environment}\n`);
|
||||
}
|
||||
|
||||
console.log('════════════════════════════════\n');
|
||||
|
||||
const overallStart = Date.now();
|
||||
|
||||
// Phase 1: Generate individual page markdown
|
||||
const pageResults = await buildPageMarkdown();
|
||||
|
||||
// Phase 2: Build section bundles
|
||||
const sectionResults = await buildSectionBundles();
|
||||
|
||||
// Summary
|
||||
const totalDuration = ((Date.now() - overallStart) / 1000).toFixed(1);
|
||||
const totalFiles = pageResults.converted + sectionResults.built;
|
||||
|
||||
console.log('════════════════════════════════\n');
|
||||
console.log('📊 Summary:');
|
||||
console.log(` Pages: ${pageResults.converted}`);
|
||||
console.log(` Sections: ${sectionResults.built}`);
|
||||
console.log(` Total: ${totalFiles} markdown files`);
|
||||
console.log(` Skipped: ${pageResults.skipped} (no article content)`);
|
||||
|
||||
const totalErrors = pageResults.errors.length + sectionResults.errors.length;
|
||||
if (totalErrors > 0) {
|
||||
console.log(` Errors: ${totalErrors}`);
|
||||
}
|
||||
|
||||
console.log(` Time: ${totalDuration}s\n`);
|
||||
|
||||
// Exit with error code if there were errors
|
||||
if (totalErrors > 0) {
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
main().catch((error) => {
|
||||
console.error('Fatal error:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
// Export functions for testing
|
||||
export {
|
||||
buildPageMarkdown,
|
||||
buildSectionBundles,
|
||||
findSections,
|
||||
combineMarkdown,
|
||||
parseMarkdown,
|
||||
};
|
||||
|
|
@ -0,0 +1,182 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Deploy docs-v2 to staging environment
|
||||
#
|
||||
# This script handles the complete staging deployment workflow:
|
||||
# 1. Build Hugo site with staging config
|
||||
# 2. Generate LLM-friendly Markdown
|
||||
# 3. Deploy to S3 staging bucket
|
||||
# 4. Invalidate CloudFront cache
|
||||
#
|
||||
# Usage:
|
||||
# ./scripts/deploy-staging.sh
|
||||
#
|
||||
# Required environment variables:
|
||||
# STAGING_BUCKET - S3 bucket name (e.g., new-docs-test-docsbucket-1ns6x5tp79507)
|
||||
# AWS_REGION - AWS region (e.g., us-east-1)
|
||||
# STAGING_CF_DISTRIBUTION_ID - CloudFront distribution ID (optional, for cache invalidation)
|
||||
#
|
||||
# Optional environment variables:
|
||||
# STAGING_URL - Staging site URL (default: https://test2.docs.influxdata.com)
|
||||
# SKIP_BUILD - Set to 'true' to skip Hugo build (use existing public/)
|
||||
# SKIP_MARKDOWN - Set to 'true' to skip markdown generation
|
||||
# SKIP_DEPLOY - Set to 'true' to build only (no S3 upload)
|
||||
#
|
||||
|
||||
set -e # Exit on error
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Helper functions
|
||||
info() {
|
||||
echo -e "${BLUE}ℹ${NC} $1"
|
||||
}
|
||||
|
||||
success() {
|
||||
echo -e "${GREEN}✓${NC} $1"
|
||||
}
|
||||
|
||||
warning() {
|
||||
echo -e "${YELLOW}⚠${NC} $1"
|
||||
}
|
||||
|
||||
error() {
|
||||
echo -e "${RED}✗${NC} $1"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Validate required environment variables
|
||||
validate_env() {
|
||||
local missing=()
|
||||
|
||||
if [ -z "$STAGING_BUCKET" ]; then
|
||||
missing+=("STAGING_BUCKET")
|
||||
fi
|
||||
|
||||
if [ -z "$AWS_REGION" ]; then
|
||||
missing+=("AWS_REGION")
|
||||
fi
|
||||
|
||||
if [ ${#missing[@]} -gt 0 ]; then
|
||||
error "Missing required environment variables: ${missing[*]}"
|
||||
fi
|
||||
|
||||
# Set default staging URL if not provided
|
||||
if [ -z "$STAGING_URL" ]; then
|
||||
STAGING_URL="https://test2.docs.influxdata.com"
|
||||
fi
|
||||
export STAGING_URL
|
||||
|
||||
success "Environment variables validated"
|
||||
}
|
||||
|
||||
# Check if s3deploy is installed
|
||||
check_s3deploy() {
|
||||
if ! command -v s3deploy &> /dev/null; then
|
||||
error "s3deploy not found. Install with: deploy/ci-install-s3deploy.sh"
|
||||
fi
|
||||
success "s3deploy found: $(s3deploy -V | head -1)"
|
||||
}
|
||||
|
||||
# Build Hugo site
|
||||
build_hugo() {
|
||||
if [ "$SKIP_BUILD" = "true" ]; then
|
||||
warning "Skipping Hugo build (SKIP_BUILD=true)"
|
||||
return
|
||||
fi
|
||||
|
||||
info "Building Hugo site with staging config..."
|
||||
yarn hugo --environment staging --logLevel info --gc --destination public
|
||||
success "Hugo build complete"
|
||||
}
|
||||
|
||||
# Generate LLM-friendly Markdown
|
||||
build_markdown() {
|
||||
if [ "$SKIP_MARKDOWN" = "true" ]; then
|
||||
warning "Skipping markdown generation (SKIP_MARKDOWN=true)"
|
||||
return
|
||||
fi
|
||||
|
||||
info "Generating LLM-friendly Markdown..."
|
||||
yarn build:md -e staging
|
||||
success "Markdown generation complete"
|
||||
}
|
||||
|
||||
# Deploy to S3
|
||||
deploy_to_s3() {
|
||||
if [ "$SKIP_DEPLOY" = "true" ]; then
|
||||
warning "Skipping S3 deployment (SKIP_DEPLOY=true)"
|
||||
return
|
||||
fi
|
||||
|
||||
info "Deploying to S3 bucket: $STAGING_BUCKET"
|
||||
s3deploy -source=public/ \
|
||||
-bucket="$STAGING_BUCKET" \
|
||||
-region="$AWS_REGION" \
|
||||
-distribution-id="${STAGING_CF_DISTRIBUTION_ID}" \
|
||||
-key=$AWS_ACCESS_KEY_ID \
|
||||
-secret=$AWS_SECRET_KEY \
|
||||
-force \
|
||||
-v
|
||||
success "Deployment to S3 complete"
|
||||
}
|
||||
|
||||
# Invalidate CloudFront cache
|
||||
invalidate_cloudfront() {
|
||||
if [ "$SKIP_DEPLOY" = "true" ] || [ -z "$STAGING_CF_DISTRIBUTION_ID" ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
info "CloudFront cache invalidation initiated"
|
||||
if [ -n "$STAGING_CF_DISTRIBUTION_ID" ]; then
|
||||
info "Distribution ID: $STAGING_CF_DISTRIBUTION_ID"
|
||||
success "Cache will be invalidated by s3deploy"
|
||||
else
|
||||
warning "No STAGING_CF_DISTRIBUTION_ID set, skipping cache invalidation"
|
||||
fi
|
||||
}
|
||||
|
||||
# Print summary
|
||||
print_summary() {
|
||||
echo ""
|
||||
echo "════════════════════════════════════════"
|
||||
success "Staging deployment complete!"
|
||||
echo "════════════════════════════════════════"
|
||||
echo ""
|
||||
info "Staging URL: $STAGING_URL"
|
||||
if [ -n "$STAGING_CF_DISTRIBUTION_ID" ]; then
|
||||
info "CloudFront: $STAGING_CF_DISTRIBUTION_ID"
|
||||
warning "Cache invalidation may take 5-10 minutes"
|
||||
fi
|
||||
echo ""
|
||||
}
|
||||
|
||||
# Main execution
|
||||
main() {
|
||||
echo ""
|
||||
echo "════════════════════════════════════════"
|
||||
info "docs-v2 Staging Deployment"
|
||||
echo "════════════════════════════════════════"
|
||||
echo ""
|
||||
|
||||
validate_env
|
||||
check_s3deploy
|
||||
|
||||
echo ""
|
||||
build_hugo
|
||||
build_markdown
|
||||
|
||||
echo ""
|
||||
deploy_to_s3
|
||||
invalidate_cloudfront
|
||||
|
||||
print_summary
|
||||
}
|
||||
|
||||
# Run main function
|
||||
main
|
||||
|
|
@ -0,0 +1,461 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* HTML to Markdown Converter CLI for InfluxData Documentation
|
||||
*
|
||||
* Generates LLM-friendly Markdown from Hugo-generated HTML documentation.
|
||||
* This script is the local CLI companion to the Lambda@Edge function that serves
|
||||
* Markdown on-demand at docs.influxdata.com.
|
||||
*
|
||||
* ## Architecture
|
||||
*
|
||||
* The core conversion logic lives in ./lib/markdown-converter.js, which is shared
|
||||
* between this CLI tool and the Lambda@Edge function in deploy/llm-markdown/.
|
||||
* This ensures local builds and production Lambda use identical conversion logic.
|
||||
*
|
||||
* ## Prerequisites
|
||||
*
|
||||
* Before running this script, you must:
|
||||
*
|
||||
* 1. Install dependencies:
|
||||
* ```bash
|
||||
* yarn install
|
||||
* ```
|
||||
*
|
||||
* 2. Compile TypeScript (for product mappings):
|
||||
* ```bash
|
||||
* yarn build:ts
|
||||
* ```
|
||||
*
|
||||
* 3. Build the Hugo site:
|
||||
* ```bash
|
||||
* npx hugo --quiet
|
||||
* ```
|
||||
*
|
||||
* ## Usage
|
||||
*
|
||||
* Basic usage:
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js [options]
|
||||
* ```
|
||||
*
|
||||
* ## Options
|
||||
*
|
||||
* --path <path> Process specific content path relative to public/ directory
|
||||
* Example: influxdb3/core/get-started
|
||||
*
|
||||
* --limit <n> Limit number of files to process (useful for testing)
|
||||
* Example: --limit 10
|
||||
*
|
||||
* -e, --env <env> Set environment (development, staging, production)
|
||||
* Controls base URL in frontmatter (matches Hugo's -e flag)
|
||||
* Example: -e staging
|
||||
*
|
||||
* --verbose Enable detailed logging showing each file processed
|
||||
*
|
||||
* ## Examples
|
||||
*
|
||||
* Generate Markdown for all documentation:
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js
|
||||
* ```
|
||||
*
|
||||
* Generate Markdown for InfluxDB 3 Core documentation:
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js --path influxdb3/core
|
||||
* ```
|
||||
*
|
||||
* Generate Markdown for a specific section (testing):
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js --path influxdb3/core/get-started --limit 10
|
||||
* ```
|
||||
*
|
||||
* Generate with verbose output:
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js --path influxdb3/core --limit 5 --verbose
|
||||
* ```
|
||||
*
|
||||
* Generate Markdown with staging URLs:
|
||||
* ```bash
|
||||
* node scripts/html-to-markdown.js --path influxdb3/core -e staging
|
||||
* ```
|
||||
*
|
||||
* ## Output Files
|
||||
*
|
||||
* This script generates two types of Markdown files:
|
||||
*
|
||||
* 1. **Single page**: `index.md`
|
||||
* - Mirrors the HTML page structure
|
||||
* - Contains YAML frontmatter with title, description, URL, product info
|
||||
* - Located alongside the source `index.html`
|
||||
*
|
||||
* 2. **Section aggregation**: `index.section.md`
|
||||
* - Combines parent page + all child pages in one file
|
||||
* - Optimized for LLM context windows
|
||||
* - Only generated for pages that have child pages
|
||||
* - Enhanced frontmatter includes child page list and token estimate
|
||||
*
|
||||
* ## Frontmatter Structure
|
||||
*
|
||||
* Single page frontmatter:
|
||||
* ```yaml
|
||||
* ---
|
||||
* title: Page Title
|
||||
* description: Page description from meta tags
|
||||
* url: /influxdb3/core/path/to/page/
|
||||
* product: InfluxDB 3 Core
|
||||
* version: core
|
||||
* ---
|
||||
* ```
|
||||
*
|
||||
* Section aggregation frontmatter includes additional fields:
|
||||
* ```yaml
|
||||
* ---
|
||||
* title: Section Title
|
||||
* description: Section description
|
||||
* url: /influxdb3/core/section/
|
||||
* type: section
|
||||
* pages: 5
|
||||
* estimated_tokens: 12500
|
||||
* product: InfluxDB 3 Core
|
||||
* version: core
|
||||
* child_pages:
|
||||
* - url: /influxdb3/core/section/page1/
|
||||
* title: Page 1 Title
|
||||
* - url: /influxdb3/core/section/page2/
|
||||
* title: Page 2 Title
|
||||
* ---
|
||||
* ```
|
||||
*
|
||||
* ## Testing Generated Markdown
|
||||
*
|
||||
* Use Cypress to validate generated Markdown:
|
||||
* ```bash
|
||||
* node cypress/support/run-e2e-specs.js \
|
||||
* --spec "cypress/e2e/content/markdown-content-validation.cy.js"
|
||||
* ```
|
||||
*
|
||||
* ## Common Issues
|
||||
*
|
||||
* **Error: Directory not found**
|
||||
* - Solution: Run `npx hugo --quiet` first to generate HTML files
|
||||
*
|
||||
* **No article content found warnings**
|
||||
* - This is normal for alias/redirect pages
|
||||
* - The script skips these pages automatically
|
||||
*
|
||||
* **Memory issues with large builds**
|
||||
* - Use `--path` to process specific sections
|
||||
* - Use `--limit` for testing with small batches
|
||||
* - Script includes periodic garbage collection hints
|
||||
*
|
||||
* ## Related Files
|
||||
*
|
||||
* - Core logic: `scripts/lib/markdown-converter.js`
|
||||
* - Lambda handler: `deploy/llm-markdown/lambda-edge/markdown-generator/index.js`
|
||||
* - Product detection: `dist/utils/product-mappings.js` (compiled from TypeScript)
|
||||
* - Cypress tests: `cypress/e2e/content/markdown-content-validation.cy.js`
|
||||
*/
|
||||
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import {
|
||||
convertToMarkdown,
|
||||
convertSectionToMarkdown,
|
||||
} from './lib/markdown-converter.cjs';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
|
||||
// Parse command line arguments
|
||||
const args = process.argv.slice(2);
|
||||
const options = {
|
||||
publicDir: path.join(__dirname, '..', 'public'),
|
||||
limit: null,
|
||||
verbose: false,
|
||||
specificPath: null,
|
||||
environment: null,
|
||||
};
|
||||
|
||||
// Parse command-line arguments
|
||||
for (let i = 0; i < args.length; i++) {
|
||||
if (args[i] === '--path' && args[i + 1]) {
|
||||
options.specificPath = args[++i];
|
||||
} else if (args[i] === '--limit' && args[i + 1]) {
|
||||
options.limit = parseInt(args[++i], 10);
|
||||
} else if ((args[i] === '-e' || args[i] === '--env') && args[i + 1]) {
|
||||
options.environment = args[++i];
|
||||
} else if (args[i] === '--verbose') {
|
||||
options.verbose = true;
|
||||
}
|
||||
}
|
||||
|
||||
// Set HUGO_ENV environment variable based on --env flag (matches Hugo's -e flag behavior)
|
||||
if (options.environment) {
|
||||
process.env.HUGO_ENV = options.environment;
|
||||
console.log(`🌍 Environment set to: ${options.environment}`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a directory is a section (has child directories with index.html)
|
||||
*/
|
||||
function isSection(dirPath) {
|
||||
try {
|
||||
const files = fs.readdirSync(dirPath);
|
||||
return files.some((file) => {
|
||||
const fullPath = path.join(dirPath, file);
|
||||
const stat = fs.statSync(fullPath);
|
||||
return (
|
||||
stat.isDirectory() && fs.existsSync(path.join(fullPath, 'index.html'))
|
||||
);
|
||||
});
|
||||
} catch (error) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find all child page HTML files in a section
|
||||
*/
|
||||
function findChildPages(sectionPath) {
|
||||
try {
|
||||
const files = fs.readdirSync(sectionPath);
|
||||
const childPages = [];
|
||||
|
||||
for (const file of files) {
|
||||
const fullPath = path.join(sectionPath, file);
|
||||
const stat = fs.statSync(fullPath);
|
||||
|
||||
if (stat.isDirectory()) {
|
||||
const childIndexPath = path.join(fullPath, 'index.html');
|
||||
if (fs.existsSync(childIndexPath)) {
|
||||
childPages.push(childIndexPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return childPages;
|
||||
} catch (error) {
|
||||
console.error(
|
||||
`Error finding child pages in ${sectionPath}:`,
|
||||
error.message
|
||||
);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert single HTML file to Markdown using the shared library
|
||||
*/
|
||||
async function convertHtmlFileToMarkdown(htmlFilePath) {
|
||||
try {
|
||||
const htmlContent = fs.readFileSync(htmlFilePath, 'utf-8');
|
||||
|
||||
// Derive URL path from file path
|
||||
const relativePath = path.relative(options.publicDir, htmlFilePath);
|
||||
const urlPath =
|
||||
'/' + relativePath.replace(/\/index\.html$/, '/').replace(/\\/g, '/');
|
||||
|
||||
// Use shared conversion function
|
||||
const markdown = await convertToMarkdown(htmlContent, urlPath);
|
||||
if (!markdown) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Write to index.md in same directory
|
||||
const markdownFilePath = htmlFilePath.replace(/index\.html$/, 'index.md');
|
||||
fs.writeFileSync(markdownFilePath, markdown, 'utf-8');
|
||||
|
||||
if (options.verbose) {
|
||||
console.log(` ✓ Converted: ${relativePath}`);
|
||||
}
|
||||
|
||||
return markdownFilePath;
|
||||
} catch (error) {
|
||||
console.error(` ✗ Error converting ${htmlFilePath}:`, error.message);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Aggregate section and child page markdown using the shared library
|
||||
*/
|
||||
async function aggregateSectionMarkdown(sectionHtmlPath) {
|
||||
try {
|
||||
const sectionDir = path.dirname(sectionHtmlPath);
|
||||
|
||||
// Read section HTML
|
||||
const sectionHtml = fs.readFileSync(sectionHtmlPath, 'utf-8');
|
||||
|
||||
// Derive URL path
|
||||
const sectionUrlPath =
|
||||
'/' +
|
||||
path
|
||||
.relative(options.publicDir, sectionHtmlPath)
|
||||
.replace(/\/index\.html$/, '/')
|
||||
.replace(/\\/g, '/');
|
||||
|
||||
// Find and read child pages
|
||||
const childPaths = findChildPages(sectionDir);
|
||||
const childHtmls = [];
|
||||
|
||||
for (const childPath of childPaths) {
|
||||
try {
|
||||
const childHtml = fs.readFileSync(childPath, 'utf-8');
|
||||
const childUrl =
|
||||
'/' +
|
||||
path
|
||||
.relative(options.publicDir, childPath)
|
||||
.replace(/\/index\.html$/, '/')
|
||||
.replace(/\\/g, '/');
|
||||
|
||||
childHtmls.push({ html: childHtml, url: childUrl });
|
||||
} catch (error) {
|
||||
if (options.verbose) {
|
||||
console.warn(` ⚠️ Could not read child page: ${childPath}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Use shared conversion function
|
||||
const markdown = await convertSectionToMarkdown(
|
||||
sectionHtml,
|
||||
sectionUrlPath,
|
||||
childHtmls
|
||||
);
|
||||
|
||||
return markdown;
|
||||
} catch (error) {
|
||||
console.error(
|
||||
`Error aggregating section ${sectionHtmlPath}:`,
|
||||
error.message
|
||||
);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find all HTML files recursively
|
||||
*/
|
||||
function findHtmlFiles(dir, fileList = []) {
|
||||
const files = fs.readdirSync(dir);
|
||||
|
||||
for (const file of files) {
|
||||
const filePath = path.join(dir, file);
|
||||
const stat = fs.statSync(filePath);
|
||||
|
||||
if (stat.isDirectory()) {
|
||||
findHtmlFiles(filePath, fileList);
|
||||
} else if (file === 'index.html') {
|
||||
fileList.push(filePath);
|
||||
}
|
||||
}
|
||||
|
||||
return fileList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Main function
|
||||
*/
|
||||
async function main() {
|
||||
console.log('🚀 Starting HTML to Markdown conversion...\n');
|
||||
|
||||
const startDir = options.specificPath
|
||||
? path.join(options.publicDir, options.specificPath)
|
||||
: options.publicDir;
|
||||
|
||||
if (!fs.existsSync(startDir)) {
|
||||
console.error(`❌ Error: Directory not found: ${startDir}`);
|
||||
console.error(' Run "npx hugo --quiet" first to generate HTML files.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`📂 Scanning: ${path.relative(process.cwd(), startDir)}`);
|
||||
|
||||
const htmlFiles = findHtmlFiles(startDir);
|
||||
|
||||
// Sort files by depth (shallow first) so root index.html files are processed first
|
||||
htmlFiles.sort((a, b) => {
|
||||
const depthA = a.split(path.sep).length;
|
||||
const depthB = b.split(path.sep).length;
|
||||
return depthA - depthB;
|
||||
});
|
||||
|
||||
const totalFiles = options.limit
|
||||
? Math.min(htmlFiles.length, options.limit)
|
||||
: htmlFiles.length;
|
||||
|
||||
console.log(`📄 Found ${htmlFiles.length} HTML files`);
|
||||
if (options.limit) {
|
||||
console.log(
|
||||
`🎯 Processing first ${totalFiles} files (--limit ${options.limit})`
|
||||
);
|
||||
}
|
||||
console.log('');
|
||||
|
||||
let converted = 0;
|
||||
let skipped = 0;
|
||||
let sectionsGenerated = 0;
|
||||
|
||||
const filesToProcess = htmlFiles.slice(0, totalFiles);
|
||||
|
||||
for (let i = 0; i < filesToProcess.length; i++) {
|
||||
const htmlFile = filesToProcess[i];
|
||||
|
||||
if (!options.verbose && i > 0 && i % 100 === 0) {
|
||||
console.log(` Progress: ${i}/${totalFiles} files...`);
|
||||
}
|
||||
|
||||
// Generate regular index.md
|
||||
const result = await convertHtmlFileToMarkdown(htmlFile);
|
||||
if (result) {
|
||||
converted++;
|
||||
} else {
|
||||
skipped++;
|
||||
}
|
||||
|
||||
// Check if this is a section and generate aggregated markdown
|
||||
const htmlDir = path.dirname(htmlFile);
|
||||
if (result && isSection(htmlDir)) {
|
||||
try {
|
||||
const sectionMarkdown = await aggregateSectionMarkdown(htmlFile);
|
||||
if (sectionMarkdown) {
|
||||
const sectionFilePath = htmlFile.replace(
|
||||
/index\.html$/,
|
||||
'index.section.md'
|
||||
);
|
||||
fs.writeFileSync(sectionFilePath, sectionMarkdown, 'utf-8');
|
||||
sectionsGenerated++;
|
||||
|
||||
if (options.verbose) {
|
||||
const relativePath = path.relative(
|
||||
options.publicDir,
|
||||
sectionFilePath
|
||||
);
|
||||
console.log(` ✓ Generated section: ${relativePath}`);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(
|
||||
` ✗ Error generating section for ${htmlFile}:`,
|
||||
error.message
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Periodic garbage collection hint every 100 files
|
||||
if (i > 0 && i % 100 === 0 && global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n✅ Conversion complete!');
|
||||
console.log(` Converted: ${converted} files`);
|
||||
console.log(` Sections: ${sectionsGenerated} aggregated files`);
|
||||
console.log(` Skipped: ${skipped} files`);
|
||||
console.log(` Total: ${totalFiles} files processed`);
|
||||
}
|
||||
|
||||
// Run main function
|
||||
main();
|
||||
|
|
@ -0,0 +1,635 @@
|
|||
/**
|
||||
* Markdown Converter Library
|
||||
*
|
||||
* Core conversion logic for transforming HTML to Markdown.
|
||||
* This library is used by both:
|
||||
* - docs-v2 build scripts (html-to-markdown.js)
|
||||
* - docs-tooling Lambda@Edge function
|
||||
*
|
||||
* Exports reusable functions for HTML→Markdown conversion
|
||||
*/
|
||||
|
||||
const TurndownService = require('turndown');
|
||||
const { JSDOM } = require('jsdom');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const yaml = require('js-yaml');
|
||||
|
||||
// Try to load Rust converter (10x faster), fall back to JavaScript
|
||||
let rustConverter = null;
|
||||
let USE_RUST = false;
|
||||
try {
|
||||
rustConverter = require('../rust-markdown-converter');
|
||||
USE_RUST = true;
|
||||
console.log('✓ Rust markdown converter loaded');
|
||||
} catch (err) {
|
||||
console.log('ℹ Using JavaScript converter (Rust not available)');
|
||||
rustConverter = null;
|
||||
}
|
||||
|
||||
// Built-in product mappings (fallback since ESM module can't be required from CommonJS)
|
||||
const URL_PATTERN_MAP = {
|
||||
'/influxdb3/core/': 'influxdb3_core',
|
||||
'/influxdb3/enterprise/': 'influxdb3_enterprise',
|
||||
'/influxdb3/cloud-dedicated/': 'influxdb3_cloud_dedicated',
|
||||
'/influxdb3/cloud-serverless/': 'influxdb3_cloud_serverless',
|
||||
'/influxdb3/clustered/': 'influxdb3_clustered',
|
||||
'/influxdb3/explorer/': 'influxdb3_explorer',
|
||||
'/influxdb/cloud/': 'influxdb_cloud',
|
||||
'/influxdb/v2': 'influxdb_v2',
|
||||
'/influxdb/v1': 'influxdb_v1',
|
||||
'/enterprise_influxdb/': 'enterprise_influxdb',
|
||||
'/telegraf/': 'telegraf',
|
||||
'/chronograf/': 'chronograf',
|
||||
'/kapacitor/': 'kapacitor',
|
||||
'/flux/': 'flux',
|
||||
};
|
||||
|
||||
const PRODUCT_NAME_MAP = {
|
||||
influxdb3_core: { name: 'InfluxDB 3 Core', version: 'core' },
|
||||
influxdb3_enterprise: { name: 'InfluxDB 3 Enterprise', version: 'enterprise' },
|
||||
influxdb3_cloud_dedicated: { name: 'InfluxDB Cloud Dedicated', version: 'cloud-dedicated' },
|
||||
influxdb3_cloud_serverless: { name: 'InfluxDB Cloud Serverless', version: 'cloud-serverless' },
|
||||
influxdb3_clustered: { name: 'InfluxDB Clustered', version: 'clustered' },
|
||||
influxdb3_explorer: { name: 'InfluxDB 3 Explorer', version: 'explorer' },
|
||||
influxdb_cloud: { name: 'InfluxDB Cloud (TSM)', version: 'cloud' },
|
||||
influxdb_v2: { name: 'InfluxDB OSS v2', version: 'v2' },
|
||||
influxdb_v1: { name: 'InfluxDB OSS v1', version: 'v1' },
|
||||
enterprise_influxdb: { name: 'InfluxDB Enterprise v1', version: 'v1' },
|
||||
telegraf: { name: 'Telegraf', version: 'v1' },
|
||||
chronograf: { name: 'Chronograf', version: 'v1' },
|
||||
kapacitor: { name: 'Kapacitor', version: 'v1' },
|
||||
flux: { name: 'Flux', version: 'v0' },
|
||||
};
|
||||
|
||||
// Note: ESM product-mappings module can't be required from CommonJS
|
||||
// Using built-in mappings above instead
|
||||
let productMappings = null;
|
||||
|
||||
// Debug mode - set to true to enable verbose logging
|
||||
const DEBUG = false;
|
||||
|
||||
// Product data cache
|
||||
let productsData = null;
|
||||
|
||||
/**
|
||||
* Detect base URL for the current environment
|
||||
* @returns {string} Base URL (http://localhost:1313, staging URL, or production URL)
|
||||
*/
|
||||
function detectBaseUrl() {
|
||||
// Check environment variables first
|
||||
if (process.env.BASE_URL) {
|
||||
return process.env.BASE_URL;
|
||||
}
|
||||
|
||||
// Check if Hugo dev server is running on localhost
|
||||
if (process.env.HUGO_ENV === 'development' || process.env.NODE_ENV === 'development') {
|
||||
return 'http://localhost:1313';
|
||||
}
|
||||
|
||||
// Check for staging environment
|
||||
if (process.env.HUGO_ENV === 'staging' || process.env.DEPLOY_ENV === 'staging') {
|
||||
return process.env.STAGING_URL || 'https://test2.docs.influxdata.com';
|
||||
}
|
||||
|
||||
// Default to production
|
||||
return 'https://docs.influxdata.com';
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize product data
|
||||
* Uses the product-mappings module (compiled from TypeScript)
|
||||
*/
|
||||
async function ensureProductDataInitialized() {
|
||||
if (productsData) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (productMappings && productMappings.initializeProductData) {
|
||||
try {
|
||||
await productMappings.initializeProductData();
|
||||
productsData = true; // Mark as initialized
|
||||
} catch (err) {
|
||||
console.warn('Failed to initialize product-mappings:', err.message);
|
||||
productsData = true; // Mark as initialized anyway to avoid retries
|
||||
}
|
||||
} else {
|
||||
productsData = true; // Mark as initialized (fallback mode)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get product info from URL path
|
||||
* Uses built-in URL pattern maps for detection
|
||||
*/
|
||||
function getProductFromPath(urlPath) {
|
||||
// Find matching product key from URL patterns
|
||||
for (const [pattern, productKey] of Object.entries(URL_PATTERN_MAP)) {
|
||||
if (urlPath.includes(pattern)) {
|
||||
const productInfo = PRODUCT_NAME_MAP[productKey];
|
||||
if (productInfo) {
|
||||
return productInfo;
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Detect product context from URL path
|
||||
*/
|
||||
function detectProduct(urlPath) {
|
||||
return getProductFromPath(urlPath);
|
||||
}
|
||||
|
||||
/**
|
||||
* Configure Turndown for InfluxData documentation
|
||||
*/
|
||||
function createTurndownService() {
|
||||
const turndownService = new TurndownService({
|
||||
headingStyle: 'atx',
|
||||
codeBlockStyle: 'fenced',
|
||||
fence: '```',
|
||||
emDelimiter: '*',
|
||||
strongDelimiter: '**',
|
||||
// Note: linkStyle: 'inline' breaks link conversion in Turndown 7.2.2
|
||||
// Using default 'referenced' style which works correctly
|
||||
bulletListMarker: '-',
|
||||
});
|
||||
|
||||
// Preserve code block language identifiers
|
||||
turndownService.addRule('fencedCodeBlock', {
|
||||
filter: function (node, options) {
|
||||
return (
|
||||
options.codeBlockStyle === 'fenced' &&
|
||||
node.nodeName === 'PRE' &&
|
||||
node.firstChild &&
|
||||
node.firstChild.nodeName === 'CODE'
|
||||
);
|
||||
},
|
||||
replacement: function (content, node, options) {
|
||||
const code = node.firstChild;
|
||||
const language = code.className.replace(/^language-/, '') || '';
|
||||
const fence = options.fence;
|
||||
return `\n\n${fence}${language}\n${code.textContent}\n${fence}\n\n`;
|
||||
},
|
||||
});
|
||||
|
||||
// Improve list item handling - ensure proper spacing
|
||||
turndownService.addRule('listItems', {
|
||||
filter: 'li',
|
||||
replacement: function (content, node, options) {
|
||||
content = content
|
||||
.replace(/^\n+/, '') // Remove leading newlines
|
||||
.replace(/\n+$/, '\n') // Single trailing newline
|
||||
.replace(/\n/gm, '\n '); // Indent nested content
|
||||
|
||||
let prefix = options.bulletListMarker + ' '; // Dash + 3 spaces for unordered lists
|
||||
const parent = node.parentNode;
|
||||
|
||||
if (parent.nodeName === 'OL') {
|
||||
const start = parent.getAttribute('start');
|
||||
const index = Array.prototype.indexOf.call(parent.children, node);
|
||||
prefix = (start ? Number(start) + index : index + 1) + '. ';
|
||||
}
|
||||
|
||||
return (
|
||||
prefix +
|
||||
content +
|
||||
(node.nextSibling && !/\n$/.test(content) ? '\n' : '')
|
||||
);
|
||||
},
|
||||
});
|
||||
|
||||
// Convert HTML tables to Markdown tables
|
||||
turndownService.addRule('tables', {
|
||||
filter: 'table',
|
||||
replacement: function (content, node) {
|
||||
// Get all rows from tbody and thead
|
||||
const theadRows = Array.from(node.querySelectorAll('thead tr'));
|
||||
const tbodyRows = Array.from(node.querySelectorAll('tbody tr'));
|
||||
|
||||
// If no thead/tbody, fall back to all tr elements
|
||||
const allRows =
|
||||
theadRows.length || tbodyRows.length
|
||||
? [...theadRows, ...tbodyRows]
|
||||
: Array.from(node.querySelectorAll('tr'));
|
||||
|
||||
if (allRows.length === 0) return '';
|
||||
|
||||
// Extract headers from first row
|
||||
const headerRow = allRows[0];
|
||||
const headers = Array.from(headerRow.querySelectorAll('th, td')).map(
|
||||
(cell) => cell.textContent.trim()
|
||||
);
|
||||
|
||||
// Build separator row
|
||||
const separator = headers.map(() => '---').join(' | ');
|
||||
|
||||
// Extract data rows (skip first row which is the header)
|
||||
const dataRows = allRows
|
||||
.slice(1)
|
||||
.map((row) => {
|
||||
const cells = Array.from(row.querySelectorAll('td, th')).map((cell) =>
|
||||
cell.textContent.trim().replace(/\n/g, ' ')
|
||||
);
|
||||
return '| ' + cells.join(' | ') + ' |';
|
||||
})
|
||||
.join('\n');
|
||||
|
||||
return (
|
||||
'\n| ' +
|
||||
headers.join(' | ') +
|
||||
' |\n| ' +
|
||||
separator +
|
||||
' |\n' +
|
||||
dataRows +
|
||||
'\n\n'
|
||||
);
|
||||
},
|
||||
});
|
||||
|
||||
// Handle GitHub-style callouts (notes, warnings, etc.)
|
||||
turndownService.addRule('githubCallouts', {
|
||||
filter: function (node) {
|
||||
return (
|
||||
node.nodeName === 'BLOCKQUOTE' &&
|
||||
node.classList &&
|
||||
(node.classList.contains('note') ||
|
||||
node.classList.contains('warning') ||
|
||||
node.classList.contains('important') ||
|
||||
node.classList.contains('tip') ||
|
||||
node.classList.contains('caution'))
|
||||
);
|
||||
},
|
||||
replacement: function (content, node) {
|
||||
const type = Array.from(node.classList).find((c) =>
|
||||
['note', 'warning', 'important', 'tip', 'caution'].includes(c)
|
||||
);
|
||||
const emoji =
|
||||
{
|
||||
note: 'Note',
|
||||
warning: 'Warning',
|
||||
caution: 'Caution',
|
||||
important: 'Important',
|
||||
tip: 'Tip',
|
||||
}[type] || 'Note';
|
||||
|
||||
return `\n> [!${emoji}]\n> ${content.trim().replace(/\n/g, '\n> ')}\n\n`;
|
||||
},
|
||||
});
|
||||
|
||||
// Remove navigation, footer, and other non-content elements
|
||||
turndownService.remove([
|
||||
'nav',
|
||||
'header',
|
||||
'footer',
|
||||
'script',
|
||||
'style',
|
||||
'noscript',
|
||||
'iframe',
|
||||
'.format-selector', // Remove format selector buttons (Copy page, etc.)
|
||||
'.page-feedback', // Remove page feedback form
|
||||
'#page-feedback', // Remove feedback modal
|
||||
]);
|
||||
|
||||
return turndownService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract article content from HTML
|
||||
* @param {string} htmlContent - Raw HTML content
|
||||
* @param {string} contextInfo - Context info for error messages (file path or URL)
|
||||
* @returns {Object|null} Object with title, description, content or null if not found
|
||||
*/
|
||||
function extractArticleContent(htmlContent, contextInfo = '') {
|
||||
const dom = new JSDOM(htmlContent);
|
||||
const document = dom.window.document;
|
||||
|
||||
try {
|
||||
// Find the main article content
|
||||
const article = document.querySelector('article.article--content');
|
||||
|
||||
// Debug logging
|
||||
if (DEBUG) {
|
||||
console.log(`[DEBUG] Looking for article in ${contextInfo}`);
|
||||
console.log(`[DEBUG] HTML length: ${htmlContent.length}`);
|
||||
console.log(`[DEBUG] Article found: ${!!article}`);
|
||||
}
|
||||
|
||||
if (!article) {
|
||||
// Try alternative selectors to debug
|
||||
if (DEBUG) {
|
||||
const anyArticle = document.querySelector('article');
|
||||
const articleContent = document.querySelector('.article--content');
|
||||
console.log(`[DEBUG] Any article element: ${!!anyArticle}`);
|
||||
console.log(`[DEBUG] .article--content element: ${!!articleContent}`);
|
||||
}
|
||||
|
||||
console.warn(
|
||||
` ⚠️ No article content found in ${contextInfo}. This is typically not a problem and represents an aliased path.`
|
||||
);
|
||||
return null;
|
||||
}
|
||||
|
||||
// Remove unwanted elements from article before conversion
|
||||
const elementsToRemove = [
|
||||
'.format-selector', // Remove format selector buttons
|
||||
'.page-feedback', // Remove page feedback form
|
||||
'#page-feedback', // Remove feedback modal
|
||||
'.feedback-widget', // Remove any feedback widgets
|
||||
'.helpful', // Remove "Was this page helpful?" section
|
||||
'.feedback.block', // Remove footer feedback/support section
|
||||
'hr', // Remove horizontal rules (often used as separators before footer)
|
||||
];
|
||||
|
||||
elementsToRemove.forEach((selector) => {
|
||||
const elements = article.querySelectorAll(selector);
|
||||
elements.forEach((el) => el.remove());
|
||||
});
|
||||
|
||||
// Extract metadata
|
||||
const title =
|
||||
document.querySelector('h1')?.textContent?.trim() ||
|
||||
document.querySelector('title')?.textContent?.trim() ||
|
||||
'Untitled';
|
||||
|
||||
const description =
|
||||
document
|
||||
.querySelector('meta[name="description"]')
|
||||
?.getAttribute('content') ||
|
||||
document
|
||||
.querySelector('meta[property="og:description"]')
|
||||
?.getAttribute('content') ||
|
||||
'';
|
||||
|
||||
// Get the content before closing the DOM
|
||||
const content = article.innerHTML;
|
||||
|
||||
return {
|
||||
title,
|
||||
description,
|
||||
content,
|
||||
};
|
||||
} finally {
|
||||
// Clean up JSDOM to prevent memory leaks
|
||||
dom.window.close();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate frontmatter for markdown file (single page)
|
||||
* @param {Object} metadata - Object with title, description
|
||||
* @param {string} urlPath - URL path for the page
|
||||
* @param {string} baseUrl - Base URL for full URL construction
|
||||
* @returns {string} YAML frontmatter as string
|
||||
*/
|
||||
function generateFrontmatter(metadata, urlPath, baseUrl = '') {
|
||||
const product = detectProduct(urlPath);
|
||||
|
||||
// Sanitize description (remove newlines, truncate to reasonable length)
|
||||
let description = metadata.description || '';
|
||||
description = description
|
||||
.replace(/\s+/g, ' ') // Replace all whitespace (including newlines) with single space
|
||||
.trim()
|
||||
.substring(0, 500); // Truncate to 500 characters max
|
||||
|
||||
// Add token estimate (rough: 4 chars per token)
|
||||
const contentLength = metadata.content?.length || 0;
|
||||
const estimatedTokens = Math.ceil(contentLength / 4);
|
||||
|
||||
// Build full URL (baseUrl + path)
|
||||
const fullUrl = baseUrl ? `${baseUrl.replace(/\/$/, '')}${urlPath}` : urlPath;
|
||||
|
||||
// Build frontmatter object (will be serialized to YAML)
|
||||
const frontmatterObj = {
|
||||
title: metadata.title,
|
||||
description: description,
|
||||
url: fullUrl,
|
||||
estimated_tokens: estimatedTokens
|
||||
};
|
||||
|
||||
if (product) {
|
||||
frontmatterObj.product = product.name;
|
||||
if (product.version) {
|
||||
frontmatterObj.version = product.version;
|
||||
}
|
||||
}
|
||||
|
||||
// Serialize to YAML (handles special characters properly)
|
||||
return '---\n' + yaml.dump(frontmatterObj, {
|
||||
lineWidth: -1, // Disable line wrapping
|
||||
noRefs: true // Disable anchors/aliases
|
||||
}).trim() + '\n---';
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate enhanced frontmatter for section aggregation
|
||||
* @param {Object} metadata - Object with title, description
|
||||
* @param {string} urlPath - URL path for the section
|
||||
* @param {Array} childPages - Array of child page objects with url and title
|
||||
* @param {string} baseUrl - Base URL for full URL construction
|
||||
* @returns {string} YAML frontmatter as string
|
||||
*/
|
||||
function generateSectionFrontmatter(metadata, urlPath, childPages, baseUrl = '') {
|
||||
const product = detectProduct(urlPath);
|
||||
|
||||
// Sanitize description (remove newlines, truncate to reasonable length)
|
||||
let description = metadata.description || '';
|
||||
description = description
|
||||
.replace(/\s+/g, ' ') // Replace all whitespace (including newlines) with single space
|
||||
.trim()
|
||||
.substring(0, 500); // Truncate to 500 characters max
|
||||
|
||||
// Add token estimate (rough: 4 chars per token)
|
||||
const contentLength = metadata.content?.length || 0;
|
||||
const childContentLength = childPages.reduce(
|
||||
(sum, child) => sum + (child.content?.length || 0),
|
||||
0
|
||||
);
|
||||
const totalLength = contentLength + childContentLength;
|
||||
const estimatedTokens = Math.ceil(totalLength / 4);
|
||||
|
||||
// Build full URL (baseUrl + path)
|
||||
const fullUrl = baseUrl ? `${baseUrl.replace(/\/$/, '')}${urlPath}` : urlPath;
|
||||
const normalizedBaseUrl = baseUrl ? baseUrl.replace(/\/$/, '') : '';
|
||||
|
||||
// Build frontmatter object (will be serialized to YAML)
|
||||
const frontmatterObj = {
|
||||
title: metadata.title,
|
||||
description: description,
|
||||
url: fullUrl,
|
||||
type: 'section',
|
||||
pages: childPages.length,
|
||||
estimated_tokens: estimatedTokens
|
||||
};
|
||||
|
||||
if (product) {
|
||||
frontmatterObj.product = product.name;
|
||||
if (product.version) {
|
||||
frontmatterObj.version = product.version;
|
||||
}
|
||||
}
|
||||
|
||||
// List child pages with full URLs
|
||||
if (childPages.length > 0) {
|
||||
frontmatterObj.child_pages = childPages.map(child => ({
|
||||
url: normalizedBaseUrl ? `${normalizedBaseUrl}${child.url}` : child.url,
|
||||
title: child.title
|
||||
}));
|
||||
}
|
||||
|
||||
// Serialize to YAML (handles special characters properly)
|
||||
return '---\n' + yaml.dump(frontmatterObj, {
|
||||
lineWidth: -1, // Disable line wrapping
|
||||
noRefs: true // Disable anchors/aliases
|
||||
}).trim() + '\n---';
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert HTML content to Markdown (single page)
|
||||
* @param {string} htmlContent - Raw HTML content
|
||||
* @param {string} urlPath - URL path for the page (for frontmatter)
|
||||
* @returns {Promise<string|null>} Markdown content with frontmatter or null if conversion fails
|
||||
*/
|
||||
async function convertToMarkdown(htmlContent, urlPath) {
|
||||
await ensureProductDataInitialized();
|
||||
|
||||
// Detect base URL for the environment
|
||||
const baseUrl = detectBaseUrl();
|
||||
if (DEBUG) {
|
||||
console.log(`[DEBUG] Base URL detected: ${baseUrl} (NODE_ENV=${process.env.NODE_ENV}, HUGO_ENV=${process.env.HUGO_ENV}, BASE_URL=${process.env.BASE_URL})`);
|
||||
}
|
||||
|
||||
// Use Rust converter if available (10× faster)
|
||||
if (USE_RUST && rustConverter) {
|
||||
try {
|
||||
return rustConverter.convertToMarkdown(htmlContent, urlPath, baseUrl);
|
||||
} catch (err) {
|
||||
console.warn(`Rust conversion failed for ${urlPath}, falling back to JavaScript:`, err.message);
|
||||
// Fall through to JavaScript implementation
|
||||
}
|
||||
}
|
||||
|
||||
// JavaScript fallback implementation
|
||||
const turndownService = createTurndownService();
|
||||
const metadata = extractArticleContent(htmlContent, urlPath);
|
||||
|
||||
if (!metadata) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Convert HTML to markdown
|
||||
let markdown = turndownService.turndown(metadata.content);
|
||||
|
||||
// Clean up excessive newlines and separator artifacts
|
||||
markdown = markdown
|
||||
.replace(/\n{3,}/g, '\n\n')
|
||||
.replace(/\* \* \*\s*\n\s*\* \* \*/g, '')
|
||||
.replace(/\* \* \*\s*$/g, '')
|
||||
.trim();
|
||||
|
||||
// Generate frontmatter with full URL
|
||||
const frontmatter = generateFrontmatter(metadata, urlPath, baseUrl);
|
||||
|
||||
return `${frontmatter}\n\n${markdown}\n`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert section HTML with child pages to aggregated Markdown
|
||||
* @param {string} sectionHtml - HTML content of the section index page
|
||||
* @param {string} sectionUrlPath - URL path for the section
|
||||
* @param {Array} childHtmls - Array of objects with {html, url} for each child page
|
||||
* @returns {Promise<string|null>} Aggregated markdown content or null if conversion fails
|
||||
*/
|
||||
async function convertSectionToMarkdown(
|
||||
sectionHtml,
|
||||
sectionUrlPath,
|
||||
childHtmls
|
||||
) {
|
||||
await ensureProductDataInitialized();
|
||||
|
||||
// Detect base URL for the environment
|
||||
const baseUrl = detectBaseUrl();
|
||||
|
||||
// Use Rust converter if available (10× faster)
|
||||
if (USE_RUST && rustConverter) {
|
||||
try {
|
||||
return rustConverter.convertSectionToMarkdown(sectionHtml, sectionUrlPath, childHtmls, baseUrl);
|
||||
} catch (err) {
|
||||
console.warn(`Rust section conversion failed for ${sectionUrlPath}, falling back to JavaScript:`, err.message);
|
||||
// Fall through to JavaScript implementation
|
||||
}
|
||||
}
|
||||
|
||||
// JavaScript fallback implementation
|
||||
const turndownService = createTurndownService();
|
||||
|
||||
// Extract section metadata and content
|
||||
const sectionMetadata = extractArticleContent(sectionHtml, sectionUrlPath);
|
||||
if (!sectionMetadata) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Convert section content to markdown
|
||||
let sectionMarkdown = turndownService.turndown(sectionMetadata.content);
|
||||
sectionMarkdown = sectionMarkdown
|
||||
.replace(/\n{3,}/g, '\n\n')
|
||||
.replace(/\* \* \*\s*\n\s*\* \* \*/g, '')
|
||||
.replace(/\* \* \*\s*$/g, '')
|
||||
.trim();
|
||||
|
||||
// Process child pages
|
||||
const childContents = [];
|
||||
const childPageInfo = [];
|
||||
|
||||
for (const { html, url } of childHtmls) {
|
||||
const childMetadata = extractArticleContent(html, url);
|
||||
if (childMetadata) {
|
||||
let childMarkdown = turndownService.turndown(childMetadata.content);
|
||||
childMarkdown = childMarkdown
|
||||
.replace(/\n{3,}/g, '\n\n')
|
||||
.replace(/\* \* \*\s*\n\s*\* \* \*/g, '')
|
||||
.replace(/\* \* \*\s*$/g, '')
|
||||
.trim();
|
||||
|
||||
// Remove the first h1 heading (page title) to avoid redundancy
|
||||
// since we're adding it as an h2 heading
|
||||
childMarkdown = childMarkdown.replace(/^#\s+.+?\n+/, '');
|
||||
|
||||
// Add child page title as heading
|
||||
childContents.push(`## ${childMetadata.title}\n\n${childMarkdown}`);
|
||||
|
||||
// Track child page info for frontmatter
|
||||
childPageInfo.push({
|
||||
url: url,
|
||||
title: childMetadata.title,
|
||||
content: childMarkdown,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Generate section frontmatter with child page info and full URLs
|
||||
const frontmatter = generateSectionFrontmatter(
|
||||
{ ...sectionMetadata, content: sectionMarkdown },
|
||||
sectionUrlPath,
|
||||
childPageInfo,
|
||||
baseUrl
|
||||
);
|
||||
|
||||
// Combine section content with child pages
|
||||
const allContent = [sectionMarkdown, ...childContents].join('\n\n---\n\n');
|
||||
|
||||
return `${frontmatter}\n\n${allContent}\n`;
|
||||
}
|
||||
|
||||
// Export all functions for CommonJS
|
||||
module.exports = {
|
||||
detectProduct,
|
||||
createTurndownService,
|
||||
extractArticleContent,
|
||||
generateFrontmatter,
|
||||
generateSectionFrontmatter,
|
||||
convertToMarkdown,
|
||||
convertSectionToMarkdown,
|
||||
};
|
||||
|
|
@ -0,0 +1,7 @@
|
|||
target/
|
||||
node_modules/
|
||||
*.node
|
||||
.cargo/
|
||||
Cargo.lock
|
||||
*.d.ts
|
||||
index.js
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
[package]
|
||||
name = "rust-markdown-converter"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
crate-type = ["cdylib"]
|
||||
|
||||
[dependencies]
|
||||
# NAPI for Node.js bindings
|
||||
napi = { version = "2.16", features = ["serde-json"] }
|
||||
napi-derive = "2.16"
|
||||
|
||||
# HTML parsing and conversion
|
||||
html2md = "0.2"
|
||||
scraper = "0.20"
|
||||
|
||||
# YAML frontmatter
|
||||
serde_yaml = "0.9"
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
|
||||
# Regex for text processing
|
||||
regex = "1.11"
|
||||
lazy_static = "1.5"
|
||||
|
||||
# Date/time for timestamps
|
||||
chrono = "0.4"
|
||||
|
||||
[build-dependencies]
|
||||
napi-build = "2.1"
|
||||
serde_yaml = "0.9"
|
||||
|
||||
[profile.release]
|
||||
lto = true
|
||||
codegen-units = 1
|
||||
opt-level = 3
|
||||
strip = true
|
||||
|
|
@ -0,0 +1,119 @@
|
|||
extern crate napi_build;
|
||||
|
||||
use std::env;
|
||||
use std::fs;
|
||||
use std::path::Path;
|
||||
|
||||
fn main() {
|
||||
napi_build::setup();
|
||||
|
||||
// Generate product mappings from products.yml
|
||||
generate_product_mappings();
|
||||
}
|
||||
|
||||
fn generate_product_mappings() {
|
||||
// Tell Cargo to rerun this build script if products.yml changes
|
||||
println!("cargo:rerun-if-changed=../../data/products.yml");
|
||||
|
||||
let out_dir = env::var("OUT_DIR").unwrap();
|
||||
let dest_path = Path::new(&out_dir).join("product_mappings.rs");
|
||||
|
||||
// Read products.yml
|
||||
let products_path = "../../data/products.yml";
|
||||
let yaml_content = fs::read_to_string(products_path)
|
||||
.expect("Failed to read products.yml");
|
||||
|
||||
// Parse YAML using serde_yaml
|
||||
let products: serde_yaml::Value = serde_yaml::from_str(&yaml_content)
|
||||
.expect("Failed to parse products.yml");
|
||||
|
||||
// Generate Rust code for the URL pattern map
|
||||
let mut mappings = Vec::new();
|
||||
|
||||
if let serde_yaml::Value::Mapping(products_map) = products {
|
||||
for (key, value) in products_map.iter() {
|
||||
if let (serde_yaml::Value::String(_product_key), serde_yaml::Value::Mapping(product_data)) = (key, value) {
|
||||
// Extract name
|
||||
let name = product_data.get(&serde_yaml::Value::String("name".to_string()))
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
// Extract namespace
|
||||
let namespace = product_data.get(&serde_yaml::Value::String("namespace".to_string()))
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
// Extract versions array (if exists)
|
||||
let versions = product_data.get(&serde_yaml::Value::String("versions".to_string()))
|
||||
.and_then(|v| v.as_sequence())
|
||||
.map(|seq| {
|
||||
seq.iter()
|
||||
.filter_map(|v| v.as_str())
|
||||
.collect::<Vec<_>>()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
// Build URL patterns from namespace and versions data
|
||||
// Convert namespace like "influxdb3_explorer" to URL path "/influxdb3/explorer/"
|
||||
let url_base = if namespace.contains('_') {
|
||||
let parts: Vec<&str> = namespace.split('_').collect();
|
||||
format!("/{}/", parts.join("/"))
|
||||
} else {
|
||||
format!("/{}/", namespace)
|
||||
};
|
||||
|
||||
if !versions.is_empty() {
|
||||
// For products with versions, create a mapping for each version
|
||||
for version in &versions {
|
||||
// Build URL: base + version (without trailing slash for now, added later if needed)
|
||||
let url_pattern = url_base.trim_end_matches('/').to_string() + "/" + version;
|
||||
|
||||
// Try to get version-specific name (e.g., name__v2, name__cloud)
|
||||
let version_key = format!("name__{}", version.replace("-", ""));
|
||||
let version_name = product_data.get(&serde_yaml::Value::String(version_key))
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or(name);
|
||||
|
||||
mappings.push(format!(
|
||||
" m.insert(\"{}\", (\"{}\", \"{}\"));",
|
||||
url_pattern, version_name, version
|
||||
));
|
||||
}
|
||||
} else if !namespace.is_empty() {
|
||||
// For products without versions, use namespace directly
|
||||
// Extract the version identifier from the latest field
|
||||
let latest = product_data.get(&serde_yaml::Value::String("latest".to_string()))
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or("");
|
||||
|
||||
// Use the base path without trailing slash for pattern matching
|
||||
let url_pattern = url_base.trim_end_matches('/').to_string();
|
||||
|
||||
mappings.push(format!(
|
||||
" m.insert(\"{}/\", (\"{}\", \"{}\"));",
|
||||
url_pattern, name, latest
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Generate the Rust code
|
||||
let generated_code = format!(
|
||||
r#"// Auto-generated from data/products.yml - DO NOT EDIT MANUALLY
|
||||
// Note: HashMap and lazy_static are already imported in lib.rs
|
||||
|
||||
lazy_static! {{
|
||||
pub static ref URL_PATTERN_MAP: HashMap<&'static str, (&'static str, &'static str)> = {{
|
||||
let mut m = HashMap::new();
|
||||
{}
|
||||
m
|
||||
}};
|
||||
}}
|
||||
"#,
|
||||
mappings.join("\n")
|
||||
);
|
||||
|
||||
fs::write(&dest_path, generated_code)
|
||||
.expect("Failed to write product_mappings.rs");
|
||||
}
|
||||
|
|
@ -0,0 +1,31 @@
|
|||
{
|
||||
"name": "@influxdata/rust-markdown-converter",
|
||||
"version": "0.1.0",
|
||||
"description": "High-performance HTML to Markdown converter for InfluxData documentation (Rust + napi-rs)",
|
||||
"main": "index.js",
|
||||
"types": "index.d.ts",
|
||||
"napi": {
|
||||
"binaryName": "rust-markdown-converter",
|
||||
"targets": [
|
||||
"aarch64-apple-darwin",
|
||||
"x86_64-unknown-linux-gnu",
|
||||
"aarch64-unknown-linux-gnu"
|
||||
]
|
||||
},
|
||||
"scripts": {
|
||||
"artifacts": "napi artifacts",
|
||||
"build": "napi build --platform --release",
|
||||
"build:debug": "napi build --platform",
|
||||
"prepublishOnly": "napi prepublish -t npm",
|
||||
"test": "cargo test",
|
||||
"universal": "napi universal",
|
||||
"version": "napi version"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@napi-rs/cli": "^3.4.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 10"
|
||||
},
|
||||
"license": "MIT"
|
||||
}
|
||||
|
|
@ -0,0 +1,696 @@
|
|||
/*!
|
||||
* Rust Markdown Converter Library
|
||||
*
|
||||
* High-performance HTML to Markdown converter for InfluxData documentation.
|
||||
* This library provides Node.js bindings via napi-rs for seamless integration.
|
||||
*
|
||||
* Features:
|
||||
* - Custom Turndown-like conversion rules
|
||||
* - Product detection from URL paths
|
||||
* - GitHub-style callout support
|
||||
* - UI element removal
|
||||
* - YAML frontmatter generation
|
||||
*/
|
||||
|
||||
#[macro_use]
|
||||
extern crate napi_derive;
|
||||
|
||||
use napi::Result;
|
||||
use scraper::{Html, Selector};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use regex::Regex;
|
||||
use lazy_static::lazy_static;
|
||||
use std::collections::HashMap;
|
||||
|
||||
// ============================================================================
|
||||
// Product Detection
|
||||
// ============================================================================
|
||||
|
||||
#[napi(object)]
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ProductInfo {
|
||||
pub name: String,
|
||||
pub version: String,
|
||||
}
|
||||
|
||||
// Include auto-generated product mappings from data/products.yml
|
||||
// This is generated at compile time by build.rs
|
||||
include!(concat!(env!("OUT_DIR"), "/product_mappings.rs"));
|
||||
|
||||
fn detect_product(url_path: &str) -> Option<ProductInfo> {
|
||||
for (pattern, (name, version)) in URL_PATTERN_MAP.iter() {
|
||||
if url_path.contains(pattern) {
|
||||
return Some(ProductInfo {
|
||||
name: name.to_string(),
|
||||
version: version.to_string(),
|
||||
});
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// HTML Processing
|
||||
// ============================================================================
|
||||
|
||||
/// Remove unwanted UI elements from HTML
|
||||
fn clean_html(html: &str) -> String {
|
||||
let document = Html::parse_document(html);
|
||||
let mut cleaned = html.to_string();
|
||||
|
||||
// Configurable list of CSS selectors for elements to remove from article content
|
||||
// Add new selectors here to remove unwanted UI elements, forms, navigation, etc.
|
||||
let remove_selectors = vec![
|
||||
// Navigation and structure
|
||||
"nav",
|
||||
"header",
|
||||
"footer",
|
||||
|
||||
// Scripts and styles
|
||||
"script",
|
||||
"style",
|
||||
"noscript",
|
||||
"iframe",
|
||||
|
||||
// UI widgets and controls
|
||||
".format-selector",
|
||||
".format-selector__button",
|
||||
"button[aria-label*='Copy']",
|
||||
"hr",
|
||||
|
||||
// Feedback and support sections (inside article content)
|
||||
".helpful", // "Was this page helpful?" form
|
||||
"div.feedback.block", // Block-level feedback sections (combined class selector)
|
||||
".feedback", // General feedback sections (must come after specific .feedback.block)
|
||||
".page-feedback",
|
||||
"#page-feedback",
|
||||
".feedback-widget",
|
||||
".support", // Support section at bottom of pages
|
||||
];
|
||||
|
||||
for selector_str in remove_selectors {
|
||||
if let Ok(selector) = Selector::parse(selector_str) {
|
||||
for element in document.select(&selector) {
|
||||
// Get the full HTML of the element to remove
|
||||
let element_html = element.html();
|
||||
cleaned = cleaned.replace(&element_html, "");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
cleaned
|
||||
}
|
||||
|
||||
/// Replace icon spans with text checkmarks
|
||||
/// Converts <span class="inline cf-icon Checkmark_New large"></span> to ✓
|
||||
fn replace_icon_spans(html: &str) -> String {
|
||||
let document = Html::parse_document(html);
|
||||
let mut result = html.to_string();
|
||||
|
||||
// Select icon spans (specifically checkmark icons used in tables)
|
||||
// The selector matches any span that has both cf-icon and Checkmark_New classes
|
||||
if let Ok(selector) = Selector::parse("span[class*='cf-icon'][class*='Checkmark_New']") {
|
||||
for element in document.select(&selector) {
|
||||
// Build the full element HTML to replace (empty span with classes)
|
||||
let class_attr = element.value().attr("class").unwrap_or("");
|
||||
let full_html = format!("<span class=\"{}\"></span>", class_attr);
|
||||
|
||||
// Replace with checkmark character
|
||||
result = result.replace(&full_html, "✓");
|
||||
}
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Extract article content from HTML
|
||||
fn extract_article_content(html: &str) -> Option<(String, String, String)> {
|
||||
let document = Html::parse_document(html);
|
||||
|
||||
// Find main article content
|
||||
let article_selector = Selector::parse("article.article--content").ok()?;
|
||||
let article = document.select(&article_selector).next()?;
|
||||
|
||||
// Extract title
|
||||
let title = if let Ok(h1_sel) = Selector::parse("h1") {
|
||||
document
|
||||
.select(&h1_sel)
|
||||
.next()
|
||||
.map(|el| el.text().collect::<Vec<_>>().join(" "))
|
||||
.or_else(|| {
|
||||
if let Ok(title_sel) = Selector::parse("title") {
|
||||
document
|
||||
.select(&title_sel)
|
||||
.next()
|
||||
.map(|el| el.text().collect::<Vec<_>>().join(" "))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.unwrap_or_else(|| "Untitled".to_string())
|
||||
} else {
|
||||
"Untitled".to_string()
|
||||
};
|
||||
|
||||
// Extract description from meta tags
|
||||
let description = if let Ok(meta_sel) = Selector::parse("meta[name='description']") {
|
||||
document
|
||||
.select(&meta_sel)
|
||||
.next()
|
||||
.and_then(|el| el.value().attr("content"))
|
||||
.or_else(|| {
|
||||
if let Ok(og_sel) = Selector::parse("meta[property='og:description']") {
|
||||
document
|
||||
.select(&og_sel)
|
||||
.next()
|
||||
.and_then(|el| el.value().attr("content"))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.unwrap_or("")
|
||||
.to_string()
|
||||
} else {
|
||||
String::new()
|
||||
};
|
||||
|
||||
// Get cleaned article HTML
|
||||
let content = clean_html(&article.html());
|
||||
|
||||
Some((title, description, content))
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Markdown Conversion
|
||||
// ============================================================================
|
||||
|
||||
lazy_static! {
|
||||
// Regex patterns for post-processing
|
||||
static ref EXCESSIVE_NEWLINES: Regex = Regex::new(r"\n{3,}").unwrap();
|
||||
static ref SEPARATOR_ARTIFACTS: Regex = Regex::new(r"\* \* \*\s*\n\s*\* \* \*").unwrap();
|
||||
static ref TRAILING_SEPARATOR: Regex = Regex::new(r"\* \* \*\s*$").unwrap();
|
||||
static ref CODE_FENCE: Regex = Regex::new(r"```(\w+)?\n").unwrap();
|
||||
}
|
||||
|
||||
/// Convert HTML blockquote callouts to GitHub-style
|
||||
fn convert_callouts(markdown: &str, html: &str) -> String {
|
||||
let document = Html::parse_document(html);
|
||||
let mut result = markdown.to_string();
|
||||
|
||||
// Process both <blockquote> elements and <div class="block ..."> callouts
|
||||
let selectors = vec![
|
||||
"blockquote.note",
|
||||
"blockquote.warning",
|
||||
"blockquote.important",
|
||||
"blockquote.tip",
|
||||
"blockquote.caution",
|
||||
"div.block.note",
|
||||
"div.block.warning",
|
||||
"div.block.important",
|
||||
"div.block.tip",
|
||||
"div.block.caution",
|
||||
];
|
||||
|
||||
for selector_str in selectors {
|
||||
if let Ok(callout_sel) = Selector::parse(selector_str) {
|
||||
// Determine callout type from selector
|
||||
let callout_type = if selector_str.ends_with("note") {
|
||||
"note"
|
||||
} else if selector_str.ends_with("warning") {
|
||||
"warning"
|
||||
} else if selector_str.ends_with("caution") {
|
||||
"caution"
|
||||
} else if selector_str.ends_with("important") {
|
||||
"important"
|
||||
} else if selector_str.ends_with("tip") {
|
||||
"tip"
|
||||
} else {
|
||||
"note"
|
||||
};
|
||||
|
||||
for element in document.select(&callout_sel) {
|
||||
let label = match callout_type {
|
||||
"note" => "Note",
|
||||
"warning" => "Warning",
|
||||
"caution" => "Caution",
|
||||
"important" => "Important",
|
||||
"tip" => "Tip",
|
||||
_ => "Note",
|
||||
};
|
||||
|
||||
// Convert the callout content to markdown preserving structure
|
||||
let callout_html = element.html();
|
||||
let callout_markdown = html2md::parse_html(&callout_html);
|
||||
|
||||
if !callout_markdown.trim().is_empty() && callout_markdown.len() > 10 {
|
||||
// Build GitHub-style callout
|
||||
let mut callout_lines = vec![format!("> [!{}]", label)];
|
||||
|
||||
// Process markdown line by line, preserving headings and structure
|
||||
for line in callout_markdown.lines() {
|
||||
let trimmed = line.trim();
|
||||
if !trimmed.is_empty() {
|
||||
// Preserve markdown headings (#### becomes > ####)
|
||||
callout_lines.push(format!("> {}", trimmed));
|
||||
}
|
||||
}
|
||||
|
||||
// Check for modal trigger links and add annotations
|
||||
if let Ok(modal_sel) = Selector::parse("a.influxdb-detector-trigger, a[onclick*='toggleModal']") {
|
||||
if element.select(&modal_sel).next().is_some() {
|
||||
// Add annotation about interactive modal
|
||||
callout_lines.push("> *(Interactive feature in HTML: Opens version detector modal)*".to_string());
|
||||
}
|
||||
}
|
||||
|
||||
let callout = callout_lines.join("\n") + "\n";
|
||||
|
||||
// Try to find and replace in markdown
|
||||
// Extract the first line of content (likely a heading or distinctive text)
|
||||
let first_content_line = callout_markdown.lines()
|
||||
.map(|l| l.trim())
|
||||
.find(|l| !l.is_empty() && l.len() > 3)
|
||||
.unwrap_or("");
|
||||
|
||||
if !first_content_line.is_empty() {
|
||||
// Try to find this content in the markdown
|
||||
if let Some(idx) = result.find(first_content_line) {
|
||||
// Find the end of this section (next heading or double newline)
|
||||
let after_start = &result[idx..];
|
||||
if let Some(section_end) = after_start.find("\n\n") {
|
||||
let end_idx = idx + section_end;
|
||||
result.replace_range(idx..end_idx, &callout);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Convert HTML tables to Markdown format
|
||||
fn convert_tables(markdown: &str, html: &str) -> String {
|
||||
let document = Html::parse_document(html);
|
||||
let mut result = markdown.to_string();
|
||||
|
||||
if let Ok(table_sel) = Selector::parse("table") {
|
||||
for table in document.select(&table_sel) {
|
||||
// Get headers
|
||||
let mut headers = Vec::new();
|
||||
if let Ok(th_sel) = Selector::parse("thead th, thead td") {
|
||||
for th in table.select(&th_sel) {
|
||||
headers.push(th.text().collect::<Vec<_>>().join(" ").trim().to_string());
|
||||
}
|
||||
}
|
||||
|
||||
// If no thead, try first tr
|
||||
if headers.is_empty() {
|
||||
if let Ok(tr_sel) = Selector::parse("tr") {
|
||||
if let Some(first_row) = table.select(&tr_sel).next() {
|
||||
if let Ok(cell_sel) = Selector::parse("th, td") {
|
||||
for cell in first_row.select(&cell_sel) {
|
||||
headers.push(cell.text().collect::<Vec<_>>().join(" ").trim().to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if headers.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Build separator
|
||||
let separator = headers.iter().map(|_| "---").collect::<Vec<_>>().join(" | ");
|
||||
|
||||
// Get data rows
|
||||
let mut data_rows = Vec::new();
|
||||
if let Ok(tr_sel) = Selector::parse("tbody tr, tr") {
|
||||
for (idx, row) in table.select(&tr_sel).enumerate() {
|
||||
// Skip first row if it was used for headers
|
||||
if idx == 0 && !table.select(&Selector::parse("thead").unwrap()).next().is_some() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let mut cells = Vec::new();
|
||||
if let Ok(cell_sel) = Selector::parse("td, th") {
|
||||
for cell in row.select(&cell_sel) {
|
||||
cells.push(cell.text().collect::<Vec<_>>().join(" ").trim().replace('\n', " "));
|
||||
}
|
||||
}
|
||||
|
||||
if !cells.is_empty() {
|
||||
data_rows.push(format!("| {} |", cells.join(" | ")));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Build markdown table
|
||||
let md_table = format!(
|
||||
"\n| {} |\n| {} |\n{}\n\n",
|
||||
headers.join(" | "),
|
||||
separator,
|
||||
data_rows.join("\n")
|
||||
);
|
||||
|
||||
// This is approximate replacement
|
||||
result.push_str(&md_table);
|
||||
}
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Add headings to delimit tabbed content in markdown
|
||||
/// Finds patterns like [Go](#)[Node.js](#)[Python](#) and replaces with heading
|
||||
fn add_tab_delimiters_to_markdown(markdown: &str) -> String {
|
||||
use regex::Regex;
|
||||
|
||||
// Pattern to match 2+ consecutive tab links
|
||||
let tabs_pattern = Regex::new(r"(\[[^\]]+\]\(#\)){2,}").unwrap();
|
||||
let first_tab_re = Regex::new(r"\[([^\]]+)\]\(#\)").unwrap();
|
||||
|
||||
tabs_pattern.replace_all(markdown, |caps: ®ex::Captures| {
|
||||
let full_match = &caps[0];
|
||||
|
||||
// Extract first tab name
|
||||
if let Some(first_cap) = first_tab_re.captures(full_match) {
|
||||
format!("#### {} ####", &first_cap[1])
|
||||
} else {
|
||||
full_match.to_string()
|
||||
}
|
||||
}).to_string()
|
||||
}
|
||||
|
||||
/// Post-process markdown to clean up formatting
|
||||
fn postprocess_markdown(markdown: &str, html: &str, remove_h1: bool) -> String {
|
||||
let mut result = markdown.to_string();
|
||||
|
||||
if remove_h1 {
|
||||
// Remove the first h1 heading (title is already in frontmatter)
|
||||
// Match both formats:
|
||||
// 1. ATX style: # Title\n
|
||||
// 2. Setext style: Title\n=====\n
|
||||
let h1_atx_pattern = Regex::new(r"^#\s+.*?\n+").unwrap();
|
||||
let h1_setext_pattern = Regex::new(r"^.+?\n=+\s*\n+").unwrap();
|
||||
|
||||
// Try ATX style first
|
||||
if h1_atx_pattern.is_match(&result) {
|
||||
result = h1_atx_pattern.replace(&result, "").to_string();
|
||||
} else {
|
||||
// Try Setext style
|
||||
result = h1_setext_pattern.replace(&result, "").to_string();
|
||||
}
|
||||
}
|
||||
|
||||
// Convert callouts
|
||||
result = convert_callouts(&result, html);
|
||||
|
||||
// Convert tables (html2md might not handle them well)
|
||||
result = convert_tables(&result, html);
|
||||
|
||||
// Add tab delimiters for tabbed content
|
||||
result = add_tab_delimiters_to_markdown(&result);
|
||||
|
||||
// Remove UI element text that shouldn't be in markdown
|
||||
result = result.replace("Copy section", "");
|
||||
result = result.replace("Copy page", "");
|
||||
result = result.replace(" Copy to clipboard", "");
|
||||
|
||||
// Remove HTML comments (<!--SOURCE-->, <!--pytest-codeblocks:...-->, etc.)
|
||||
let comment_pattern = Regex::new(r"<!--.*?-->").unwrap();
|
||||
result = comment_pattern.replace_all(&result, "").to_string();
|
||||
|
||||
// Remove feedback and support sections at the bottom
|
||||
// Match "Was this page helpful?" to end of document
|
||||
let feedback_section = Regex::new(r"(?s)Was this page helpful\?.*$").unwrap();
|
||||
result = feedback_section.replace(&result, "").to_string();
|
||||
|
||||
// Also remove "Support and feedback" heading if it somehow remains
|
||||
let support_section = Regex::new(r"(?s)#{2,6}\s+Support and feedback\s*\n.*$").unwrap();
|
||||
result = support_section.replace(&result, "").to_string();
|
||||
|
||||
// Clean up excessive newlines
|
||||
result = EXCESSIVE_NEWLINES.replace_all(&result, "\n\n").to_string();
|
||||
|
||||
// Remove separator artifacts
|
||||
result = SEPARATOR_ARTIFACTS.replace_all(&result, "").to_string();
|
||||
result = TRAILING_SEPARATOR.replace_all(&result, "").to_string();
|
||||
|
||||
result.trim().to_string()
|
||||
}
|
||||
|
||||
/// Fix code block language identifiers
|
||||
/// html2md doesn't preserve language classes, so we need to extract them from HTML
|
||||
/// and add them to the markdown code fences
|
||||
fn fix_code_block_languages(markdown: &str, html: &str) -> String {
|
||||
let document = Html::parse_document(html);
|
||||
let mut result = markdown.to_string();
|
||||
|
||||
// Find all code blocks with language classes or data-lang attributes
|
||||
if let Ok(code_selector) = Selector::parse("code[class*='language-'], code[data-lang]") {
|
||||
for code_element in document.select(&code_selector) {
|
||||
let mut lang: Option<String> = None;
|
||||
|
||||
// Try to extract language from class (e.g., "language-bash" -> "bash")
|
||||
if let Some(class_attr) = code_element.value().attr("class") {
|
||||
for class in class_attr.split_whitespace() {
|
||||
if class.starts_with("language-") {
|
||||
lang = Some(class[9..].to_string()); // Skip "language-" prefix
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to data-lang attribute if class didn't have language
|
||||
if lang.is_none() {
|
||||
if let Some(data_lang) = code_element.value().attr("data-lang") {
|
||||
lang = Some(data_lang.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
// If we found a language identifier, add it to the markdown fence
|
||||
if let Some(lang_str) = lang {
|
||||
// Get the code content
|
||||
let code_text = code_element.text().collect::<Vec<_>>().join("");
|
||||
let code_text = code_text.trim();
|
||||
|
||||
// Find the code block in markdown (without language identifier)
|
||||
// Look for ```\n<code>\n``` pattern
|
||||
let fence_pattern = format!("```\n{}\n```", code_text);
|
||||
let fence_with_lang = format!("```{}\n{}\n```", lang_str, code_text);
|
||||
|
||||
// Replace first occurrence
|
||||
if result.contains(&fence_pattern) {
|
||||
result = result.replacen(&fence_pattern, &fence_with_lang, 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// Convert HTML to Markdown
|
||||
fn html_to_markdown(html: &str, remove_h1: bool) -> String {
|
||||
// Pre-process HTML
|
||||
let html = replace_icon_spans(html);
|
||||
// Note: tab delimiters are added in post-processing on markdown, not HTML preprocessing
|
||||
|
||||
// Use html2md for basic conversion
|
||||
let markdown = html2md::parse_html(&html);
|
||||
|
||||
// Apply post-processing
|
||||
let markdown = postprocess_markdown(&markdown, &html, remove_h1);
|
||||
|
||||
// Fix code block language identifiers
|
||||
fix_code_block_languages(&markdown, &html)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Frontmatter Generation
|
||||
// ============================================================================
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
struct Frontmatter {
|
||||
title: String,
|
||||
description: String,
|
||||
url: String,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
product: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
product_version: Option<String>,
|
||||
date: String,
|
||||
lastmod: String,
|
||||
estimated_tokens: usize,
|
||||
}
|
||||
|
||||
fn generate_frontmatter(
|
||||
title: &str,
|
||||
description: &str,
|
||||
url_path: &str,
|
||||
content_length: usize,
|
||||
base_url: &str,
|
||||
) -> String {
|
||||
let product = detect_product(url_path);
|
||||
|
||||
// Sanitize description
|
||||
let description = description
|
||||
.chars()
|
||||
.filter(|c| !c.is_control() || *c == '\n')
|
||||
.collect::<String>()
|
||||
.split_whitespace()
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ")
|
||||
.chars()
|
||||
.take(500)
|
||||
.collect::<String>();
|
||||
|
||||
// Estimate tokens (4 chars per token)
|
||||
let estimated_tokens = (content_length + 3) / 4;
|
||||
|
||||
// Generate current timestamp in ISO 8601 format
|
||||
let now = chrono::Utc::now().to_rfc3339_opts(chrono::SecondsFormat::Secs, true);
|
||||
|
||||
// Convert relative URL to full URL using the provided base URL
|
||||
let full_url = format!("{}{}", base_url, url_path);
|
||||
|
||||
let frontmatter = Frontmatter {
|
||||
title: title.to_string(),
|
||||
description,
|
||||
url: full_url,
|
||||
product: product.as_ref().map(|p| p.name.clone()),
|
||||
product_version: product.as_ref().map(|p| p.version.clone()),
|
||||
date: now.clone(),
|
||||
lastmod: now,
|
||||
estimated_tokens,
|
||||
};
|
||||
|
||||
match serde_yaml::to_string(&frontmatter) {
|
||||
Ok(yaml) => format!("---\n{}---", yaml),
|
||||
Err(_) => "---\n---".to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Node.js API (napi-rs bindings)
|
||||
// ============================================================================
|
||||
|
||||
/// Convert HTML to Markdown with frontmatter
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `html_content` - Raw HTML content
|
||||
/// * `url_path` - URL path for the page (for frontmatter generation)
|
||||
/// * `base_url` - Base URL for the site (e.g., "http://localhost:1313" or "https://docs.influxdata.com")
|
||||
///
|
||||
/// # Returns
|
||||
/// Markdown string with YAML frontmatter, or null if conversion fails
|
||||
#[napi]
|
||||
pub fn convert_to_markdown(html_content: String, url_path: String, base_url: String) -> Result<Option<String>> {
|
||||
match extract_article_content(&html_content) {
|
||||
Some((title, description, content)) => {
|
||||
// For single pages, remove h1 since title is in frontmatter
|
||||
let markdown = html_to_markdown(&content, true);
|
||||
let frontmatter = generate_frontmatter(&title, &description, &url_path, markdown.len(), &base_url);
|
||||
|
||||
// Product info is already in frontmatter, no need to duplicate in content
|
||||
Ok(Some(format!("{}\n\n{}\n", frontmatter, markdown)))
|
||||
}
|
||||
None => Ok(None),
|
||||
}
|
||||
}
|
||||
|
||||
/// Convert section HTML with child pages to aggregated Markdown
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `section_html` - HTML content of the section index page
|
||||
/// * `section_url_path` - URL path for the section
|
||||
/// * `child_htmls` - Array of child page objects with `{html, url}` structure
|
||||
/// * `base_url` - Base URL for the site (e.g., "http://localhost:1313" or "https://docs.influxdata.com")
|
||||
///
|
||||
/// # Returns
|
||||
/// Aggregated markdown content or null if conversion fails
|
||||
#[napi(object)]
|
||||
pub struct ChildPageInput {
|
||||
pub html: String,
|
||||
pub url: String,
|
||||
}
|
||||
|
||||
#[napi]
|
||||
pub fn convert_section_to_markdown(
|
||||
section_html: String,
|
||||
section_url_path: String,
|
||||
child_htmls: Vec<ChildPageInput>,
|
||||
base_url: String,
|
||||
) -> Result<Option<String>> {
|
||||
// Extract section metadata
|
||||
let (section_title, section_description, section_content) = match extract_article_content(§ion_html) {
|
||||
Some(data) => data,
|
||||
None => return Ok(None),
|
||||
};
|
||||
|
||||
// For section pages, keep the h1 title in content
|
||||
let section_markdown = html_to_markdown(§ion_content, false);
|
||||
|
||||
// Process child pages
|
||||
let mut child_contents = Vec::new();
|
||||
let mut total_length = section_markdown.len();
|
||||
|
||||
for child in child_htmls {
|
||||
if let Some((title, _desc, content)) = extract_article_content(&child.html) {
|
||||
// For child pages, remove h1 since we add them as h2
|
||||
let child_markdown = html_to_markdown(&content, true);
|
||||
|
||||
// Add as h2 heading with URL
|
||||
let full_child_url = format!("{}{}", base_url, child.url);
|
||||
child_contents.push(format!("## {}\n\n**URL**: {}\n\n{}", title, full_child_url, child_markdown));
|
||||
total_length += child_markdown.len();
|
||||
}
|
||||
}
|
||||
|
||||
// Generate frontmatter
|
||||
let frontmatter = generate_frontmatter(
|
||||
§ion_title,
|
||||
§ion_description,
|
||||
§ion_url_path,
|
||||
total_length,
|
||||
&base_url,
|
||||
);
|
||||
|
||||
// Combine all content
|
||||
let mut all_content = vec![section_markdown];
|
||||
all_content.extend(child_contents);
|
||||
let combined = all_content.join("\n\n---\n\n");
|
||||
|
||||
Ok(Some(format!("{}\n\n{}\n", frontmatter, combined)))
|
||||
}
|
||||
|
||||
/// Detect product from URL path
|
||||
#[napi]
|
||||
pub fn detect_product_from_path(url_path: String) -> Result<Option<ProductInfo>> {
|
||||
Ok(detect_product(&url_path))
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_product_detection() {
|
||||
let product = detect_product("/influxdb3/core/get-started/");
|
||||
assert!(product.is_some());
|
||||
let p = product.unwrap();
|
||||
assert_eq!(p.name, "InfluxDB 3 Core");
|
||||
assert_eq!(p.version, "core");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_html_to_markdown() {
|
||||
let html = "<p>Hello <strong>world</strong>!</p>";
|
||||
let md = html_to_markdown(html, false);
|
||||
assert!(md.contains("Hello **world**!"));
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,779 @@
|
|||
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
|
||||
# yarn lockfile v1
|
||||
|
||||
|
||||
"@emnapi/core@^1.5.0":
|
||||
version "1.7.1"
|
||||
resolved "https://registry.yarnpkg.com/@emnapi/core/-/core-1.7.1.tgz#3a79a02dbc84f45884a1806ebb98e5746bdfaac4"
|
||||
integrity sha512-o1uhUASyo921r2XtHYOHy7gdkGLge8ghBEQHMWmyJFoXlpU58kIrhhN3w26lpQb6dspetweapMn2CSNwQ8I4wg==
|
||||
dependencies:
|
||||
"@emnapi/wasi-threads" "1.1.0"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@emnapi/runtime@^1.5.0":
|
||||
version "1.7.1"
|
||||
resolved "https://registry.yarnpkg.com/@emnapi/runtime/-/runtime-1.7.1.tgz#a73784e23f5d57287369c808197288b52276b791"
|
||||
integrity sha512-PVtJr5CmLwYAU9PZDMITZoR5iAOShYREoR45EyyLrbntV50mdePTgUn4AmOw90Ifcj+x2kRjdzr1HP3RrNiHGA==
|
||||
dependencies:
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@emnapi/wasi-threads@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@emnapi/wasi-threads/-/wasi-threads-1.1.0.tgz#60b2102fddc9ccb78607e4a3cf8403ea69be41bf"
|
||||
integrity sha512-WI0DdZ8xFSbgMjR1sFsKABJ/C5OnRrjT06JXbZKexJGrDuPTzZdDYfFlsgcCXCyf+suG5QU2e/y1Wo2V/OapLQ==
|
||||
dependencies:
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@inquirer/ansi@^1.0.2":
|
||||
version "1.0.2"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/ansi/-/ansi-1.0.2.tgz#674a4c4d81ad460695cb2a1fc69d78cd187f337e"
|
||||
integrity sha512-S8qNSZiYzFd0wAcyG5AXCvUHC5Sr7xpZ9wZ2py9XR88jUz8wooStVx5M6dRzczbBWjic9NP7+rY0Xi7qqK/aMQ==
|
||||
|
||||
"@inquirer/checkbox@^4.3.2":
|
||||
version "4.3.2"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/checkbox/-/checkbox-4.3.2.tgz#e1483e6519d6ffef97281a54d2a5baa0d81b3f3b"
|
||||
integrity sha512-VXukHf0RR1doGe6Sm4F0Em7SWYLTHSsbGfJdS9Ja2bX5/D5uwVOEjr07cncLROdBvmnvCATYEWlHqYmXv2IlQA==
|
||||
dependencies:
|
||||
"@inquirer/ansi" "^1.0.2"
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/figures" "^1.0.15"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/confirm@^5.1.21":
|
||||
version "5.1.21"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/confirm/-/confirm-5.1.21.tgz#610c4acd7797d94890a6e2dde2c98eb1e891dd12"
|
||||
integrity sha512-KR8edRkIsUayMXV+o3Gv+q4jlhENF9nMYUZs9PA2HzrXeHI8M5uDag70U7RJn9yyiMZSbtF5/UexBtAVtZGSbQ==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
|
||||
"@inquirer/core@^10.3.2":
|
||||
version "10.3.2"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/core/-/core-10.3.2.tgz#535979ff3ff4fe1e7cc4f83e2320504c743b7e20"
|
||||
integrity sha512-43RTuEbfP8MbKzedNqBrlhhNKVwoK//vUFNW3Q3vZ88BLcrs4kYpGg+B2mm5p2K/HfygoCxuKwJJiv8PbGmE0A==
|
||||
dependencies:
|
||||
"@inquirer/ansi" "^1.0.2"
|
||||
"@inquirer/figures" "^1.0.15"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
cli-width "^4.1.0"
|
||||
mute-stream "^2.0.0"
|
||||
signal-exit "^4.1.0"
|
||||
wrap-ansi "^6.2.0"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/editor@^4.2.23":
|
||||
version "4.2.23"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/editor/-/editor-4.2.23.tgz#fe046a3bfdae931262de98c1052437d794322e0b"
|
||||
integrity sha512-aLSROkEwirotxZ1pBaP8tugXRFCxW94gwrQLxXfrZsKkfjOYC1aRvAZuhpJOb5cu4IBTJdsCigUlf2iCOu4ZDQ==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/external-editor" "^1.0.3"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
|
||||
"@inquirer/expand@^4.0.23":
|
||||
version "4.0.23"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/expand/-/expand-4.0.23.tgz#a38b5f32226d75717c370bdfed792313b92bdc05"
|
||||
integrity sha512-nRzdOyFYnpeYTTR2qFwEVmIWypzdAx/sIkCMeTNTcflFOovfqUk+HcFhQQVBftAh9gmGrpFj6QcGEqrDMDOiew==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/external-editor@^1.0.3":
|
||||
version "1.0.3"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/external-editor/-/external-editor-1.0.3.tgz#c23988291ee676290fdab3fd306e64010a6d13b8"
|
||||
integrity sha512-RWbSrDiYmO4LbejWY7ttpxczuwQyZLBUyygsA9Nsv95hpzUWwnNTVQmAq3xuh7vNwCp07UTmE5i11XAEExx4RA==
|
||||
dependencies:
|
||||
chardet "^2.1.1"
|
||||
iconv-lite "^0.7.0"
|
||||
|
||||
"@inquirer/figures@^1.0.15":
|
||||
version "1.0.15"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/figures/-/figures-1.0.15.tgz#dbb49ed80df11df74268023b496ac5d9acd22b3a"
|
||||
integrity sha512-t2IEY+unGHOzAaVM5Xx6DEWKeXlDDcNPeDyUpsRc6CUhBfU3VQOEl+Vssh7VNp1dR8MdUJBWhuObjXCsVpjN5g==
|
||||
|
||||
"@inquirer/input@^4.3.1":
|
||||
version "4.3.1"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/input/-/input-4.3.1.tgz#778683b4c4c4d95d05d4b05c4a854964b73565b4"
|
||||
integrity sha512-kN0pAM4yPrLjJ1XJBjDxyfDduXOuQHrBB8aLDMueuwUGn+vNpF7Gq7TvyVxx8u4SHlFFj4trmj+a2cbpG4Jn1g==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
|
||||
"@inquirer/number@^3.0.23":
|
||||
version "3.0.23"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/number/-/number-3.0.23.tgz#3fdec2540d642093fd7526818fd8d4bdc7335094"
|
||||
integrity sha512-5Smv0OK7K0KUzUfYUXDXQc9jrf8OHo4ktlEayFlelCjwMXz0299Y8OrI+lj7i4gCBY15UObk76q0QtxjzFcFcg==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
|
||||
"@inquirer/password@^4.0.23":
|
||||
version "4.0.23"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/password/-/password-4.0.23.tgz#b9f5187c8c92fd7aa9eceb9d8f2ead0d7e7b000d"
|
||||
integrity sha512-zREJHjhT5vJBMZX/IUbyI9zVtVfOLiTO66MrF/3GFZYZ7T4YILW5MSkEYHceSii/KtRk+4i3RE7E1CUXA2jHcA==
|
||||
dependencies:
|
||||
"@inquirer/ansi" "^1.0.2"
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
|
||||
"@inquirer/prompts@^7.8.4":
|
||||
version "7.10.1"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/prompts/-/prompts-7.10.1.tgz#e1436c0484cf04c22548c74e2cd239e989d5f847"
|
||||
integrity sha512-Dx/y9bCQcXLI5ooQ5KyvA4FTgeo2jYj/7plWfV5Ak5wDPKQZgudKez2ixyfz7tKXzcJciTxqLeK7R9HItwiByg==
|
||||
dependencies:
|
||||
"@inquirer/checkbox" "^4.3.2"
|
||||
"@inquirer/confirm" "^5.1.21"
|
||||
"@inquirer/editor" "^4.2.23"
|
||||
"@inquirer/expand" "^4.0.23"
|
||||
"@inquirer/input" "^4.3.1"
|
||||
"@inquirer/number" "^3.0.23"
|
||||
"@inquirer/password" "^4.0.23"
|
||||
"@inquirer/rawlist" "^4.1.11"
|
||||
"@inquirer/search" "^3.2.2"
|
||||
"@inquirer/select" "^4.4.2"
|
||||
|
||||
"@inquirer/rawlist@^4.1.11":
|
||||
version "4.1.11"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/rawlist/-/rawlist-4.1.11.tgz#313c8c3ffccb7d41e990c606465726b4a898a033"
|
||||
integrity sha512-+LLQB8XGr3I5LZN/GuAHo+GpDJegQwuPARLChlMICNdwW7OwV2izlCSCxN6cqpL0sMXmbKbFcItJgdQq5EBXTw==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/search@^3.2.2":
|
||||
version "3.2.2"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/search/-/search-3.2.2.tgz#4cc6fd574dcd434e4399badc37c742c3fd534ac8"
|
||||
integrity sha512-p2bvRfENXCZdWF/U2BXvnSI9h+tuA8iNqtUKb9UWbmLYCRQxd8WkvwWvYn+3NgYaNwdUkHytJMGG4MMLucI1kA==
|
||||
dependencies:
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/figures" "^1.0.15"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/select@^4.4.2":
|
||||
version "4.4.2"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/select/-/select-4.4.2.tgz#2ac8fca960913f18f1d1b35323ed8fcd27d89323"
|
||||
integrity sha512-l4xMuJo55MAe+N7Qr4rX90vypFwCajSakx59qe/tMaC1aEHWLyw68wF4o0A4SLAY4E0nd+Vt+EyskeDIqu1M6w==
|
||||
dependencies:
|
||||
"@inquirer/ansi" "^1.0.2"
|
||||
"@inquirer/core" "^10.3.2"
|
||||
"@inquirer/figures" "^1.0.15"
|
||||
"@inquirer/type" "^3.0.10"
|
||||
yoctocolors-cjs "^2.1.3"
|
||||
|
||||
"@inquirer/type@^3.0.10":
|
||||
version "3.0.10"
|
||||
resolved "https://registry.yarnpkg.com/@inquirer/type/-/type-3.0.10.tgz#11ed564ec78432a200ea2601a212d24af8150d50"
|
||||
integrity sha512-BvziSRxfz5Ov8ch0z/n3oijRSEcEsHnhggm4xFZe93DHcUCTlutlq9Ox4SVENAfcRD22UQq7T/atg9Wr3k09eA==
|
||||
|
||||
"@napi-rs/cli@^3.4.1":
|
||||
version "3.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/cli/-/cli-3.4.1.tgz#424d505b0c57e87f6d869d81d354833a51657fa6"
|
||||
integrity sha512-ayhm+NfrP5Hmh7vy5pfyYm/ktYtLh2PrgdLuqHTAubO7RoO2JkUE4F991AtgYxNewwXI8+guZLxU8itV7QnDrQ==
|
||||
dependencies:
|
||||
"@inquirer/prompts" "^7.8.4"
|
||||
"@napi-rs/cross-toolchain" "^1.0.3"
|
||||
"@napi-rs/wasm-tools" "^1.0.1"
|
||||
"@octokit/rest" "^22.0.0"
|
||||
clipanion "^4.0.0-rc.4"
|
||||
colorette "^2.0.20"
|
||||
debug "^4.4.1"
|
||||
emnapi "^1.5.0"
|
||||
es-toolkit "^1.39.10"
|
||||
js-yaml "^4.1.0"
|
||||
semver "^7.7.2"
|
||||
typanion "^3.14.0"
|
||||
|
||||
"@napi-rs/cross-toolchain@^1.0.3":
|
||||
version "1.0.3"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/cross-toolchain/-/cross-toolchain-1.0.3.tgz#8e345d0c9a8aeeaf9287e7af1d4ce83476681373"
|
||||
integrity sha512-ENPfLe4937bsKVTDA6zdABx4pq9w0tHqRrJHyaGxgaPq03a2Bd1unD5XSKjXJjebsABJ+MjAv1A2OvCgK9yehg==
|
||||
dependencies:
|
||||
"@napi-rs/lzma" "^1.4.5"
|
||||
"@napi-rs/tar" "^1.1.0"
|
||||
debug "^4.4.1"
|
||||
|
||||
"@napi-rs/lzma-android-arm-eabi@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-android-arm-eabi/-/lzma-android-arm-eabi-1.4.5.tgz#c6722a1d7201e269fdb6ba997d28cb41223e515c"
|
||||
integrity sha512-Up4gpyw2SacmyKWWEib06GhiDdF+H+CCU0LAV8pnM4aJIDqKKd5LHSlBht83Jut6frkB0vwEPmAkv4NjQ5u//Q==
|
||||
|
||||
"@napi-rs/lzma-android-arm64@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-android-arm64/-/lzma-android-arm64-1.4.5.tgz#05df61667e84419e0550200b48169057b734806f"
|
||||
integrity sha512-uwa8sLlWEzkAM0MWyoZJg0JTD3BkPknvejAFG2acUA1raXM8jLrqujWCdOStisXhqQjZ2nDMp3FV6cs//zjfuQ==
|
||||
|
||||
"@napi-rs/lzma-darwin-arm64@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-darwin-arm64/-/lzma-darwin-arm64-1.4.5.tgz#c37a01c53f25cb7f014870d2ea6c5576138bcaaa"
|
||||
integrity sha512-0Y0TQLQ2xAjVabrMDem1NhIssOZzF/y/dqetc6OT8mD3xMTDtF8u5BqZoX3MyPc9FzpsZw4ksol+w7DsxHrpMA==
|
||||
|
||||
"@napi-rs/lzma-darwin-x64@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-darwin-x64/-/lzma-darwin-x64-1.4.5.tgz#555b1dd65d7b104d28b2a12d925d7059226c7f4b"
|
||||
integrity sha512-vR2IUyJY3En+V1wJkwmbGWcYiT8pHloTAWdW4pG24+51GIq+intst6Uf6D/r46citObGZrlX0QvMarOkQeHWpw==
|
||||
|
||||
"@napi-rs/lzma-freebsd-x64@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-freebsd-x64/-/lzma-freebsd-x64-1.4.5.tgz#683beff15b37774ec91e1de7b4d337894bf43694"
|
||||
integrity sha512-XpnYQC5SVovO35tF0xGkbHYjsS6kqyNCjuaLQ2dbEblFRr5cAZVvsJ/9h7zj/5FluJPJRDojVNxGyRhTp4z2lw==
|
||||
|
||||
"@napi-rs/lzma-linux-arm-gnueabihf@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-arm-gnueabihf/-/lzma-linux-arm-gnueabihf-1.4.5.tgz#505f659a9131474b7270afa4a4e9caf709c4d213"
|
||||
integrity sha512-ic1ZZMoRfRMwtSwxkyw4zIlbDZGC6davC9r+2oX6x9QiF247BRqqT94qGeL5ZP4Vtz0Hyy7TEViWhx5j6Bpzvw==
|
||||
|
||||
"@napi-rs/lzma-linux-arm64-gnu@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-arm64-gnu/-/lzma-linux-arm64-gnu-1.4.5.tgz#ecbb944635fa004a9415d1f50f165bc0d26d3807"
|
||||
integrity sha512-asEp7FPd7C1Yi6DQb45a3KPHKOFBSfGuJWXcAd4/bL2Fjetb2n/KK2z14yfW8YC/Fv6x3rBM0VAZKmJuz4tysg==
|
||||
|
||||
"@napi-rs/lzma-linux-arm64-musl@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-arm64-musl/-/lzma-linux-arm64-musl-1.4.5.tgz#c0d17f40ce2db0b075469a28f233fd8ce31fbb95"
|
||||
integrity sha512-yWjcPDgJ2nIL3KNvi4536dlT/CcCWO0DUyEOlBs/SacG7BeD6IjGh6yYzd3/X1Y3JItCbZoDoLUH8iB1lTXo3w==
|
||||
|
||||
"@napi-rs/lzma-linux-ppc64-gnu@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-ppc64-gnu/-/lzma-linux-ppc64-gnu-1.4.5.tgz#2f17b9d1fc920c6c511d2086c7623752172c2f07"
|
||||
integrity sha512-0XRhKuIU/9ZjT4WDIG/qnX7Xz7mSQHYZo9Gb3MP2gcvBgr6BA4zywQ9k3gmQaPn9ECE+CZg2V7DV7kT+x2pUMQ==
|
||||
|
||||
"@napi-rs/lzma-linux-riscv64-gnu@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-riscv64-gnu/-/lzma-linux-riscv64-gnu-1.4.5.tgz#63c2a4e1157586252186e39604370d5b29c6db85"
|
||||
integrity sha512-QrqDIPEUUB23GCpyQj/QFyMlr8SGxxyExeZz9OWFnHfb70kXdTLWrHS/hEI1Ru+lSbQ/6xRqeoGyQ4Aqdg+/RA==
|
||||
|
||||
"@napi-rs/lzma-linux-s390x-gnu@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-s390x-gnu/-/lzma-linux-s390x-gnu-1.4.5.tgz#6f2ca44bf5c5bef1b31d7516bf15d63c35cdf59f"
|
||||
integrity sha512-k8RVM5aMhW86E9H0QXdquwojew4H3SwPxbRVbl49/COJQWCUjGi79X6mYruMnMPEznZinUiT1jgKbFo2A00NdA==
|
||||
|
||||
"@napi-rs/lzma-linux-x64-gnu@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-x64-gnu/-/lzma-linux-x64-gnu-1.4.5.tgz#54879d88a9c370687b5463c7c1b6208b718c1ab2"
|
||||
integrity sha512-6rMtBgnIq2Wcl1rQdZsnM+rtCcVCbws1nF8S2NzaUsVaZv8bjrPiAa0lwg4Eqnn1d9lgwqT+cZgm5m+//K08Kw==
|
||||
|
||||
"@napi-rs/lzma-linux-x64-musl@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-linux-x64-musl/-/lzma-linux-x64-musl-1.4.5.tgz#412705f6925f10f45122bd0f3e2fb6e597bed4f8"
|
||||
integrity sha512-eiadGBKi7Vd0bCArBUOO/qqRYPHt/VQVvGyYvDFt6C2ZSIjlD+HuOl+2oS1sjf4CFjK4eDIog6EdXnL0NE6iyQ==
|
||||
|
||||
"@napi-rs/lzma-wasm32-wasi@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-wasm32-wasi/-/lzma-wasm32-wasi-1.4.5.tgz#4b74abfd144371123cb6f5b7bad5bae868206ecf"
|
||||
integrity sha512-+VyHHlr68dvey6fXc2hehw9gHVFIW3TtGF1XkcbAu65qVXsA9D/T+uuoRVqhE+JCyFHFrO0ixRbZDRK1XJt1sA==
|
||||
dependencies:
|
||||
"@napi-rs/wasm-runtime" "^1.0.3"
|
||||
|
||||
"@napi-rs/lzma-win32-arm64-msvc@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-win32-arm64-msvc/-/lzma-win32-arm64-msvc-1.4.5.tgz#7ed8c80d588fa244a7fd55249cb0d011d04bf984"
|
||||
integrity sha512-eewnqvIyyhHi3KaZtBOJXohLvwwN27gfS2G/YDWdfHlbz1jrmfeHAmzMsP5qv8vGB+T80TMHNkro4kYjeh6Deg==
|
||||
|
||||
"@napi-rs/lzma-win32-ia32-msvc@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-win32-ia32-msvc/-/lzma-win32-ia32-msvc-1.4.5.tgz#e6f70ca87bd88370102aa610ee9e44ec28911b46"
|
||||
integrity sha512-OeacFVRCJOKNU/a0ephUfYZ2Yt+NvaHze/4TgOwJ0J0P4P7X1mHzN+ig9Iyd74aQDXYqc7kaCXA2dpAOcH87Cg==
|
||||
|
||||
"@napi-rs/lzma-win32-x64-msvc@1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma-win32-x64-msvc/-/lzma-win32-x64-msvc-1.4.5.tgz#ecfcfe364e805915608ce0ff41ed4c950fdb51b8"
|
||||
integrity sha512-T4I1SamdSmtyZgDXGAGP+y5LEK5vxHUFwe8mz6D4R7Sa5/WCxTcCIgPJ9BD7RkpO17lzhlaM2vmVvMy96Lvk9Q==
|
||||
|
||||
"@napi-rs/lzma@^1.4.5":
|
||||
version "1.4.5"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/lzma/-/lzma-1.4.5.tgz#43e17cdfe332a3f33fa640422da348db3d8825e1"
|
||||
integrity sha512-zS5LuN1OBPAyZpda2ZZgYOEDC+xecUdAGnrvbYzjnLXkrq/OBC3B9qcRvlxbDR3k5H/gVfvef1/jyUqPknqjbg==
|
||||
optionalDependencies:
|
||||
"@napi-rs/lzma-android-arm-eabi" "1.4.5"
|
||||
"@napi-rs/lzma-android-arm64" "1.4.5"
|
||||
"@napi-rs/lzma-darwin-arm64" "1.4.5"
|
||||
"@napi-rs/lzma-darwin-x64" "1.4.5"
|
||||
"@napi-rs/lzma-freebsd-x64" "1.4.5"
|
||||
"@napi-rs/lzma-linux-arm-gnueabihf" "1.4.5"
|
||||
"@napi-rs/lzma-linux-arm64-gnu" "1.4.5"
|
||||
"@napi-rs/lzma-linux-arm64-musl" "1.4.5"
|
||||
"@napi-rs/lzma-linux-ppc64-gnu" "1.4.5"
|
||||
"@napi-rs/lzma-linux-riscv64-gnu" "1.4.5"
|
||||
"@napi-rs/lzma-linux-s390x-gnu" "1.4.5"
|
||||
"@napi-rs/lzma-linux-x64-gnu" "1.4.5"
|
||||
"@napi-rs/lzma-linux-x64-musl" "1.4.5"
|
||||
"@napi-rs/lzma-wasm32-wasi" "1.4.5"
|
||||
"@napi-rs/lzma-win32-arm64-msvc" "1.4.5"
|
||||
"@napi-rs/lzma-win32-ia32-msvc" "1.4.5"
|
||||
"@napi-rs/lzma-win32-x64-msvc" "1.4.5"
|
||||
|
||||
"@napi-rs/tar-android-arm-eabi@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-android-arm-eabi/-/tar-android-arm-eabi-1.1.0.tgz#08ae6ebbaf38d416954a28ca09bf77410d5b0c2b"
|
||||
integrity sha512-h2Ryndraj/YiKgMV/r5by1cDusluYIRT0CaE0/PekQ4u+Wpy2iUVqvzVU98ZPnhXaNeYxEvVJHNGafpOfaD0TA==
|
||||
|
||||
"@napi-rs/tar-android-arm64@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-android-arm64/-/tar-android-arm64-1.1.0.tgz#825a76140116f89d7e930245bda9f70b196da565"
|
||||
integrity sha512-DJFyQHr1ZxNZorm/gzc1qBNLF/FcKzcH0V0Vwan5P+o0aE2keQIGEjJ09FudkF9v6uOuJjHCVDdK6S6uHtShAw==
|
||||
|
||||
"@napi-rs/tar-darwin-arm64@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-darwin-arm64/-/tar-darwin-arm64-1.1.0.tgz#8821616c40ea52ec2c00a055be56bf28dee76013"
|
||||
integrity sha512-Zz2sXRzjIX4e532zD6xm2SjXEym6MkvfCvL2RMpG2+UwNVDVscHNcz3d47Pf3sysP2e2af7fBB3TIoK2f6trPw==
|
||||
|
||||
"@napi-rs/tar-darwin-x64@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-darwin-x64/-/tar-darwin-x64-1.1.0.tgz#4a975e41932a145c58181cb43c8f483c3858e359"
|
||||
integrity sha512-EI+CptIMNweT0ms9S3mkP/q+J6FNZ1Q6pvpJOEcWglRfyfQpLqjlC0O+dptruTPE8VamKYuqdjxfqD8hifZDOA==
|
||||
|
||||
"@napi-rs/tar-freebsd-x64@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-freebsd-x64/-/tar-freebsd-x64-1.1.0.tgz#5ebc0633f257b258aacc59ac1420835513ed0967"
|
||||
integrity sha512-J0PIqX+pl6lBIAckL/c87gpodLbjZB1OtIK+RDscKC9NLdpVv6VGOxzUV/fYev/hctcE8EfkLbgFOfpmVQPg2g==
|
||||
|
||||
"@napi-rs/tar-linux-arm-gnueabihf@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-arm-gnueabihf/-/tar-linux-arm-gnueabihf-1.1.0.tgz#1d309bd4f46f0490353d9608e79d260cf6c7cd43"
|
||||
integrity sha512-SLgIQo3f3EjkZ82ZwvrEgFvMdDAhsxCYjyoSuWfHCz0U16qx3SuGCp8+FYOPYCECHN3ZlGjXnoAIt9ERd0dEUg==
|
||||
|
||||
"@napi-rs/tar-linux-arm64-gnu@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-arm64-gnu/-/tar-linux-arm64-gnu-1.1.0.tgz#88d974821f3f8e9ee6948b4d51c78c019dee88ad"
|
||||
integrity sha512-d014cdle52EGaH6GpYTQOP9Py7glMO1zz/+ynJPjjzYFSxvdYx0byrjumZk2UQdIyGZiJO2MEFpCkEEKFSgPYA==
|
||||
|
||||
"@napi-rs/tar-linux-arm64-musl@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-arm64-musl/-/tar-linux-arm64-musl-1.1.0.tgz#ab2baee7b288df5e68cef0b2d12fa79d2a551b58"
|
||||
integrity sha512-L/y1/26q9L/uBqiW/JdOb/Dc94egFvNALUZV2WCGKQXc6UByPBMgdiEyW2dtoYxYYYYc+AKD+jr+wQPcvX2vrQ==
|
||||
|
||||
"@napi-rs/tar-linux-ppc64-gnu@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-ppc64-gnu/-/tar-linux-ppc64-gnu-1.1.0.tgz#7500e60d27849ba36fa4802a346249974e7ecf74"
|
||||
integrity sha512-EPE1K/80RQvPbLRJDJs1QmCIcH+7WRi0F73+oTe1582y9RtfGRuzAkzeBuAGRXAQEjRQw/RjtNqr6UTJ+8UuWQ==
|
||||
|
||||
"@napi-rs/tar-linux-s390x-gnu@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-s390x-gnu/-/tar-linux-s390x-gnu-1.1.0.tgz#cfc0923bfad1dea8ef9da22148a8d4932aa52d08"
|
||||
integrity sha512-B2jhWiB1ffw1nQBqLUP1h4+J1ovAxBOoe5N2IqDMOc63fsPZKNqF1PvO/dIem8z7LL4U4bsfmhy3gBfu547oNQ==
|
||||
|
||||
"@napi-rs/tar-linux-x64-gnu@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-x64-gnu/-/tar-linux-x64-gnu-1.1.0.tgz#5fdf9e1bb12b10a951c6ab03268a9f8d9788c929"
|
||||
integrity sha512-tbZDHnb9617lTnsDMGo/eAMZxnsQFnaRe+MszRqHguKfMwkisc9CCJnks/r1o84u5fECI+J/HOrKXgczq/3Oww==
|
||||
|
||||
"@napi-rs/tar-linux-x64-musl@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-linux-x64-musl/-/tar-linux-x64-musl-1.1.0.tgz#f001fc0a0a2996dcf99e787a15eade8dce215e91"
|
||||
integrity sha512-dV6cODlzbO8u6Anmv2N/ilQHq/AWz0xyltuXoLU3yUyXbZcnWYZuB2rL8OBGPmqNcD+x9NdScBNXh7vWN0naSQ==
|
||||
|
||||
"@napi-rs/tar-wasm32-wasi@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-wasm32-wasi/-/tar-wasm32-wasi-1.1.0.tgz#c1c7df7738b23f1cdbcff261d5bea6968d0a3c9a"
|
||||
integrity sha512-jIa9nb2HzOrfH0F8QQ9g3WE4aMH5vSI5/1NYVNm9ysCmNjCCtMXCAhlI3WKCdm/DwHf0zLqdrrtDFXODcNaqMw==
|
||||
dependencies:
|
||||
"@napi-rs/wasm-runtime" "^1.0.3"
|
||||
|
||||
"@napi-rs/tar-win32-arm64-msvc@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-win32-arm64-msvc/-/tar-win32-arm64-msvc-1.1.0.tgz#4c8519eab28021e1eda0847433cab949d5389833"
|
||||
integrity sha512-vfpG71OB0ijtjemp3WTdmBKJm9R70KM8vsSExMsIQtV0lVzP07oM1CW6JbNRPXNLhRoue9ofYLiUDk8bE0Hckg==
|
||||
|
||||
"@napi-rs/tar-win32-ia32-msvc@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-win32-ia32-msvc/-/tar-win32-ia32-msvc-1.1.0.tgz#4f61af0da2c53b23f7d58c77970eaa4449e8eb79"
|
||||
integrity sha512-hGPyPW60YSpOSgzfy68DLBHgi6HxkAM+L59ZZZPMQ0TOXjQg+p2EW87+TjZfJOkSpbYiEkULwa/f4a2hcVjsqQ==
|
||||
|
||||
"@napi-rs/tar-win32-x64-msvc@1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar-win32-x64-msvc/-/tar-win32-x64-msvc-1.1.0.tgz#eb63fb44ecde001cce6be238f175e66a06c15035"
|
||||
integrity sha512-L6Ed1DxXK9YSCMyvpR8MiNAyKNkQLjsHsHK9E0qnHa8NzLFqzDKhvs5LfnWxM2kJ+F7m/e5n9zPm24kHb3LsVw==
|
||||
|
||||
"@napi-rs/tar@^1.1.0":
|
||||
version "1.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/tar/-/tar-1.1.0.tgz#acecd9e29f705a3f534d5fb3d8aa36b3266727d0"
|
||||
integrity sha512-7cmzIu+Vbupriudo7UudoMRH2OA3cTw67vva8MxeoAe5S7vPFI7z0vp0pMXiA25S8IUJefImQ90FeJjl8fjEaQ==
|
||||
optionalDependencies:
|
||||
"@napi-rs/tar-android-arm-eabi" "1.1.0"
|
||||
"@napi-rs/tar-android-arm64" "1.1.0"
|
||||
"@napi-rs/tar-darwin-arm64" "1.1.0"
|
||||
"@napi-rs/tar-darwin-x64" "1.1.0"
|
||||
"@napi-rs/tar-freebsd-x64" "1.1.0"
|
||||
"@napi-rs/tar-linux-arm-gnueabihf" "1.1.0"
|
||||
"@napi-rs/tar-linux-arm64-gnu" "1.1.0"
|
||||
"@napi-rs/tar-linux-arm64-musl" "1.1.0"
|
||||
"@napi-rs/tar-linux-ppc64-gnu" "1.1.0"
|
||||
"@napi-rs/tar-linux-s390x-gnu" "1.1.0"
|
||||
"@napi-rs/tar-linux-x64-gnu" "1.1.0"
|
||||
"@napi-rs/tar-linux-x64-musl" "1.1.0"
|
||||
"@napi-rs/tar-wasm32-wasi" "1.1.0"
|
||||
"@napi-rs/tar-win32-arm64-msvc" "1.1.0"
|
||||
"@napi-rs/tar-win32-ia32-msvc" "1.1.0"
|
||||
"@napi-rs/tar-win32-x64-msvc" "1.1.0"
|
||||
|
||||
"@napi-rs/wasm-runtime@^1.0.3":
|
||||
version "1.0.7"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-runtime/-/wasm-runtime-1.0.7.tgz#dcfea99a75f06209a235f3d941e3460a51e9b14c"
|
||||
integrity sha512-SeDnOO0Tk7Okiq6DbXmmBODgOAb9dp9gjlphokTUxmt8U3liIP1ZsozBahH69j/RJv+Rfs6IwUKHTgQYJ/HBAw==
|
||||
dependencies:
|
||||
"@emnapi/core" "^1.5.0"
|
||||
"@emnapi/runtime" "^1.5.0"
|
||||
"@tybys/wasm-util" "^0.10.1"
|
||||
|
||||
"@napi-rs/wasm-tools-android-arm-eabi@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-android-arm-eabi/-/wasm-tools-android-arm-eabi-1.0.1.tgz#a709f93ddd95508a4ef949b5ceff2b2e85b676f7"
|
||||
integrity sha512-lr07E/l571Gft5v4aA1dI8koJEmF1F0UigBbsqg9OWNzg80H3lDPO+auv85y3T/NHE3GirDk7x/D3sLO57vayw==
|
||||
|
||||
"@napi-rs/wasm-tools-android-arm64@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-android-arm64/-/wasm-tools-android-arm64-1.0.1.tgz#304b5761b4fcc871b876ebd34975c72c9d11a7fc"
|
||||
integrity sha512-WDR7S+aRLV6LtBJAg5fmjKkTZIdrEnnQxgdsb7Cf8pYiMWBHLU+LC49OUVppQ2YSPY0+GeYm9yuZWW3kLjJ7Bg==
|
||||
|
||||
"@napi-rs/wasm-tools-darwin-arm64@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-darwin-arm64/-/wasm-tools-darwin-arm64-1.0.1.tgz#dafb4330986a8b46e8de1603ea2f6932a19634c6"
|
||||
integrity sha512-qWTI+EEkiN0oIn/N2gQo7+TVYil+AJ20jjuzD2vATS6uIjVz+Updeqmszi7zq7rdFTLp6Ea3/z4kDKIfZwmR9g==
|
||||
|
||||
"@napi-rs/wasm-tools-darwin-x64@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-darwin-x64/-/wasm-tools-darwin-x64-1.0.1.tgz#0919e63714ee0a52b1120f6452bbc3a4d793ce3c"
|
||||
integrity sha512-bA6hubqtHROR5UI3tToAF/c6TDmaAgF0SWgo4rADHtQ4wdn0JeogvOk50gs2TYVhKPE2ZD2+qqt7oBKB+sxW3A==
|
||||
|
||||
"@napi-rs/wasm-tools-freebsd-x64@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-freebsd-x64/-/wasm-tools-freebsd-x64-1.0.1.tgz#1f50a2d5d5af041c55634f43f623ae49192bce9c"
|
||||
integrity sha512-90+KLBkD9hZEjPQW1MDfwSt5J1L46EUKacpCZWyRuL6iIEO5CgWU0V/JnEgFsDOGyyYtiTvHc5bUdUTWd4I9Vg==
|
||||
|
||||
"@napi-rs/wasm-tools-linux-arm64-gnu@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-linux-arm64-gnu/-/wasm-tools-linux-arm64-gnu-1.0.1.tgz#6106d5e65a25ec2ae417c2fcfebd5c8f14d80e84"
|
||||
integrity sha512-rG0QlS65x9K/u3HrKafDf8cFKj5wV2JHGfl8abWgKew0GVPyp6vfsDweOwHbWAjcHtp2LHi6JHoW80/MTHm52Q==
|
||||
|
||||
"@napi-rs/wasm-tools-linux-arm64-musl@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-linux-arm64-musl/-/wasm-tools-linux-arm64-musl-1.0.1.tgz#0eb3d4d1fbc1938b0edd907423840365ebc53859"
|
||||
integrity sha512-jAasbIvjZXCgX0TCuEFQr+4D6Lla/3AAVx2LmDuMjgG4xoIXzjKWl7c4chuaD+TI+prWT0X6LJcdzFT+ROKGHQ==
|
||||
|
||||
"@napi-rs/wasm-tools-linux-x64-gnu@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-linux-x64-gnu/-/wasm-tools-linux-x64-gnu-1.0.1.tgz#5de6a567083a83efed16d046f47b680cbe7c9b53"
|
||||
integrity sha512-Plgk5rPqqK2nocBGajkMVbGm010Z7dnUgq0wtnYRZbzWWxwWcXfZMPa8EYxrK4eE8SzpI7VlZP1tdVsdjgGwMw==
|
||||
|
||||
"@napi-rs/wasm-tools-linux-x64-musl@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-linux-x64-musl/-/wasm-tools-linux-x64-musl-1.0.1.tgz#04cc17ef12b4e5012f2d0e46b09cabe473566e5a"
|
||||
integrity sha512-GW7AzGuWxtQkyHknHWYFdR0CHmW6is8rG2Rf4V6GNmMpmwtXt/ItWYWtBe4zqJWycMNazpfZKSw/BpT7/MVCXQ==
|
||||
|
||||
"@napi-rs/wasm-tools-wasm32-wasi@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-wasm32-wasi/-/wasm-tools-wasm32-wasi-1.0.1.tgz#6ced3bd03428c854397f00509b1694c3af857a0f"
|
||||
integrity sha512-/nQVSTrqSsn7YdAc2R7Ips/tnw5SPUcl3D7QrXCNGPqjbatIspnaexvaOYNyKMU6xPu+pc0BTnKVmqhlJJCPLA==
|
||||
dependencies:
|
||||
"@napi-rs/wasm-runtime" "^1.0.3"
|
||||
|
||||
"@napi-rs/wasm-tools-win32-arm64-msvc@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-win32-arm64-msvc/-/wasm-tools-win32-arm64-msvc-1.0.1.tgz#e776f66eb637eee312b562e987c0a5871ddc6dac"
|
||||
integrity sha512-PFi7oJIBu5w7Qzh3dwFea3sHRO3pojMsaEnUIy22QvsW+UJfNQwJCryVrpoUt8m4QyZXI+saEq/0r4GwdoHYFQ==
|
||||
|
||||
"@napi-rs/wasm-tools-win32-ia32-msvc@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-win32-ia32-msvc/-/wasm-tools-win32-ia32-msvc-1.0.1.tgz#9167919a62d24cb3a46f01fada26fee38aeaf884"
|
||||
integrity sha512-gXkuYzxQsgkj05Zaq+KQTkHIN83dFAwMcTKa2aQcpYPRImFm2AQzEyLtpXmyCWzJ0F9ZYAOmbSyrNew8/us6bw==
|
||||
|
||||
"@napi-rs/wasm-tools-win32-x64-msvc@1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools-win32-x64-msvc/-/wasm-tools-win32-x64-msvc-1.0.1.tgz#f896ab29a83605795bb12cf2cfc1a215bc830c65"
|
||||
integrity sha512-rEAf05nol3e3eei2sRButmgXP+6ATgm0/38MKhz9Isne82T4rPIMYsCIFj0kOisaGeVwoi2fnm7O9oWp5YVnYQ==
|
||||
|
||||
"@napi-rs/wasm-tools@^1.0.1":
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@napi-rs/wasm-tools/-/wasm-tools-1.0.1.tgz#f54caa0132322fd5275690b2aeb581d11539262f"
|
||||
integrity sha512-enkZYyuCdo+9jneCPE/0fjIta4wWnvVN9hBo2HuiMpRF0q3lzv1J6b/cl7i0mxZUKhBrV3aCKDBQnCOhwKbPmQ==
|
||||
optionalDependencies:
|
||||
"@napi-rs/wasm-tools-android-arm-eabi" "1.0.1"
|
||||
"@napi-rs/wasm-tools-android-arm64" "1.0.1"
|
||||
"@napi-rs/wasm-tools-darwin-arm64" "1.0.1"
|
||||
"@napi-rs/wasm-tools-darwin-x64" "1.0.1"
|
||||
"@napi-rs/wasm-tools-freebsd-x64" "1.0.1"
|
||||
"@napi-rs/wasm-tools-linux-arm64-gnu" "1.0.1"
|
||||
"@napi-rs/wasm-tools-linux-arm64-musl" "1.0.1"
|
||||
"@napi-rs/wasm-tools-linux-x64-gnu" "1.0.1"
|
||||
"@napi-rs/wasm-tools-linux-x64-musl" "1.0.1"
|
||||
"@napi-rs/wasm-tools-wasm32-wasi" "1.0.1"
|
||||
"@napi-rs/wasm-tools-win32-arm64-msvc" "1.0.1"
|
||||
"@napi-rs/wasm-tools-win32-ia32-msvc" "1.0.1"
|
||||
"@napi-rs/wasm-tools-win32-x64-msvc" "1.0.1"
|
||||
|
||||
"@octokit/auth-token@^6.0.0":
|
||||
version "6.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/auth-token/-/auth-token-6.0.0.tgz#b02e9c08a2d8937df09a2a981f226ad219174c53"
|
||||
integrity sha512-P4YJBPdPSpWTQ1NU4XYdvHvXJJDxM6YwpS0FZHRgP7YFkdVxsWcpWGy/NVqlAA7PcPCnMacXlRm1y2PFZRWL/w==
|
||||
|
||||
"@octokit/core@^7.0.6":
|
||||
version "7.0.6"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/core/-/core-7.0.6.tgz#0d58704391c6b681dec1117240ea4d2a98ac3916"
|
||||
integrity sha512-DhGl4xMVFGVIyMwswXeyzdL4uXD5OGILGX5N8Y+f6W7LhC1Ze2poSNrkF/fedpVDHEEZ+PHFW0vL14I+mm8K3Q==
|
||||
dependencies:
|
||||
"@octokit/auth-token" "^6.0.0"
|
||||
"@octokit/graphql" "^9.0.3"
|
||||
"@octokit/request" "^10.0.6"
|
||||
"@octokit/request-error" "^7.0.2"
|
||||
"@octokit/types" "^16.0.0"
|
||||
before-after-hook "^4.0.0"
|
||||
universal-user-agent "^7.0.0"
|
||||
|
||||
"@octokit/endpoint@^11.0.2":
|
||||
version "11.0.2"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/endpoint/-/endpoint-11.0.2.tgz#a8d955e053a244938b81d86cd73efd2dcb5ef5af"
|
||||
integrity sha512-4zCpzP1fWc7QlqunZ5bSEjxc6yLAlRTnDwKtgXfcI/FxxGoqedDG8V2+xJ60bV2kODqcGB+nATdtap/XYq2NZQ==
|
||||
dependencies:
|
||||
"@octokit/types" "^16.0.0"
|
||||
universal-user-agent "^7.0.2"
|
||||
|
||||
"@octokit/graphql@^9.0.3":
|
||||
version "9.0.3"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/graphql/-/graphql-9.0.3.tgz#5b8341c225909e924b466705c13477face869456"
|
||||
integrity sha512-grAEuupr/C1rALFnXTv6ZQhFuL1D8G5y8CN04RgrO4FIPMrtm+mcZzFG7dcBm+nq+1ppNixu+Jd78aeJOYxlGA==
|
||||
dependencies:
|
||||
"@octokit/request" "^10.0.6"
|
||||
"@octokit/types" "^16.0.0"
|
||||
universal-user-agent "^7.0.0"
|
||||
|
||||
"@octokit/openapi-types@^27.0.0":
|
||||
version "27.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/openapi-types/-/openapi-types-27.0.0.tgz#374ea53781965fd02a9d36cacb97e152cefff12d"
|
||||
integrity sha512-whrdktVs1h6gtR+09+QsNk2+FO+49j6ga1c55YZudfEG+oKJVvJLQi3zkOm5JjiUXAagWK2tI2kTGKJ2Ys7MGA==
|
||||
|
||||
"@octokit/plugin-paginate-rest@^14.0.0":
|
||||
version "14.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/plugin-paginate-rest/-/plugin-paginate-rest-14.0.0.tgz#44dc9fff2dacb148d4c5c788b573ddc044503026"
|
||||
integrity sha512-fNVRE7ufJiAA3XUrha2omTA39M6IXIc6GIZLvlbsm8QOQCYvpq/LkMNGyFlB1d8hTDzsAXa3OKtybdMAYsV/fw==
|
||||
dependencies:
|
||||
"@octokit/types" "^16.0.0"
|
||||
|
||||
"@octokit/plugin-request-log@^6.0.0":
|
||||
version "6.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/plugin-request-log/-/plugin-request-log-6.0.0.tgz#de1c1e557df6c08adb631bf78264fa741e01b317"
|
||||
integrity sha512-UkOzeEN3W91/eBq9sPZNQ7sUBvYCqYbrrD8gTbBuGtHEuycE4/awMXcYvx6sVYo7LypPhmQwwpUe4Yyu4QZN5Q==
|
||||
|
||||
"@octokit/plugin-rest-endpoint-methods@^17.0.0":
|
||||
version "17.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/plugin-rest-endpoint-methods/-/plugin-rest-endpoint-methods-17.0.0.tgz#8c54397d3a4060356a1c8a974191ebf945924105"
|
||||
integrity sha512-B5yCyIlOJFPqUUeiD0cnBJwWJO8lkJs5d8+ze9QDP6SvfiXSz1BF+91+0MeI1d2yxgOhU/O+CvtiZ9jSkHhFAw==
|
||||
dependencies:
|
||||
"@octokit/types" "^16.0.0"
|
||||
|
||||
"@octokit/request-error@^7.0.2":
|
||||
version "7.1.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/request-error/-/request-error-7.1.0.tgz#440fa3cae310466889778f5a222b47a580743638"
|
||||
integrity sha512-KMQIfq5sOPpkQYajXHwnhjCC0slzCNScLHs9JafXc4RAJI+9f+jNDlBNaIMTvazOPLgb4BnlhGJOTbnN0wIjPw==
|
||||
dependencies:
|
||||
"@octokit/types" "^16.0.0"
|
||||
|
||||
"@octokit/request@^10.0.6":
|
||||
version "10.0.7"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/request/-/request-10.0.7.tgz#93f619914c523750a85e7888de983e1009eb03f6"
|
||||
integrity sha512-v93h0i1yu4idj8qFPZwjehoJx4j3Ntn+JhXsdJrG9pYaX6j/XRz2RmasMUHtNgQD39nrv/VwTWSqK0RNXR8upA==
|
||||
dependencies:
|
||||
"@octokit/endpoint" "^11.0.2"
|
||||
"@octokit/request-error" "^7.0.2"
|
||||
"@octokit/types" "^16.0.0"
|
||||
fast-content-type-parse "^3.0.0"
|
||||
universal-user-agent "^7.0.2"
|
||||
|
||||
"@octokit/rest@^22.0.0":
|
||||
version "22.0.1"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/rest/-/rest-22.0.1.tgz#4d866c32b76b711d3f736f91992e2b534163b416"
|
||||
integrity sha512-Jzbhzl3CEexhnivb1iQ0KJ7s5vvjMWcmRtq5aUsKmKDrRW6z3r84ngmiFKFvpZjpiU/9/S6ITPFRpn5s/3uQJw==
|
||||
dependencies:
|
||||
"@octokit/core" "^7.0.6"
|
||||
"@octokit/plugin-paginate-rest" "^14.0.0"
|
||||
"@octokit/plugin-request-log" "^6.0.0"
|
||||
"@octokit/plugin-rest-endpoint-methods" "^17.0.0"
|
||||
|
||||
"@octokit/types@^16.0.0":
|
||||
version "16.0.0"
|
||||
resolved "https://registry.yarnpkg.com/@octokit/types/-/types-16.0.0.tgz#fbd7fa590c2ef22af881b1d79758bfaa234dbb7c"
|
||||
integrity sha512-sKq+9r1Mm4efXW1FCk7hFSeJo4QKreL/tTbR0rz/qx/r1Oa2VV83LTA/H/MuCOX7uCIJmQVRKBcbmWoySjAnSg==
|
||||
dependencies:
|
||||
"@octokit/openapi-types" "^27.0.0"
|
||||
|
||||
"@tybys/wasm-util@^0.10.1":
|
||||
version "0.10.1"
|
||||
resolved "https://registry.yarnpkg.com/@tybys/wasm-util/-/wasm-util-0.10.1.tgz#ecddd3205cf1e2d5274649ff0eedd2991ed7f414"
|
||||
integrity sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg==
|
||||
dependencies:
|
||||
tslib "^2.4.0"
|
||||
|
||||
ansi-regex@^5.0.1:
|
||||
version "5.0.1"
|
||||
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-5.0.1.tgz#082cb2c89c9fe8659a311a53bd6a4dc5301db304"
|
||||
integrity sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==
|
||||
|
||||
ansi-styles@^4.0.0:
|
||||
version "4.3.0"
|
||||
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-4.3.0.tgz#edd803628ae71c04c85ae7a0906edad34b648937"
|
||||
integrity sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==
|
||||
dependencies:
|
||||
color-convert "^2.0.1"
|
||||
|
||||
argparse@^2.0.1:
|
||||
version "2.0.1"
|
||||
resolved "https://registry.yarnpkg.com/argparse/-/argparse-2.0.1.tgz#246f50f3ca78a3240f6c997e8a9bd1eac49e4b38"
|
||||
integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==
|
||||
|
||||
before-after-hook@^4.0.0:
|
||||
version "4.0.0"
|
||||
resolved "https://registry.yarnpkg.com/before-after-hook/-/before-after-hook-4.0.0.tgz#cf1447ab9160df6a40f3621da64d6ffc36050cb9"
|
||||
integrity sha512-q6tR3RPqIB1pMiTRMFcZwuG5T8vwp+vUvEG0vuI6B+Rikh5BfPp2fQ82c925FOs+b0lcFQ8CFrL+KbilfZFhOQ==
|
||||
|
||||
chardet@^2.1.1:
|
||||
version "2.1.1"
|
||||
resolved "https://registry.yarnpkg.com/chardet/-/chardet-2.1.1.tgz#5c75593704a642f71ee53717df234031e65373c8"
|
||||
integrity sha512-PsezH1rqdV9VvyNhxxOW32/d75r01NY7TQCmOqomRo15ZSOKbpTFVsfjghxo6JloQUCGnH4k1LGu0R4yCLlWQQ==
|
||||
|
||||
cli-width@^4.1.0:
|
||||
version "4.1.0"
|
||||
resolved "https://registry.yarnpkg.com/cli-width/-/cli-width-4.1.0.tgz#42daac41d3c254ef38ad8ac037672130173691c5"
|
||||
integrity sha512-ouuZd4/dm2Sw5Gmqy6bGyNNNe1qt9RpmxveLSO7KcgsTnU7RXfsw+/bukWGo1abgBiMAic068rclZsO4IWmmxQ==
|
||||
|
||||
clipanion@^4.0.0-rc.4:
|
||||
version "4.0.0-rc.4"
|
||||
resolved "https://registry.yarnpkg.com/clipanion/-/clipanion-4.0.0-rc.4.tgz#7191a940e47ef197e5f18c9cbbe419278b5f5903"
|
||||
integrity sha512-CXkMQxU6s9GklO/1f714dkKBMu1lopS1WFF0B8o4AxPykR1hpozxSiUZ5ZUeBjfPgCWqbcNOtZVFhB8Lkfp1+Q==
|
||||
dependencies:
|
||||
typanion "^3.8.0"
|
||||
|
||||
color-convert@^2.0.1:
|
||||
version "2.0.1"
|
||||
resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-2.0.1.tgz#72d3a68d598c9bdb3af2ad1e84f21d896abd4de3"
|
||||
integrity sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==
|
||||
dependencies:
|
||||
color-name "~1.1.4"
|
||||
|
||||
color-name@~1.1.4:
|
||||
version "1.1.4"
|
||||
resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2"
|
||||
integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==
|
||||
|
||||
colorette@^2.0.20:
|
||||
version "2.0.20"
|
||||
resolved "https://registry.yarnpkg.com/colorette/-/colorette-2.0.20.tgz#9eb793e6833067f7235902fcd3b09917a000a95a"
|
||||
integrity sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w==
|
||||
|
||||
debug@^4.4.1:
|
||||
version "4.4.3"
|
||||
resolved "https://registry.yarnpkg.com/debug/-/debug-4.4.3.tgz#c6ae432d9bd9662582fce08709b038c58e9e3d6a"
|
||||
integrity sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==
|
||||
dependencies:
|
||||
ms "^2.1.3"
|
||||
|
||||
emnapi@^1.5.0:
|
||||
version "1.7.1"
|
||||
resolved "https://registry.yarnpkg.com/emnapi/-/emnapi-1.7.1.tgz#5cbb09ca201c648417077f2d8825289c106461de"
|
||||
integrity sha512-wlLK2xFq+T+rCBlY6+lPlFVDEyE93b7hSn9dMrfWBIcPf4ArwUvymvvMnN9M5WWuiryYQe9M+UJrkqw4trdyRA==
|
||||
|
||||
emoji-regex@^8.0.0:
|
||||
version "8.0.0"
|
||||
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-8.0.0.tgz#e818fd69ce5ccfcb404594f842963bf53164cc37"
|
||||
integrity sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==
|
||||
|
||||
es-toolkit@^1.39.10:
|
||||
version "1.42.0"
|
||||
resolved "https://registry.yarnpkg.com/es-toolkit/-/es-toolkit-1.42.0.tgz#c9e87c7e2d4759ca26887814e6bc780cf4747fc5"
|
||||
integrity sha512-SLHIyY7VfDJBM8clz4+T2oquwTQxEzu263AyhVK4jREOAwJ+8eebaa4wM3nlvnAqhDrMm2EsA6hWHaQsMPQ1nA==
|
||||
|
||||
fast-content-type-parse@^3.0.0:
|
||||
version "3.0.0"
|
||||
resolved "https://registry.yarnpkg.com/fast-content-type-parse/-/fast-content-type-parse-3.0.0.tgz#5590b6c807cc598be125e6740a9fde589d2b7afb"
|
||||
integrity sha512-ZvLdcY8P+N8mGQJahJV5G4U88CSvT1rP8ApL6uETe88MBXrBHAkZlSEySdUlyztF7ccb+Znos3TFqaepHxdhBg==
|
||||
|
||||
iconv-lite@^0.7.0:
|
||||
version "0.7.0"
|
||||
resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.7.0.tgz#c50cd80e6746ca8115eb98743afa81aa0e147a3e"
|
||||
integrity sha512-cf6L2Ds3h57VVmkZe+Pn+5APsT7FpqJtEhhieDCvrE2MK5Qk9MyffgQyuxQTm6BChfeZNtcOLHp9IcWRVcIcBQ==
|
||||
dependencies:
|
||||
safer-buffer ">= 2.1.2 < 3.0.0"
|
||||
|
||||
is-fullwidth-code-point@^3.0.0:
|
||||
version "3.0.0"
|
||||
resolved "https://registry.yarnpkg.com/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz#f116f8064fe90b3f7844a38997c0b75051269f1d"
|
||||
integrity sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==
|
||||
|
||||
js-yaml@^4.1.0:
|
||||
version "4.1.1"
|
||||
resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-4.1.1.tgz#854c292467705b699476e1a2decc0c8a3458806b"
|
||||
integrity sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==
|
||||
dependencies:
|
||||
argparse "^2.0.1"
|
||||
|
||||
ms@^2.1.3:
|
||||
version "2.1.3"
|
||||
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2"
|
||||
integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==
|
||||
|
||||
mute-stream@^2.0.0:
|
||||
version "2.0.0"
|
||||
resolved "https://registry.yarnpkg.com/mute-stream/-/mute-stream-2.0.0.tgz#a5446fc0c512b71c83c44d908d5c7b7b4c493b2b"
|
||||
integrity sha512-WWdIxpyjEn+FhQJQQv9aQAYlHoNVdzIzUySNV1gHUPDSdZJ3yZn7pAAbQcV7B56Mvu881q9FZV+0Vx2xC44VWA==
|
||||
|
||||
"safer-buffer@>= 2.1.2 < 3.0.0":
|
||||
version "2.1.2"
|
||||
resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a"
|
||||
integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
|
||||
|
||||
semver@^7.7.2:
|
||||
version "7.7.3"
|
||||
resolved "https://registry.yarnpkg.com/semver/-/semver-7.7.3.tgz#4b5f4143d007633a8dc671cd0a6ef9147b8bb946"
|
||||
integrity sha512-SdsKMrI9TdgjdweUSR9MweHA4EJ8YxHn8DFaDisvhVlUOe4BF1tLD7GAj0lIqWVl+dPb/rExr0Btby5loQm20Q==
|
||||
|
||||
signal-exit@^4.1.0:
|
||||
version "4.1.0"
|
||||
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-4.1.0.tgz#952188c1cbd546070e2dd20d0f41c0ae0530cb04"
|
||||
integrity sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==
|
||||
|
||||
string-width@^4.1.0:
|
||||
version "4.2.3"
|
||||
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
|
||||
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
|
||||
dependencies:
|
||||
emoji-regex "^8.0.0"
|
||||
is-fullwidth-code-point "^3.0.0"
|
||||
strip-ansi "^6.0.1"
|
||||
|
||||
strip-ansi@^6.0.0, strip-ansi@^6.0.1:
|
||||
version "6.0.1"
|
||||
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
|
||||
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
|
||||
dependencies:
|
||||
ansi-regex "^5.0.1"
|
||||
|
||||
tslib@^2.4.0:
|
||||
version "2.8.1"
|
||||
resolved "https://registry.yarnpkg.com/tslib/-/tslib-2.8.1.tgz#612efe4ed235d567e8aba5f2a5fab70280ade83f"
|
||||
integrity sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==
|
||||
|
||||
typanion@^3.14.0, typanion@^3.8.0:
|
||||
version "3.14.0"
|
||||
resolved "https://registry.yarnpkg.com/typanion/-/typanion-3.14.0.tgz#a766a91810ce8258033975733e836c43a2929b94"
|
||||
integrity sha512-ZW/lVMRabETuYCd9O9ZvMhAh8GslSqaUjxmK/JLPCh6l73CvLBiuXswj/+7LdnWOgYsQ130FqLzFz5aGT4I3Ug==
|
||||
|
||||
universal-user-agent@^7.0.0, universal-user-agent@^7.0.2:
|
||||
version "7.0.3"
|
||||
resolved "https://registry.yarnpkg.com/universal-user-agent/-/universal-user-agent-7.0.3.tgz#c05870a58125a2dc00431f2df815a77fe69736be"
|
||||
integrity sha512-TmnEAEAsBJVZM/AADELsK76llnwcf9vMKuPz8JflO1frO8Lchitr0fNaN9d+Ap0BjKtqWqd/J17qeDnXh8CL2A==
|
||||
|
||||
wrap-ansi@^6.2.0:
|
||||
version "6.2.0"
|
||||
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-6.2.0.tgz#e9393ba07102e6c91a3b221478f0257cd2856e53"
|
||||
integrity sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==
|
||||
dependencies:
|
||||
ansi-styles "^4.0.0"
|
||||
string-width "^4.1.0"
|
||||
strip-ansi "^6.0.0"
|
||||
|
||||
yoctocolors-cjs@^2.1.3:
|
||||
version "2.1.3"
|
||||
resolved "https://registry.yarnpkg.com/yoctocolors-cjs/-/yoctocolors-cjs-2.1.3.tgz#7e4964ea8ec422b7a40ac917d3a344cfd2304baa"
|
||||
integrity sha512-U/PBtDf35ff0D8X8D0jfdzHYEPFxAI7jJlxZXwCSez5M3190m+QobIfh+sWDWSHMCWWJN2AWamkegn6vr6YBTw==
|
||||
Loading…
Reference in New Issue