Merge pull request #6190 from influxdata/jts-qol-influxdb3-release-helper-scripts
Jts qol influxdb3 release helper scriptspull/6198/head
commit
d7151cea85
|
|
@ -0,0 +1,192 @@
|
|||
# enhance-release-notes
|
||||
|
||||
Analyze GitHub PRs referenced in release notes and enhance descriptions following Google Developer Documentation style.
|
||||
|
||||
## Overview
|
||||
|
||||
This command improves release note descriptions by:
|
||||
1. Fetching PR data from GitHub API
|
||||
2. Analyzing code changes and PR content
|
||||
3. Generating clear, action-oriented descriptions
|
||||
4. Following Google Developer Documentation principles
|
||||
5. Creating a descriptive commit message
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
enhance-release-notes <release-notes-file> [--dry-run]
|
||||
```
|
||||
|
||||
## Process
|
||||
|
||||
### 1. Extract PR References
|
||||
|
||||
- Scan the release notes file for GitHub PR links
|
||||
- Extract PR numbers and repository information
|
||||
- Example pattern: `([#26574](https://github.com/influxdata/influxdb/pull/26574))`
|
||||
|
||||
### 2. Fetch PR Data
|
||||
|
||||
For each PR, collect:
|
||||
- PR title and description
|
||||
- Files modified (to determine component scope)
|
||||
- Labels and metadata
|
||||
- Code change statistics
|
||||
|
||||
### 3. Analyze and Categorize
|
||||
|
||||
**Component Detection** (based on file paths):
|
||||
- `src/database/`, `catalog/`, `schema/` → Database operations
|
||||
- `cmd/`, `cli/` → CLI commands
|
||||
- `api/`, `http/` → HTTP API
|
||||
- `src/query/`, `sql/` → Query engine
|
||||
- `src/auth/`, `token/` → Authentication
|
||||
- `storage/`, `parquet/`, `wal/` → Storage engine
|
||||
- `license/` → License management
|
||||
|
||||
**Change Type Detection**:
|
||||
- `feat:` or "add", "new" → Feature
|
||||
- `fix:` or "resolve", "correct" → Bug fix
|
||||
- `perf:` or "optim", "faster" → Performance improvement
|
||||
|
||||
### 4. Generate Google Developer Documentation Style Descriptions
|
||||
|
||||
**Principles**:
|
||||
- Clear, concise, action-oriented language
|
||||
- Focus on what developers can do
|
||||
- Avoid marketing speak ("enhanced", "improved", "better")
|
||||
- Use specific, concrete benefits
|
||||
- Start with action verbs when possible
|
||||
|
||||
**Templates**:
|
||||
|
||||
**Database Operations**:
|
||||
- `hard.*delet.*date` → "Set custom hard deletion dates for deleted databases and tables"
|
||||
- `retention.*period` → "Configure automatic data expiration for databases"
|
||||
- `schema.*updat` → "Modify database schema after creation"
|
||||
|
||||
**CLI Commands**:
|
||||
- `help.*text` → "Access help documentation for commands"
|
||||
- `show.*license` → "View license details including expiration and limits"
|
||||
- `object.*store.*required` → "Specify object store configuration when starting the server"
|
||||
|
||||
**HTTP API**:
|
||||
- `v1.*query.*endpoint.*ns` → "Use nanosecond precision by default in V1 API CSV responses"
|
||||
- `trigger.*request_path` → "Configure processing engine triggers with request paths"
|
||||
|
||||
**Query Engine**:
|
||||
- `csv.*precision` → "Get consistent timestamp precision in CSV output"
|
||||
- `query.*performance` → "Execute queries without performance degradation"
|
||||
|
||||
**Authentication**:
|
||||
- `token.*creation` → "Generate tokens with additional configuration options"
|
||||
- `admin.*token.*expiration` → "Set expiration dates for admin tokens"
|
||||
|
||||
**Storage Engine**:
|
||||
- `aws.*credential.*reload` → "Automatically refresh AWS credentials from files"
|
||||
- `wal.*replay.*concurrency` → "Control memory usage during database startup"
|
||||
- `corrupt.*wal.*recovery` → "Recover from corrupted write-ahead log files"
|
||||
|
||||
**Fallback Patterns**:
|
||||
- Features: "Use [functionality] to [specific action]"
|
||||
- Bug fixes: "Avoid [specific problem] when [specific action]"
|
||||
- Performance: "Execute [operation] without [specific issue]"
|
||||
|
||||
### 5. Enhancement Format
|
||||
|
||||
Transform:
|
||||
```markdown
|
||||
- **Database management**: Allow hard_deleted date of deleted schema to be updated ([#26574](https://github.com/influxdata/influxdb/pull/26574))
|
||||
```
|
||||
|
||||
Into:
|
||||
```markdown
|
||||
- **Database operations**: Set custom hard deletion dates for deleted databases and tables ([#26574](https://github.com/influxdata/influxdb/pull/26574))
|
||||
```
|
||||
|
||||
### 6. Output Processing
|
||||
|
||||
**Dry Run Mode**:
|
||||
- Show before/after comparison
|
||||
- List all proposed changes
|
||||
- Don't modify the file
|
||||
|
||||
**Apply Mode**:
|
||||
- Replace descriptions in the original file
|
||||
- Preserve all formatting and PR links
|
||||
- Log successful enhancements
|
||||
|
||||
### 7. Create Descriptive Commit Message
|
||||
|
||||
After enhancing the release notes, generate a commit message:
|
||||
|
||||
**Format**:
|
||||
```
|
||||
docs: enhance release notes with specific user benefits
|
||||
|
||||
- Transform generic descriptions into action-oriented language
|
||||
- Add specific benefits following Google Developer Documentation style
|
||||
- Focus on what developers can do with each change
|
||||
- Enhanced [X] descriptions across [Y] components
|
||||
|
||||
Enhanced components: [list of components modified]
|
||||
```
|
||||
|
||||
**Example**:
|
||||
```
|
||||
docs: enhance v3.2.1 release notes with specific user benefits
|
||||
|
||||
- Transform generic descriptions into action-oriented language
|
||||
- Add specific benefits following Google Developer Documentation style
|
||||
- Focus on what developers can do with each change
|
||||
- Enhanced 8 descriptions across database, CLI, and API components
|
||||
|
||||
Enhanced components: Database operations, CLI commands, HTTP API
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
- **Missing GitHub token**: Warn about rate limits, continue with public API
|
||||
- **Private repos**: Skip PRs that can't be accessed
|
||||
- **Invalid PR URLs**: Log error and skip enhancement
|
||||
- **API rate limits**: Implement exponential backoff
|
||||
- **Network issues**: Retry with fallback to original description
|
||||
|
||||
## Configuration
|
||||
|
||||
**Environment Variables**:
|
||||
- `GITHUB_TOKEN`: Personal access token for GitHub API access
|
||||
|
||||
**GitHub Enterprise Support**:
|
||||
- Detect GitHub Enterprise URLs in PR links
|
||||
- Use appropriate API base URL
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
1. **Rate Limiting**: Respect GitHub API rate limits (5000/hour authenticated, 60/hour unauthenticated)
|
||||
2. **Caching**: Consider caching PR data to avoid repeated API calls during development
|
||||
3. **Validation**: Verify PR URLs match expected format before API calls
|
||||
4. **Preservation**: Maintain all existing formatting, spacing, and non-PR content
|
||||
5. **Atomic Updates**: Only modify the file if all enhancements succeed (or provide partial success options)
|
||||
|
||||
## Example Usage
|
||||
|
||||
```bash
|
||||
# Dry run to see proposed changes
|
||||
enhance-release-notes release-notes-v3.2.1.md --dry-run
|
||||
|
||||
# Apply enhancements
|
||||
enhance-release-notes release-notes-v3.2.1.md
|
||||
|
||||
# With verbose output
|
||||
enhance-release-notes release-notes-v3.2.1.md --verbose
|
||||
```
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. All PR descriptions follow Google Developer Documentation style
|
||||
2. Descriptions focus on specific developer actions and benefits
|
||||
3. No marketing language or vague improvements
|
||||
4. Component categories are accurate based on code changes
|
||||
5. Original formatting and PR links are preserved
|
||||
6. Commit message clearly describes the enhancement approach
|
||||
|
|
@ -0,0 +1,16 @@
|
|||
Please analyze and fix the GitHub issue: $ARGUMENTS.
|
||||
|
||||
Follow these steps:
|
||||
|
||||
1. Use `gh issue view` to get the issue details
|
||||
2. Understand the problem described in the issue
|
||||
3. Search the codebase for relevant files, using your knowledge of the project structure and the issue description
|
||||
4. Implement the necessary changes to fix the issue
|
||||
5. Write and run tests (store in `tests/` directory) to verify the fix
|
||||
6. Create a descriptive commit message
|
||||
7. Ensure code passes linting and type checking
|
||||
8. Push
|
||||
9. Ensure code passes pre-push tests
|
||||
10. Create a PR
|
||||
|
||||
Remember to use the GitHub CLI (`gh`) for all GitHub-related tasks.
|
||||
|
|
@ -73,6 +73,12 @@ Help document InfluxData products by creating clear, accurate technical content
|
|||
- **Repository**: https://github.com/influxdata/docs-v2
|
||||
- **Framework**: Hugo static site generator
|
||||
|
||||
## Abbreviations and shortcuts
|
||||
|
||||
- `gdd`: Google Developer Documentation style
|
||||
- `3core`: InfluxDB 3 Core
|
||||
- `3ent`: InfluxDB 3 Enterprise
|
||||
|
||||
## Style guidelines
|
||||
|
||||
- Follow Google Developer Documentation style guidelines
|
||||
|
|
|
|||
|
|
@ -0,0 +1,429 @@
|
|||
name: Audit Documentation
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
version:
|
||||
description: 'Version to audit (must exist in git tags, e.g., v3.0.0 or "local" for dev containers)'
|
||||
required: false
|
||||
default: 'local'
|
||||
create_issue:
|
||||
description: 'Create GitHub issue with audit results'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
jobs:
|
||||
cli-3-core:
|
||||
name: Audit InfluxDB 3 Core CLI
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Set up Docker
|
||||
if: github.event.inputs.version == 'local' || github.event_name == 'schedule'
|
||||
run: |
|
||||
docker compose up -d influxdb3-core
|
||||
sleep 10 # Wait for containers to be ready
|
||||
|
||||
- name: Run Core CLI audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js core $VERSION
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: cli-audit-3-core-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/cli-audit/
|
||||
retention-days: 30
|
||||
|
||||
cli-3-enterprise:
|
||||
name: Audit InfluxDB 3 Enterprise CLI
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Set up Docker
|
||||
if: github.event.inputs.version == 'local' || github.event_name == 'schedule'
|
||||
run: |
|
||||
docker compose up -d influxdb3-enterprise
|
||||
sleep 10 # Wait for containers to be ready
|
||||
|
||||
- name: Run Enterprise CLI audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js enterprise $VERSION
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: cli-audit-3-enterprise-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/cli-audit/
|
||||
retention-days: 30
|
||||
|
||||
cli-3-influxctl:
|
||||
name: Audit InfluxDB 3 influxctl CLI
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Run influxctl CLI audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "influxctl CLI audit not yet implemented"
|
||||
# TODO: Implement influxctl CLI audit
|
||||
# node ./helper-scripts/influxdb3-distributed/audit-influxctl-cli.js $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/cli-audit
|
||||
cat > helper-scripts/output/cli-audit/influxctl-audit-$VERSION.md << 'EOF'
|
||||
# influxctl CLI Audit Report
|
||||
|
||||
**CLI:** influxctl
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement influxctl CLI help extraction
|
||||
- Compare against clustered and cloud-dedicated documentation
|
||||
- Generate patches for missing documentation
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: cli-audit-3-influxctl-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/cli-audit/
|
||||
retention-days: 30
|
||||
|
||||
api-3-core:
|
||||
name: Audit InfluxDB 3 Core API
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run Core API audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "Core API audit not yet implemented"
|
||||
# TODO: Implement Core API audit
|
||||
# node ./helper-scripts/influxdb3-monolith/audit-api-documentation.js core $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/api-audit
|
||||
cat > helper-scripts/output/api-audit/core-api-audit-$VERSION.md << 'EOF'
|
||||
# InfluxDB 3 Core API Audit Report
|
||||
|
||||
**API:** InfluxDB 3 Core
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement API endpoint discovery
|
||||
- Compare against OpenAPI specs
|
||||
- Validate documentation examples
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: api-audit-3-core-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/api-audit/
|
||||
retention-days: 30
|
||||
|
||||
api-3-enterprise:
|
||||
name: Audit InfluxDB 3 Enterprise API
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run Enterprise API audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "Enterprise API audit not yet implemented"
|
||||
# TODO: Implement Enterprise API audit
|
||||
# node ./helper-scripts/influxdb3-monolith/audit-api-documentation.js enterprise $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/api-audit
|
||||
cat > helper-scripts/output/api-audit/enterprise-api-audit-$VERSION.md << 'EOF'
|
||||
# InfluxDB 3 Enterprise API Audit Report
|
||||
|
||||
**API:** InfluxDB 3 Enterprise
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement API endpoint discovery
|
||||
- Compare against OpenAPI specs
|
||||
- Validate documentation examples
|
||||
- Check enterprise-specific endpoints
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: api-audit-3-enterprise-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/api-audit/
|
||||
retention-days: 30
|
||||
|
||||
api-3-cloud-dedicated:
|
||||
name: Audit InfluxDB 3 Cloud Dedicated API
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run Cloud Dedicated API audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "Cloud Dedicated API audit not yet implemented"
|
||||
# TODO: Implement Cloud Dedicated API audit
|
||||
# node ./helper-scripts/influxdb3-distributed/audit-api-documentation.js cloud-dedicated $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/api-audit
|
||||
cat > helper-scripts/output/api-audit/cloud-dedicated-api-audit-$VERSION.md << 'EOF'
|
||||
# InfluxDB 3 Cloud Dedicated API Audit Report
|
||||
|
||||
**API:** InfluxDB 3 Cloud Dedicated
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement management API audit
|
||||
- Implement data API audit
|
||||
- Compare against OpenAPI specs
|
||||
- Validate documentation examples
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: api-audit-3-cloud-dedicated-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/api-audit/
|
||||
retention-days: 30
|
||||
|
||||
api-3-clustered:
|
||||
name: Audit InfluxDB 3 Clustered API
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run Clustered API audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "Clustered API audit not yet implemented"
|
||||
# TODO: Implement Clustered API audit
|
||||
# node ./helper-scripts/influxdb3-distributed/audit-api-documentation.js clustered $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/api-audit
|
||||
cat > helper-scripts/output/api-audit/clustered-api-audit-$VERSION.md << 'EOF'
|
||||
# InfluxDB 3 Clustered API Audit Report
|
||||
|
||||
**API:** InfluxDB 3 Clustered
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement management API audit
|
||||
- Implement data API audit
|
||||
- Compare against OpenAPI specs
|
||||
- Validate documentation examples
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: api-audit-3-clustered-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/api-audit/
|
||||
retention-days: 30
|
||||
|
||||
api-3-cloud-serverless:
|
||||
name: Audit InfluxDB 3 Cloud Serverless API
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run Cloud Serverless API audit
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version || 'local' }}"
|
||||
echo "Cloud Serverless API audit not yet implemented"
|
||||
# TODO: Implement Cloud Serverless API audit
|
||||
# node ./helper-scripts/influxdb3-distributed/audit-api-documentation.js cloud-serverless $VERSION
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p helper-scripts/output/api-audit
|
||||
cat > helper-scripts/output/api-audit/cloud-serverless-api-audit-$VERSION.md << 'EOF'
|
||||
# InfluxDB 3 Cloud Serverless API Audit Report
|
||||
|
||||
**API:** InfluxDB 3 Cloud Serverless
|
||||
**Version:** $VERSION
|
||||
**Date:** $(date)
|
||||
**Status:** Placeholder - audit not yet implemented
|
||||
|
||||
## TODO
|
||||
- Implement management API audit
|
||||
- Implement data API audit
|
||||
- Compare against OpenAPI specs
|
||||
- Validate documentation examples
|
||||
EOF
|
||||
|
||||
- name: Upload audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: api-audit-3-cloud-serverless-${{ github.event.inputs.version || 'local' }}
|
||||
path: helper-scripts/output/api-audit/
|
||||
retention-days: 30
|
||||
|
||||
create-audit-issues:
|
||||
name: Create Issues from Audit Results
|
||||
runs-on: ubuntu-latest
|
||||
needs: [
|
||||
cli-3-core,
|
||||
cli-3-enterprise,
|
||||
cli-3-influxctl,
|
||||
api-3-core,
|
||||
api-3-enterprise,
|
||||
api-3-cloud-dedicated,
|
||||
api-3-clustered,
|
||||
api-3-cloud-serverless
|
||||
]
|
||||
if: always() && (github.event_name == 'schedule' || github.event.inputs.create_issue == 'true')
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Download all audit reports
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: audit-reports/
|
||||
|
||||
- name: Create issues from audit results
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
// Find all audit report directories
|
||||
const reportDirs = fs.readdirSync('audit-reports');
|
||||
|
||||
for (const reportDir of reportDirs) {
|
||||
const reportPath = path.join('audit-reports', reportDir);
|
||||
const files = fs.readdirSync(reportPath);
|
||||
|
||||
for (const file of files) {
|
||||
if (file.endsWith('.md')) {
|
||||
const content = fs.readFileSync(path.join(reportPath, file), 'utf8');
|
||||
|
||||
// Only create issues if there are actual problems (not placeholders)
|
||||
const hasIssues = content.includes('⚠️ Missing from docs') ||
|
||||
content.includes('ℹ️ Documented but not in CLI') ||
|
||||
content.includes('API endpoint mismatch');
|
||||
|
||||
if (hasIssues) {
|
||||
const auditType = reportDir.replace(/-(local|\d+\.\d+\.\d+)$/, '');
|
||||
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: `Documentation Audit Issues - ${auditType}`,
|
||||
body: `## Audit Results\n\n${content}`,
|
||||
labels: ['documentation', 'audit', auditType.includes('cli') ? 'cli-audit' : 'api-audit']
|
||||
});
|
||||
|
||||
console.log(`Created issue for ${auditType}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
audit-summary:
|
||||
name: Generate Summary Report
|
||||
runs-on: ubuntu-latest
|
||||
needs: [
|
||||
cli-3-core,
|
||||
cli-3-enterprise,
|
||||
cli-3-influxctl,
|
||||
api-3-core,
|
||||
api-3-enterprise,
|
||||
api-3-cloud-dedicated,
|
||||
api-3-clustered,
|
||||
api-3-cloud-serverless
|
||||
]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Download all artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: audit-artifacts/
|
||||
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "# Documentation Audit Summary" > summary.md
|
||||
echo "Date: $(date)" >> summary.md
|
||||
echo "Version: ${{ github.event.inputs.version || 'local' }}" >> summary.md
|
||||
echo "" >> summary.md
|
||||
|
||||
# Add results from each audit type
|
||||
for dir in audit-artifacts/*/; do
|
||||
if [ -d "$dir" ]; then
|
||||
echo "## $(basename "$dir")" >> summary.md
|
||||
if [ -f "$dir"/*.md ]; then
|
||||
cat "$dir"/*.md >> summary.md
|
||||
fi
|
||||
echo "" >> summary.md
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Upload summary
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: audit-summary
|
||||
path: summary.md
|
||||
retention-days: 30
|
||||
|
|
@ -0,0 +1,503 @@
|
|||
name: InfluxDB 3 Release Documentation
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
product:
|
||||
description: 'Product being released'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- core
|
||||
- enterprise
|
||||
- clustered
|
||||
- cloud-dedicated
|
||||
- cloud-serverless
|
||||
version:
|
||||
description: 'Release tag name (must exist in git tags, e.g., v3.0.0 or "local" for dev)'
|
||||
required: true
|
||||
type: string
|
||||
previous_version:
|
||||
description: 'Previous release tag name (must exist in git tags, e.g., v2.9.0)'
|
||||
required: true
|
||||
type: string
|
||||
dry_run:
|
||||
description: 'Dry run (do not create PRs or issues)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: true
|
||||
|
||||
jobs:
|
||||
generate-release-notes-core-enterprise:
|
||||
name: Generate Release Notes (Core/Enterprise)
|
||||
runs-on: ubuntu-latest
|
||||
if: contains(fromJSON('["core", "enterprise"]'), github.event.inputs.product)
|
||||
outputs:
|
||||
generated: ${{ steps.generate.outputs.generated }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Generate release notes
|
||||
id: generate
|
||||
run: |
|
||||
echo "Generating Core/Enterprise release notes for ${{ github.event.inputs.product }} v${{ github.event.inputs.version }}"
|
||||
|
||||
# Create output directory
|
||||
mkdir -p helper-scripts/output/release-notes
|
||||
|
||||
# Note: This generates placeholder release notes since the actual repositories
|
||||
# (influxdb and influxdb_pro) are not available in the GitHub Actions environment.
|
||||
# To generate actual release notes, the script would need to be run locally with:
|
||||
# node ./helper-scripts/common/generate-release-notes.js \
|
||||
# --config ./helper-scripts/common/config/influxdb3-core-enterprise.json \
|
||||
# ${{ github.event.inputs.previous_version }} \
|
||||
# ${{ github.event.inputs.version }}
|
||||
|
||||
# Create structured placeholder that matches the expected format
|
||||
cat > helper-scripts/output/release-notes/release-notes-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}.md << EOF
|
||||
> [!Note]
|
||||
> #### InfluxDB 3 Core and Enterprise relationship
|
||||
>
|
||||
> InfluxDB 3 Enterprise is a superset of InfluxDB 3 Core.
|
||||
> All updates to Core are automatically included in Enterprise.
|
||||
> The Enterprise sections below only list updates exclusive to Enterprise.
|
||||
|
||||
## ${{ github.event.inputs.version }} {date="$(date +'%Y-%m-%d')"}
|
||||
|
||||
### Core
|
||||
|
||||
#### Features
|
||||
|
||||
- TODO: Add Core features for ${{ github.event.inputs.product }} ${{ github.event.inputs.version }}
|
||||
|
||||
#### Bug Fixes
|
||||
|
||||
- TODO: Add Core bug fixes for ${{ github.event.inputs.product }} ${{ github.event.inputs.version }}
|
||||
|
||||
### Enterprise
|
||||
|
||||
All Core updates are included in Enterprise. Additional Enterprise-specific features and fixes:
|
||||
|
||||
#### Features
|
||||
|
||||
- TODO: Add Enterprise-specific features for ${{ github.event.inputs.version }}
|
||||
|
||||
#### Bug Fixes
|
||||
|
||||
- TODO: Add Enterprise-specific bug fixes for ${{ github.event.inputs.version }}
|
||||
EOF
|
||||
|
||||
echo "generated=true" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Upload release notes
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: release-notes-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}
|
||||
path: helper-scripts/output/release-notes/
|
||||
retention-days: 30
|
||||
|
||||
# generate-release-notes-distributed:
|
||||
# name: Generate Release Notes (Distributed)
|
||||
# runs-on: ubuntu-latest
|
||||
# if: contains(fromJSON('["clustered", "cloud-dedicated", "cloud-serverless"]'), github.event.inputs.product)
|
||||
# outputs:
|
||||
# generated: ${{ steps.generate.outputs.generated }}
|
||||
|
||||
# steps:
|
||||
# - uses: actions/checkout@v4
|
||||
|
||||
# - name: Set up Node.js
|
||||
# uses: actions/setup-node@v4
|
||||
# with:
|
||||
# node-version: '18'
|
||||
# cache: 'yarn'
|
||||
|
||||
# - name: Install dependencies
|
||||
# run: yarn install --frozen-lockfile
|
||||
|
||||
# - name: Generate release notes
|
||||
# id: generate
|
||||
# run: |
|
||||
# echo "Generating distributed product release notes for ${{ github.event.inputs.product }} v${{ github.event.inputs.version }}"
|
||||
|
||||
# # Create output directory
|
||||
# mkdir -p helper-scripts/output/release-notes
|
||||
|
||||
# # Note: This generates placeholder release notes since the actual repositories
|
||||
# # for distributed products are not available in the GitHub Actions environment.
|
||||
# # To generate actual release notes, the script would need to be run locally with:
|
||||
# # node ./helper-scripts/common/generate-release-notes.js \
|
||||
# # --config ./helper-scripts/common/config/influxdb3-clustered.json \
|
||||
# # ${{ github.event.inputs.previous_version }} \
|
||||
# # ${{ github.event.inputs.version }}
|
||||
|
||||
# # Create structured placeholder for distributed products
|
||||
# cat > helper-scripts/output/release-notes/release-notes-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}.md << EOF
|
||||
# ## ${{ github.event.inputs.version }} {date="$(date +'%Y-%m-%d')"}
|
||||
|
||||
# ### Features
|
||||
|
||||
# - TODO: Add features for ${{ github.event.inputs.product }} ${{ github.event.inputs.version }}
|
||||
|
||||
# ### Bug Fixes
|
||||
|
||||
# - TODO: Add bug fixes for ${{ github.event.inputs.product }} ${{ github.event.inputs.version }}
|
||||
|
||||
# ### Performance Improvements
|
||||
|
||||
# - TODO: Add performance improvements for ${{ github.event.inputs.product }} ${{ github.event.inputs.version }}
|
||||
# EOF
|
||||
|
||||
# echo "generated=true" >> $GITHUB_OUTPUT
|
||||
|
||||
# - name: Upload release notes
|
||||
# uses: actions/upload-artifact@v4
|
||||
# with:
|
||||
# name: release-notes-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}
|
||||
# path: helper-scripts/output/release-notes/
|
||||
# retention-days: 30
|
||||
|
||||
audit-cli-documentation:
|
||||
name: Audit CLI Documentation
|
||||
needs: generate-release-notes-core-enterprise
|
||||
runs-on: ubuntu-latest
|
||||
if: needs.generate-release-notes-core-enterprise.outputs.generated == 'true' && contains(fromJSON('["core", "enterprise"]'), github.event.inputs.product)
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Pull Docker images for version
|
||||
run: |
|
||||
VERSION="${{ github.event.inputs.version }}"
|
||||
PRODUCT="${{ github.event.inputs.product }}"
|
||||
|
||||
if [ "$PRODUCT" == "both" ]; then
|
||||
docker pull influxdb:${VERSION}-core || true
|
||||
docker pull influxdb:${VERSION}-enterprise || true
|
||||
else
|
||||
docker pull influxdb:${VERSION}-${PRODUCT} || true
|
||||
fi
|
||||
|
||||
- name: Run CLI audit
|
||||
run: |
|
||||
PRODUCT="${{ github.event.inputs.product }}"
|
||||
VERSION="${{ github.event.inputs.version }}"
|
||||
|
||||
node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js $PRODUCT $VERSION
|
||||
|
||||
- name: Upload CLI audit reports
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: cli-audit-release-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}
|
||||
path: helper-scripts/output/cli-audit/
|
||||
retention-days: 90
|
||||
|
||||
# audit-distributed-documentation:
|
||||
# name: Audit Distributed Products Documentation
|
||||
# needs: generate-release-notes-distributed
|
||||
# runs-on: ubuntu-latest
|
||||
# if: needs.generate-release-notes-distributed.outputs.generated == 'true' && contains(fromJSON('["clustered", "cloud-dedicated", "cloud-serverless"]'), github.event.inputs.product)
|
||||
|
||||
# steps:
|
||||
# - uses: actions/checkout@v4
|
||||
|
||||
# - name: Set up Node.js
|
||||
# uses: actions/setup-node@v4
|
||||
# with:
|
||||
# node-version: '18'
|
||||
# cache: 'yarn'
|
||||
|
||||
# - name: Install dependencies
|
||||
# run: yarn install --frozen-lockfile
|
||||
|
||||
# - name: Run distributed products audit
|
||||
# run: |
|
||||
# PRODUCT="${{ github.event.inputs.product }}"
|
||||
# VERSION="${{ github.event.inputs.version }}"
|
||||
|
||||
# echo "Auditing distributed product: $PRODUCT v$VERSION"
|
||||
# # TODO: Implement distributed products audit for release
|
||||
# # This would audit API docs, deployment guides, configuration references
|
||||
# # node ./helper-scripts/influxdb3-distributed/audit-documentation.js $PRODUCT $VERSION
|
||||
|
||||
# # For now, create placeholder report
|
||||
# mkdir -p helper-scripts/output/distributed-audit
|
||||
# cat > helper-scripts/output/distributed-audit/release-audit-$PRODUCT-$VERSION.md << 'EOF'
|
||||
# # Release Audit Report - Distributed Products
|
||||
|
||||
# **Product:** $PRODUCT
|
||||
# **Version:** $VERSION
|
||||
# **Date:** $(date)
|
||||
# **Status:** Placeholder - audit not yet implemented
|
||||
|
||||
# ## Areas to Audit
|
||||
# - API documentation completeness
|
||||
# - Deployment guide accuracy
|
||||
# - Configuration reference updates
|
||||
# - Integration guide updates
|
||||
# - Version-specific feature documentation
|
||||
|
||||
# ## TODO
|
||||
# - Implement API documentation audit
|
||||
# - Implement deployment guide audit
|
||||
# - Implement configuration reference audit
|
||||
# - Implement integration guide audit
|
||||
# EOF
|
||||
|
||||
# - name: Upload distributed audit reports
|
||||
# uses: actions/upload-artifact@v4
|
||||
# with:
|
||||
# name: distributed-audit-release-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}
|
||||
# path: helper-scripts/output/distributed-audit/
|
||||
# retention-days: 90
|
||||
|
||||
create-documentation-pr:
|
||||
name: Create Documentation PR
|
||||
needs: [generate-release-notes-core-enterprise, audit-cli-documentation]
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event.inputs.dry_run != 'true' && always() && (needs.generate-release-notes-core-enterprise.result == 'success')
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: artifacts/
|
||||
|
||||
- name: Create release branch
|
||||
run: |
|
||||
BRANCH="release-docs-${{ github.event.inputs.product }}-${{ github.event.inputs.version }}"
|
||||
git checkout -b $BRANCH
|
||||
echo "BRANCH=$BRANCH" >> $GITHUB_ENV
|
||||
|
||||
- name: Copy release notes to docs
|
||||
run: |
|
||||
# Download the generated release notes artifact
|
||||
PRODUCT="${{ github.event.inputs.product }}"
|
||||
VERSION="${{ github.event.inputs.version }}"
|
||||
|
||||
# Determine the target documentation file based on product
|
||||
case "$PRODUCT" in
|
||||
"core"|"enterprise")
|
||||
TARGET_FILE="content/shared/v3-core-enterprise-release-notes/_index.md"
|
||||
SOURCE_FILE="artifacts/release-notes-${PRODUCT}-${VERSION}/release-notes-${PRODUCT}-${VERSION}.md"
|
||||
;;
|
||||
"clustered")
|
||||
TARGET_FILE="content/influxdb3/clustered/reference/release-notes/_index.md"
|
||||
SOURCE_FILE="artifacts/release-notes-${PRODUCT}-${VERSION}/release-notes-${PRODUCT}-${VERSION}.md"
|
||||
;;
|
||||
"cloud-dedicated")
|
||||
TARGET_FILE="content/influxdb3/cloud-dedicated/reference/release-notes/_index.md"
|
||||
SOURCE_FILE="artifacts/release-notes-${PRODUCT}-${VERSION}/release-notes-${PRODUCT}-${VERSION}.md"
|
||||
;;
|
||||
"cloud-serverless")
|
||||
TARGET_FILE="content/influxdb3/cloud-serverless/reference/release-notes/_index.md"
|
||||
SOURCE_FILE="artifacts/release-notes-${PRODUCT}-${VERSION}/release-notes-${PRODUCT}-${VERSION}.md"
|
||||
;;
|
||||
*)
|
||||
echo "Unknown product: $PRODUCT"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
# Check if source file exists
|
||||
if [ -f "$SOURCE_FILE" ]; then
|
||||
echo "Copying release notes from $SOURCE_FILE to $TARGET_FILE"
|
||||
|
||||
# For Core/Enterprise, prepend to existing file (new releases go at the top)
|
||||
if [ "$PRODUCT" = "core" ] || [ "$PRODUCT" = "enterprise" ]; then
|
||||
# Create temporary file with new content + existing content
|
||||
cp "$SOURCE_FILE" temp_release_notes.md
|
||||
echo "" >> temp_release_notes.md
|
||||
cat "$TARGET_FILE" >> temp_release_notes.md
|
||||
mv temp_release_notes.md "$TARGET_FILE"
|
||||
else
|
||||
# For other products, replace the file
|
||||
cp "$SOURCE_FILE" "$TARGET_FILE"
|
||||
fi
|
||||
|
||||
echo "Release notes successfully copied to documentation"
|
||||
else
|
||||
echo "Warning: Release notes file not found at $SOURCE_FILE"
|
||||
echo "Available files in artifacts:"
|
||||
find artifacts/ -type f -name "*.md" || echo "No markdown files found"
|
||||
fi
|
||||
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v5
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
branch: ${{ env.BRANCH }}
|
||||
title: "docs: Release documentation for ${{ github.event.inputs.product }} v${{ github.event.inputs.version }}"
|
||||
body: |
|
||||
## Release Documentation Update
|
||||
|
||||
This PR contains documentation updates for **${{ github.event.inputs.product }} v${{ github.event.inputs.version }}**
|
||||
|
||||
### Included Updates:
|
||||
- [ ] Release notes
|
||||
- [ ] Version updates
|
||||
- [ ] CLI documentation audit results
|
||||
|
||||
### Artifacts:
|
||||
- [Release Notes](https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }})
|
||||
- [CLI Audit Report](https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }})
|
||||
|
||||
### Manual Review Needed:
|
||||
Please review the CLI audit report for any missing or outdated documentation that needs to be updated.
|
||||
|
||||
---
|
||||
*This PR was automatically generated by the release workflow.*
|
||||
labels: |
|
||||
documentation
|
||||
release
|
||||
${{ github.event.inputs.product }}
|
||||
draft: true
|
||||
|
||||
create-audit-issue:
|
||||
name: Create Audit Issue
|
||||
needs: [audit-cli-documentation]
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event.inputs.dry_run != 'true' && always() && (needs.audit-cli-documentation.result == 'success')
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Download audit reports
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: audit-reports/
|
||||
|
||||
- name: Create issue from audit
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const product = '${{ github.event.inputs.product }}';
|
||||
const version = '${{ github.event.inputs.version }}';
|
||||
|
||||
let auditReports = [];
|
||||
let hasIssues = false;
|
||||
|
||||
// Check for CLI audit report
|
||||
const cliAuditPath = `audit-reports/cli-audit-release-${product}-${version}`;
|
||||
if (fs.existsSync(cliAuditPath)) {
|
||||
const files = fs.readdirSync(cliAuditPath);
|
||||
const cliAuditFile = files.find(f => f.includes('documentation-audit'));
|
||||
if (cliAuditFile) {
|
||||
const report = fs.readFileSync(path.join(cliAuditPath, cliAuditFile), 'utf8');
|
||||
const hasMissingOptions = report.includes('⚠️ Missing from docs');
|
||||
const hasExtraOptions = report.includes('ℹ️ Documented but not in CLI');
|
||||
if (hasMissingOptions || hasExtraOptions) {
|
||||
auditReports.push({
|
||||
type: 'CLI',
|
||||
content: report
|
||||
});
|
||||
hasIssues = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check for distributed audit report
|
||||
const distributedAuditPath = `audit-reports/distributed-audit-release-${product}-${version}`;
|
||||
if (fs.existsSync(distributedAuditPath)) {
|
||||
const files = fs.readdirSync(distributedAuditPath);
|
||||
const distributedAuditFile = files.find(f => f.includes('release-audit'));
|
||||
if (distributedAuditFile) {
|
||||
const report = fs.readFileSync(path.join(distributedAuditPath, distributedAuditFile), 'utf8');
|
||||
// For now, always include distributed audit reports since they're placeholders
|
||||
auditReports.push({
|
||||
type: 'Distributed Products',
|
||||
content: report
|
||||
});
|
||||
hasIssues = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (hasIssues && auditReports.length > 0) {
|
||||
// Create comprehensive issue
|
||||
const issueBody = [
|
||||
'## Release Documentation Audit Results',
|
||||
'',
|
||||
`The following documentation issues were found during the release of **${product} v${version}**:`,
|
||||
'',
|
||||
...auditReports.map(report => [
|
||||
`### ${report.type} Audit`,
|
||||
'',
|
||||
report.content,
|
||||
''
|
||||
]).flat(),
|
||||
'### Action Items:',
|
||||
'- [ ] Review and update documentation for missing or outdated content',
|
||||
'- [ ] Verify all examples work with the new version',
|
||||
'- [ ] Update any version-specific content',
|
||||
'- [ ] Remove documentation for deprecated features',
|
||||
'',
|
||||
'---',
|
||||
'*This issue was automatically generated during the release process.*'
|
||||
].join('\n');
|
||||
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: `Documentation Updates Needed - ${product} v${version}`,
|
||||
body: issueBody,
|
||||
labels: ['documentation', 'release', product, 'audit']
|
||||
});
|
||||
|
||||
console.log('Created issue for documentation updates');
|
||||
} else {
|
||||
console.log('No documentation issues found - skipping issue creation');
|
||||
}
|
||||
|
||||
influxdb3-monolith-release-summary:
|
||||
name: Release Summary
|
||||
needs: [generate-release-notes-core-enterprise, audit-cli-documentation, create-documentation-pr, create-audit-issue]
|
||||
runs-on: ubuntu-latest
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "# Release Documentation Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "## Release Information" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Product**: ${{ github.event.inputs.product }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Version**: ${{ github.event.inputs.version }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Previous Version**: ${{ github.event.inputs.previous_version }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Dry Run**: ${{ github.event.inputs.dry_run }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
echo "## Workflow Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Step | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Generate Release Notes (Core/Enterprise) | ${{ needs.generate-release-notes-core-enterprise.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| CLI Documentation Audit | ${{ needs.audit-cli-documentation.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Create Documentation PR | ${{ needs.create-documentation-pr.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Create Audit Issue | ${{ needs.create-audit-issue.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [ "${{ github.event.inputs.dry_run }}" == "true" ]; then
|
||||
echo "**Note**: This was a dry run. No PRs or issues were created." >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
|
@ -0,0 +1,107 @@
|
|||
name: Prepare Documentation Release
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
product:
|
||||
description: 'Product to release'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- core
|
||||
- enterprise
|
||||
- cloud-serverless
|
||||
- cloud-dedicated
|
||||
version:
|
||||
description: 'Version number (e.g., 3.2.1)'
|
||||
required: true
|
||||
release_type:
|
||||
description: 'Release type'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- major
|
||||
- minor
|
||||
- patch
|
||||
- hotfix
|
||||
|
||||
jobs:
|
||||
prepare-release:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Create release branch
|
||||
run: |
|
||||
git checkout -b docs-release-v${{ inputs.version }}
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
cache: 'yarn'
|
||||
|
||||
- name: Install dependencies
|
||||
run: yarn install --frozen-lockfile
|
||||
|
||||
- name: Generate release notes
|
||||
run: |
|
||||
# Note: This workflow assumes release notes are generated manually or from tagged releases
|
||||
# For Core/Enterprise products, the script needs repository access which would require
|
||||
# checking out the influxdb and influxdb_pro repositories first
|
||||
|
||||
echo "Warning: Release notes generation requires access to InfluxDB source repositories"
|
||||
echo "For now, creating a placeholder file that should be replaced with actual release notes"
|
||||
|
||||
# Create output directory
|
||||
mkdir -p helper-scripts/output/release-notes
|
||||
|
||||
# Create placeholder release notes file
|
||||
cat > helper-scripts/output/release-notes/release-notes-v${{ inputs.version }}.md << EOF
|
||||
## v${{ inputs.version }} {date="$(date +'%Y-%m-%d')"}
|
||||
|
||||
### Features
|
||||
|
||||
- TODO: Add features for ${{ inputs.product }} v${{ inputs.version }}
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- TODO: Add bug fixes for ${{ inputs.product }} v${{ inputs.version }}
|
||||
|
||||
<!--
|
||||
Note: This is a placeholder file generated by the workflow.
|
||||
To generate actual release notes with commit history, run:
|
||||
|
||||
For Core/Enterprise:
|
||||
node ./helper-scripts/common/generate-release-notes.js \\
|
||||
--config ./helper-scripts/common/config/influxdb3-core-enterprise.json \\
|
||||
v$(echo "${{ inputs.version }}" | sed 's/^v//') \\
|
||||
v${{ inputs.version }}
|
||||
|
||||
For other products:
|
||||
node ./helper-scripts/common/generate-release-notes.js \\
|
||||
--config ./helper-scripts/common/config/[product-config].json \\
|
||||
v$(echo "${{ inputs.version }}" | sed 's/^v//') \\
|
||||
v${{ inputs.version }}
|
||||
-->
|
||||
EOF
|
||||
|
||||
- name: Update product versions
|
||||
run: |
|
||||
# Script to update data/products.yml
|
||||
./helper-scripts/common/update-product-version.sh \
|
||||
--product ${{ inputs.product }} \
|
||||
--version ${{ inputs.version }}
|
||||
|
||||
- name: Create release checklist issue
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const checklist = require('./.github/scripts/release-checklist.js');
|
||||
await checklist.createIssue({
|
||||
github,
|
||||
context,
|
||||
product: '${{ inputs.product }}',
|
||||
version: '${{ inputs.version }}',
|
||||
releaseType: '${{ inputs.release_type }}'
|
||||
})
|
||||
|
|
@ -0,0 +1,61 @@
|
|||
name: Trigger Documentation Update on Release
|
||||
|
||||
on:
|
||||
# Can be triggered by external workflows using repository_dispatch
|
||||
repository_dispatch:
|
||||
types: [influxdb3-release]
|
||||
|
||||
# Can also be triggered via GitHub API
|
||||
# Example:
|
||||
# curl -X POST \
|
||||
# -H "Authorization: token $GITHUB_TOKEN" \
|
||||
# -H "Accept: application/vnd.github.v3+json" \
|
||||
# https://api.github.com/repos/influxdata/docs-v2/dispatches \
|
||||
# -d '{"event_type":"influxdb3-release","client_payload":{"product":"core","version":"3.0.0","previous_version":"2.9.0"}}'
|
||||
|
||||
jobs:
|
||||
trigger-release-workflow:
|
||||
name: Trigger Release Documentation
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Validate payload
|
||||
run: |
|
||||
if [ -z "${{ github.event.client_payload.product }}" ]; then
|
||||
echo "Error: product is required in client_payload"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -z "${{ github.event.client_payload.version }}" ]; then
|
||||
echo "Error: version is required in client_payload"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -z "${{ github.event.client_payload.previous_version }}" ]; then
|
||||
echo "Error: previous_version is required in client_payload"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Received release notification:"
|
||||
echo "Product: ${{ github.event.client_payload.product }}"
|
||||
echo "Version: ${{ github.event.client_payload.version }}"
|
||||
echo "Previous Version: ${{ github.event.client_payload.previous_version }}"
|
||||
|
||||
- name: Trigger release documentation workflow
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
await github.rest.actions.createWorkflowDispatch({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
workflow_id: 'influxdb3-release.yml',
|
||||
ref: 'master',
|
||||
inputs: {
|
||||
product: '${{ github.event.client_payload.product }}',
|
||||
version: '${{ github.event.client_payload.version }}',
|
||||
previous_version: '${{ github.event.client_payload.previous_version }}',
|
||||
dry_run: '${{ github.event.client_payload.dry_run || 'false' }}'
|
||||
}
|
||||
});
|
||||
|
||||
console.log('Successfully triggered release documentation workflow');
|
||||
|
|
@ -11,6 +11,7 @@ node_modules
|
|||
/content/influxdb*/**/api/**/*.html
|
||||
!api-docs/**/.config.yml
|
||||
/api-docs/redoc-static.html*
|
||||
/helper-scripts/output/*
|
||||
/telegraf-build
|
||||
!telegraf-build/templates
|
||||
!telegraf-build/scripts
|
||||
|
|
|
|||
|
|
@ -1,60 +0,0 @@
|
|||
Doc is a public custom GPT for OpenAI ChatGPT used to help write and style InfluxData and InfluxDB documentation.
|
||||
|
||||
## Introduction
|
||||
|
||||
You write technical software documentation for InfluxData. The public web site is https://docs.influxdata.com and the source repository is https://github.com/influxdata/docs-v2.
|
||||
Documentation provides step-by-step guides and reference documentation for InfluxDB and associated clients (CLIs, client libraries (SDKs), and Telegraf (https://docs.influxdata.com/telegraf/v1/)), and the legacy v1 components Kapacitor and Chronograf.
|
||||
|
||||
## Instruction
|
||||
|
||||
When a user asks a question and doesn't include a product from the list below, ask them which product in the list they are using, along with the version and query language:
|
||||
|
||||
InfluxDB OSS 1.x (AKA "OSS v1")
|
||||
- Documentation: https://docs.influxdata.com/influxdb/v1/
|
||||
- Query languages: v1.8+ supports InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1 client libraries
|
||||
InfluxDB Enterprise (AKA "v1 Enterprise")
|
||||
- Documentation: https://docs.influxdata.com/enterprise_influxdb/v1/
|
||||
- Query languages: v1.8+ supports InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1 client libraries
|
||||
InfluxDB OSS 2.x (AKA "OSS v2", "OSS (TSM)")
|
||||
- Documentation: https://docs.influxdata.com/influxdb/v2/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v2 client libraries
|
||||
InfluxDB Cloud (TSM) (AKA "Cloud 2")
|
||||
- Documentation: https://docs.influxdata.com/influxdb/cloud/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v2 client libraries
|
||||
InfluxDB 3 Clustered (AKA "Clustered", "v3 Clustered")
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/clustered/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxctl CLI, `influxdb3-` (v3) client libraries
|
||||
InfluxDB 3 Cloud Dedicated (AKA "Cloud Dedicated", "v3 Cloud Dedicated", "Dedicated", "CST (Cloud single-tenant)")
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/cloud-dedicated/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxctl CLI, `influxdb3-` (v3) client libraries
|
||||
InfluxDB 3 Cloud Serverless (AKA "Cloud Serverless", "v3 Cloud", "Serverless", "Cloud multi-tenant")
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/cloud-serverless/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influx CLI, `influxdb3-` (v3) client libraries
|
||||
InfluxDB 3 Core (AKA "Core", "InfluxDB 3 OSS", "v3 Core", "v3 free")
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/core/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxdb3 CLI, `influxdb3-` (v3) client libraries
|
||||
InfluxDB 3 Enterprise (AKA "Enterprise", "v3 Enterprise")
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/enterprise/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxdb3 CLI, `influxdb3-` (v3) client libraries
|
||||
|
||||
If I ask about a REST API or SDK (client library) and don't specify a product, ask which product.
|
||||
For API client libraries, refer to the documentation and to the source repositories in https://github.com/InfluxCommunity for the version-specific client library.
|
||||
|
||||
When writing documentation, always use Google Developer Documentation style guidelines and Markdown format.
|
||||
If writing REST API reference documentation follow YouTube Data API style and Google Developer Documentation style guidelines.
|
||||
|
||||
The project uses the Hugo static site generator to build the documentation.
|
||||
The site uses JavaScript and jQuery.
|
||||
For information about linting, tests (using pytests for codeblocks), shortcode <shortcode_name>, refer to https://github.com/influxdata/docs-v2/blob/master/README.md and https://github.com/influxdata/docs-v2/blob/master/CONTRIBUTING.md.
|
||||
If something in CONTRIBUTING.md needs clarification, then give me the suggested revision for CONTRIBUTING.md in Markdown.
|
||||
|
||||
The community forum is https://community.influxdata.com/ and should not be used as a primary source of information, but might contain useful suggestions or solutions to specific problems from users.
|
||||
23
compose.yaml
23
compose.yaml
|
|
@ -1,6 +1,7 @@
|
|||
# This is a Docker Compose file for the InfluxData documentation site.
|
||||
## Run documentation tests for code samples.
|
||||
name: influxdata-docs
|
||||
# Configure your credentials in the following secrets files.
|
||||
secrets:
|
||||
influxdb2-admin-username:
|
||||
file: ~/.env.influxdb2-admin-username
|
||||
|
|
@ -8,6 +9,10 @@ secrets:
|
|||
file: ~/.env.influxdb2-admin-password
|
||||
influxdb2-admin-token:
|
||||
file: ~/.env.influxdb2-admin-token
|
||||
influxdb3-core-admin-token:
|
||||
file: ~/.env.influxdb3-core-admin-token
|
||||
influxdb3-enterprise-admin-token:
|
||||
file: ~/.env.influxdb3-enterprise-admin-token
|
||||
services:
|
||||
local-dev:
|
||||
build:
|
||||
|
|
@ -302,6 +307,9 @@ services:
|
|||
influxdb3-core:
|
||||
container_name: influxdb3-core
|
||||
image: influxdb:3-core
|
||||
# Set variables (except your auth token) for Core in the .env.3core file.
|
||||
env_file:
|
||||
- .env.3core
|
||||
ports:
|
||||
- 8282:8181
|
||||
command:
|
||||
|
|
@ -319,14 +327,18 @@ services:
|
|||
- type: bind
|
||||
source: test/.influxdb3/core/plugins
|
||||
target: /var/lib/influxdb3/plugins
|
||||
environment:
|
||||
- INFLUXDB3_AUTH_TOKEN=/run/secrets/influxdb3-core-admin-token
|
||||
secrets:
|
||||
- influxdb3-core-admin-token
|
||||
influxdb3-enterprise:
|
||||
container_name: influxdb3-enterprise
|
||||
image: influxdb:3-enterprise
|
||||
# Set license email and other variables (except your auth token) for Enterprise in the .env.3ent file.
|
||||
env_file:
|
||||
- .env.3ent
|
||||
ports:
|
||||
- 8181:8181
|
||||
# Change the INFLUXDB3_LICENSE_EMAIL environment variable to your email address. You can also set it in a `.env` file in the same directory as this compose file. Docker Compose automatically loads the .env file.
|
||||
# The license email option is only used the first time you run the container; you can't change the license email after the first run.
|
||||
# The server stores the license in the data directory in the object store and the license is associated with the cluster ID and email.
|
||||
command:
|
||||
- influxdb3
|
||||
- serve
|
||||
|
|
@ -336,7 +348,8 @@ services:
|
|||
- --object-store=file
|
||||
- --data-dir=/var/lib/influxdb3/data
|
||||
- --plugin-dir=/var/lib/influxdb3/plugins
|
||||
- --license-email=${INFLUXDB3_LICENSE_EMAIL}
|
||||
environment:
|
||||
- INFLUXDB3_AUTH_TOKEN=/run/secrets/influxdb3-enterprise-admin-token
|
||||
volumes:
|
||||
- type: bind
|
||||
source: test/.influxdb3/enterprise/data
|
||||
|
|
@ -344,6 +357,8 @@ services:
|
|||
- type: bind
|
||||
source: test/.influxdb3/enterprise/plugins
|
||||
target: /var/lib/influxdb3/plugins
|
||||
secrets:
|
||||
- influxdb3-enterprise-admin-token
|
||||
telegraf-pytest:
|
||||
container_name: telegraf-pytest
|
||||
image: influxdata/docs-pytest
|
||||
|
|
|
|||
|
|
@ -95,6 +95,19 @@ export default [
|
|||
},
|
||||
},
|
||||
|
||||
// Configuration for Node.js helper scripts
|
||||
{
|
||||
files: ['helper-scripts/**/*.js'],
|
||||
languageOptions: {
|
||||
globals: {
|
||||
...globals.node,
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
// Node.js specific rules
|
||||
},
|
||||
},
|
||||
|
||||
// Configuration for specific file patterns
|
||||
{
|
||||
files: ['**/*.js'],
|
||||
|
|
|
|||
|
|
@ -1,36 +1,143 @@
|
|||
# InfluxData documentation helper scripts
|
||||
# InfluxData Documentation Helper Scripts
|
||||
|
||||
This directory contains scripts designed to help make specific maintenance
|
||||
processes easier.
|
||||
This directory contains scripts to assist with InfluxDB documentation workflows, including release notes generation, CLI/API documentation auditing, and version management.
|
||||
|
||||
## InfluxDB Clustered release artifacts
|
||||
## Directory Structure
|
||||
|
||||
**Script:** `./clustered-release-artifacts.sh`
|
||||
```
|
||||
helper-scripts/
|
||||
├── common/ # Shared scripts used across all products
|
||||
├── influxdb3-monolith/ # Scripts for InfluxDB 3 Core & Enterprise
|
||||
├── influxdb3-distributed/ # Scripts for InfluxDB 3 Clustered & Cloud Dedicated
|
||||
├── cloud-serverless/ # Scripts for InfluxDB Cloud Serverless
|
||||
└── output/ # Generated outputs from all scripts
|
||||
```
|
||||
|
||||
Each InfluxDB Clustered release has the following associated artifacts that need
|
||||
to be provided with the release notes:
|
||||
## Product Categories
|
||||
|
||||
- `example-customer.yaml`
|
||||
- `app-instance-schema.json`
|
||||
### InfluxDB 3 Monolith
|
||||
- **Products**: InfluxDB 3 Core, InfluxDB 3 Enterprise
|
||||
- **Deployment**: Single binary deployment
|
||||
- **Scripts Location**: `influxdb3-monolith/`
|
||||
|
||||
This script uses an InfluxDB Clustered pull secret to pull down the required
|
||||
assets and store them in `static/downloads/clustered-release-artifacts/<RELEASE>`.
|
||||
### InfluxDB 3 Distributed
|
||||
- **Products**: InfluxDB 3 Clustered, InfluxDB 3 Cloud Dedicated
|
||||
- **Deployment**: Distributed/Kubernetes deployment
|
||||
- **Scripts Location**: `influxdb3-distributed/`
|
||||
|
||||
1. **Set up the pull secret:**
|
||||
### Cloud Serverless
|
||||
- **Product**: InfluxDB Cloud Serverless
|
||||
- **Deployment**: Fully managed cloud service
|
||||
- **Scripts Location**: `cloud-serverless/`
|
||||
|
||||
The **Clustered Pull Secret** (config.json) is available in Docs Team
|
||||
1Password vault. Download the pull secret and store it in the
|
||||
`/tmp/influxdbsecret` directory on your local machine.
|
||||
## Common Scripts
|
||||
|
||||
2. Install dependencies:
|
||||
|
||||
- [Install `crane`](https://github.com/google/go-containerregistry/tree/main/cmd/crane#installation).
|
||||
- [Install `jq`](https://jqlang.org/download/)
|
||||
### `common/generate-release-notes.js`
|
||||
JavaScript ESM script that generates release notes by analyzing git commits across multiple repositories. Supports flexible configuration for different InfluxDB products and output formats.
|
||||
|
||||
3. From the root of the docs project directory, run the following command to
|
||||
execute the script. Provide the release version as an argument to the
|
||||
script--for example:
|
||||
**Usage:**
|
||||
```bash
|
||||
node common/generate-release-notes.js [options] <from_version> <to_version> [repo_paths...]
|
||||
```
|
||||
|
||||
```sh
|
||||
sh ./helper-scripts/clustered-release-artifacts.sh 20250508-1719206
|
||||
```
|
||||
**Options:**
|
||||
- `--config <file>` - Load configuration from JSON file
|
||||
- `--format <type>` - Output format: 'integrated' or 'separated'
|
||||
- `--no-fetch` - Skip fetching latest commits from remote
|
||||
- `--pull` - Pull latest changes (use with caution)
|
||||
- `--no-pr-links` - Omit PR links from commit messages
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Using configuration file (recommended)
|
||||
node common/generate-release-notes.js --config common/config/influxdb3-core-enterprise.json v3.1.0 v3.2.0
|
||||
|
||||
# Traditional command-line arguments
|
||||
node common/generate-release-notes.js v3.1.0 v3.2.0 ~/repos/influxdb ~/repos/influxdb_pro
|
||||
```
|
||||
|
||||
### `common/update-product-version.sh`
|
||||
Updates product version numbers in `data/products.yml` and related documentation files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./common/update-product-version.sh --product <product> --version <version>
|
||||
```
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
./common/update-product-version.sh --product core --version 3.2.1
|
||||
```
|
||||
|
||||
## Product-Specific Scripts
|
||||
|
||||
### InfluxDB 3 Monolith (Core & Enterprise)
|
||||
|
||||
See [`influxdb3-monolith/README.md`](influxdb3-monolith/README.md) for detailed documentation.
|
||||
|
||||
**Key Scripts:**
|
||||
- `audit-cli-documentation.sh` - Audits CLI commands against existing documentation
|
||||
- `setup-auth-tokens.sh` - Sets up authentication tokens for local containers
|
||||
|
||||
### InfluxDB 3 Distributed (Clustered & Cloud Dedicated)
|
||||
|
||||
See [`influxdb3-distributed/README.md`](influxdb3-distributed/README.md) for detailed documentation.
|
||||
|
||||
**Key Scripts:**
|
||||
- `clustered-release-artifacts.sh` - Downloads release artifacts for Clustered releases
|
||||
|
||||
## Output Directory
|
||||
|
||||
All scripts write their outputs to organized subdirectories:
|
||||
|
||||
```
|
||||
output/
|
||||
├── release-notes/ # Generated release notes
|
||||
├── cli-audit/ # CLI documentation audit reports
|
||||
├── api-audit/ # API documentation audit reports
|
||||
└── artifacts/ # Downloaded release artifacts
|
||||
```
|
||||
|
||||
## GitHub Workflow Integration
|
||||
|
||||
These scripts are integrated with GitHub Actions workflows:
|
||||
|
||||
- **Workflow**: `.github/workflows/prepare-release.yml`
|
||||
- **Uses**: `generate-release-notes.js`, `update-product-version.sh`
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. **Clone the repository**
|
||||
```bash
|
||||
git clone https://github.com/influxdata/docs-v2.git
|
||||
cd docs-v2/helper-scripts
|
||||
```
|
||||
|
||||
2. **Make scripts executable**
|
||||
```bash
|
||||
find . -name "*.sh" -type f -exec chmod +x {} \;
|
||||
```
|
||||
|
||||
3. **Run a script**
|
||||
```bash
|
||||
# Generate release notes
|
||||
node common/generate-release-notes.js --config common/config/influxdb3-core-enterprise.json v3.1.0 v3.2.0
|
||||
|
||||
# Audit CLI documentation
|
||||
node influxdb3-monolith/audit-cli-documentation.js core local
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
When adding new scripts:
|
||||
|
||||
1. Place in the appropriate product directory
|
||||
2. Follow naming conventions (lowercase with hyphens)
|
||||
3. Include comprehensive help text and documentation
|
||||
4. Update the relevant README files
|
||||
5. Test with all applicable products
|
||||
6. Ensure outputs go to the `output/` directory
|
||||
|
||||
## Archived Scripts
|
||||
|
||||
Deprecated scripts are moved to `archive/` subdirectories. These scripts are kept for reference but should not be used in active workflows.
|
||||
|
|
@ -0,0 +1,62 @@
|
|||
# InfluxDB Cloud Serverless Helper Scripts
|
||||
|
||||
This directory contains scripts specific to InfluxDB Cloud Serverless documentation workflows.
|
||||
|
||||
## Overview
|
||||
|
||||
InfluxDB Cloud Serverless is a fully managed cloud service that requires different documentation maintenance approaches compared to self-hosted products.
|
||||
|
||||
## Scripts (Planned)
|
||||
|
||||
### audit-api-documentation.sh (TODO)
|
||||
Audit API documentation against the Cloud Serverless API endpoints.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./audit-api-documentation.sh [version]
|
||||
```
|
||||
|
||||
### update-pricing-information.sh (TODO)
|
||||
Update pricing and billing documentation based on current Cloud Serverless offerings.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./update-pricing-information.sh
|
||||
```
|
||||
|
||||
### validate-tutorial-links.sh (TODO)
|
||||
Validate that tutorial links and examples work with current Cloud Serverless endpoints.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-tutorial-links.sh
|
||||
```
|
||||
|
||||
## Considerations for Cloud Serverless
|
||||
|
||||
Unlike self-hosted products, Cloud Serverless:
|
||||
|
||||
- Has no CLI tool to audit
|
||||
- Uses exclusively HTTP API endpoints
|
||||
- Has dynamic pricing that may need regular updates
|
||||
- Requires authentication against live cloud services for testing
|
||||
- Has region-specific endpoints and limitations
|
||||
|
||||
## Future Development
|
||||
|
||||
As Cloud Serverless documentation needs evolve, this directory will be expanded with:
|
||||
|
||||
- API endpoint validation scripts
|
||||
- Tutorial testing automation
|
||||
- Pricing documentation updates
|
||||
- Regional documentation maintenance
|
||||
- Authentication and permissions testing
|
||||
|
||||
## Integration
|
||||
|
||||
These scripts will integrate with the main documentation workflow via:
|
||||
|
||||
- GitHub Actions for automated testing
|
||||
- Scheduled runs for pricing updates
|
||||
- PR validation for API changes
|
||||
- Integration with common utility functions
|
||||
|
|
@ -0,0 +1,135 @@
|
|||
# Common Helper Scripts
|
||||
|
||||
This directory contains scripts that are shared across all InfluxDB documentation products.
|
||||
|
||||
## Scripts
|
||||
|
||||
### generate-release-notes.js
|
||||
|
||||
JavaScript ESM script that generates release notes by analyzing git commits between two versions across multiple repositories. Supports flexible configuration for different InfluxDB products and output formats.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
node generate-release-notes.js [options] <from_version> <to_version> [repo_paths...]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--config <file>` - Load configuration from JSON file (recommended)
|
||||
- `--format <type>` - Output format: 'integrated' or 'separated'
|
||||
- `--no-fetch` - Skip fetching latest commits from remote
|
||||
- `--pull` - Pull latest changes (use with caution)
|
||||
- `--no-pr-links` - Omit PR links from commit messages
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Using configuration file (recommended for InfluxDB 3 Core/Enterprise)
|
||||
node generate-release-notes.js --config config/influxdb3-core-enterprise.json v3.1.0 v3.2.0
|
||||
|
||||
# Using configuration file for other products
|
||||
node generate-release-notes.js --config config/influxdb3-clustered.json v1.0.0 v1.1.0
|
||||
|
||||
# Traditional command-line arguments
|
||||
node generate-release-notes.js v3.1.0 v3.2.0 ~/repos/influxdb ~/repos/influxdb_pro
|
||||
|
||||
# Skip fetch for faster local testing
|
||||
node generate-release-notes.js --no-fetch v3.1.0 v3.2.0 ~/repos/influxdb
|
||||
```
|
||||
|
||||
**Configuration Files:**
|
||||
- `config/influxdb3-core-enterprise.json` - InfluxDB 3 Core/Enterprise separated format
|
||||
- `config/influxdb3-clustered.json` - InfluxDB 3 Clustered integrated format
|
||||
- `config/influxdb-v2.json` - InfluxDB v2.x integrated format
|
||||
|
||||
**Output:**
|
||||
- Creates `release-notes-<version>.md` in `../output/release-notes/`
|
||||
- Supports two formats:
|
||||
- **Integrated**: All repositories' changes combined in unified sections
|
||||
- **Separated**: Primary repository first, then secondary repositories (ideal for Core/Enterprise)
|
||||
- Includes sections for Features, Bug Fixes, Breaking Changes, Performance, and API changes
|
||||
- Automatically links to GitHub pull requests (configurable per repository)
|
||||
|
||||
### update-product-version.sh
|
||||
|
||||
Updates product version information in documentation configuration files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./update-product-version.sh --product <product> --version <version>
|
||||
```
|
||||
|
||||
**Supported Products:**
|
||||
- `core` - InfluxDB 3 Core
|
||||
- `enterprise` - InfluxDB 3 Enterprise
|
||||
- `clustered` - InfluxDB 3 Clustered
|
||||
- `cloud-dedicated` - InfluxDB 3 Cloud Dedicated
|
||||
- `cloud-serverless` - InfluxDB Cloud Serverless
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Update Core to version 3.2.1
|
||||
./update-product-version.sh --product core --version 3.2.1
|
||||
|
||||
# Update Clustered to version 2024.1
|
||||
./update-product-version.sh --product clustered --version 2024.1
|
||||
```
|
||||
|
||||
**What it updates:**
|
||||
- `data/products.yml` - Main product version configuration
|
||||
- Docker Compose example files
|
||||
- Installation instructions
|
||||
- Download links
|
||||
|
||||
## Library Functions
|
||||
|
||||
### lib/docker-utils.sh
|
||||
|
||||
Shared Docker utility functions used by other scripts.
|
||||
|
||||
**Available Functions:**
|
||||
- `check_docker_running()` - Verify Docker daemon is running
|
||||
- `container_exists()` - Check if a container exists
|
||||
- `container_running()` - Check if a container is running
|
||||
- `pull_image()` - Pull Docker image with retry logic
|
||||
- `load_auth_token()` - Load authentication tokens from secret files
|
||||
|
||||
**Usage in scripts:**
|
||||
```bash
|
||||
source "$(dirname "$0")/../common/lib/docker-utils.sh"
|
||||
|
||||
if container_running "influxdb3-core"; then
|
||||
echo "Container is running"
|
||||
fi
|
||||
```
|
||||
|
||||
## Integration with GitHub Actions
|
||||
|
||||
These scripts are designed to work in both local development and CI/CD environments:
|
||||
|
||||
**Local Development:**
|
||||
- Assumes Docker Desktop or Docker Engine installed
|
||||
- Uses local file paths for repositories
|
||||
- Can work with running containers
|
||||
|
||||
**GitHub Actions:**
|
||||
- Automatically detects CI environment
|
||||
- Uses workspace paths
|
||||
- Handles authentication via secrets
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Error Handling**: All scripts use `set -e` to exit on errors
|
||||
2. **Logging**: Color-coded output for better readability
|
||||
3. **Validation**: Input validation before processing
|
||||
4. **Idempotency**: Scripts can be run multiple times safely
|
||||
5. **Documentation**: Comprehensive help text in each script
|
||||
|
||||
## Adding New Common Scripts
|
||||
|
||||
When adding scripts to this directory:
|
||||
|
||||
1. Ensure they are truly product-agnostic
|
||||
2. Follow existing naming conventions
|
||||
3. Add comprehensive documentation
|
||||
4. Include error handling and validation
|
||||
5. Update this README
|
||||
6. Test with all supported products
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"outputFormat": "integrated",
|
||||
"repositories": [
|
||||
{
|
||||
"name": "influxdb",
|
||||
"path": "../influxdb",
|
||||
"label": "influxdb",
|
||||
"includePrLinks": true
|
||||
},
|
||||
{
|
||||
"name": "plutonium",
|
||||
"path": "https://github.com/influxdata/plutonium",
|
||||
"label": "enterprise_v1",
|
||||
"includePrLinks": false
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -0,0 +1,11 @@
|
|||
{
|
||||
"outputFormat": "integrated",
|
||||
"repositories": [
|
||||
{
|
||||
"name": "influxdb",
|
||||
"path": "../influxdb",
|
||||
"label": "influxdb",
|
||||
"includePrLinks": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -0,0 +1,25 @@
|
|||
{
|
||||
"outputFormat": "separated",
|
||||
"primaryRepo": "influxdb",
|
||||
"repositories": [
|
||||
{
|
||||
"name": "influxdb",
|
||||
"path": "../influxdb",
|
||||
"label": "influxdb",
|
||||
"includePrLinks": true
|
||||
},
|
||||
{
|
||||
"name": "influxdb_pro",
|
||||
"path": "../influxdb_pro",
|
||||
"label": "influxdb_pro",
|
||||
"includePrLinks": true
|
||||
}
|
||||
],
|
||||
"separatedTemplate": {
|
||||
"header": "> [!Note]\n> #### InfluxDB 3 Core and Enterprise relationship\n>\n> InfluxDB 3 Enterprise is a superset of InfluxDB 3 Core.\n> All updates to Core are automatically included in Enterprise.\n> The Enterprise sections below only list updates exclusive to Enterprise.",
|
||||
"primaryLabel": "Core",
|
||||
"secondaryLabel": "Enterprise",
|
||||
"secondaryIntro": "All Core updates are included in Enterprise. Additional Enterprise-specific features and fixes:",
|
||||
"comment": "If you plan to run .claude enhance-release-notes after this, you need to include PR links. For Enterprise, remove the links after running .claude enhance-release-notes."
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,15 @@
|
|||
{
|
||||
"outputFormat": "core-enterprise",
|
||||
"repositories": [
|
||||
{
|
||||
"name": "influxdb",
|
||||
"path": "/Users/ja/Documents/github/influxdb",
|
||||
"label": "influxdb"
|
||||
},
|
||||
{
|
||||
"name": "influxdb_pro",
|
||||
"path": "/Users/ja/Documents/github/influxdb_pro",
|
||||
"label": "influxdb_pro"
|
||||
}
|
||||
]
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,222 @@
|
|||
# Generate Release Notes
|
||||
|
||||
A JavaScript ESM script to generate release notes for InfluxDB projects by analyzing git commits between two versions.
|
||||
|
||||
## InfluxDB 3 Core/Enterprise
|
||||
|
||||
This script supports the InfluxDB 3 Core/Enterprise relationship and tagged releases.
|
||||
|
||||
## InfluxDB 3 Clustered
|
||||
|
||||
See the Clustered [release process](https://github.com/influxdata/project-clustered?tab=readme-ov-file#release-process).
|
||||
|
||||
|
||||
## Features
|
||||
|
||||
- **Flexible repository support**: Handle single or multiple repositories
|
||||
- **Multiple output formats**:
|
||||
- **Integrated**: All repositories' changes integrated in unified sections
|
||||
- **Separated**: Primary repository first, then secondary repositories
|
||||
- **Merge commit support**: Extracts features and fixes from merge commit bodies
|
||||
- **Conventional commit parsing**: Supports `feat:`, `fix:`, `perf:`, etc.
|
||||
- **PR link generation**: Automatically links to GitHub pull requests (configurable per repository)
|
||||
- **JSON configuration**: Full configuration support via JSON files
|
||||
- **Enhanced commit messages**: Categorizes commits based on affected areas (database, CLI, API, etc.)
|
||||
- **Customizable templates**: Configure headers, labels, and intro text for separated format
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Usage
|
||||
|
||||
```bash
|
||||
# Single repository, standard format
|
||||
node generate-release-notes.js v3.1.0 v3.2.0 /path/to/repo
|
||||
|
||||
# Multiple repositories (auto-detects Core/Enterprise format)
|
||||
node generate-release-notes.js v3.1.0 v3.2.0 /path/to/influxdb /path/to/influxdb_pro
|
||||
|
||||
# Skip fetching from remote
|
||||
node generate-release-notes.js --no-fetch v3.1.0 v3.2.0 /path/to/repo
|
||||
|
||||
# Pull latest changes (use with caution)
|
||||
node generate-release-notes.js --pull v3.1.0 v3.2.0 /path/to/repo
|
||||
|
||||
# Omit PR links from release notes
|
||||
node generate-release-notes.js --no-pr-links v3.1.0 v3.2.0 /path/to/repo
|
||||
```
|
||||
|
||||
### Advanced Usage
|
||||
|
||||
```bash
|
||||
# Explicit format specification
|
||||
node generate-release-notes.js --format separated v3.1.0 v3.2.0 /path/to/influxdb /path/to/influxdb_pro
|
||||
|
||||
# Using JSON configuration
|
||||
node generate-release-notes.js --config config.json v3.1.0 v3.2.0
|
||||
|
||||
# Using product-specific configurations
|
||||
node generate-release-notes.js --config config/influxdb3-core-enterprise.json v3.2.0 v3.2.1
|
||||
node generate-release-notes.js --config config/influxdb-v2.json v2.7.0 v2.7.1
|
||||
node generate-release-notes.js --config config/influxdb3-clustered.json v1.0.0 v1.1.0
|
||||
```
|
||||
|
||||
### Configuration File
|
||||
|
||||
Create a JSON configuration file for complex setups:
|
||||
|
||||
```json
|
||||
{
|
||||
"outputFormat": "separated",
|
||||
"primaryRepo": "influxdb",
|
||||
"repositories": [
|
||||
{
|
||||
"name": "influxdb",
|
||||
"path": "/path/to/influxdb",
|
||||
"label": "Core",
|
||||
"includePrLinks": true
|
||||
},
|
||||
{
|
||||
"name": "influxdb_pro",
|
||||
"path": "/path/to/influxdb_pro",
|
||||
"label": "Enterprise",
|
||||
"includePrLinks": false
|
||||
}
|
||||
],
|
||||
"separatedTemplate": {
|
||||
"header": "> [!Note]\n> Custom header text here",
|
||||
"primaryLabel": "Primary Repository",
|
||||
"secondaryLabel": "Secondary Repositories",
|
||||
"secondaryIntro": "Additional features and fixes from secondary repositories:"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Output Formats
|
||||
|
||||
### Integrated Format
|
||||
|
||||
All repositories' changes are integrated together in unified sections with repository labels and enhanced descriptions:
|
||||
|
||||
```markdown
|
||||
## v3.2.1 {date="2025-07-03"}
|
||||
|
||||
### Features
|
||||
|
||||
- [influxdb] **Database management**: Allow hard_deleted date of deleted schema to be updated ([#26574](https://github.com/influxdata/influxdb/pull/26574))
|
||||
- [influxdb_pro] **License management**: Amend license info ([#987](https://github.com/influxdata/influxdb/pull/987))
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- [influxdb] **CLI**: Add help text for the new update subcommand ([#26569](https://github.com/influxdata/influxdb/pull/26569))
|
||||
```
|
||||
|
||||
When using `--no-pr-links`, the PR links are omitted:
|
||||
|
||||
```markdown
|
||||
## v3.2.1 {date="2025-07-03"}
|
||||
|
||||
### Features
|
||||
|
||||
- [influxdb] **Database management**: Allow hard_deleted date of deleted schema to be updated
|
||||
- [influxdb_pro] **License management**: Amend license info
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- [influxdb] **CLI**: Add help text for the new update subcommand
|
||||
```
|
||||
|
||||
### Separated Format
|
||||
|
||||
Primary repository changes are shown first, followed by secondary repository changes. Ideal for products where one repository is a superset of another:
|
||||
|
||||
```markdown
|
||||
> [!Note]
|
||||
> #### InfluxDB 3 Core and Enterprise relationship
|
||||
>
|
||||
> InfluxDB 3 Enterprise is a superset of InfluxDB 3 Core.
|
||||
> All updates to Core are automatically included in Enterprise.
|
||||
> The Enterprise sections below only list updates exclusive to Enterprise.
|
||||
|
||||
## v3.2.1 {date="2025-07-03"}
|
||||
|
||||
### Core
|
||||
|
||||
#### Features
|
||||
|
||||
- **Database management**: Allow hard_deleted date of deleted schema to be updated ([#26574](https://github.com/influxdata/influxdb/pull/26574))
|
||||
|
||||
### Enterprise
|
||||
|
||||
All Core updates are included in Enterprise. Additional Enterprise-specific features and fixes:
|
||||
|
||||
#### Features
|
||||
|
||||
- **License management**: Amend license info ([#987](https://github.com/influxdata/influxdb/pull/987))
|
||||
```
|
||||
|
||||
## Configuration Options
|
||||
|
||||
### Repository Configuration
|
||||
|
||||
Each repository in the configuration can have:
|
||||
- `name`: Repository identifier
|
||||
- `path`: Path to the repository
|
||||
- `label`: Label used in output
|
||||
- `includePrLinks`: Whether to include PR links (boolean)
|
||||
|
||||
### Separated Format Template
|
||||
|
||||
When using separated format, you can customize:
|
||||
- `header`: Markdown header text shown at the top
|
||||
- `primaryLabel`: Section label for primary repository
|
||||
- `secondaryLabel`: Section label for secondary repositories
|
||||
- `secondaryIntro`: Introduction text for secondary section
|
||||
- `primaryRepo`: Name or index of the primary repository
|
||||
|
||||
## Migration from Bash
|
||||
|
||||
This JavaScript version replaces the previous bash script with the following improvements:
|
||||
|
||||
- **Better error handling**: More robust git command execution
|
||||
- **Flexible configuration**: JSON-based configuration support
|
||||
- **Cleaner code structure**: Object-oriented design with clear separation of concerns
|
||||
- **Enhanced regex handling**: Fixed merge commit parsing issues
|
||||
- **Cross-platform compatibility**: Works on all platforms with Node.js
|
||||
|
||||
## Output Location
|
||||
|
||||
Generated release notes are saved to `helper-scripts/output/release-notes/release-notes-{version}.md`.
|
||||
|
||||
## Requirements
|
||||
|
||||
- Node.js (ES modules support)
|
||||
- Git repositories with the specified version tags
|
||||
- Access to the git repositories (local or remote)
|
||||
|
||||
## Command Line Options
|
||||
|
||||
- `--no-fetch`: Skip fetching latest commits from remote
|
||||
- `--pull`: Pull latest changes (implies fetch) - use with caution
|
||||
- `--no-pr-links`: Omit PR links from commit messages (default: include links)
|
||||
- `--config <file>`: Load configuration from JSON file
|
||||
- `--format <type>`: Output format: 'integrated' or 'separated'
|
||||
- `-h, --help`: Show help message
|
||||
|
||||
## Example Configurations
|
||||
|
||||
### InfluxDB 3 Core/Enterprise
|
||||
|
||||
See `config/influxdb3-core-enterprise.json` for a configuration that:
|
||||
- Uses separated format
|
||||
- Sets influxdb as primary repository (Core)
|
||||
- Sets influxdb_pro as secondary repository (Enterprise)
|
||||
- Includes PR links for Core, excludes them for Enterprise
|
||||
- Adds custom header explaining the Core/Enterprise relationship
|
||||
|
||||
### InfluxDB v2
|
||||
|
||||
See `config/influxdb-v2.json` for a simple single-repository configuration using integrated format.
|
||||
|
||||
### InfluxDB 3 Clustered
|
||||
|
||||
See `config/influxdb3-clustered.json` for Kubernetes operator release notes.
|
||||
|
|
@ -0,0 +1,205 @@
|
|||
#!/bin/bash
|
||||
# Docker utility functions shared across helper scripts
|
||||
|
||||
# Color codes
|
||||
export RED='\033[0;31m'
|
||||
export GREEN='\033[0;32m'
|
||||
export YELLOW='\033[1;33m'
|
||||
export BLUE='\033[0;34m'
|
||||
export NC='\033[0m' # No Color
|
||||
|
||||
# Check if Docker is running
|
||||
check_docker_running() {
|
||||
if ! docker info > /dev/null 2>&1; then
|
||||
echo -e "${RED}Error: Docker is not running${NC}"
|
||||
echo "Please start Docker Desktop or Docker Engine"
|
||||
return 1
|
||||
fi
|
||||
return 0
|
||||
}
|
||||
|
||||
# Check if a container exists
|
||||
container_exists() {
|
||||
local container_name=$1
|
||||
docker ps -a --format '{{.Names}}' | grep -q "^${container_name}$"
|
||||
}
|
||||
|
||||
# Check if a container is running
|
||||
container_running() {
|
||||
local container_name=$1
|
||||
docker ps --format '{{.Names}}' | grep -q "^${container_name}$"
|
||||
}
|
||||
|
||||
# Pull Docker image with retry logic
|
||||
pull_image() {
|
||||
local image=$1
|
||||
local max_retries=3
|
||||
local retry_count=0
|
||||
|
||||
echo -e "${BLUE}Pulling image: $image${NC}"
|
||||
|
||||
while [ $retry_count -lt $max_retries ]; do
|
||||
if docker pull "$image"; then
|
||||
echo -e "${GREEN}✓ Successfully pulled $image${NC}"
|
||||
return 0
|
||||
fi
|
||||
|
||||
retry_count=$((retry_count + 1))
|
||||
if [ $retry_count -lt $max_retries ]; then
|
||||
echo -e "${YELLOW}Retry $retry_count/$max_retries...${NC}"
|
||||
sleep 2
|
||||
fi
|
||||
done
|
||||
|
||||
echo -e "${RED}✗ Failed to pull $image after $max_retries attempts${NC}"
|
||||
return 1
|
||||
}
|
||||
|
||||
# Load authentication token from secret file
|
||||
load_auth_token() {
|
||||
local product=$1
|
||||
local token_var_name=$2
|
||||
|
||||
local secret_file="$HOME/.env.influxdb3-${product}-admin-token"
|
||||
|
||||
if [ -f "$secret_file" ] && [ -s "$secret_file" ]; then
|
||||
local token=$(cat "$secret_file")
|
||||
eval "export $token_var_name='$token'"
|
||||
return 0
|
||||
else
|
||||
echo -e "${YELLOW}Warning: No token found in $secret_file${NC}"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Start container if not running
|
||||
ensure_container_running() {
|
||||
local container_name=$1
|
||||
local service_name=${2:-$container_name}
|
||||
|
||||
if ! container_running "$container_name"; then
|
||||
echo -e "${YELLOW}Starting $container_name...${NC}"
|
||||
|
||||
if docker compose up -d "$service_name"; then
|
||||
# Wait for container to be ready
|
||||
local max_wait=30
|
||||
local waited=0
|
||||
|
||||
while [ $waited -lt $max_wait ]; do
|
||||
if container_running "$container_name"; then
|
||||
echo -e "${GREEN}✓ $container_name is running${NC}"
|
||||
return 0
|
||||
fi
|
||||
sleep 1
|
||||
waited=$((waited + 1))
|
||||
done
|
||||
|
||||
echo -e "${RED}✗ Timeout waiting for $container_name to start${NC}"
|
||||
return 1
|
||||
else
|
||||
echo -e "${RED}✗ Failed to start $container_name${NC}"
|
||||
return 1
|
||||
fi
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
# Execute command in container
|
||||
exec_in_container() {
|
||||
local container_name=$1
|
||||
shift
|
||||
local command="$@"
|
||||
|
||||
if ! container_running "$container_name"; then
|
||||
echo -e "${RED}Error: Container $container_name is not running${NC}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
docker exec "$container_name" $command
|
||||
}
|
||||
|
||||
# Get container health status
|
||||
container_health() {
|
||||
local container_name=$1
|
||||
|
||||
if ! container_exists "$container_name"; then
|
||||
echo "not_found"
|
||||
return
|
||||
fi
|
||||
|
||||
local status=$(docker inspect --format='{{.State.Status}}' "$container_name" 2>/dev/null)
|
||||
echo "${status:-unknown}"
|
||||
}
|
||||
|
||||
# Wait for container to be healthy
|
||||
wait_for_healthy() {
|
||||
local container_name=$1
|
||||
local timeout=${2:-60}
|
||||
|
||||
echo -e "${BLUE}Waiting for $container_name to be healthy...${NC}"
|
||||
|
||||
local elapsed=0
|
||||
while [ $elapsed -lt $timeout ]; do
|
||||
local health=$(docker inspect --format='{{.State.Health.Status}}' "$container_name" 2>/dev/null || echo "no_health_check")
|
||||
|
||||
if [ "$health" = "healthy" ] || [ "$health" = "no_health_check" ]; then
|
||||
echo -e "${GREEN}✓ $container_name is ready${NC}"
|
||||
return 0
|
||||
fi
|
||||
|
||||
sleep 2
|
||||
elapsed=$((elapsed + 2))
|
||||
echo -n "."
|
||||
done
|
||||
|
||||
echo -e "\n${RED}✗ Timeout waiting for $container_name to be healthy${NC}"
|
||||
return 1
|
||||
}
|
||||
|
||||
# Validate Docker image tag format
|
||||
validate_image_tag() {
|
||||
local product=$1
|
||||
local version=$2
|
||||
|
||||
# Check version format
|
||||
if [[ ! "$version" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]] && [ "$version" != "local" ]; then
|
||||
echo -e "${RED}Error: Invalid version format: $version${NC}"
|
||||
echo "Expected format: X.Y.Z (e.g., 3.2.0) or 'local'"
|
||||
return 1
|
||||
fi
|
||||
|
||||
# Check product name
|
||||
case "$product" in
|
||||
core|enterprise|clustered|cloud-dedicated)
|
||||
return 0
|
||||
;;
|
||||
*)
|
||||
echo -e "${RED}Error: Invalid product: $product${NC}"
|
||||
echo "Valid products: core, enterprise, clustered, cloud-dedicated"
|
||||
return 1
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
# Get the correct Docker image name for a product
|
||||
get_docker_image() {
|
||||
local product=$1
|
||||
local version=$2
|
||||
|
||||
case "$product" in
|
||||
core|enterprise)
|
||||
echo "influxdb:${version}-${product}"
|
||||
;;
|
||||
clustered)
|
||||
echo "us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:${version}"
|
||||
;;
|
||||
cloud-dedicated)
|
||||
# Cloud Dedicated typically uses the same image as clustered
|
||||
echo "us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:${version}"
|
||||
;;
|
||||
*)
|
||||
return 1
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
|
@ -0,0 +1,23 @@
|
|||
#!/bin/bash
|
||||
# Automatically updates version numbers in products.yml
|
||||
|
||||
set -e
|
||||
|
||||
PRODUCT=""
|
||||
VERSION=""
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--product) PRODUCT="$2"; shift 2 ;;
|
||||
--version) VERSION="$2"; shift 2 ;;
|
||||
*) echo "Unknown option: $1"; exit 1 ;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Update products.yml using yq
|
||||
yq eval -i ".influxdb3_${PRODUCT}.latest_patch = \"${VERSION}\"" data/products.yml
|
||||
|
||||
# Update Docker compose examples
|
||||
find compose-examples/ -name "*.yml" -exec sed -i "s/influxdb:3-${PRODUCT}:[0-9.]+/influxdb:3-${PRODUCT}:${VERSION}/g" {} \;
|
||||
|
||||
echo "✅ Updated version to ${VERSION} for ${PRODUCT}"
|
||||
|
|
@ -0,0 +1,194 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Git tag validation utility
|
||||
* Validates that provided version strings are actual git tags in the repository
|
||||
*/
|
||||
|
||||
import { spawn } from 'child_process';
|
||||
|
||||
/**
|
||||
* Execute a command and return the output
|
||||
* @param {string} command - Command to execute
|
||||
* @param {string[]} args - Command arguments
|
||||
* @param {string} cwd - Working directory
|
||||
* @returns {Promise<string>} Command output
|
||||
*/
|
||||
function execCommand(command, args = [], cwd = process.cwd()) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const child = spawn(command, args, { cwd, stdio: 'pipe' });
|
||||
let stdout = '';
|
||||
let stderr = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
stdout += data.toString();
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
stderr += data.toString();
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
if (code === 0) {
|
||||
resolve(stdout.trim());
|
||||
} else {
|
||||
reject(
|
||||
new Error(`Command failed: ${command} ${args.join(' ')}\n${stderr}`)
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all git tags from the repository
|
||||
* @param {string} repoPath - Path to the git repository
|
||||
* @returns {Promise<string[]>} Array of tag names
|
||||
*/
|
||||
async function getGitTags(repoPath = process.cwd()) {
|
||||
try {
|
||||
const output = await execCommand(
|
||||
'git',
|
||||
['tag', '--list', '--sort=-version:refname'],
|
||||
repoPath
|
||||
);
|
||||
return output ? output.split('\n').filter((tag) => tag.trim()) : [];
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to get git tags: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate that a version string is an existing git tag
|
||||
* @param {string} version - Version string to validate
|
||||
* @param {string} repoPath - Path to the git repository
|
||||
* @returns {Promise<boolean>} True if version is a valid tag
|
||||
*/
|
||||
async function isValidTag(version, repoPath = process.cwd()) {
|
||||
if (!version || version === 'local') {
|
||||
return true; // 'local' is a special case for development
|
||||
}
|
||||
|
||||
const tags = await getGitTags(repoPath);
|
||||
return tags.includes(version);
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate multiple version tags
|
||||
* @param {string[]} versions - Array of version strings to validate
|
||||
* @param {string} repoPath - Path to the git repository
|
||||
* @returns {Promise<{
|
||||
* valid: boolean,
|
||||
* errors: string[],
|
||||
* availableTags: string[]
|
||||
* }>} Validation result
|
||||
*/
|
||||
async function validateTags(versions, repoPath = process.cwd()) {
|
||||
const errors = [];
|
||||
const availableTags = await getGitTags(repoPath);
|
||||
|
||||
for (const version of versions) {
|
||||
if (version && version !== 'local' && !availableTags.includes(version)) {
|
||||
errors.push(`Version '${version}' is not a valid git tag`);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
availableTags: availableTags.slice(0, 10), // Return top 10 most recent tags
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate version inputs and exit with error if invalid
|
||||
* @param {string} version - Current version
|
||||
* @param {string} previousVersion - Previous version (optional)
|
||||
* @param {string} repoPath - Path to the git repository
|
||||
*/
|
||||
async function validateVersionInputs(
|
||||
version,
|
||||
previousVersion = null,
|
||||
repoPath = process.cwd()
|
||||
) {
|
||||
const versionsToCheck = [version];
|
||||
if (previousVersion) {
|
||||
versionsToCheck.push(previousVersion);
|
||||
}
|
||||
|
||||
const validation = await validateTags(versionsToCheck, repoPath);
|
||||
|
||||
if (!validation.valid) {
|
||||
console.error('\n❌ Version validation failed:');
|
||||
validation.errors.forEach((error) => console.error(` - ${error}`));
|
||||
|
||||
if (validation.availableTags.length > 0) {
|
||||
console.error('\n📋 Available tags (most recent first):');
|
||||
validation.availableTags.forEach((tag) => console.error(` - ${tag}`));
|
||||
} else {
|
||||
console.error('\n📋 No git tags found in repository');
|
||||
}
|
||||
|
||||
console.error(
|
||||
'\n💡 Tip: Use "local" for development/testing with local containers'
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log('✅ Version tags validated successfully');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the repository root path (where .git directory is located)
|
||||
* @param {string} startPath - Starting path to search from
|
||||
* @returns {Promise<string>} Path to repository root
|
||||
*/
|
||||
async function getRepositoryRoot(startPath = process.cwd()) {
|
||||
try {
|
||||
const output = await execCommand(
|
||||
'git',
|
||||
['rev-parse', '--show-toplevel'],
|
||||
startPath
|
||||
);
|
||||
return output;
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Not a git repository or git not available: ${error.message}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
export {
|
||||
getGitTags,
|
||||
isValidTag,
|
||||
validateTags,
|
||||
validateVersionInputs,
|
||||
getRepositoryRoot,
|
||||
};
|
||||
|
||||
// CLI usage when run directly
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
const args = process.argv.slice(2);
|
||||
|
||||
if (args.length === 0) {
|
||||
console.log('Usage: node validate-tags.js <version> [previous-version]');
|
||||
console.log('Examples:');
|
||||
console.log(' node validate-tags.js v3.0.0');
|
||||
console.log(' node validate-tags.js v3.0.0 v2.9.0');
|
||||
console.log(
|
||||
' node validate-tags.js local # Special case for development'
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const [version, previousVersion] = args;
|
||||
|
||||
try {
|
||||
const repoRoot = await getRepositoryRoot();
|
||||
await validateVersionInputs(version, previousVersion, repoRoot);
|
||||
console.log('All versions are valid git tags');
|
||||
} catch (error) {
|
||||
console.error(`Error: ${error.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
|
@ -1,354 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
# Script to generate release notes for InfluxDB v3.x releases
|
||||
# Usage: ./generate-release-notes.sh [--no-fetch] [--pull] <from_version> <to_version> <primary_repo_path> [additional_repo_paths...]
|
||||
#
|
||||
# Options:
|
||||
# --no-fetch Skip fetching latest commits from remote
|
||||
# --pull Pull latest changes (implies fetch) - use with caution as it may change your working directory
|
||||
#
|
||||
# Example: ./generate-release-notes.sh v3.1.0 v3.2.0 /path/to/influxdb /path/to/influxdb_pro /path/to/influxdb_iox
|
||||
# Example: ./generate-release-notes.sh --no-fetch v3.1.0 v3.2.0 /path/to/influxdb
|
||||
# Example: ./generate-release-notes.sh --pull v3.1.0 v3.2.0 /path/to/influxdb /path/to/influxdb_pro
|
||||
|
||||
set -e
|
||||
|
||||
# Parse command line options
|
||||
FETCH_COMMITS=true
|
||||
PULL_COMMITS=false
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--no-fetch)
|
||||
FETCH_COMMITS=false
|
||||
shift
|
||||
;;
|
||||
--pull)
|
||||
PULL_COMMITS=true
|
||||
FETCH_COMMITS=true
|
||||
shift
|
||||
;;
|
||||
-*)
|
||||
echo "Unknown option $1"
|
||||
exit 1
|
||||
;;
|
||||
*)
|
||||
break
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Parse remaining arguments
|
||||
FROM_VERSION="${1:-v3.1.0}"
|
||||
TO_VERSION="${2:-v3.2.0}"
|
||||
PRIMARY_REPO="${3:-/Users/ja/Documents/github/influxdb}"
|
||||
|
||||
# Collect additional repositories (all arguments after the third)
|
||||
ADDITIONAL_REPOS=()
|
||||
shift 3 2>/dev/null || true
|
||||
while [ $# -gt 0 ]; do
|
||||
ADDITIONAL_REPOS+=("$1")
|
||||
shift
|
||||
done
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[0;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
echo -e "${BLUE}Generating release notes for ${TO_VERSION}${NC}"
|
||||
echo -e "Primary Repository: ${PRIMARY_REPO}"
|
||||
if [ ${#ADDITIONAL_REPOS[@]} -gt 0 ]; then
|
||||
echo -e "Additional Repositories:"
|
||||
for repo in "${ADDITIONAL_REPOS[@]}"; do
|
||||
echo -e " - ${repo}"
|
||||
done
|
||||
fi
|
||||
echo -e "From: ${FROM_VERSION} To: ${TO_VERSION}\n"
|
||||
|
||||
# Function to extract PR number from commit message
|
||||
extract_pr_number() {
|
||||
echo "$1" | grep -oE '#[0-9]+' | head -1 | sed 's/#//'
|
||||
}
|
||||
|
||||
# Function to get commits from a repository
|
||||
get_commits_from_repo() {
|
||||
local repo_path="$1"
|
||||
local pattern="$2"
|
||||
local format="${3:-%h %s}"
|
||||
|
||||
if [ -d "$repo_path" ]; then
|
||||
git -C "$repo_path" log --format="$format" "${FROM_VERSION}..${TO_VERSION}" 2>/dev/null | grep -E "$pattern" || true
|
||||
fi
|
||||
}
|
||||
|
||||
# Function to analyze API-related commits
|
||||
analyze_api_changes() {
|
||||
local repo_path="$1"
|
||||
local repo_name="$2"
|
||||
|
||||
if [ ! -d "$repo_path" ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
# Look for API-related file changes
|
||||
local api_files=$(git -C "$repo_path" diff --name-only "${FROM_VERSION}..${TO_VERSION}" 2>/dev/null | grep -E "(api|handler|endpoint|route)" | head -10 || true)
|
||||
|
||||
# Look for specific API endpoint patterns in commit messages and diffs
|
||||
local api_commits=$(git -C "$repo_path" log --format="%h %s" "${FROM_VERSION}..${TO_VERSION}" 2>/dev/null | \
|
||||
grep -iE "(api|endpoint|/write|/query|/ping|/health|/metrics|v1|v2|v3)" || true)
|
||||
|
||||
if [ -n "$api_files" ] || [ -n "$api_commits" ]; then
|
||||
echo " Repository: $repo_name"
|
||||
if [ -n "$api_files" ]; then
|
||||
echo " Modified API files:"
|
||||
echo "$api_files" | while read -r file; do
|
||||
echo " - $file"
|
||||
done
|
||||
fi
|
||||
if [ -n "$api_commits" ]; then
|
||||
echo " API-related commits:"
|
||||
echo "$api_commits" | while read -r commit; do
|
||||
echo " - $commit"
|
||||
done
|
||||
fi
|
||||
echo
|
||||
fi
|
||||
}
|
||||
|
||||
# Get the release date
|
||||
RELEASE_DATE=$(git -C "$PRIMARY_REPO" log -1 --format=%ai "$TO_VERSION" | cut -d' ' -f1)
|
||||
echo -e "${GREEN}Release Date: ${RELEASE_DATE}${NC}\n"
|
||||
|
||||
# Create array of all repositories
|
||||
ALL_REPOS=("$PRIMARY_REPO")
|
||||
for repo in "${ADDITIONAL_REPOS[@]}"; do
|
||||
ALL_REPOS+=("$repo")
|
||||
done
|
||||
|
||||
# Fetch latest commits from all repositories (if enabled)
|
||||
if [ "$FETCH_COMMITS" = true ]; then
|
||||
if [ "$PULL_COMMITS" = true ]; then
|
||||
echo -e "${YELLOW}Pulling latest changes from all repositories...${NC}"
|
||||
echo -e "${RED}Warning: This will modify your working directories!${NC}"
|
||||
else
|
||||
echo -e "${YELLOW}Fetching latest commits from all repositories...${NC}"
|
||||
fi
|
||||
|
||||
for repo in "${ALL_REPOS[@]}"; do
|
||||
if [ -d "$repo" ]; then
|
||||
repo_name=$(basename "$repo")
|
||||
|
||||
if [ "$PULL_COMMITS" = true ]; then
|
||||
echo -e " Pulling changes in $repo_name..."
|
||||
if git -C "$repo" pull origin 2>/dev/null; then
|
||||
echo -e " ${GREEN}✓${NC} Successfully pulled changes in $repo_name"
|
||||
else
|
||||
echo -e " ${RED}✗${NC} Failed to pull changes in $repo_name (trying fetch only)"
|
||||
if git -C "$repo" fetch origin 2>/dev/null; then
|
||||
echo -e " ${GREEN}✓${NC} Successfully fetched from $repo_name"
|
||||
else
|
||||
echo -e " ${RED}✗${NC} Failed to fetch from $repo_name (continuing with local commits)"
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo -e " Fetching from $repo_name..."
|
||||
if git -C "$repo" fetch origin 2>/dev/null; then
|
||||
echo -e " ${GREEN}✓${NC} Successfully fetched from $repo_name"
|
||||
else
|
||||
echo -e " ${RED}✗${NC} Failed to fetch from $repo_name (continuing with local commits)"
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo -e " ${RED}✗${NC} Repository not found: $repo"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo -e "${YELLOW}Skipping fetch (using local commits only)${NC}"
|
||||
fi
|
||||
|
||||
# Collect commits by category from all repositories
|
||||
echo -e "\n${YELLOW}Analyzing commits across all repositories...${NC}"
|
||||
|
||||
# Initialize variables
|
||||
FEATURES=""
|
||||
FIXES=""
|
||||
BREAKING=""
|
||||
PERF=""
|
||||
API_CHANGES=""
|
||||
|
||||
# Collect commits from all repositories
|
||||
for repo in "${ALL_REPOS[@]}"; do
|
||||
if [ -d "$repo" ]; then
|
||||
repo_name=$(basename "$repo")
|
||||
echo -e " Analyzing $repo_name..."
|
||||
|
||||
# Features
|
||||
repo_features=$(get_commits_from_repo "$repo" "^[a-f0-9]+ feat:" | sed "s/^[a-f0-9]* feat: /- [$repo_name] /")
|
||||
if [ -n "$repo_features" ]; then
|
||||
FEATURES="$FEATURES$repo_features"$'\n'
|
||||
fi
|
||||
|
||||
# Fixes
|
||||
repo_fixes=$(get_commits_from_repo "$repo" "^[a-f0-9]+ fix:" | sed "s/^[a-f0-9]* fix: /- [$repo_name] /")
|
||||
if [ -n "$repo_fixes" ]; then
|
||||
FIXES="$FIXES$repo_fixes"$'\n'
|
||||
fi
|
||||
|
||||
# Breaking changes
|
||||
repo_breaking=$(get_commits_from_repo "$repo" "^[a-f0-9]+ .*(BREAKING|breaking change)" | sed "s/^[a-f0-9]* /- [$repo_name] /")
|
||||
if [ -n "$repo_breaking" ]; then
|
||||
BREAKING="$BREAKING$repo_breaking"$'\n'
|
||||
fi
|
||||
|
||||
# Performance improvements
|
||||
repo_perf=$(get_commits_from_repo "$repo" "^[a-f0-9]+ perf:" | sed "s/^[a-f0-9]* perf: /- [$repo_name] /")
|
||||
if [ -n "$repo_perf" ]; then
|
||||
PERF="$PERF$repo_perf"$'\n'
|
||||
fi
|
||||
|
||||
# API changes
|
||||
repo_api=$(get_commits_from_repo "$repo" "(api|endpoint|/write|/query|/ping|/health|/metrics|v1|v2|v3)" | sed "s/^[a-f0-9]* /- [$repo_name] /")
|
||||
if [ -n "$repo_api" ]; then
|
||||
API_CHANGES="$API_CHANGES$repo_api"$'\n'
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
# Analyze API changes in detail
|
||||
echo -e "\n${YELLOW}Analyzing HTTP API changes...${NC}"
|
||||
for repo in "${ALL_REPOS[@]}"; do
|
||||
repo_name=$(basename "$repo")
|
||||
analyze_api_changes "$repo" "$repo_name"
|
||||
done
|
||||
|
||||
# Generate markdown output
|
||||
OUTPUT_FILE="release-notes-${TO_VERSION}.md"
|
||||
cat > "$OUTPUT_FILE" << EOF
|
||||
## ${TO_VERSION} {date="${RELEASE_DATE}"}
|
||||
|
||||
### Features
|
||||
|
||||
EOF
|
||||
|
||||
# Add features
|
||||
if [ -n "$FEATURES" ]; then
|
||||
echo "$FEATURES" | while IFS= read -r line; do
|
||||
if [ -n "$line" ]; then
|
||||
PR=$(extract_pr_number "$line")
|
||||
# Clean up the commit message
|
||||
CLEAN_LINE=$(echo "$line" | sed -E 's/ \(#[0-9]+\)$//')
|
||||
if [ -n "$PR" ]; then
|
||||
echo "$CLEAN_LINE ([#$PR](https://github.com/influxdata/influxdb/pull/$PR))" >> "$OUTPUT_FILE"
|
||||
else
|
||||
echo "$CLEAN_LINE" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "- No new features in this release" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
|
||||
# Add bug fixes
|
||||
cat >> "$OUTPUT_FILE" << EOF
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
EOF
|
||||
|
||||
if [ -n "$FIXES" ]; then
|
||||
echo "$FIXES" | while IFS= read -r line; do
|
||||
if [ -n "$line" ]; then
|
||||
PR=$(extract_pr_number "$line")
|
||||
CLEAN_LINE=$(echo "$line" | sed -E 's/ \(#[0-9]+\)$//')
|
||||
if [ -n "$PR" ]; then
|
||||
echo "$CLEAN_LINE ([#$PR](https://github.com/influxdata/influxdb/pull/$PR))" >> "$OUTPUT_FILE"
|
||||
else
|
||||
echo "$CLEAN_LINE" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "- No bug fixes in this release" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
|
||||
# Add breaking changes if any
|
||||
if [ -n "$BREAKING" ]; then
|
||||
cat >> "$OUTPUT_FILE" << EOF
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
EOF
|
||||
echo "$BREAKING" | while IFS= read -r line; do
|
||||
if [ -n "$line" ]; then
|
||||
PR=$(extract_pr_number "$line")
|
||||
CLEAN_LINE=$(echo "$line" | sed -E 's/ \(#[0-9]+\)$//')
|
||||
if [ -n "$PR" ]; then
|
||||
echo "$CLEAN_LINE ([#$PR](https://github.com/influxdata/influxdb/pull/$PR))" >> "$OUTPUT_FILE"
|
||||
else
|
||||
echo "$CLEAN_LINE" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Add performance improvements if any
|
||||
if [ -n "$PERF" ]; then
|
||||
cat >> "$OUTPUT_FILE" << EOF
|
||||
|
||||
### Performance Improvements
|
||||
|
||||
EOF
|
||||
echo "$PERF" | while IFS= read -r line; do
|
||||
if [ -n "$line" ]; then
|
||||
PR=$(extract_pr_number "$line")
|
||||
CLEAN_LINE=$(echo "$line" | sed -E 's/ \(#[0-9]+\)$//')
|
||||
if [ -n "$PR" ]; then
|
||||
echo "$CLEAN_LINE ([#$PR](https://github.com/influxdata/influxdb/pull/$PR))" >> "$OUTPUT_FILE"
|
||||
else
|
||||
echo "$CLEAN_LINE" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Add HTTP API changes if any
|
||||
if [ -n "$API_CHANGES" ]; then
|
||||
cat >> "$OUTPUT_FILE" << EOF
|
||||
|
||||
### HTTP API Changes
|
||||
|
||||
EOF
|
||||
echo "$API_CHANGES" | while IFS= read -r line; do
|
||||
if [ -n "$line" ]; then
|
||||
PR=$(extract_pr_number "$line")
|
||||
CLEAN_LINE=$(echo "$line" | sed -E 's/ \(#[0-9]+\)$//')
|
||||
if [ -n "$PR" ]; then
|
||||
echo "$CLEAN_LINE ([#$PR](https://github.com/influxdata/influxdb/pull/$PR))" >> "$OUTPUT_FILE"
|
||||
else
|
||||
echo "$CLEAN_LINE" >> "$OUTPUT_FILE"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Add API analysis summary
|
||||
cat >> "$OUTPUT_FILE" << EOF
|
||||
|
||||
### API Analysis Summary
|
||||
|
||||
The following endpoints may have been affected in this release:
|
||||
- v1 API endpoints: \`/write\`, \`/query\`, \`/ping\`
|
||||
- v2 API endpoints: \`/api/v2/write\`, \`/api/v2/query\`
|
||||
- v3 API endpoints: \`/api/v3/*\`
|
||||
- System endpoints: \`/health\`, \`/metrics\`
|
||||
|
||||
Please review the commit details above and consult the API documentation for specific changes.
|
||||
|
||||
EOF
|
||||
|
||||
echo -e "\n${GREEN}Release notes generated in: ${OUTPUT_FILE}${NC}"
|
||||
echo -e "${YELLOW}Please review and edit the generated notes before adding to documentation.${NC}"
|
||||
echo -e "${BLUE}API changes have been automatically detected and included.${NC}"
|
||||
|
|
@ -0,0 +1,36 @@
|
|||
# InfluxDB 3 distributed (Cloud Dedicated and Clustered) documentation helper scripts
|
||||
|
||||
This directory contains scripts designed to help make specific maintenance
|
||||
processes easier.
|
||||
|
||||
## InfluxDB Clustered release artifacts
|
||||
|
||||
**Script:** `./clustered-release-artifacts.sh`
|
||||
|
||||
Each InfluxDB Clustered release has the following associated artifacts that need
|
||||
to be provided with the release notes:
|
||||
|
||||
- `example-customer.yaml`
|
||||
- `app-instance-schema.json`
|
||||
|
||||
This script uses an InfluxDB Clustered pull secret to pull down the required
|
||||
assets and store them in `static/downloads/clustered-release-artifacts/<RELEASE>`.
|
||||
|
||||
1. **Set up the pull secret:**
|
||||
|
||||
The **Clustered Pull Secret** (config.json) is available in Docs Team
|
||||
1Password vault. Download the pull secret and store it in the
|
||||
`/tmp/influxdbsecret` directory on your local machine.
|
||||
|
||||
2. Install dependencies:
|
||||
|
||||
- [Install `crane`](https://github.com/google/go-containerregistry/tree/main/cmd/crane#installation).
|
||||
- [Install `jq`](https://jqlang.org/download/)
|
||||
|
||||
3. From the root of the docs project directory, run the following command to
|
||||
execute the script. Provide the release version as an argument to the
|
||||
script--for example:
|
||||
|
||||
```sh
|
||||
sh ./helper-scripts/clustered-release-artifacts.sh 20250508-1719206
|
||||
```
|
||||
|
|
@ -0,0 +1,373 @@
|
|||
# InfluxDB 3 Monolith (Core and Enterprise) Helper Scripts
|
||||
|
||||
This directory contains helper scripts specifically for InfluxDB 3 Core and Enterprise (monolith deployments), as opposed to distributed/clustered deployments.
|
||||
|
||||
## Overview
|
||||
|
||||
These scripts help with documentation workflows for InfluxDB 3 Core and Enterprise, including CLI change detection, authentication setup, API analysis, and release preparation.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- **Docker and Docker Compose**: For running InfluxDB 3 containers
|
||||
- **Node.js 16+**: For running JavaScript ESM scripts
|
||||
- **Active containers**: InfluxDB 3 Core and/or Enterprise containers running via `docker compose`
|
||||
- **Secret files**: Docker Compose secrets for auth tokens (`~/.env.influxdb3-core-admin-token` and `~/.env.influxdb3-enterprise-admin-token`)
|
||||
|
||||
## Scripts
|
||||
|
||||
### 🔐 Authentication & Setup
|
||||
|
||||
#### `setup-auth-tokens.sh`
|
||||
Creates and configures authentication tokens for InfluxDB 3 containers.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./setup-auth-tokens.sh [core|enterprise|both]
|
||||
```
|
||||
|
||||
**What it does:**
|
||||
- Checks existing tokens in secret files (`~/.env.influxdb3-core-admin-token` and `~/.env.influxdb3-enterprise-admin-token`)
|
||||
- Starts containers if not running
|
||||
- Creates admin tokens using `influxdb3 create token --admin`
|
||||
- Updates appropriate secret files with new tokens
|
||||
- Tests tokens to ensure they work
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Set up both Core and Enterprise tokens
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Set up only Enterprise
|
||||
./setup-auth-tokens.sh enterprise
|
||||
```
|
||||
|
||||
### 🔍 CLI Documentation Audit
|
||||
|
||||
#### `audit-cli-documentation.js`
|
||||
JavaScript ESM script that audits InfluxDB 3 CLI commands against existing documentation to identify missing or outdated content.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
node audit-cli-documentation.js [core|enterprise|both] [version|local]
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Compares actual CLI help output with documented commands
|
||||
- Identifies missing documentation for new CLI options
|
||||
- Finds documented options that no longer exist in the CLI
|
||||
- Supports both released versions and local containers
|
||||
- Generates detailed audit reports with recommendations
|
||||
- Handles authentication automatically using Docker secrets
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Audit Core documentation against local container
|
||||
node audit-cli-documentation.js core local
|
||||
|
||||
# Audit Enterprise documentation against specific version
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
# Audit both products against local containers
|
||||
node audit-cli-documentation.js both local
|
||||
```
|
||||
|
||||
**Output:**
|
||||
- `../output/cli-audit/documentation-audit-{product}-{version}.md` - Detailed audit report
|
||||
- `../output/cli-audit/parsed-cli-{product}-{version}.md` - Parsed CLI structure
|
||||
- `../output/cli-audit/patches/{product}/` - Generated patches for missing documentation
|
||||
|
||||
### 🛠️ CLI Documentation Updates
|
||||
|
||||
#### `apply-cli-patches.js`
|
||||
JavaScript ESM script that applies generated patches to update CLI documentation with missing options.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
node apply-cli-patches.js [core|enterprise|both] [--dry-run]
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Applies patches generated by `audit-cli-documentation.js`
|
||||
- Updates CLI reference documentation with missing options
|
||||
- Supports dry-run mode to preview changes
|
||||
- Maintains existing documentation structure and formatting
|
||||
- Creates backups before applying changes
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Preview changes without applying (dry run)
|
||||
node apply-cli-patches.js core --dry-run
|
||||
|
||||
# Apply patches to Enterprise documentation
|
||||
node apply-cli-patches.js enterprise
|
||||
|
||||
# Apply patches to both products
|
||||
node apply-cli-patches.js both
|
||||
```
|
||||
|
||||
**Output:**
|
||||
- Updates CLI reference documentation files in place
|
||||
- Creates backup files with `.backup` extension
|
||||
- Logs all changes made to the documentation
|
||||
|
||||
## Quick Start Guide
|
||||
|
||||
### 1. Initial Setup
|
||||
|
||||
```bash
|
||||
# Navigate to the monolith scripts directory
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
|
||||
# Make scripts executable
|
||||
chmod +x *.sh
|
||||
|
||||
# Set up authentication for both products
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Restart containers to load new secrets
|
||||
docker compose down && docker compose up -d influxdb3-core influxdb3-enterprise
|
||||
```
|
||||
|
||||
### 2. CLI Documentation Audit
|
||||
|
||||
```bash
|
||||
# Start your containers
|
||||
docker compose up -d influxdb3-core influxdb3-enterprise
|
||||
|
||||
# Audit CLI documentation
|
||||
node audit-cli-documentation.js core local
|
||||
node audit-cli-documentation.js enterprise local
|
||||
|
||||
# Review the output
|
||||
ls ../output/cli-audit/
|
||||
```
|
||||
|
||||
### 3. Development Workflow
|
||||
|
||||
```bash
|
||||
# Audit documentation for both products
|
||||
node audit-cli-documentation.js both local
|
||||
|
||||
# Check the audit results
|
||||
cat ../output/cli-audit/documentation-audit-core-local.md
|
||||
cat ../output/cli-audit/documentation-audit-enterprise-local.md
|
||||
|
||||
# Apply patches if needed (dry run first)
|
||||
node apply-cli-patches.js both --dry-run
|
||||
```
|
||||
|
||||
### 4. Release Documentation Updates
|
||||
|
||||
For release documentation, use the audit and patch workflow:
|
||||
|
||||
```bash
|
||||
# Audit against released version
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
# Review missing documentation
|
||||
cat ../output/cli-audit/documentation-audit-enterprise-v3.2.0.md
|
||||
|
||||
# Apply patches to update documentation
|
||||
node apply-cli-patches.js enterprise
|
||||
|
||||
# Verify changes look correct
|
||||
git diff content/influxdb3/enterprise/reference/cli/
|
||||
```
|
||||
|
||||
## Container Integration
|
||||
|
||||
The scripts work with your Docker Compose setup:
|
||||
|
||||
**Expected container names:**
|
||||
- `influxdb3-core` (port 8282)
|
||||
- `influxdb3-enterprise` (port 8181)
|
||||
|
||||
**Docker Compose secrets:**
|
||||
- `influxdb3-core-admin-token` - Admin token for Core (stored in `~/.env.influxdb3-core-admin-token`)
|
||||
- `influxdb3-enterprise-admin-token` - Admin token for Enterprise (stored in `~/.env.influxdb3-enterprise-admin-token`)
|
||||
- `INFLUXDB3_LICENSE_EMAIL` - Enterprise license email (set in `.env.3ent` env_file)
|
||||
|
||||
## Use Cases
|
||||
|
||||
### 📋 Release Documentation
|
||||
|
||||
1. **Pre-release audit:**
|
||||
```bash
|
||||
node audit-cli-documentation.js core v3.2.0
|
||||
```
|
||||
|
||||
2. **Review audit results and update documentation**
|
||||
3. **Apply patches for missing content**
|
||||
4. **Test documented commands work correctly**
|
||||
|
||||
### 🔬 Development Testing
|
||||
|
||||
1. **Audit local development:**
|
||||
```bash
|
||||
node audit-cli-documentation.js enterprise local
|
||||
```
|
||||
|
||||
2. **Verify new features are documented**
|
||||
3. **Test authentication setup**
|
||||
4. **Apply patches to keep docs current**
|
||||
|
||||
### 🚀 Release Preparation
|
||||
|
||||
1. **Final audit before release:**
|
||||
```bash
|
||||
node audit-cli-documentation.js both local
|
||||
```
|
||||
|
||||
2. **Apply all pending patches**
|
||||
3. **Update examples and tutorials**
|
||||
4. **Verify all CLI commands work as documented**
|
||||
|
||||
## Output Structure
|
||||
|
||||
```
|
||||
helper-scripts/
|
||||
├── output/
|
||||
│ └── cli-audit/
|
||||
│ ├── documentation-audit-core-local.md # CLI documentation audit report
|
||||
│ ├── documentation-audit-enterprise-v3.2.0.md # CLI documentation audit report
|
||||
│ ├── parsed-cli-core-local.md # Parsed CLI structure
|
||||
│ ├── parsed-cli-enterprise-v3.2.0.md # Parsed CLI structure
|
||||
│ └── patches/
|
||||
│ ├── core/ # Generated patches for Core
|
||||
│ │ ├── influxdb3-cli-patch-001.md
|
||||
│ │ └── influxdb3-cli-patch-002.md
|
||||
│ └── enterprise/ # Generated patches for Enterprise
|
||||
│ ├── influxdb3-cli-patch-001.md
|
||||
│ └── influxdb3-cli-patch-002.md
|
||||
└── influxdb3-monolith/
|
||||
├── README.md # This file
|
||||
├── setup-auth-tokens.sh # Auth setup
|
||||
├── audit-cli-documentation.js # CLI documentation audit
|
||||
└── apply-cli-patches.js # CLI documentation patches
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Common Issues
|
||||
|
||||
**Container not running:**
|
||||
```bash
|
||||
# Check status
|
||||
docker compose ps
|
||||
|
||||
# Start specific service
|
||||
docker compose up -d influxdb3-core
|
||||
```
|
||||
|
||||
**Authentication failures:**
|
||||
```bash
|
||||
# Recreate tokens
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Test manually
|
||||
docker exec influxdb3-core influxdb3 create token --admin
|
||||
```
|
||||
|
||||
**Version not found:**
|
||||
```bash
|
||||
# Check available versions
|
||||
docker pull influxdb:3-core:3.2.0
|
||||
docker pull influxdb:3-enterprise:3.2.0
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug output for troubleshooting:
|
||||
```bash
|
||||
DEBUG=1 node audit-cli-documentation.js core local
|
||||
```
|
||||
|
||||
## Integration with CI/CD
|
||||
|
||||
### GitHub Actions Example
|
||||
|
||||
```yaml
|
||||
- name: Audit CLI Documentation
|
||||
run: |
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
node audit-cli-documentation.js core ${{ env.VERSION }}
|
||||
|
||||
- name: Upload CLI Audit Results
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: cli-audit
|
||||
path: helper-scripts/output/cli-audit/
|
||||
```
|
||||
|
||||
### CircleCI Example
|
||||
|
||||
```yaml
|
||||
- run:
|
||||
name: CLI Documentation Audit
|
||||
command: |
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
- store_artifacts:
|
||||
path: helper-scripts/output/cli-audit/
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 🔒 Security
|
||||
- Secret files (`~/.env.influxdb3-*-admin-token`) are stored in your home directory and not in version control
|
||||
- Rotate auth tokens regularly by re-running `setup-auth-tokens.sh`
|
||||
- Use minimal token permissions when possible
|
||||
|
||||
### 📚 Documentation
|
||||
- Run audits early in release cycle
|
||||
- Review all audit reports for missing content
|
||||
- Apply patches to keep documentation current
|
||||
- Test all documented commands work correctly
|
||||
|
||||
### 🔄 Workflow
|
||||
- Use `local` version for development testing
|
||||
- Audit against released versions for release prep
|
||||
- Generate patches before documentation updates
|
||||
- Validate changes with stakeholders
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Script Permissions
|
||||
```bash
|
||||
chmod +x *.sh
|
||||
```
|
||||
|
||||
### Missing Dependencies
|
||||
```bash
|
||||
# Node.js dependencies
|
||||
node --version # Should be 16 or higher
|
||||
|
||||
# Docker Compose
|
||||
docker compose version
|
||||
```
|
||||
|
||||
### Container Health
|
||||
```bash
|
||||
# Check container logs
|
||||
docker logs influxdb3-core
|
||||
docker logs influxdb3-enterprise
|
||||
|
||||
# Test basic connectivity
|
||||
docker exec influxdb3-core influxdb3 --version
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
When adding new scripts to this directory:
|
||||
|
||||
1. **Follow naming conventions**: Use lowercase with hyphens
|
||||
2. **Add usage documentation**: Include help text in scripts
|
||||
3. **Handle errors gracefully**: Use proper exit codes
|
||||
4. **Test with both products**: Ensure Core and Enterprise compatibility
|
||||
5. **Update this README**: Document new functionality
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [InfluxDB 3 Core CLI Reference](/influxdb3/core/reference/cli/)
|
||||
- [InfluxDB 3 Enterprise CLI Reference](/influxdb3/enterprise/reference/cli/)
|
||||
|
|
@ -0,0 +1,277 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Apply CLI documentation patches generated by audit-cli-documentation.js
|
||||
* Usage: node apply-cli-patches.js [core|enterprise|both] [--dry-run]
|
||||
*/
|
||||
|
||||
import { promises as fs } from 'fs';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import { process } from 'node:process';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
// Color codes
|
||||
const Colors = {
|
||||
RED: '\x1b[0;31m',
|
||||
GREEN: '\x1b[0;32m',
|
||||
YELLOW: '\x1b[1;33m',
|
||||
BLUE: '\x1b[0;34m',
|
||||
NC: '\x1b[0m', // No Color
|
||||
};
|
||||
|
||||
async function fileExists(path) {
|
||||
try {
|
||||
await fs.access(path);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureDir(dir) {
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
}
|
||||
|
||||
async function extractFrontmatter(content) {
|
||||
const lines = content.split('\n');
|
||||
if (lines[0] !== '---') return { frontmatter: null, content };
|
||||
|
||||
const frontmatterLines = [];
|
||||
let i = 1;
|
||||
while (i < lines.length && lines[i] !== '---') {
|
||||
frontmatterLines.push(lines[i]);
|
||||
i++;
|
||||
}
|
||||
|
||||
if (i >= lines.length) return { frontmatter: null, content };
|
||||
|
||||
const frontmatterText = frontmatterLines.join('\n');
|
||||
const remainingContent = lines.slice(i + 1).join('\n');
|
||||
|
||||
return { frontmatter: frontmatterText, content: remainingContent };
|
||||
}
|
||||
|
||||
async function getActualDocumentationPath(docPath, projectRoot) {
|
||||
// Check if the documentation file exists and has a source field
|
||||
const fullPath = join(projectRoot, docPath);
|
||||
|
||||
if (await fileExists(fullPath)) {
|
||||
const content = await fs.readFile(fullPath, 'utf8');
|
||||
const { frontmatter } = await extractFrontmatter(content);
|
||||
|
||||
if (frontmatter) {
|
||||
// Look for source: field in frontmatter
|
||||
const sourceMatch = frontmatter.match(/^source:\s*(.+)$/m);
|
||||
if (sourceMatch) {
|
||||
const sourcePath = sourceMatch[1].trim();
|
||||
return sourcePath;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return docPath;
|
||||
}
|
||||
|
||||
async function applyPatches(product, dryRun = false) {
|
||||
const patchDir = join(
|
||||
dirname(__dirname),
|
||||
'output',
|
||||
'cli-audit',
|
||||
'patches',
|
||||
product
|
||||
);
|
||||
const projectRoot = join(__dirname, '..', '..');
|
||||
|
||||
console.log(
|
||||
`${Colors.BLUE}📋 Applying CLI documentation patches for ${product}${Colors.NC}`
|
||||
);
|
||||
if (dryRun) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}🔍 DRY RUN - No files will be created${Colors.NC}`
|
||||
);
|
||||
}
|
||||
console.log();
|
||||
|
||||
// Check if patch directory exists
|
||||
if (!(await fileExists(patchDir))) {
|
||||
console.log(`${Colors.YELLOW}No patches found for ${product}.${Colors.NC}`);
|
||||
console.log("Run 'yarn audit:cli' first to generate patches.");
|
||||
return;
|
||||
}
|
||||
|
||||
// Read all patch files
|
||||
const patchFiles = await fs.readdir(patchDir);
|
||||
const mdFiles = patchFiles.filter((f) => f.endsWith('.md'));
|
||||
|
||||
if (mdFiles.length === 0) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}No patch files found in ${patchDir}${Colors.NC}`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Found ${mdFiles.length} patch file(s) to apply:\n`);
|
||||
|
||||
// Map patch files to their destination
|
||||
const baseCliPath = `content/influxdb3/${product}/reference/cli/influxdb3`;
|
||||
const commandToFile = {
|
||||
'create-database.md': `${baseCliPath}/create/database.md`,
|
||||
'create-token.md': `${baseCliPath}/create/token/_index.md`,
|
||||
'create-token-admin.md': `${baseCliPath}/create/token/admin.md`,
|
||||
'create-trigger.md': `${baseCliPath}/create/trigger.md`,
|
||||
'create-table.md': `${baseCliPath}/create/table.md`,
|
||||
'create-last_cache.md': `${baseCliPath}/create/last_cache.md`,
|
||||
'create-distinct_cache.md': `${baseCliPath}/create/distinct_cache.md`,
|
||||
'show-databases.md': `${baseCliPath}/show/databases.md`,
|
||||
'show-tokens.md': `${baseCliPath}/show/tokens.md`,
|
||||
'delete-database.md': `${baseCliPath}/delete/database.md`,
|
||||
'delete-table.md': `${baseCliPath}/delete/table.md`,
|
||||
'query.md': `${baseCliPath}/query.md`,
|
||||
'write.md': `${baseCliPath}/write.md`,
|
||||
};
|
||||
|
||||
let applied = 0;
|
||||
let skipped = 0;
|
||||
|
||||
for (const patchFile of mdFiles) {
|
||||
const destinationPath = commandToFile[patchFile];
|
||||
|
||||
if (!destinationPath) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}⚠️ Unknown patch file: ${patchFile}${Colors.NC}`
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get the actual documentation path (handles source: frontmatter)
|
||||
const actualPath = await getActualDocumentationPath(
|
||||
destinationPath,
|
||||
projectRoot
|
||||
);
|
||||
const fullDestPath = join(projectRoot, actualPath);
|
||||
const patchPath = join(patchDir, patchFile);
|
||||
|
||||
// Check if destination already exists
|
||||
if (await fileExists(fullDestPath)) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}⏭️ Skipping${Colors.NC} ${patchFile} - destination already exists:`
|
||||
);
|
||||
console.log(` ${actualPath}`);
|
||||
skipped++;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (dryRun) {
|
||||
console.log(`${Colors.BLUE}🔍 Would create${Colors.NC} ${actualPath}`);
|
||||
console.log(` from patch: ${patchFile}`);
|
||||
if (actualPath !== destinationPath) {
|
||||
console.log(` (resolved from: ${destinationPath})`);
|
||||
}
|
||||
applied++;
|
||||
} else {
|
||||
try {
|
||||
// Ensure destination directory exists
|
||||
await ensureDir(dirname(fullDestPath));
|
||||
|
||||
// Copy patch to destination
|
||||
const content = await fs.readFile(patchPath, 'utf8');
|
||||
|
||||
// Update the menu configuration based on product
|
||||
let updatedContent = content;
|
||||
if (product === 'enterprise') {
|
||||
updatedContent = content
|
||||
.replace('influxdb3/core/tags:', 'influxdb3/enterprise/tags:')
|
||||
.replace(
|
||||
'influxdb3_core_reference:',
|
||||
'influxdb3_enterprise_reference:'
|
||||
);
|
||||
}
|
||||
|
||||
await fs.writeFile(fullDestPath, updatedContent);
|
||||
|
||||
console.log(`${Colors.GREEN}✅ Created${Colors.NC} ${actualPath}`);
|
||||
console.log(` from patch: ${patchFile}`);
|
||||
if (actualPath !== destinationPath) {
|
||||
console.log(` (resolved from: ${destinationPath})`);
|
||||
}
|
||||
applied++;
|
||||
} catch (error) {
|
||||
console.log(
|
||||
`${Colors.RED}❌ Error${Colors.NC} creating ${actualPath}:`
|
||||
);
|
||||
console.log(` ${error.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log();
|
||||
console.log(`${Colors.BLUE}Summary:${Colors.NC}`);
|
||||
console.log(`- Patches ${dryRun ? 'would be' : ''} applied: ${applied}`);
|
||||
console.log(`- Files skipped (already exist): ${skipped}`);
|
||||
console.log(`- Total patch files: ${mdFiles.length}`);
|
||||
|
||||
if (!dryRun && applied > 0) {
|
||||
console.log();
|
||||
console.log(
|
||||
`${Colors.GREEN}✨ Success!${Colors.NC} Created ${applied} new ` +
|
||||
'documentation file(s).'
|
||||
);
|
||||
console.log();
|
||||
console.log('Next steps:');
|
||||
console.log('1. Review the generated files and customize the content');
|
||||
console.log('2. Add proper examples with placeholders');
|
||||
console.log('3. Update descriptions and add any missing options');
|
||||
console.log('4. Run tests: yarn test:links');
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const args = process.argv.slice(2);
|
||||
const product =
|
||||
args.find((arg) => ['core', 'enterprise', 'both'].includes(arg)) || 'both';
|
||||
const dryRun = args.includes('--dry-run');
|
||||
|
||||
if (args.includes('--help') || args.includes('-h')) {
|
||||
console.log(
|
||||
'Usage: node apply-cli-patches.js [core|enterprise|both] [--dry-run]'
|
||||
);
|
||||
console.log();
|
||||
console.log('Options:');
|
||||
console.log(
|
||||
' --dry-run Show what would be done without creating files'
|
||||
);
|
||||
console.log();
|
||||
console.log('Examples:');
|
||||
console.log(
|
||||
' node apply-cli-patches.js # Apply patches for both products'
|
||||
);
|
||||
console.log(
|
||||
' node apply-cli-patches.js core --dry-run # Preview core patches'
|
||||
);
|
||||
console.log(
|
||||
' node apply-cli-patches.js enterprise # Apply enterprise patches'
|
||||
);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
try {
|
||||
if (product === 'both') {
|
||||
await applyPatches('core', dryRun);
|
||||
console.log();
|
||||
await applyPatches('enterprise', dryRun);
|
||||
} else {
|
||||
await applyPatches(product, dryRun);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`${Colors.RED}Error:${Colors.NC}`, error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main();
|
||||
}
|
||||
|
|
@ -0,0 +1,974 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Audit CLI documentation against current CLI help output
|
||||
* Usage: node audit-cli-documentation.js [core|enterprise|both] [version]
|
||||
* Example: node audit-cli-documentation.js core 3.2.0
|
||||
*/
|
||||
|
||||
import { spawn } from 'child_process';
|
||||
import { promises as fs } from 'fs';
|
||||
import { homedir } from 'os';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import {
|
||||
validateVersionInputs,
|
||||
getRepositoryRoot,
|
||||
} from '../common/validate-tags.js';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
// Color codes
|
||||
const Colors = {
|
||||
RED: '\x1b[0;31m',
|
||||
GREEN: '\x1b[0;32m',
|
||||
YELLOW: '\x1b[1;33m',
|
||||
BLUE: '\x1b[0;34m',
|
||||
NC: '\x1b[0m', // No Color
|
||||
};
|
||||
|
||||
class CLIDocAuditor {
|
||||
constructor(product = 'both', version = 'local') {
|
||||
this.product = product;
|
||||
this.version = version;
|
||||
this.outputDir = join(dirname(__dirname), 'output', 'cli-audit');
|
||||
|
||||
// Token paths - check environment variables first (Docker Compose), then fall back to local files
|
||||
const coreTokenEnv = process.env.INFLUXDB3_CORE_TOKEN;
|
||||
const enterpriseTokenEnv = process.env.INFLUXDB3_ENTERPRISE_TOKEN;
|
||||
|
||||
if (coreTokenEnv && this.fileExists(coreTokenEnv)) {
|
||||
// Running in Docker Compose with secrets
|
||||
this.coreTokenFile = coreTokenEnv;
|
||||
this.enterpriseTokenFile = enterpriseTokenEnv;
|
||||
} else {
|
||||
// Running locally
|
||||
this.coreTokenFile = join(homedir(), '.env.influxdb3-core-admin-token');
|
||||
this.enterpriseTokenFile = join(
|
||||
homedir(),
|
||||
'.env.influxdb3-enterprise-admin-token'
|
||||
);
|
||||
}
|
||||
|
||||
// Commands to extract help for
|
||||
this.mainCommands = [
|
||||
'create',
|
||||
'delete',
|
||||
'disable',
|
||||
'enable',
|
||||
'query',
|
||||
'show',
|
||||
'test',
|
||||
'update',
|
||||
'write',
|
||||
];
|
||||
this.subcommands = [
|
||||
'create database',
|
||||
'create token admin',
|
||||
'create token',
|
||||
'create trigger',
|
||||
'create last_cache',
|
||||
'create distinct_cache',
|
||||
'create table',
|
||||
'show databases',
|
||||
'show tokens',
|
||||
'show system',
|
||||
'delete database',
|
||||
'delete table',
|
||||
'delete trigger',
|
||||
'update database',
|
||||
'test wal_plugin',
|
||||
'test schedule_plugin',
|
||||
];
|
||||
|
||||
// Map for command tracking during option parsing
|
||||
this.commandOptionsMap = {};
|
||||
}
|
||||
|
||||
async fileExists(path) {
|
||||
try {
|
||||
await fs.access(path);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async ensureDir(dir) {
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
}
|
||||
|
||||
async loadTokens() {
|
||||
let coreToken = null;
|
||||
let enterpriseToken = null;
|
||||
|
||||
try {
|
||||
if (await this.fileExists(this.coreTokenFile)) {
|
||||
const stat = await fs.stat(this.coreTokenFile);
|
||||
if (stat.size > 0) {
|
||||
coreToken = (await fs.readFile(this.coreTokenFile, 'utf8')).trim();
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Token file doesn't exist or can't be read
|
||||
}
|
||||
|
||||
try {
|
||||
if (await this.fileExists(this.enterpriseTokenFile)) {
|
||||
const stat = await fs.stat(this.enterpriseTokenFile);
|
||||
if (stat.size > 0) {
|
||||
enterpriseToken = (
|
||||
await fs.readFile(this.enterpriseTokenFile, 'utf8')
|
||||
).trim();
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Token file doesn't exist or can't be read
|
||||
}
|
||||
|
||||
return { coreToken, enterpriseToken };
|
||||
}
|
||||
|
||||
runCommand(cmd, args = []) {
|
||||
return new Promise((resolve) => {
|
||||
const child = spawn(cmd, args, { encoding: 'utf8' });
|
||||
let stdout = '';
|
||||
let stderr = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
stdout += data.toString();
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
stderr += data.toString();
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
resolve({ code, stdout, stderr });
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
resolve({ code: 1, stdout: '', stderr: err.message });
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async extractCurrentCLI(product, outputFile) {
|
||||
process.stdout.write(
|
||||
`Extracting current CLI help from influxdb3-${product}...`
|
||||
);
|
||||
|
||||
await this.loadTokens();
|
||||
|
||||
if (this.version === 'local') {
|
||||
const containerName = `influxdb3-${product}`;
|
||||
|
||||
// Check if container is running
|
||||
const { code, stdout } = await this.runCommand('docker', [
|
||||
'ps',
|
||||
'--format',
|
||||
'{{.Names}}',
|
||||
]);
|
||||
if (code !== 0 || !stdout.includes(containerName)) {
|
||||
console.log(` ${Colors.RED}✗${Colors.NC}`);
|
||||
console.log(`Error: Container ${containerName} is not running.`);
|
||||
console.log(`Start it with: docker compose up -d influxdb3-${product}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Extract comprehensive help
|
||||
let fileContent = '';
|
||||
|
||||
// Main help
|
||||
const mainHelp = await this.runCommand('docker', [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
'--help',
|
||||
]);
|
||||
fileContent += mainHelp.code === 0 ? mainHelp.stdout : mainHelp.stderr;
|
||||
|
||||
// Extract all subcommand help
|
||||
for (const cmd of this.mainCommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${cmd} --help =====\n`;
|
||||
const cmdHelp = await this.runCommand('docker', [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
cmd,
|
||||
'--help',
|
||||
]);
|
||||
fileContent += cmdHelp.code === 0 ? cmdHelp.stdout : cmdHelp.stderr;
|
||||
}
|
||||
|
||||
// Extract detailed subcommand help
|
||||
for (const subcmd of this.subcommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${subcmd} --help =====\n`;
|
||||
const cmdParts = [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
...subcmd.split(' '),
|
||||
'--help',
|
||||
];
|
||||
const subcmdHelp = await this.runCommand('docker', cmdParts);
|
||||
fileContent +=
|
||||
subcmdHelp.code === 0 ? subcmdHelp.stdout : subcmdHelp.stderr;
|
||||
}
|
||||
|
||||
await fs.writeFile(outputFile, fileContent);
|
||||
console.log(` ${Colors.GREEN}✓${Colors.NC}`);
|
||||
} else {
|
||||
// Use specific version image
|
||||
const image = `influxdb:${this.version}-${product}`;
|
||||
|
||||
process.stdout.write(`Extracting CLI help from ${image}...`);
|
||||
|
||||
// Pull image if needed
|
||||
const pullResult = await this.runCommand('docker', ['pull', image]);
|
||||
if (pullResult.code !== 0) {
|
||||
console.log(` ${Colors.RED}✗${Colors.NC}`);
|
||||
console.log(`Error: Failed to pull image ${image}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Extract help from specific version
|
||||
let fileContent = '';
|
||||
|
||||
// Main help
|
||||
const mainHelp = await this.runCommand('docker', [
|
||||
'run',
|
||||
'--rm',
|
||||
image,
|
||||
'influxdb3',
|
||||
'--help',
|
||||
]);
|
||||
fileContent += mainHelp.code === 0 ? mainHelp.stdout : mainHelp.stderr;
|
||||
|
||||
// Extract subcommand help
|
||||
for (const cmd of this.mainCommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${cmd} --help =====\n`;
|
||||
const cmdHelp = await this.runCommand('docker', [
|
||||
'run',
|
||||
'--rm',
|
||||
image,
|
||||
'influxdb3',
|
||||
cmd,
|
||||
'--help',
|
||||
]);
|
||||
fileContent += cmdHelp.code === 0 ? cmdHelp.stdout : cmdHelp.stderr;
|
||||
}
|
||||
|
||||
await fs.writeFile(outputFile, fileContent);
|
||||
console.log(` ${Colors.GREEN}✓${Colors.NC}`);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async parseCLIHelp(helpFile, parsedFile) {
|
||||
const content = await fs.readFile(helpFile, 'utf8');
|
||||
const lines = content.split('\n');
|
||||
|
||||
let output = '# CLI Commands and Options\n\n';
|
||||
let currentCommand = '';
|
||||
let inOptions = false;
|
||||
|
||||
for (const line of lines) {
|
||||
// Detect command headers
|
||||
if (line.startsWith('===== influxdb3') && line.endsWith('--help =====')) {
|
||||
currentCommand = line
|
||||
.replace('===== ', '')
|
||||
.replace(' --help =====', '')
|
||||
.trim();
|
||||
output += `## ${currentCommand}\n\n`;
|
||||
inOptions = false;
|
||||
// Initialize options list for this command
|
||||
this.commandOptionsMap[currentCommand] = [];
|
||||
}
|
||||
// Detect options sections
|
||||
else if (line.trim() === 'Options:') {
|
||||
output += '### Options:\n\n';
|
||||
inOptions = true;
|
||||
}
|
||||
// Parse option lines
|
||||
else if (inOptions && /^\s*-/.test(line)) {
|
||||
// Extract option and description
|
||||
const optionMatch = line.match(/--[a-z][a-z0-9-]*/);
|
||||
const shortMatch = line.match(/\s-[a-zA-Z],/);
|
||||
|
||||
if (optionMatch) {
|
||||
const option = optionMatch[0];
|
||||
const shortOption = shortMatch
|
||||
? shortMatch[0].replace(/[,\s]/g, '')
|
||||
: null;
|
||||
|
||||
// Extract description by removing option parts
|
||||
let description = line.replace(/^\s*-[^\s]*\s*/, '');
|
||||
description = description.replace(/^\s*--[^\s]*\s*/, '').trim();
|
||||
|
||||
if (shortOption) {
|
||||
output += `- \`${shortOption}, ${option}\`: ${description}\n`;
|
||||
} else {
|
||||
output += `- \`${option}\`: ${description}\n`;
|
||||
}
|
||||
|
||||
// Store option with its command context
|
||||
if (currentCommand && option) {
|
||||
this.commandOptionsMap[currentCommand].push(option);
|
||||
}
|
||||
}
|
||||
}
|
||||
// Reset options flag for new sections
|
||||
else if (/^[A-Z][a-z]+:$/.test(line.trim())) {
|
||||
inOptions = false;
|
||||
}
|
||||
}
|
||||
|
||||
await fs.writeFile(parsedFile, output);
|
||||
}
|
||||
|
||||
findDocsPath(product) {
|
||||
if (product === 'core') {
|
||||
return 'content/influxdb3/core/reference/cli/influxdb3';
|
||||
} else if (product === 'enterprise') {
|
||||
return 'content/influxdb3/enterprise/reference/cli/influxdb3';
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
async extractCommandHelp(content, command) {
|
||||
// Find the section for this specific command in the CLI help
|
||||
const lines = content.split('\n');
|
||||
let inCommand = false;
|
||||
let helpText = [];
|
||||
const commandHeader = `===== influxdb3 ${command} --help =====`;
|
||||
|
||||
for (let i = 0; i < lines.length; i++) {
|
||||
if (lines[i] === commandHeader) {
|
||||
inCommand = true;
|
||||
continue;
|
||||
}
|
||||
if (inCommand && lines[i].startsWith('===== influxdb3')) {
|
||||
break;
|
||||
}
|
||||
if (inCommand) {
|
||||
helpText.push(lines[i]);
|
||||
}
|
||||
}
|
||||
|
||||
return helpText.join('\n').trim();
|
||||
}
|
||||
|
||||
async generateDocumentationTemplate(command, helpText) {
|
||||
// Parse the help text to extract description and options
|
||||
const lines = helpText.split('\n');
|
||||
let description = '';
|
||||
let usage = '';
|
||||
let options = [];
|
||||
let inOptions = false;
|
||||
|
||||
for (let i = 0; i < lines.length; i++) {
|
||||
const line = lines[i];
|
||||
|
||||
if (i === 0 && !line.startsWith('Usage:') && line.trim()) {
|
||||
description = line.trim();
|
||||
}
|
||||
if (line.startsWith('Usage:')) {
|
||||
usage = line.replace('Usage:', '').trim();
|
||||
}
|
||||
if (line.trim() === 'Options:') {
|
||||
inOptions = true;
|
||||
continue;
|
||||
}
|
||||
if (inOptions && /^\s*-/.test(line)) {
|
||||
const optionMatch = line.match(/--([a-z][a-z0-9-]*)/);
|
||||
const shortMatch = line.match(/\s-([a-zA-Z]),/);
|
||||
if (optionMatch) {
|
||||
const optionName = optionMatch[1];
|
||||
const shortOption = shortMatch ? shortMatch[1] : null;
|
||||
let optionDesc = line
|
||||
.replace(/^\s*-[^\s]*\s*/, '')
|
||||
.replace(/^\s*--[^\s]*\s*/, '')
|
||||
.trim();
|
||||
|
||||
options.push({
|
||||
name: optionName,
|
||||
short: shortOption,
|
||||
description: optionDesc,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Generate markdown template
|
||||
let template = `---
|
||||
title: influxdb3 ${command}
|
||||
description: >
|
||||
The \`influxdb3 ${command}\` command ${description.toLowerCase()}.
|
||||
influxdb3/core/tags: [cli]
|
||||
menu:
|
||||
influxdb3_core_reference:
|
||||
parent: influxdb3 cli
|
||||
weight: 201
|
||||
---
|
||||
|
||||
# influxdb3 ${command}
|
||||
|
||||
${description}
|
||||
|
||||
## Usage
|
||||
|
||||
\`\`\`bash
|
||||
${usage || `influxdb3 ${command} [OPTIONS]`}
|
||||
\`\`\`
|
||||
|
||||
`;
|
||||
|
||||
if (options.length > 0) {
|
||||
template += `## Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
`;
|
||||
|
||||
for (const opt of options) {
|
||||
const optionDisplay = opt.short
|
||||
? `\`-${opt.short}\`, \`--${opt.name}\``
|
||||
: `\`--${opt.name}\``;
|
||||
template += `| ${optionDisplay} | ${opt.description} |\n`;
|
||||
}
|
||||
}
|
||||
|
||||
template += `
|
||||
## Examples
|
||||
|
||||
### Example 1: Basic usage
|
||||
|
||||
{{% code-placeholders "PLACEHOLDER1|PLACEHOLDER2" %}}
|
||||
\`\`\`bash
|
||||
influxdb3 ${command} --example PLACEHOLDER1
|
||||
\`\`\`
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
Replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}\`PLACEHOLDER1\`{{% /code-placeholder-key %}}: Description of placeholder
|
||||
`;
|
||||
|
||||
return template;
|
||||
}
|
||||
|
||||
async extractFrontmatter(content) {
|
||||
const lines = content.split('\n');
|
||||
if (lines[0] !== '---') return { frontmatter: null, content };
|
||||
|
||||
const frontmatterLines = [];
|
||||
let i = 1;
|
||||
while (i < lines.length && lines[i] !== '---') {
|
||||
frontmatterLines.push(lines[i]);
|
||||
i++;
|
||||
}
|
||||
|
||||
if (i >= lines.length) return { frontmatter: null, content };
|
||||
|
||||
const frontmatterText = frontmatterLines.join('\n');
|
||||
const remainingContent = lines.slice(i + 1).join('\n');
|
||||
|
||||
return { frontmatter: frontmatterText, content: remainingContent };
|
||||
}
|
||||
|
||||
async getActualContentPath(filePath) {
|
||||
// Get the actual content path, resolving source fields
|
||||
try {
|
||||
const content = await fs.readFile(filePath, 'utf8');
|
||||
const { frontmatter } = await this.extractFrontmatter(content);
|
||||
|
||||
if (frontmatter) {
|
||||
const sourceMatch = frontmatter.match(/^source:\s*(.+)$/m);
|
||||
if (sourceMatch) {
|
||||
let sourcePath = sourceMatch[1].trim();
|
||||
// Handle relative paths from project root
|
||||
if (sourcePath.startsWith('/shared/')) {
|
||||
sourcePath = `content${sourcePath}`;
|
||||
}
|
||||
return sourcePath;
|
||||
}
|
||||
}
|
||||
return null; // No source field found
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async parseDocumentedOptions(filePath) {
|
||||
// Parse a documentation file to extract all documented options
|
||||
try {
|
||||
const content = await fs.readFile(filePath, 'utf8');
|
||||
const options = [];
|
||||
|
||||
// Look for options in various patterns:
|
||||
// 1. Markdown tables with option columns
|
||||
// 2. Option lists with backticks
|
||||
// 3. Code examples with --option flags
|
||||
|
||||
// Pattern 1: Markdown tables (| Option | Description |)
|
||||
const tableMatches = content.match(/\|\s*`?--[a-z][a-z0-9-]*`?\s*\|/gi);
|
||||
if (tableMatches) {
|
||||
for (const match of tableMatches) {
|
||||
const option = match.match(/--[a-z][a-z0-9-]*/i);
|
||||
if (option) {
|
||||
options.push(option[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 2: Backtick-enclosed options in text
|
||||
const backtickMatches = content.match(/`--[a-z][a-z0-9-]*`/gi);
|
||||
if (backtickMatches) {
|
||||
for (const match of backtickMatches) {
|
||||
const option = match.replace(/`/g, '');
|
||||
options.push(option);
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 3: Options in code blocks
|
||||
const codeBlockMatches = content.match(/```[\s\S]*?```/g);
|
||||
if (codeBlockMatches) {
|
||||
for (const block of codeBlockMatches) {
|
||||
const blockOptions = block.match(/--[a-z][a-z0-9-]*/gi);
|
||||
if (blockOptions) {
|
||||
options.push(...blockOptions);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 4: Environment variable mappings (INFLUXDB3_* to --option)
|
||||
const envMatches = content.match(
|
||||
/\|\s*`INFLUXDB3_[^`]*`\s*\|\s*`--[a-z][a-z0-9-]*`\s*\|/gi
|
||||
);
|
||||
if (envMatches) {
|
||||
for (const match of envMatches) {
|
||||
const option = match.match(/--[a-z][a-z0-9-]*/);
|
||||
if (option) {
|
||||
options.push(option[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove duplicates and return sorted
|
||||
return [...new Set(options)].sort();
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async auditDocs(product, cliFile, auditFile) {
|
||||
const docsPath = this.findDocsPath(product);
|
||||
const sharedPath = 'content/shared/influxdb3-cli';
|
||||
const patchDir = join(this.outputDir, 'patches', product);
|
||||
await this.ensureDir(patchDir);
|
||||
|
||||
let output = `# CLI Documentation Audit - ${product}\n`;
|
||||
output += `Generated: ${new Date().toISOString()}\n\n`;
|
||||
|
||||
// GitHub base URL for edit links
|
||||
const githubBase = 'https://github.com/influxdata/docs-v2/edit/master';
|
||||
const githubNewBase = 'https://github.com/influxdata/docs-v2/new/master';
|
||||
|
||||
// VSCode links for local editing
|
||||
const vscodeBase = 'vscode://file';
|
||||
const projectRoot = join(__dirname, '..', '..');
|
||||
|
||||
// Check for missing documentation
|
||||
output += '## Missing Documentation\n\n';
|
||||
|
||||
let missingCount = 0;
|
||||
const missingDocs = [];
|
||||
|
||||
// Map commands to expected documentation files
|
||||
const commandToFile = {
|
||||
'create database': 'create/database.md',
|
||||
'create token': 'create/token/_index.md',
|
||||
'create token admin': 'create/token/admin.md',
|
||||
'create trigger': 'create/trigger.md',
|
||||
'create table': 'create/table.md',
|
||||
'create last_cache': 'create/last_cache.md',
|
||||
'create distinct_cache': 'create/distinct_cache.md',
|
||||
'show databases': 'show/databases.md',
|
||||
'show tokens': 'show/tokens.md',
|
||||
'delete database': 'delete/database.md',
|
||||
'delete table': 'delete/table.md',
|
||||
query: 'query.md',
|
||||
write: 'write.md',
|
||||
};
|
||||
|
||||
// Extract commands from CLI help
|
||||
const content = await fs.readFile(cliFile, 'utf8');
|
||||
const lines = content.split('\n');
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.startsWith('===== influxdb3') && line.endsWith('--help =====')) {
|
||||
const command = line
|
||||
.replace('===== influxdb3 ', '')
|
||||
.replace(' --help =====', '');
|
||||
|
||||
if (commandToFile[command]) {
|
||||
const expectedFile = commandToFile[command];
|
||||
const productFile = join(docsPath, expectedFile);
|
||||
const sharedFile = join(sharedPath, expectedFile);
|
||||
|
||||
const productExists = await this.fileExists(productFile);
|
||||
const sharedExists = await this.fileExists(sharedFile);
|
||||
|
||||
let needsContent = false;
|
||||
let targetPath = null;
|
||||
let stubPath = null;
|
||||
|
||||
if (!productExists && !sharedExists) {
|
||||
// Completely missing
|
||||
needsContent = true;
|
||||
targetPath = productFile;
|
||||
} else if (productExists) {
|
||||
// Check if it has a source field pointing to missing content
|
||||
const actualPath = await this.getActualContentPath(productFile);
|
||||
if (actualPath && !(await this.fileExists(actualPath))) {
|
||||
needsContent = true;
|
||||
targetPath = actualPath;
|
||||
stubPath = productFile;
|
||||
}
|
||||
} else if (sharedExists) {
|
||||
// Shared file exists, check if it has content
|
||||
const actualPath = await this.getActualContentPath(sharedFile);
|
||||
if (actualPath && !(await this.fileExists(actualPath))) {
|
||||
needsContent = true;
|
||||
targetPath = actualPath;
|
||||
stubPath = sharedFile;
|
||||
}
|
||||
}
|
||||
|
||||
if (needsContent && targetPath) {
|
||||
const githubNewUrl = `${githubNewBase}/${targetPath}`;
|
||||
const localPath = join(projectRoot, targetPath);
|
||||
|
||||
output += `- **Missing**: Documentation for \`influxdb3 ${command}\`\n`;
|
||||
if (stubPath) {
|
||||
output += ` - Stub exists at: \`${stubPath}\`\n`;
|
||||
output += ` - Content needed at: \`${targetPath}\`\n`;
|
||||
} else {
|
||||
output += ` - Expected: \`${targetPath}\` or \`${sharedFile}\`\n`;
|
||||
}
|
||||
output += ` - [Create on GitHub](${githubNewUrl})\n`;
|
||||
output += ` - Local: \`${localPath}\`\n`;
|
||||
|
||||
// Generate documentation template
|
||||
const helpText = await this.extractCommandHelp(content, command);
|
||||
const docTemplate = await this.generateDocumentationTemplate(
|
||||
command,
|
||||
helpText
|
||||
);
|
||||
|
||||
// Save patch file
|
||||
const patchFileName = `${command.replace(/ /g, '-')}.md`;
|
||||
const patchFile = join(patchDir, patchFileName);
|
||||
await fs.writeFile(patchFile, docTemplate);
|
||||
|
||||
output += ` - **Template generated**: \`${patchFile}\`\n`;
|
||||
|
||||
missingDocs.push({ command, file: targetPath, patchFile });
|
||||
missingCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (missingCount === 0) {
|
||||
output += 'No missing documentation files detected.\n';
|
||||
} else {
|
||||
output += '\n### Quick Actions\n\n';
|
||||
output +=
|
||||
'Copy and paste these commands to create missing documentation:\n\n';
|
||||
output += '```bash\n';
|
||||
for (const doc of missingDocs) {
|
||||
const relativePatch = join(
|
||||
'helper-scripts/output/cli-audit/patches',
|
||||
product,
|
||||
`${doc.command.replace(/ /g, '-')}.md`
|
||||
);
|
||||
output += `# Create ${doc.command} documentation\n`;
|
||||
output += `mkdir -p $(dirname ${doc.file})\n`;
|
||||
output += `cp ${relativePatch} ${doc.file}\n\n`;
|
||||
}
|
||||
output += '```\n';
|
||||
}
|
||||
|
||||
output += '\n';
|
||||
|
||||
// Check for outdated options in existing docs
|
||||
output += '## Existing Documentation Review\n\n';
|
||||
|
||||
// Parse CLI help first to populate commandOptionsMap
|
||||
const parsedFile = join(
|
||||
this.outputDir,
|
||||
`parsed-cli-${product}-${this.version}.md`
|
||||
);
|
||||
await this.parseCLIHelp(cliFile, parsedFile);
|
||||
|
||||
// For each command, check if documentation exists and compare content
|
||||
const existingDocs = [];
|
||||
for (const [command, expectedFile] of Object.entries(commandToFile)) {
|
||||
const productFile = join(docsPath, expectedFile);
|
||||
const sharedFile = join(sharedPath, expectedFile);
|
||||
|
||||
let docFile = null;
|
||||
let actualContentFile = null;
|
||||
|
||||
// Find the documentation file
|
||||
if (await this.fileExists(productFile)) {
|
||||
docFile = productFile;
|
||||
// Check if it's a stub with source field
|
||||
const actualPath = await this.getActualContentPath(productFile);
|
||||
actualContentFile = actualPath
|
||||
? join(projectRoot, actualPath)
|
||||
: join(projectRoot, productFile);
|
||||
} else if (await this.fileExists(sharedFile)) {
|
||||
docFile = sharedFile;
|
||||
actualContentFile = join(projectRoot, sharedFile);
|
||||
}
|
||||
|
||||
if (docFile && (await this.fileExists(actualContentFile))) {
|
||||
const githubEditUrl = `${githubBase}/${docFile}`;
|
||||
const localPath = join(projectRoot, docFile);
|
||||
const vscodeUrl = `${vscodeBase}/${localPath}`;
|
||||
|
||||
// Get CLI options for this command
|
||||
const cliOptions = this.commandOptionsMap[`influxdb3 ${command}`] || [];
|
||||
|
||||
// Parse documentation content to find documented options
|
||||
const documentedOptions =
|
||||
await this.parseDocumentedOptions(actualContentFile);
|
||||
|
||||
// Find missing options (in CLI but not in docs)
|
||||
const missingOptions = cliOptions.filter(
|
||||
(opt) => !documentedOptions.includes(opt)
|
||||
);
|
||||
|
||||
// Find extra options (in docs but not in CLI)
|
||||
const extraOptions = documentedOptions.filter(
|
||||
(opt) => !cliOptions.includes(opt)
|
||||
);
|
||||
|
||||
existingDocs.push({
|
||||
command,
|
||||
file: docFile,
|
||||
actualContentFile: actualContentFile.replace(
|
||||
join(projectRoot, ''),
|
||||
''
|
||||
),
|
||||
githubUrl: githubEditUrl,
|
||||
localPath,
|
||||
vscodeUrl,
|
||||
cliOptions,
|
||||
documentedOptions,
|
||||
missingOptions,
|
||||
extraOptions,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (existingDocs.length > 0) {
|
||||
output += 'Review these existing documentation files for accuracy:\n\n';
|
||||
|
||||
for (const doc of existingDocs) {
|
||||
output += `### \`influxdb3 ${doc.command}\`\n`;
|
||||
output += `- **File**: \`${doc.file}\`\n`;
|
||||
if (doc.actualContentFile !== doc.file) {
|
||||
output += `- **Content**: \`${doc.actualContentFile}\`\n`;
|
||||
}
|
||||
output += `- [Edit on GitHub](${doc.githubUrl})\n`;
|
||||
output += `- [Open in VS Code](${doc.vscodeUrl})\n`;
|
||||
output += `- **Local**: \`${doc.localPath}\`\n`;
|
||||
|
||||
// Show option analysis
|
||||
if (doc.missingOptions.length > 0) {
|
||||
output += `- **⚠️ Missing from docs** (${doc.missingOptions.length} options):\n`;
|
||||
for (const option of doc.missingOptions.sort()) {
|
||||
output += ` - \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
|
||||
if (doc.extraOptions.length > 0) {
|
||||
output += `- **ℹ️ Documented but not in CLI** (${doc.extraOptions.length} options):\n`;
|
||||
for (const option of doc.extraOptions.sort()) {
|
||||
output += ` - \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
|
||||
if (doc.missingOptions.length === 0 && doc.extraOptions.length === 0) {
|
||||
output += `- **✅ Options match** (${doc.cliOptions.length} options)\n`;
|
||||
}
|
||||
|
||||
if (doc.cliOptions.length > 0) {
|
||||
output += `- **All CLI Options** (${doc.cliOptions.length}):\n`;
|
||||
const uniqueOptions = [...new Set(doc.cliOptions)].sort();
|
||||
for (const option of uniqueOptions) {
|
||||
const status = doc.missingOptions.includes(option) ? '❌' : '✅';
|
||||
output += ` - ${status} \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
output += '\n';
|
||||
}
|
||||
}
|
||||
|
||||
output += '\n## Summary\n';
|
||||
output += `- Missing documentation files: ${missingCount}\n`;
|
||||
output += `- Existing documentation files: ${existingDocs.length}\n`;
|
||||
output += `- Generated templates: ${missingCount}\n`;
|
||||
output += '- Options are grouped by command for easier review\n\n';
|
||||
|
||||
output += '## Automation Suggestions\n\n';
|
||||
output +=
|
||||
'1. **Use generated templates**: Check the `patches` directory for pre-filled documentation templates\n';
|
||||
output +=
|
||||
'2. **Batch creation**: Use the shell commands above to quickly create all missing files\n';
|
||||
output +=
|
||||
'3. **CI Integration**: Add this audit to your CI pipeline to catch missing docs early\n';
|
||||
output +=
|
||||
'4. **Auto-PR**: Create a GitHub Action that runs this audit and opens PRs for missing docs\n\n';
|
||||
|
||||
await fs.writeFile(auditFile, output);
|
||||
console.log(`📄 Audit complete: ${auditFile}`);
|
||||
|
||||
if (missingCount > 0) {
|
||||
console.log(
|
||||
`📝 Generated ${missingCount} documentation templates in: ${patchDir}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
async run() {
|
||||
console.log(
|
||||
`${Colors.BLUE}🔍 InfluxDB 3 CLI Documentation Audit${Colors.NC}`
|
||||
);
|
||||
console.log('=======================================');
|
||||
console.log(`Product: ${this.product}`);
|
||||
console.log(`Version: ${this.version}`);
|
||||
console.log();
|
||||
|
||||
// Ensure output directory exists
|
||||
await this.ensureDir(this.outputDir);
|
||||
|
||||
if (this.product === 'core') {
|
||||
const cliFile = join(
|
||||
this.outputDir,
|
||||
`current-cli-core-${this.version}.txt`
|
||||
);
|
||||
const auditFile = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-core-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('core', cliFile)) {
|
||||
await this.auditDocs('core', cliFile, auditFile);
|
||||
}
|
||||
} else if (this.product === 'enterprise') {
|
||||
const cliFile = join(
|
||||
this.outputDir,
|
||||
`current-cli-enterprise-${this.version}.txt`
|
||||
);
|
||||
const auditFile = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-enterprise-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('enterprise', cliFile)) {
|
||||
await this.auditDocs('enterprise', cliFile, auditFile);
|
||||
}
|
||||
} else if (this.product === 'both') {
|
||||
// Core
|
||||
const cliFileCore = join(
|
||||
this.outputDir,
|
||||
`current-cli-core-${this.version}.txt`
|
||||
);
|
||||
const auditFileCore = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-core-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('core', cliFileCore)) {
|
||||
await this.auditDocs('core', cliFileCore, auditFileCore);
|
||||
}
|
||||
|
||||
// Enterprise
|
||||
const cliFileEnt = join(
|
||||
this.outputDir,
|
||||
`current-cli-enterprise-${this.version}.txt`
|
||||
);
|
||||
const auditFileEnt = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-enterprise-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('enterprise', cliFileEnt)) {
|
||||
await this.auditDocs('enterprise', cliFileEnt, auditFileEnt);
|
||||
}
|
||||
} else {
|
||||
console.error(`Error: Invalid product '${this.product}'`);
|
||||
console.error(
|
||||
'Usage: node audit-cli-documentation.js [core|enterprise|both] [version]'
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log();
|
||||
console.log(
|
||||
`${Colors.GREEN}✅ CLI documentation audit complete!${Colors.NC}`
|
||||
);
|
||||
console.log();
|
||||
console.log('Next steps:');
|
||||
console.log(`1. Review the audit reports in: ${this.outputDir}`);
|
||||
console.log('2. Update missing documentation files');
|
||||
console.log('3. Verify options match current CLI behavior');
|
||||
console.log('4. Update examples and usage patterns');
|
||||
}
|
||||
}
|
||||
|
||||
// Main execution
|
||||
async function main() {
|
||||
const args = process.argv.slice(2);
|
||||
const product = args[0] || 'both';
|
||||
const version = args[1] || 'local';
|
||||
|
||||
// Validate product
|
||||
if (!['core', 'enterprise', 'both'].includes(product)) {
|
||||
console.error(`Error: Invalid product '${product}'`);
|
||||
console.error(
|
||||
'Usage: node audit-cli-documentation.js [core|enterprise|both] [version]'
|
||||
);
|
||||
console.error('Example: node audit-cli-documentation.js core 3.2.0');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Validate version tag
|
||||
try {
|
||||
const repoRoot = await getRepositoryRoot();
|
||||
await validateVersionInputs(version, null, repoRoot);
|
||||
} catch (error) {
|
||||
console.error(`Version validation failed: ${error.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const auditor = new CLIDocAuditor(product, version);
|
||||
await auditor.run();
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main().catch((err) => {
|
||||
console.error('Error:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
|
||||
export { CLIDocAuditor };
|
||||
|
|
@ -0,0 +1,164 @@
|
|||
#!/bin/bash
|
||||
# Set up authentication tokens for InfluxDB 3 Core and Enterprise containers
|
||||
# Usage: ./setup-auth-tokens.sh [core|enterprise|both]
|
||||
|
||||
set -e
|
||||
|
||||
# Color codes
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
# Parse arguments
|
||||
TARGET=${1:-both}
|
||||
|
||||
echo -e "${BLUE}🔐 InfluxDB 3 Authentication Setup${NC}"
|
||||
echo "=================================="
|
||||
echo ""
|
||||
|
||||
# Check for and load existing secret files
|
||||
SECRET_CORE_FILE="$HOME/.env.influxdb3-core-admin-token"
|
||||
SECRET_ENT_FILE="$HOME/.env.influxdb3-enterprise-admin-token"
|
||||
|
||||
if [ -f "$SECRET_CORE_FILE" ]; then
|
||||
echo "✅ Found existing Core token secret file"
|
||||
else
|
||||
echo "📝 Creating new Core token secret file: $SECRET_CORE_FILE"
|
||||
touch "$SECRET_CORE_FILE"
|
||||
fi
|
||||
|
||||
if [ -f "$SECRET_ENT_FILE" ]; then
|
||||
echo "✅ Found existing Enterprise token secret file"
|
||||
else
|
||||
echo "📝 Creating new Enterprise token secret file: $SECRET_ENT_FILE"
|
||||
touch "$SECRET_ENT_FILE"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
|
||||
# Function to setup auth for a product
|
||||
setup_auth() {
|
||||
local product=$1
|
||||
local container_name="influxdb3-${product}"
|
||||
local port
|
||||
local secret_file
|
||||
|
||||
case "$product" in
|
||||
"core")
|
||||
port="8282"
|
||||
secret_file="$SECRET_CORE_FILE"
|
||||
;;
|
||||
"enterprise")
|
||||
port="8181"
|
||||
secret_file="$SECRET_ENT_FILE"
|
||||
;;
|
||||
esac
|
||||
|
||||
echo -e "${BLUE}Setting up $(echo ${product} | awk '{print toupper(substr($0,1,1)) tolower(substr($0,2))}') authentication...${NC}"
|
||||
|
||||
# Check if token already exists in secret file
|
||||
if [ -s "$secret_file" ]; then
|
||||
local existing_token=$(cat "$secret_file")
|
||||
echo "✅ Token already exists in secret file"
|
||||
echo " Token: ${existing_token:0:20}..."
|
||||
|
||||
# Test if the token works
|
||||
echo -n "🧪 Testing existing token..."
|
||||
if docker exec "${container_name}" influxdb3 show databases --token "${existing_token}" --host "http://localhost:${port}" > /dev/null 2>&1; then
|
||||
echo -e " ${GREEN}✓ Working${NC}"
|
||||
return 0
|
||||
else
|
||||
echo -e " ${YELLOW}⚠ Not working, will create new token${NC}"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check if container is running
|
||||
if ! docker ps --format '{{.Names}}' | grep -q "^${container_name}$"; then
|
||||
echo "🚀 Starting ${container_name} container..."
|
||||
if ! docker compose up -d "${container_name}"; then
|
||||
echo -e "${RED}❌ Failed to start container${NC}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo -n "⏳ Waiting for container to be ready..."
|
||||
sleep 5
|
||||
echo -e " ${GREEN}✓${NC}"
|
||||
else
|
||||
echo "✅ Container ${container_name} is running"
|
||||
fi
|
||||
|
||||
# Create admin token
|
||||
echo "🔑 Creating admin token..."
|
||||
|
||||
local token_output
|
||||
if token_output=$(docker exec "${container_name}" influxdb3 create token --admin 2>&1); then
|
||||
# Extract the token from the "Token: " line
|
||||
local new_token=$(echo "$token_output" | grep "^Token: " | sed 's/^Token: //' | tr -d '\r\n')
|
||||
|
||||
echo -e "✅ ${GREEN}Token created successfully!${NC}"
|
||||
echo " Token: ${new_token:0:20}..."
|
||||
|
||||
# Update secret file
|
||||
echo "${new_token}" > "$secret_file"
|
||||
|
||||
echo "📝 Updated secret file: $secret_file"
|
||||
|
||||
# Test the new token
|
||||
echo -n "🧪 Testing new token..."
|
||||
if docker exec "${container_name}" influxdb3 show databases --token "${new_token}" --host "http://localhost:${port}" > /dev/null 2>&1; then
|
||||
echo -e " ${GREEN}✓ Working${NC}"
|
||||
else
|
||||
echo -e " ${YELLOW}⚠ Test failed, but token was created${NC}"
|
||||
fi
|
||||
|
||||
else
|
||||
echo -e "${RED}❌ Failed to create token${NC}"
|
||||
echo "Error output: $token_output"
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo ""
|
||||
}
|
||||
|
||||
# Main execution
|
||||
case "$TARGET" in
|
||||
"core")
|
||||
setup_auth "core"
|
||||
;;
|
||||
"enterprise")
|
||||
setup_auth "enterprise"
|
||||
;;
|
||||
"both")
|
||||
setup_auth "core"
|
||||
setup_auth "enterprise"
|
||||
;;
|
||||
*)
|
||||
echo "Usage: $0 [core|enterprise|both]"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
echo -e "${GREEN}🎉 Authentication setup complete!${NC}"
|
||||
echo ""
|
||||
echo "📋 Next steps:"
|
||||
echo "1. Restart containers to load new secrets:"
|
||||
echo " docker compose down && docker compose up -d influxdb3-core influxdb3-enterprise"
|
||||
echo "2. Test CLI commands with authentication:"
|
||||
echo " ./detect-cli-changes.sh core 3.1.0 local"
|
||||
echo " ./detect-cli-changes.sh enterprise 3.1.0 local"
|
||||
echo ""
|
||||
echo "📄 Your secret files now contain:"
|
||||
|
||||
# Show Core tokens
|
||||
if [ -f "$SECRET_CORE_FILE" ] && [ -s "$SECRET_CORE_FILE" ]; then
|
||||
token_preview=$(head -c 20 "$SECRET_CORE_FILE")
|
||||
echo " $SECRET_CORE_FILE: ${token_preview}..."
|
||||
fi
|
||||
|
||||
# Show Enterprise tokens
|
||||
if [ -f "$SECRET_ENT_FILE" ] && [ -s "$SECRET_ENT_FILE" ]; then
|
||||
token_preview=$(head -c 20 "$SECRET_ENT_FILE")
|
||||
echo " $SECRET_ENT_FILE: ${token_preview}..."
|
||||
fi
|
||||
|
|
@ -1,725 +0,0 @@
|
|||
// generate-cli-docs.js
|
||||
const { execSync } = require('child_process');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const OUTPUT_DIR = path.join(__dirname, 'content', 'shared', 'influxdb3-cli');
|
||||
const BASE_CMD = 'influxdb3';
|
||||
const DEBUG = true; // Set to true for verbose logging
|
||||
|
||||
// Debug logging function
|
||||
function debug(message, data) {
|
||||
if (DEBUG) {
|
||||
console.log(`DEBUG: ${message}`);
|
||||
if (data) console.log(JSON.stringify(data, null, 2));
|
||||
}
|
||||
}
|
||||
|
||||
// Function to remove ANSI escape codes
|
||||
function stripAnsiCodes(str) {
|
||||
// Regular expression to match ANSI escape codes
|
||||
// eslint-disable-next-line no-control-regex
|
||||
return str.replace(/[][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g, '');
|
||||
}
|
||||
|
||||
// Ensure output directories exist
|
||||
function ensureDirectoryExistence(filePath) {
|
||||
const dirname = path.dirname(filePath);
|
||||
if (fs.existsSync(dirname)) {
|
||||
return true;
|
||||
}
|
||||
ensureDirectoryExistence(dirname);
|
||||
fs.mkdirSync(dirname);
|
||||
}
|
||||
|
||||
// Get all available commands and subcommands
|
||||
function getCommands() {
|
||||
try {
|
||||
debug('Getting base commands');
|
||||
let helpOutput = execSync(`${BASE_CMD} --help`).toString();
|
||||
helpOutput = stripAnsiCodes(helpOutput); // Strip ANSI codes
|
||||
debug('Cleaned help output received', helpOutput);
|
||||
|
||||
// Find all command sections (Common Commands, Resource Management, etc.)
|
||||
const commandSections = helpOutput.match(/^[A-Za-z\s]+:\s*$([\s\S]+?)(?=^[A-Za-z\s]+:\s*$|\n\s*$|\n[A-Z]|\n\n|$)/gm);
|
||||
|
||||
if (!commandSections || commandSections.length === 0) {
|
||||
debug('No command sections found in help output');
|
||||
return [];
|
||||
}
|
||||
|
||||
debug(`Found ${commandSections.length} command sections`);
|
||||
|
||||
let commands = [];
|
||||
|
||||
// Process each section to extract commands
|
||||
commandSections.forEach(section => {
|
||||
// Extract command lines (ignoring section headers)
|
||||
const cmdLines = section.split('\n')
|
||||
.slice(1) // Skip the section header
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('-') && !line.startsWith('#')); // Skip empty lines, flags and comments
|
||||
|
||||
debug('Command lines in section', cmdLines);
|
||||
|
||||
// Extract command names and descriptions
|
||||
cmdLines.forEach(line => {
|
||||
// Handle commands with aliases (like "query, q")
|
||||
const aliasMatch = line.match(/^\s*([a-zA-Z0-9_,-\s]+?)\s{2,}(.+)$/);
|
||||
|
||||
if (aliasMatch) {
|
||||
// Get primary command and any aliases
|
||||
const commandParts = aliasMatch[1].split(',').map(cmd => cmd.trim());
|
||||
const primaryCmd = commandParts[0]; // Use the first as primary
|
||||
const description = aliasMatch[2].trim();
|
||||
|
||||
commands.push({
|
||||
cmd: primaryCmd,
|
||||
description: description
|
||||
});
|
||||
|
||||
debug(`Added command: ${primaryCmd} - ${description}`);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
debug('Extracted commands', commands);
|
||||
return commands;
|
||||
} catch (error) {
|
||||
console.error('Error getting commands:', error.message);
|
||||
if (DEBUG) console.error(error.stack);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// Get subcommands for a specific command
|
||||
function getSubcommands(cmd) {
|
||||
try {
|
||||
debug(`Getting subcommands for: ${cmd}`);
|
||||
let helpOutput = execSync(`${BASE_CMD} ${cmd} --help`).toString();
|
||||
helpOutput = stripAnsiCodes(helpOutput); // Strip ANSI codes
|
||||
debug(`Cleaned help output for ${cmd} received`, helpOutput);
|
||||
|
||||
// Look for sections containing commands (similar to top-level help)
|
||||
// First try to find a dedicated Commands: section
|
||||
let subcommands = [];
|
||||
|
||||
// Try to find a dedicated "Commands:" section first
|
||||
const commandsMatch = helpOutput.match(/Commands:\s+([\s\S]+?)(?=^[A-Za-z\s]+:\s*$|\n\s*$|\n[A-Z]|\n\n|$)/m);
|
||||
|
||||
if (commandsMatch) {
|
||||
debug(`Found dedicated Commands section for ${cmd}`);
|
||||
const cmdLines = commandsMatch[1].split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('-') && !line.startsWith('#')); // Skip empty lines, flags, comments
|
||||
|
||||
cmdLines.forEach(line => {
|
||||
const match = line.match(/^\s*([a-zA-Z0-9_,-\s]+?)\s{2,}(.+)$/);
|
||||
if (match) {
|
||||
// Get primary command name (before any commas for aliases)
|
||||
const commandName = match[1].split(',')[0].trim();
|
||||
const description = match[2].trim();
|
||||
|
||||
subcommands.push({
|
||||
cmd: `${cmd} ${commandName}`,
|
||||
description: description
|
||||
});
|
||||
|
||||
debug(`Added subcommand: ${cmd} ${commandName} - ${description}`);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
// Look for sections like "Common Commands:", "Resource Management:", etc.
|
||||
const sectionMatches = helpOutput.match(/^[A-Za-z\s]+:\s*$([\s\S]+?)(?=^[A-Za-z\s]+:\s*$|\n\s*$|\n[A-Z]|\n\n|$)/gm);
|
||||
|
||||
if (sectionMatches) {
|
||||
debug(`Found ${sectionMatches.length} sections with potential commands for ${cmd}`);
|
||||
|
||||
sectionMatches.forEach(section => {
|
||||
const cmdLines = section.split('\n')
|
||||
.slice(1) // Skip the section header
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('-') && !line.startsWith('#')); // Skip empty lines, flags, comments
|
||||
|
||||
cmdLines.forEach(line => {
|
||||
const match = line.match(/^\s*([a-zA-Z0-9_,-\s]+?)\s{2,}(.+)$/);
|
||||
if (match) {
|
||||
// Get primary command name (before any commas for aliases)
|
||||
const commandName = match[1].split(',')[0].trim();
|
||||
const description = match[2].trim();
|
||||
|
||||
subcommands.push({
|
||||
cmd: `${cmd} ${commandName}`,
|
||||
description: description
|
||||
});
|
||||
|
||||
debug(`Added subcommand from section: ${cmd} ${commandName} - ${description}`);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
debug(`Extracted ${subcommands.length} subcommands for ${cmd}`, subcommands);
|
||||
return subcommands;
|
||||
} catch (error) {
|
||||
debug(`Error getting subcommands for ${cmd}:`, error.message);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// Helper functions to generate descriptions for different command types
|
||||
function getQueryDescription(cmd, fullCmd) {
|
||||
return ` executes a query against a running {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getWriteDescription(cmd, fullCmd) {
|
||||
return ` writes data to a running {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getShowDescription(cmd, fullCmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const resourceType = cmdParts.length > 1 ? cmdParts[1] : 'resources';
|
||||
return ` lists ${resourceType} in your {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getCreateDescription(cmd, fullCmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const createType = cmdParts.length > 1 ? cmdParts[1] : 'resources';
|
||||
return ` creates ${createType} in your {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getDeleteDescription(cmd, fullCmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const deleteType = cmdParts.length > 1 ? cmdParts[1] : 'resources';
|
||||
return ` deletes ${deleteType} from your {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getServeDescription(cmd, fullCmd) {
|
||||
return ` starts the {{< product-name >}} server.`;
|
||||
}
|
||||
|
||||
function getDefaultDescription(cmd, fullCmd) {
|
||||
return `.`;
|
||||
}
|
||||
|
||||
// Helper functions to generate examples for different command types
|
||||
function getQueryExample(cmd) {
|
||||
return {
|
||||
title: 'Query data using SQL',
|
||||
code: `${BASE_CMD} ${cmd} --database DATABASE_NAME "SELECT * FROM home"`
|
||||
};
|
||||
}
|
||||
|
||||
function getWriteExample(cmd) {
|
||||
return {
|
||||
title: 'Write data from a file',
|
||||
code: `${BASE_CMD} ${cmd} --database DATABASE_NAME --file data.lp`
|
||||
};
|
||||
}
|
||||
|
||||
function getShowExample(cmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const resourceType = cmdParts.length > 1 ? cmdParts[1] : 'resources';
|
||||
return {
|
||||
title: `List ${resourceType}`,
|
||||
code: `${BASE_CMD} ${cmd}`
|
||||
};
|
||||
}
|
||||
|
||||
function getCreateExample(cmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const resourceType = cmdParts.length > 1 ? cmdParts[1] : 'resource';
|
||||
return {
|
||||
title: `Create a new ${resourceType}`,
|
||||
code: `${BASE_CMD} ${cmd} --name new-${resourceType}-name`
|
||||
};
|
||||
}
|
||||
|
||||
function getDeleteExample(cmd) {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const resourceType = cmdParts.length > 1 ? cmdParts[1] : 'resource';
|
||||
return {
|
||||
title: `Delete a ${resourceType}`,
|
||||
code: `${BASE_CMD} ${cmd} --name ${resourceType}-to-delete`
|
||||
};
|
||||
}
|
||||
|
||||
function getServeExample(cmd) {
|
||||
return {
|
||||
title: 'Start the InfluxDB server',
|
||||
code: `${BASE_CMD} serve --node-id my-node --object-store file --data-dir ~/.influxdb3_data`
|
||||
};
|
||||
}
|
||||
|
||||
function getDefaultExample(fullCmd, cmd) {
|
||||
return {
|
||||
title: `Run the ${fullCmd} command`,
|
||||
code: `${BASE_CMD} ${cmd}`
|
||||
};
|
||||
}
|
||||
|
||||
// Generate frontmatter for a command
|
||||
function generateFrontmatter(cmd) {
|
||||
const parts = cmd.split(' ');
|
||||
const lastPart = parts[parts.length - 1];
|
||||
const fullCmd = cmd === '' ? BASE_CMD : `${BASE_CMD} ${cmd}`;
|
||||
|
||||
// Determine a good description based on the command
|
||||
let description = '';
|
||||
if (cmd === '') {
|
||||
description = `The ${BASE_CMD} CLI runs and interacts with the {{< product-name >}} server.`;
|
||||
} else {
|
||||
const cmdParts = cmd.split(' ');
|
||||
const lastCmd = cmdParts[cmdParts.length - 1];
|
||||
|
||||
// Use the description helper functions for consistency
|
||||
switch (lastCmd) {
|
||||
case 'query':
|
||||
case 'q':
|
||||
description = `The \`${fullCmd}\` command${getQueryDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
case 'write':
|
||||
case 'w':
|
||||
description = `The \`${fullCmd}\` command${getWriteDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
case 'show':
|
||||
description = `The \`${fullCmd}\` command${getShowDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
case 'create':
|
||||
description = `The \`${fullCmd}\` command${getCreateDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
case 'delete':
|
||||
description = `The \`${fullCmd}\` command${getDeleteDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
case 'serve':
|
||||
description = `The \`${fullCmd}\` command${getServeDescription(cmd, fullCmd)}`;
|
||||
break;
|
||||
default:
|
||||
description = `The \`${fullCmd}\` command${getDefaultDescription(cmd, fullCmd)}`;
|
||||
}
|
||||
}
|
||||
|
||||
// Create the frontmatter
|
||||
let frontmatter = `---
|
||||
title: ${fullCmd}
|
||||
description: >
|
||||
${description}
|
||||
`;
|
||||
|
||||
// Add source attribute for shared files
|
||||
if (cmd !== '') {
|
||||
// Build the path relative to the /content/shared/influxdb3-cli/ directory
|
||||
const relativePath = cmd.split(' ').join('/');
|
||||
frontmatter += `source: /shared/influxdb3-cli/${relativePath === '' ? '_index' : relativePath}.md
|
||||
`;
|
||||
}
|
||||
|
||||
// Close the frontmatter
|
||||
frontmatter += `---
|
||||
|
||||
`;
|
||||
|
||||
return frontmatter;
|
||||
}
|
||||
|
||||
// Generate Markdown for a command
|
||||
function generateCommandMarkdown(cmd) {
|
||||
try {
|
||||
debug(`Generating markdown for command: ${cmd}`);
|
||||
const fullCmd = cmd === '' ? BASE_CMD : `${BASE_CMD} ${cmd}`;
|
||||
let helpOutput = execSync(`${fullCmd} --help`).toString();
|
||||
helpOutput = stripAnsiCodes(helpOutput); // Strip ANSI codes
|
||||
debug(`Cleaned help output for ${fullCmd} received`, helpOutput);
|
||||
|
||||
// Extract sections from help output
|
||||
const usageMatch = helpOutput.match(/Usage:\s+([\s\S]+?)(?:\n\n|$)/);
|
||||
const usage = usageMatch ? usageMatch[1].trim() : '';
|
||||
|
||||
const argsMatch = helpOutput.match(/Arguments:\s+([\s\S]+?)(?:\n\n|$)/);
|
||||
const args = argsMatch ? argsMatch[1].trim() : '';
|
||||
|
||||
// Store option sections separately
|
||||
const optionSections = {};
|
||||
const optionSectionRegex = /^([A-Za-z\s]+ Options?|Required):\s*$([\s\S]+?)(?=\n^[A-Za-z\s]+:|^$|\n\n)/gm;
|
||||
let sectionMatch;
|
||||
while ((sectionMatch = optionSectionRegex.exec(helpOutput)) !== null) {
|
||||
const sectionTitle = sectionMatch[1].trim();
|
||||
const sectionContent = sectionMatch[2].trim();
|
||||
debug(`Found option section: ${sectionTitle}`);
|
||||
optionSections[sectionTitle] = sectionContent;
|
||||
}
|
||||
|
||||
// Fallback if no specific sections found
|
||||
if (Object.keys(optionSections).length === 0) {
|
||||
const flagsMatch = helpOutput.match(/(?:Flags|Options):\s+([\s\S]+?)(?:\n\n|$)/);
|
||||
if (flagsMatch) {
|
||||
debug('Using fallback Flags/Options section');
|
||||
optionSections['Options'] = flagsMatch[1].trim();
|
||||
}
|
||||
}
|
||||
debug('Extracted option sections', optionSections);
|
||||
|
||||
|
||||
// Format flags as a table, processing sections and handling duplicates/multi-lines
|
||||
let flagsTable = '';
|
||||
const addedFlags = new Set(); // Track added long flags
|
||||
const tableRows = [];
|
||||
const sectionOrder = ['Required', ...Object.keys(optionSections).filter(k => k !== 'Required')]; // Prioritize Required
|
||||
|
||||
for (const sectionTitle of sectionOrder) {
|
||||
if (!optionSections[sectionTitle]) continue;
|
||||
|
||||
const sectionContent = optionSections[sectionTitle];
|
||||
const lines = sectionContent.split('\n');
|
||||
let i = 0;
|
||||
while (i < lines.length) {
|
||||
const line = lines[i];
|
||||
// Regex to capture flag and start of description
|
||||
const flagMatch = line.match(/^\s+(?:(-\w),\s+)?(--[\w-]+(?:[=\s]<[^>]+>)?)?\s*(.*)/);
|
||||
|
||||
|
||||
if (flagMatch) {
|
||||
const shortFlag = flagMatch[1] || '';
|
||||
const longFlagRaw = flagMatch[2] || ''; // Might be empty if only short flag exists (unlikely here)
|
||||
const longFlag = longFlagRaw.split(/[=\s]/)[0]; // Get only the flag name, e.g., --cluster-id from --cluster-id <CLUSTER_ID>
|
||||
let description = flagMatch[3].trim();
|
||||
|
||||
// Check for multi-line description (indented lines following)
|
||||
let j = i + 1;
|
||||
while (j < lines.length && lines[j].match(/^\s{4,}/)) { // Look for lines with significant indentation
|
||||
description += ' ' + lines[j].trim();
|
||||
j++;
|
||||
}
|
||||
i = j; // Move main index past the multi-line description
|
||||
|
||||
// Clean description
|
||||
description = description
|
||||
.replace(/\s+\[default:.*?\]/g, '')
|
||||
.replace(/\s+\[env:.*?\]/g, '')
|
||||
.replace(/\s+\[possible values:.*?\]/g, '')
|
||||
.trim();
|
||||
|
||||
// Check if required based on section
|
||||
const isRequired = sectionTitle === 'Required';
|
||||
|
||||
// Add to table if not already added
|
||||
if (longFlag && !addedFlags.has(longFlag)) {
|
||||
// Use longFlagRaw which includes the placeholder for display
|
||||
tableRows.push(`| \`${shortFlag}\` | \`${longFlagRaw.trim()}\` | ${isRequired ? '_({{< req >}})_ ' : ''}${description} |`);
|
||||
addedFlags.add(longFlag);
|
||||
debug(`Added flag: ${longFlag} (Required: ${isRequired})`);
|
||||
} else if (!longFlag && shortFlag && !addedFlags.has(shortFlag)) {
|
||||
// Handle case where only short flag might exist (though unlikely for this CLI)
|
||||
tableRows.push(`| \`${shortFlag}\` | | ${isRequired ? '_({{< req >}})_ ' : ''}${description} |`);
|
||||
addedFlags.add(shortFlag); // Use short flag for tracking if no long flag
|
||||
debug(`Added flag: ${shortFlag} (Required: ${isRequired})`);
|
||||
} else if (longFlag) {
|
||||
debug(`Skipping duplicate flag: ${longFlag}`);
|
||||
} else {
|
||||
debug(`Skipping flag line with no long or short flag found: ${line}`);
|
||||
}
|
||||
} else {
|
||||
debug(`Could not parse flag line in section "${sectionTitle}": ${line}`);
|
||||
i++; // Move to next line if current one doesn't match
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if (tableRows.length > 0) {
|
||||
// Sort rows alphabetically by long flag, putting required flags first
|
||||
tableRows.sort((a, b) => {
|
||||
const isARequired = a.includes('_({{< req >}})_');
|
||||
const isBRequired = b.includes('_({{< req >}})_');
|
||||
if (isARequired && !isBRequired) return -1;
|
||||
if (!isARequired && isBRequired) return 1;
|
||||
// Extract long flag for sorting (second column content between backticks)
|
||||
const longFlagA = (a.match(/\|\s*`.*?`\s*\|\s*`(--[\w-]+)/) || [])[1] || '';
|
||||
const longFlagB = (b.match(/\|\s*`.*?`\s*\|\s*`(--[\w-]+)/) || [])[1] || '';
|
||||
return longFlagA.localeCompare(longFlagB);
|
||||
});
|
||||
flagsTable = `| Short | Long | Description |\n| :---- | :--- | :---------- |\n${tableRows.join('\n')}`;
|
||||
}
|
||||
|
||||
|
||||
// Extract description from help text (appears before Usage section or other sections)
|
||||
let descriptionText = '';
|
||||
// Updated regex to stop before any known section header
|
||||
const descMatches = helpOutput.match(/^([\s\S]+?)(?=Usage:|Common Commands:|Examples:|Options:|Flags:|Required:|Arguments:|$)/);
|
||||
if (descMatches && descMatches[1]) {
|
||||
descriptionText = descMatches[1].trim();
|
||||
}
|
||||
|
||||
// Example commands
|
||||
const examples = [];
|
||||
// Updated regex to stop before any known section header
|
||||
const exampleMatch = helpOutput.match(/(?:Example|Examples):\s*([\s\S]+?)(?=\n\n|Usage:|Options:|Flags:|Required:|Arguments:|$)/i);
|
||||
|
||||
if (exampleMatch) {
|
||||
// Found examples in help output, use them
|
||||
const exampleBlocks = exampleMatch[1].trim().split(/\n\s*#\s+/); // Split by lines starting with # (section comments)
|
||||
|
||||
exampleBlocks.forEach((block, index) => {
|
||||
const lines = block.trim().split('\n');
|
||||
const titleLine = lines[0].startsWith('#') ? lines[0].substring(1).trim() : `Example ${index + 1}`;
|
||||
const codeLines = lines.slice(titleLine === `Example ${index + 1}` ? 0 : 1) // Skip title line if we extracted it
|
||||
.map(line => line.replace(/^\s*\d+\.\s*/, '').trim()) // Remove numbering like "1. "
|
||||
.filter(line => line);
|
||||
if (codeLines.length > 0) {
|
||||
examples.push({ title: titleLine, code: codeLines.join('\n') });
|
||||
}
|
||||
});
|
||||
|
||||
} else {
|
||||
// Fallback example generation
|
||||
if (cmd === '') {
|
||||
// ... (existing base command examples) ...
|
||||
} else {
|
||||
// ... (existing command-specific example generation using helpers) ...
|
||||
}
|
||||
}
|
||||
|
||||
// Construct markdown content
|
||||
const frontmatter = generateFrontmatter(cmd);
|
||||
let markdown = frontmatter;
|
||||
|
||||
markdown += `The \`${fullCmd}\` command`;
|
||||
|
||||
// Use extracted description if available, otherwise fallback
|
||||
if (descriptionText) {
|
||||
markdown += ` ${descriptionText.toLowerCase().replace(/\.$/, '')}.`;
|
||||
} else if (cmd === '') {
|
||||
markdown += ` runs and interacts with the {{< product-name >}} server.`;
|
||||
} else {
|
||||
// Fallback description generation using helpers
|
||||
const cmdParts = cmd.split(' ');
|
||||
const lastCmd = cmdParts[cmdParts.length - 1];
|
||||
switch (lastCmd) {
|
||||
case 'query': case 'q': markdown += getQueryDescription(cmd, fullCmd); break;
|
||||
case 'write': case 'w': markdown += getWriteDescription(cmd, fullCmd); break;
|
||||
case 'show': markdown += getShowDescription(cmd, fullCmd); break;
|
||||
case 'create': markdown += getCreateDescription(cmd, fullCmd); break;
|
||||
case 'delete': markdown += getDeleteDescription(cmd, fullCmd); break;
|
||||
case 'serve': markdown += getServeDescription(cmd, fullCmd); break;
|
||||
default: markdown += getDefaultDescription(cmd, fullCmd);
|
||||
}
|
||||
}
|
||||
|
||||
markdown += `\n\n## Usage\n\n<!--pytest.mark.skip-->\n\n\`\`\`bash\n${usage}\n\`\`\`\n\n`;
|
||||
|
||||
if (args) {
|
||||
markdown += `## Arguments\n\n${args}\n\n`;
|
||||
}
|
||||
|
||||
if (flagsTable) {
|
||||
markdown += `## Options\n\n${flagsTable}\n\n`;
|
||||
}
|
||||
|
||||
if (examples.length > 0) {
|
||||
markdown += `## Examples\n\n`;
|
||||
examples.forEach(ex => {
|
||||
markdown += `### ${ex.title}\n\n<!--pytest.mark.skip-->\n\n\`\`\`bash\n${ex.code}\n\`\`\`\n\n`;
|
||||
});
|
||||
}
|
||||
|
||||
return markdown;
|
||||
} catch (error) {
|
||||
console.error(`Error generating markdown for '${cmd}':`, error.message);
|
||||
if (DEBUG) console.error(error.stack);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Generate reference page with proper frontmatter that imports from shared content
|
||||
function generateReferencePage(cmd, product) {
|
||||
// Skip the base command since it's not typically needed as a reference
|
||||
if (cmd === '') {
|
||||
return null;
|
||||
}
|
||||
|
||||
const parts = cmd.split(' ');
|
||||
const fullCmd = cmd === '' ? BASE_CMD : `${BASE_CMD} ${cmd}`;
|
||||
|
||||
// Build the appropriate menu path
|
||||
let menuParent;
|
||||
if (parts.length === 1) {
|
||||
menuParent = 'influxdb3'; // Top-level command
|
||||
} else {
|
||||
// For nested commands, the parent is the command's parent command
|
||||
menuParent = `influxdb3 ${parts.slice(0, -1).join(' ')}`;
|
||||
}
|
||||
|
||||
// Determine a good description
|
||||
let description;
|
||||
const lastCmd = parts.length > 0 ? parts[parts.length - 1] : '';
|
||||
|
||||
switch (lastCmd) {
|
||||
case 'query':
|
||||
case 'q':
|
||||
description = `Use the ${fullCmd} command to query data in your {{< product-name >}} instance.`;
|
||||
break;
|
||||
case 'write':
|
||||
case 'w':
|
||||
description = `Use the ${fullCmd} command to write data to your {{< product-name >}} instance.`;
|
||||
break;
|
||||
case 'show':
|
||||
const showType = parts.length > 1 ? parts[1] : 'resources';
|
||||
description = `Use the ${fullCmd} command to list ${showType} in your {{< product-name >}} instance.`;
|
||||
break;
|
||||
case 'create':
|
||||
const createType = parts.length > 1 ? parts[1] : 'resources';
|
||||
description = `Use the ${fullCmd} command to create ${createType} in your {{< product-name >}} instance.`;
|
||||
break;
|
||||
case 'delete':
|
||||
const deleteType = parts.length > 1 ? parts[1] : 'resources';
|
||||
description = `Use the ${fullCmd} command to delete ${deleteType} from your {{< product-name >}} instance.`;
|
||||
break;
|
||||
case 'serve':
|
||||
description = `Use the ${fullCmd} command to start and run your {{< product-name >}} server.`;
|
||||
break;
|
||||
default:
|
||||
description = `Use the ${fullCmd} command.`;
|
||||
}
|
||||
|
||||
// Build the path to the shared content
|
||||
const sharedPath = parts.join('/');
|
||||
|
||||
// Create the frontmatter for the reference page
|
||||
const frontmatter = `---
|
||||
title: ${fullCmd}
|
||||
description: >
|
||||
${description}
|
||||
menu:
|
||||
${product}:
|
||||
parent: ${menuParent}
|
||||
name: ${fullCmd}
|
||||
weight: 400
|
||||
source: /shared/influxdb3-cli/${sharedPath}.md
|
||||
---
|
||||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-cli/${sharedPath}.md
|
||||
-->`;
|
||||
|
||||
return frontmatter;
|
||||
}
|
||||
|
||||
// Create the reference page files for different product variants
|
||||
async function createReferencePages(cmd) {
|
||||
if (cmd === '') return; // Skip the base command
|
||||
|
||||
// Define the InfluxDB products that use this CLI
|
||||
const products = [
|
||||
{ id: 'influxdb3_core', path: 'influxdb3/core' },
|
||||
{ id: 'influxdb3_enterprise', path: 'influxdb3/enterprise' }
|
||||
];
|
||||
|
||||
// Generate reference pages for each product
|
||||
for (const product of products) {
|
||||
const frontmatter = generateReferencePage(cmd, product.id);
|
||||
if (!frontmatter) continue;
|
||||
|
||||
const parts = cmd.split(' ');
|
||||
const cmdPath = parts.join('/');
|
||||
|
||||
// Create the directory path for the reference file
|
||||
const refDirPath = path.join(__dirname, '..', 'content', product.path, 'reference', 'cli', 'influxdb3', ...parts.slice(0, -1));
|
||||
const refFilePath = path.join(refDirPath, `${parts[parts.length - 1]}.md`);
|
||||
|
||||
// Create directory if it doesn't exist
|
||||
ensureDirectoryExistence(refFilePath);
|
||||
|
||||
// Write the reference file
|
||||
fs.writeFileSync(refFilePath, frontmatter);
|
||||
console.log(`Generated reference page: ${refFilePath}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Process a command and its subcommands recursively
|
||||
async function processCommand(cmd = '', depth = 0) {
|
||||
debug(`Processing command: "${cmd}" at depth ${depth}`);
|
||||
|
||||
// Generate markdown for this command
|
||||
const markdown = generateCommandMarkdown(cmd);
|
||||
if (!markdown) {
|
||||
console.error(`Failed to generate markdown for command: ${cmd}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Create file path and write content
|
||||
let filePath;
|
||||
if (cmd === '') {
|
||||
// Base command
|
||||
filePath = path.join(OUTPUT_DIR, '_index.md');
|
||||
} else {
|
||||
const parts = cmd.split(' ');
|
||||
const dirPath = path.join(OUTPUT_DIR, ...parts.slice(0, -1));
|
||||
const fileName = parts[parts.length - 1] === '' ? '_index.md' : `${parts[parts.length - 1]}.md`;
|
||||
filePath = path.join(dirPath, fileName);
|
||||
|
||||
// For commands with subcommands, also create an index file
|
||||
if (depth < 3) { // Limit recursion depth
|
||||
try {
|
||||
const subcommandOutput = execSync(`${BASE_CMD} ${cmd} --help`).toString();
|
||||
if (subcommandOutput.includes('Commands:')) {
|
||||
const subDirPath = path.join(OUTPUT_DIR, ...parts);
|
||||
const indexFilePath = path.join(subDirPath, '_index.md');
|
||||
ensureDirectoryExistence(indexFilePath);
|
||||
fs.writeFileSync(indexFilePath, markdown);
|
||||
debug(`Created index file: ${indexFilePath}`);
|
||||
}
|
||||
} catch (error) {
|
||||
debug(`Error checking for subcommands: ${error.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
ensureDirectoryExistence(filePath);
|
||||
fs.writeFileSync(filePath, markdown);
|
||||
console.log(`Generated: ${filePath}`);
|
||||
|
||||
// Create reference pages for this command
|
||||
await createReferencePages(cmd);
|
||||
|
||||
// Get and process subcommands
|
||||
if (depth < 3) { // Limit recursion depth
|
||||
const subcommands = getSubcommands(cmd);
|
||||
debug(`Found ${subcommands.length} subcommands for "${cmd}"`);
|
||||
|
||||
for (const subCmd of subcommands) {
|
||||
await processCommand(subCmd.cmd, depth + 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Main function
|
||||
async function main() {
|
||||
try {
|
||||
debug('Starting documentation generation');
|
||||
|
||||
// Process base command
|
||||
await processCommand();
|
||||
|
||||
// Get top-level commands
|
||||
const commands = getCommands();
|
||||
debug(`Found ${commands.length} top-level commands`);
|
||||
|
||||
if (commands.length === 0) {
|
||||
console.warn('Warning: No commands were found. Check the influxdb3 CLI help output format.');
|
||||
}
|
||||
|
||||
// Process each top-level command
|
||||
for (const { cmd } of commands) {
|
||||
await processCommand(cmd, 1);
|
||||
}
|
||||
|
||||
console.log('Documentation generation complete!');
|
||||
} catch (error) {
|
||||
console.error('Error in main execution:', error.message);
|
||||
if (DEBUG) console.error(error.stack);
|
||||
}
|
||||
}
|
||||
|
||||
// Run the script
|
||||
main();
|
||||
|
|
@ -106,17 +106,17 @@ pre-push:
|
|||
- assets/*.{js,mjs,css,scss}
|
||||
- layouts/*.html
|
||||
- content/example.md
|
||||
files: /bin/ls content/example.md
|
||||
run: |
|
||||
node cypress/support/run-e2e-specs.js --spec "cypress/e2e/content/article-links.cy.js" {files}
|
||||
echo "Running shortcode examples test due to changes in: {staged_files}"
|
||||
node cypress/support/run-e2e-specs.js --spec "cypress/e2e/content/article-links.cy.js" content/example.md
|
||||
exit $?
|
||||
|
||||
e2e-links:
|
||||
tags: test,links
|
||||
glob: 'content/*.{md,html}'
|
||||
run: |
|
||||
echo "Running link checker for: {push_files}"
|
||||
yarn test:links {push_files}
|
||||
echo "Running link checker for: {staged_files}"
|
||||
yarn test:links {staged_files}
|
||||
exit $?
|
||||
|
||||
# Manage Docker containers
|
||||
|
|
|
|||
|
|
@ -64,7 +64,12 @@
|
|||
"test:links:telegraf": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/telegraf/**/*.{md,html}",
|
||||
"test:links:shared": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/shared/**/*.{md,html}",
|
||||
"test:links:api-docs": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" /influxdb3/core/api/,/influxdb3/enterprise/api/,/influxdb3/cloud-dedicated/api/,/influxdb3/cloud-dedicated/api/v1/,/influxdb/cloud-dedicated/api/v1/,/influxdb/cloud-dedicated/api/management/,/influxdb3/cloud-dedicated/api/management/",
|
||||
"test:shortcode-examples": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/example.md"
|
||||
"test:shortcode-examples": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/example.md",
|
||||
"audit:cli": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js both local",
|
||||
"audit:cli:3core": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js core local",
|
||||
"audit:cli:3ent": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js enterprise local",
|
||||
"audit:cli:apply": "node ./helper-scripts/influxdb3-monolith/apply-cli-patches.js both",
|
||||
"audit:cli:apply:dry": "node ./helper-scripts/influxdb3-monolith/apply-cli-patches.js both --dry-run"
|
||||
},
|
||||
"type": "module",
|
||||
"browserslist": [
|
||||
|
|
|
|||
|
|
@ -0,0 +1,26 @@
|
|||
|
||||
# [core3,enterprise3]
|
||||
# Bearer auth works with v1 /query
|
||||
curl -v http://localhost:8181/query \
|
||||
--header "Authorization: Bearer ${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}" \
|
||||
--data-urlencode "db=sensors" \
|
||||
--data-urlencode "q=SELECT * FROM home"
|
||||
|
||||
# Bearer auth works with v1 /write
|
||||
curl -v "http://localhost:8181/write?db=sensors" \
|
||||
--header "Authorization: Bearer ${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}" \
|
||||
--data-raw "sensors,location=home temperature=23.5 1622547800"
|
||||
|
||||
# Basic auth works with v1 /write
|
||||
curl -v "http://localhost:8181/write?db=sensors" \
|
||||
--user "admin:${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}" \
|
||||
--data-raw "sensors,location=home temperature=23.5 1622547800"
|
||||
|
||||
# URL auth works with v1 /write
|
||||
curl -v "http://localhost:8181/write?db=sensors&u=admin&p=${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}" \
|
||||
--data-raw "sensors,location=home temperature=23.5 1622547800"
|
||||
|
||||
# Token auth works with /api/v2/write
|
||||
curl -v http://localhost:8181/write?db=sensors \
|
||||
--header "Authorization: Token ${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}" \
|
||||
--data-raw "sensors,location=home temperature=23.5 1622547800"
|
||||
|
|
@ -0,0 +1,4 @@
|
|||
# [core3,enterprise3]
|
||||
# Delete a database with hard delete at date
|
||||
curl -v -X DELETE "http://localhost:8181/api/v3/configure/database?hard_delete_at=20250701&db=sensors" \
|
||||
--header "Authorization: Bearer ${INFLUXDB3_ENTERPRISE_ADMIN_TOKEN}"
|
||||
Loading…
Reference in New Issue