Merge pull request #6282 from influxdata/jts-qol-vscode
chore(qol): Copilot no longer uses instruction settings; it automatic…pull/6298/head
commit
e2c7d728d0
|
|
@ -4,6 +4,10 @@ Always follow these instructions first and fallback to additional search and con
|
|||
|
||||
## Working Effectively
|
||||
|
||||
### Collaboration approach
|
||||
|
||||
Be a critical thinking partner, provide honest feedback, and identify potential issues.
|
||||
|
||||
### Bootstrap, Build, and Test the Repository
|
||||
|
||||
Execute these commands in order to set up a complete working environment:
|
||||
|
|
@ -54,16 +58,18 @@ yarn test:codeblocks:v2
|
|||
yarn test:codeblocks:telegraf
|
||||
```
|
||||
|
||||
#### Link Validation (takes 10-30 minutes, NEVER CANCEL - set timeout to 45+ minutes):
|
||||
#### Link Validation (takes 1-5 minutes):
|
||||
|
||||
Runs automatically on pull requests.
|
||||
Requires the **link-checker** binary from the repo release artifacts.
|
||||
|
||||
```bash
|
||||
# Test all links (very long-running)
|
||||
yarn test:links
|
||||
|
||||
# Test specific files/products (faster)
|
||||
yarn test:links content/influxdb3/core/**/*.md
|
||||
yarn test:links:v3
|
||||
yarn test:links:v2
|
||||
# JSON format is required for accurate reporting
|
||||
link-checker map content/influxdb3/core/**/*.md \
|
||||
| link-checker check \
|
||||
--config .ci/link-checker/production.lycherc.toml
|
||||
--format json
|
||||
```
|
||||
|
||||
#### Style Linting (takes 30-60 seconds):
|
||||
|
|
@ -168,7 +174,8 @@ yarn test:links content/example.md
|
|||
- **Package Manager**: Yarn (1.22.22+) with Node.js (20.19.4+)
|
||||
- **Testing Framework**:
|
||||
- Pytest with pytest-codeblocks (for code examples)
|
||||
- Cypress (for link validation and E2E tests)
|
||||
- Cypress (for E2E tests)
|
||||
- influxdata/docs-link-checker (for link validation)
|
||||
- Vale (for style and writing guidelines)
|
||||
- **Containerization**: Docker with Docker Compose
|
||||
- **Linting**: ESLint, Prettier, Vale
|
||||
|
|
@ -176,16 +183,6 @@ yarn test:links content/example.md
|
|||
|
||||
## Common Tasks and Build Times
|
||||
|
||||
### Time Expectations (CRITICAL - NEVER CANCEL)
|
||||
|
||||
- **Dependency installation**: 4 seconds
|
||||
- **Hugo static build**: 75 seconds (NEVER CANCEL - timeout: 180+ seconds)
|
||||
- **Hugo server startup**: 92 seconds (NEVER CANCEL - timeout: 150+ seconds)
|
||||
- **Code block tests**: 5-15 minutes per product (NEVER CANCEL - timeout: 30+ minutes)
|
||||
- **Link validation**: 10-30 minutes (NEVER CANCEL - timeout: 45+ minutes)
|
||||
- **Style linting**: 30-60 seconds
|
||||
- **Docker image build**: 30+ seconds (may fail due to network restrictions)
|
||||
|
||||
### Network Connectivity Issues
|
||||
|
||||
In restricted environments, these commands may fail due to external dependency downloads:
|
||||
|
|
|
|||
|
|
@ -14,17 +14,6 @@
|
|||
},
|
||||
"vale.valeCLI.config": "${workspaceFolder}/.vale.ini",
|
||||
"vale.valeCLI.minAlertLevel": "warning",
|
||||
"github.copilot.chat.codeGeneration.useInstructionFiles": true,
|
||||
"github.copilot.chat.codeGeneration.instructions": [
|
||||
{
|
||||
"file": "${workspaceFolder}/.github/copilot-instructions.md",
|
||||
}
|
||||
],
|
||||
"github.copilot.chat.pullRequestDescriptionGeneration.instructions": [
|
||||
{
|
||||
"file": "${workspaceFolder}/.github/copilot-instructions.md",
|
||||
}
|
||||
],
|
||||
"cSpell.words": [
|
||||
"influxctl"
|
||||
]
|
||||
|
|
|
|||
|
|
@ -349,7 +349,6 @@ services:
|
|||
- --data-dir=/var/lib/influxdb3/data
|
||||
- --plugin-dir=/var/lib/influxdb3/plugins
|
||||
environment:
|
||||
- INFLUXDB3_ENTERPRISE_LICENSE_EMAIL=${INFLUXDB3_ENTERPRISE_LICENSE_EMAIL}
|
||||
- INFLUXDB3_AUTH_TOKEN=/run/secrets/influxdb3-enterprise-admin-token
|
||||
volumes:
|
||||
- type: bind
|
||||
|
|
|
|||
|
|
@ -1,373 +0,0 @@
|
|||
# InfluxDB 3 Monolith (Core and Enterprise) Helper Scripts
|
||||
|
||||
This directory contains helper scripts specifically for InfluxDB 3 Core and Enterprise (monolith deployments), as opposed to distributed/clustered deployments.
|
||||
|
||||
## Overview
|
||||
|
||||
These scripts help with documentation workflows for InfluxDB 3 Core and Enterprise, including CLI change detection, authentication setup, API analysis, and release preparation.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- **Docker and Docker Compose**: For running InfluxDB 3 containers
|
||||
- **Node.js 16+**: For running JavaScript ESM scripts
|
||||
- **Active containers**: InfluxDB 3 Core and/or Enterprise containers running via `docker compose`
|
||||
- **Secret files**: Docker Compose secrets for auth tokens (`~/.env.influxdb3-core-admin-token` and `~/.env.influxdb3-enterprise-admin-token`)
|
||||
|
||||
## Scripts
|
||||
|
||||
### 🔐 Authentication & Setup
|
||||
|
||||
#### `setup-auth-tokens.sh`
|
||||
Creates and configures authentication tokens for InfluxDB 3 containers.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./setup-auth-tokens.sh [core|enterprise|both]
|
||||
```
|
||||
|
||||
**What it does:**
|
||||
- Checks existing tokens in secret files (`~/.env.influxdb3-core-admin-token` and `~/.env.influxdb3-enterprise-admin-token`)
|
||||
- Starts containers if not running
|
||||
- Creates admin tokens using `influxdb3 create token --admin`
|
||||
- Updates appropriate secret files with new tokens
|
||||
- Tests tokens to ensure they work
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Set up both Core and Enterprise tokens
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Set up only Enterprise
|
||||
./setup-auth-tokens.sh enterprise
|
||||
```
|
||||
|
||||
### 🔍 CLI Documentation Audit
|
||||
|
||||
#### `audit-cli-documentation.js`
|
||||
JavaScript ESM script that audits InfluxDB 3 CLI commands against existing documentation to identify missing or outdated content.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
node audit-cli-documentation.js [core|enterprise|both] [version|local]
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Compares actual CLI help output with documented commands
|
||||
- Identifies missing documentation for new CLI options
|
||||
- Finds documented options that no longer exist in the CLI
|
||||
- Supports both released versions and local containers
|
||||
- Generates detailed audit reports with recommendations
|
||||
- Handles authentication automatically using Docker secrets
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Audit Core documentation against local container
|
||||
node audit-cli-documentation.js core local
|
||||
|
||||
# Audit Enterprise documentation against specific version
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
# Audit both products against local containers
|
||||
node audit-cli-documentation.js both local
|
||||
```
|
||||
|
||||
**Output:**
|
||||
- `../output/cli-audit/documentation-audit-{product}-{version}.md` - Detailed audit report
|
||||
- `../output/cli-audit/parsed-cli-{product}-{version}.md` - Parsed CLI structure
|
||||
- `../output/cli-audit/patches/{product}/` - Generated patches for missing documentation
|
||||
|
||||
### 🛠️ CLI Documentation Updates
|
||||
|
||||
#### `apply-cli-patches.js`
|
||||
JavaScript ESM script that applies generated patches to update CLI documentation with missing options.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
node apply-cli-patches.js [core|enterprise|both] [--dry-run]
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Applies patches generated by `audit-cli-documentation.js`
|
||||
- Updates CLI reference documentation with missing options
|
||||
- Supports dry-run mode to preview changes
|
||||
- Maintains existing documentation structure and formatting
|
||||
- Creates backups before applying changes
|
||||
|
||||
**Examples:**
|
||||
```bash
|
||||
# Preview changes without applying (dry run)
|
||||
node apply-cli-patches.js core --dry-run
|
||||
|
||||
# Apply patches to Enterprise documentation
|
||||
node apply-cli-patches.js enterprise
|
||||
|
||||
# Apply patches to both products
|
||||
node apply-cli-patches.js both
|
||||
```
|
||||
|
||||
**Output:**
|
||||
- Updates CLI reference documentation files in place
|
||||
- Creates backup files with `.backup` extension
|
||||
- Logs all changes made to the documentation
|
||||
|
||||
## Quick Start Guide
|
||||
|
||||
### 1. Initial Setup
|
||||
|
||||
```bash
|
||||
# Navigate to the monolith scripts directory
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
|
||||
# Make scripts executable
|
||||
chmod +x *.sh
|
||||
|
||||
# Set up authentication for both products
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Restart containers to load new secrets
|
||||
docker compose down && docker compose up -d influxdb3-core influxdb3-enterprise
|
||||
```
|
||||
|
||||
### 2. CLI Documentation Audit
|
||||
|
||||
```bash
|
||||
# Start your containers
|
||||
docker compose up -d influxdb3-core influxdb3-enterprise
|
||||
|
||||
# Audit CLI documentation
|
||||
node audit-cli-documentation.js core local
|
||||
node audit-cli-documentation.js enterprise local
|
||||
|
||||
# Review the output
|
||||
ls ../output/cli-audit/
|
||||
```
|
||||
|
||||
### 3. Development Workflow
|
||||
|
||||
```bash
|
||||
# Audit documentation for both products
|
||||
node audit-cli-documentation.js both local
|
||||
|
||||
# Check the audit results
|
||||
cat ../output/cli-audit/documentation-audit-core-local.md
|
||||
cat ../output/cli-audit/documentation-audit-enterprise-local.md
|
||||
|
||||
# Apply patches if needed (dry run first)
|
||||
node apply-cli-patches.js both --dry-run
|
||||
```
|
||||
|
||||
### 4. Release Documentation Updates
|
||||
|
||||
For release documentation, use the audit and patch workflow:
|
||||
|
||||
```bash
|
||||
# Audit against released version
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
# Review missing documentation
|
||||
cat ../output/cli-audit/documentation-audit-enterprise-v3.2.0.md
|
||||
|
||||
# Apply patches to update documentation
|
||||
node apply-cli-patches.js enterprise
|
||||
|
||||
# Verify changes look correct
|
||||
git diff content/influxdb3/enterprise/reference/cli/
|
||||
```
|
||||
|
||||
## Container Integration
|
||||
|
||||
The scripts work with your Docker Compose setup:
|
||||
|
||||
**Expected container names:**
|
||||
- `influxdb3-core` (port 8282)
|
||||
- `influxdb3-enterprise` (port 8181)
|
||||
|
||||
**Docker Compose secrets:**
|
||||
- `influxdb3-core-admin-token` - Admin token for Core (stored in `~/.env.influxdb3-core-admin-token`)
|
||||
- `influxdb3-enterprise-admin-token` - Admin token for Enterprise (stored in `~/.env.influxdb3-enterprise-admin-token`)
|
||||
- `INFLUXDB3_LICENSE_EMAIL` - Enterprise license email (set in `.env.3ent` env_file)
|
||||
|
||||
## Use Cases
|
||||
|
||||
### 📋 Release Documentation
|
||||
|
||||
1. **Pre-release audit:**
|
||||
```bash
|
||||
node audit-cli-documentation.js core v3.2.0
|
||||
```
|
||||
|
||||
2. **Review audit results and update documentation**
|
||||
3. **Apply patches for missing content**
|
||||
4. **Test documented commands work correctly**
|
||||
|
||||
### 🔬 Development Testing
|
||||
|
||||
1. **Audit local development:**
|
||||
```bash
|
||||
node audit-cli-documentation.js enterprise local
|
||||
```
|
||||
|
||||
2. **Verify new features are documented**
|
||||
3. **Test authentication setup**
|
||||
4. **Apply patches to keep docs current**
|
||||
|
||||
### 🚀 Release Preparation
|
||||
|
||||
1. **Final audit before release:**
|
||||
```bash
|
||||
node audit-cli-documentation.js both local
|
||||
```
|
||||
|
||||
2. **Apply all pending patches**
|
||||
3. **Update examples and tutorials**
|
||||
4. **Verify all CLI commands work as documented**
|
||||
|
||||
## Output Structure
|
||||
|
||||
```
|
||||
helper-scripts/
|
||||
├── output/
|
||||
│ └── cli-audit/
|
||||
│ ├── documentation-audit-core-local.md # CLI documentation audit report
|
||||
│ ├── documentation-audit-enterprise-v3.2.0.md # CLI documentation audit report
|
||||
│ ├── parsed-cli-core-local.md # Parsed CLI structure
|
||||
│ ├── parsed-cli-enterprise-v3.2.0.md # Parsed CLI structure
|
||||
│ └── patches/
|
||||
│ ├── core/ # Generated patches for Core
|
||||
│ │ ├── influxdb3-cli-patch-001.md
|
||||
│ │ └── influxdb3-cli-patch-002.md
|
||||
│ └── enterprise/ # Generated patches for Enterprise
|
||||
│ ├── influxdb3-cli-patch-001.md
|
||||
│ └── influxdb3-cli-patch-002.md
|
||||
└── influxdb3-monolith/
|
||||
├── README.md # This file
|
||||
├── setup-auth-tokens.sh # Auth setup
|
||||
├── audit-cli-documentation.js # CLI documentation audit
|
||||
└── apply-cli-patches.js # CLI documentation patches
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Common Issues
|
||||
|
||||
**Container not running:**
|
||||
```bash
|
||||
# Check status
|
||||
docker compose ps
|
||||
|
||||
# Start specific service
|
||||
docker compose up -d influxdb3-core
|
||||
```
|
||||
|
||||
**Authentication failures:**
|
||||
```bash
|
||||
# Recreate tokens
|
||||
./setup-auth-tokens.sh both
|
||||
|
||||
# Test manually
|
||||
docker exec influxdb3-core influxdb3 create token --admin
|
||||
```
|
||||
|
||||
**Version not found:**
|
||||
```bash
|
||||
# Check available versions
|
||||
docker pull influxdb:3-core:3.2.0
|
||||
docker pull influxdb:3-enterprise:3.2.0
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug output for troubleshooting:
|
||||
```bash
|
||||
DEBUG=1 node audit-cli-documentation.js core local
|
||||
```
|
||||
|
||||
## Integration with CI/CD
|
||||
|
||||
### GitHub Actions Example
|
||||
|
||||
```yaml
|
||||
- name: Audit CLI Documentation
|
||||
run: |
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
node audit-cli-documentation.js core ${{ env.VERSION }}
|
||||
|
||||
- name: Upload CLI Audit Results
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: cli-audit
|
||||
path: helper-scripts/output/cli-audit/
|
||||
```
|
||||
|
||||
### CircleCI Example
|
||||
|
||||
```yaml
|
||||
- run:
|
||||
name: CLI Documentation Audit
|
||||
command: |
|
||||
cd helper-scripts/influxdb3-monolith
|
||||
node audit-cli-documentation.js enterprise v3.2.0
|
||||
|
||||
- store_artifacts:
|
||||
path: helper-scripts/output/cli-audit/
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 🔒 Security
|
||||
- Secret files (`~/.env.influxdb3-*-admin-token`) are stored in your home directory and not in version control
|
||||
- Rotate auth tokens regularly by re-running `setup-auth-tokens.sh`
|
||||
- Use minimal token permissions when possible
|
||||
|
||||
### 📚 Documentation
|
||||
- Run audits early in release cycle
|
||||
- Review all audit reports for missing content
|
||||
- Apply patches to keep documentation current
|
||||
- Test all documented commands work correctly
|
||||
|
||||
### 🔄 Workflow
|
||||
- Use `local` version for development testing
|
||||
- Audit against released versions for release prep
|
||||
- Generate patches before documentation updates
|
||||
- Validate changes with stakeholders
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Script Permissions
|
||||
```bash
|
||||
chmod +x *.sh
|
||||
```
|
||||
|
||||
### Missing Dependencies
|
||||
```bash
|
||||
# Node.js dependencies
|
||||
node --version # Should be 16 or higher
|
||||
|
||||
# Docker Compose
|
||||
docker compose version
|
||||
```
|
||||
|
||||
### Container Health
|
||||
```bash
|
||||
# Check container logs
|
||||
docker logs influxdb3-core
|
||||
docker logs influxdb3-enterprise
|
||||
|
||||
# Test basic connectivity
|
||||
docker exec influxdb3-core influxdb3 --version
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
When adding new scripts to this directory:
|
||||
|
||||
1. **Follow naming conventions**: Use lowercase with hyphens
|
||||
2. **Add usage documentation**: Include help text in scripts
|
||||
3. **Handle errors gracefully**: Use proper exit codes
|
||||
4. **Test with both products**: Ensure Core and Enterprise compatibility
|
||||
5. **Update this README**: Document new functionality
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [InfluxDB 3 Core CLI Reference](/influxdb3/core/reference/cli/)
|
||||
- [InfluxDB 3 Enterprise CLI Reference](/influxdb3/enterprise/reference/cli/)
|
||||
|
|
@ -1,277 +0,0 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Apply CLI documentation patches generated by audit-cli-documentation.js
|
||||
* Usage: node apply-cli-patches.js [core|enterprise|both] [--dry-run]
|
||||
*/
|
||||
|
||||
import { promises as fs } from 'fs';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import { process } from 'node:process';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
// Color codes
|
||||
const Colors = {
|
||||
RED: '\x1b[0;31m',
|
||||
GREEN: '\x1b[0;32m',
|
||||
YELLOW: '\x1b[1;33m',
|
||||
BLUE: '\x1b[0;34m',
|
||||
NC: '\x1b[0m', // No Color
|
||||
};
|
||||
|
||||
async function fileExists(path) {
|
||||
try {
|
||||
await fs.access(path);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureDir(dir) {
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
}
|
||||
|
||||
async function extractFrontmatter(content) {
|
||||
const lines = content.split('\n');
|
||||
if (lines[0] !== '---') return { frontmatter: null, content };
|
||||
|
||||
const frontmatterLines = [];
|
||||
let i = 1;
|
||||
while (i < lines.length && lines[i] !== '---') {
|
||||
frontmatterLines.push(lines[i]);
|
||||
i++;
|
||||
}
|
||||
|
||||
if (i >= lines.length) return { frontmatter: null, content };
|
||||
|
||||
const frontmatterText = frontmatterLines.join('\n');
|
||||
const remainingContent = lines.slice(i + 1).join('\n');
|
||||
|
||||
return { frontmatter: frontmatterText, content: remainingContent };
|
||||
}
|
||||
|
||||
async function getActualDocumentationPath(docPath, projectRoot) {
|
||||
// Check if the documentation file exists and has a source field
|
||||
const fullPath = join(projectRoot, docPath);
|
||||
|
||||
if (await fileExists(fullPath)) {
|
||||
const content = await fs.readFile(fullPath, 'utf8');
|
||||
const { frontmatter } = await extractFrontmatter(content);
|
||||
|
||||
if (frontmatter) {
|
||||
// Look for source: field in frontmatter
|
||||
const sourceMatch = frontmatter.match(/^source:\s*(.+)$/m);
|
||||
if (sourceMatch) {
|
||||
const sourcePath = sourceMatch[1].trim();
|
||||
return sourcePath;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return docPath;
|
||||
}
|
||||
|
||||
async function applyPatches(product, dryRun = false) {
|
||||
const patchDir = join(
|
||||
dirname(__dirname),
|
||||
'output',
|
||||
'cli-audit',
|
||||
'patches',
|
||||
product
|
||||
);
|
||||
const projectRoot = join(__dirname, '..', '..');
|
||||
|
||||
console.log(
|
||||
`${Colors.BLUE}📋 Applying CLI documentation patches for ${product}${Colors.NC}`
|
||||
);
|
||||
if (dryRun) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}🔍 DRY RUN - No files will be created${Colors.NC}`
|
||||
);
|
||||
}
|
||||
console.log();
|
||||
|
||||
// Check if patch directory exists
|
||||
if (!(await fileExists(patchDir))) {
|
||||
console.log(`${Colors.YELLOW}No patches found for ${product}.${Colors.NC}`);
|
||||
console.log("Run 'yarn audit:cli' first to generate patches.");
|
||||
return;
|
||||
}
|
||||
|
||||
// Read all patch files
|
||||
const patchFiles = await fs.readdir(patchDir);
|
||||
const mdFiles = patchFiles.filter((f) => f.endsWith('.md'));
|
||||
|
||||
if (mdFiles.length === 0) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}No patch files found in ${patchDir}${Colors.NC}`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Found ${mdFiles.length} patch file(s) to apply:\n`);
|
||||
|
||||
// Map patch files to their destination
|
||||
const baseCliPath = `content/influxdb3/${product}/reference/cli/influxdb3`;
|
||||
const commandToFile = {
|
||||
'create-database.md': `${baseCliPath}/create/database.md`,
|
||||
'create-token.md': `${baseCliPath}/create/token/_index.md`,
|
||||
'create-token-admin.md': `${baseCliPath}/create/token/admin.md`,
|
||||
'create-trigger.md': `${baseCliPath}/create/trigger.md`,
|
||||
'create-table.md': `${baseCliPath}/create/table.md`,
|
||||
'create-last_cache.md': `${baseCliPath}/create/last_cache.md`,
|
||||
'create-distinct_cache.md': `${baseCliPath}/create/distinct_cache.md`,
|
||||
'show-databases.md': `${baseCliPath}/show/databases.md`,
|
||||
'show-tokens.md': `${baseCliPath}/show/tokens.md`,
|
||||
'delete-database.md': `${baseCliPath}/delete/database.md`,
|
||||
'delete-table.md': `${baseCliPath}/delete/table.md`,
|
||||
'query.md': `${baseCliPath}/query.md`,
|
||||
'write.md': `${baseCliPath}/write.md`,
|
||||
};
|
||||
|
||||
let applied = 0;
|
||||
let skipped = 0;
|
||||
|
||||
for (const patchFile of mdFiles) {
|
||||
const destinationPath = commandToFile[patchFile];
|
||||
|
||||
if (!destinationPath) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}⚠️ Unknown patch file: ${patchFile}${Colors.NC}`
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get the actual documentation path (handles source: frontmatter)
|
||||
const actualPath = await getActualDocumentationPath(
|
||||
destinationPath,
|
||||
projectRoot
|
||||
);
|
||||
const fullDestPath = join(projectRoot, actualPath);
|
||||
const patchPath = join(patchDir, patchFile);
|
||||
|
||||
// Check if destination already exists
|
||||
if (await fileExists(fullDestPath)) {
|
||||
console.log(
|
||||
`${Colors.YELLOW}⏭️ Skipping${Colors.NC} ${patchFile} - destination already exists:`
|
||||
);
|
||||
console.log(` ${actualPath}`);
|
||||
skipped++;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (dryRun) {
|
||||
console.log(`${Colors.BLUE}🔍 Would create${Colors.NC} ${actualPath}`);
|
||||
console.log(` from patch: ${patchFile}`);
|
||||
if (actualPath !== destinationPath) {
|
||||
console.log(` (resolved from: ${destinationPath})`);
|
||||
}
|
||||
applied++;
|
||||
} else {
|
||||
try {
|
||||
// Ensure destination directory exists
|
||||
await ensureDir(dirname(fullDestPath));
|
||||
|
||||
// Copy patch to destination
|
||||
const content = await fs.readFile(patchPath, 'utf8');
|
||||
|
||||
// Update the menu configuration based on product
|
||||
let updatedContent = content;
|
||||
if (product === 'enterprise') {
|
||||
updatedContent = content
|
||||
.replace('influxdb3/core/tags:', 'influxdb3/enterprise/tags:')
|
||||
.replace(
|
||||
'influxdb3_core_reference:',
|
||||
'influxdb3_enterprise_reference:'
|
||||
);
|
||||
}
|
||||
|
||||
await fs.writeFile(fullDestPath, updatedContent);
|
||||
|
||||
console.log(`${Colors.GREEN}✅ Created${Colors.NC} ${actualPath}`);
|
||||
console.log(` from patch: ${patchFile}`);
|
||||
if (actualPath !== destinationPath) {
|
||||
console.log(` (resolved from: ${destinationPath})`);
|
||||
}
|
||||
applied++;
|
||||
} catch (error) {
|
||||
console.log(
|
||||
`${Colors.RED}❌ Error${Colors.NC} creating ${actualPath}:`
|
||||
);
|
||||
console.log(` ${error.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log();
|
||||
console.log(`${Colors.BLUE}Summary:${Colors.NC}`);
|
||||
console.log(`- Patches ${dryRun ? 'would be' : ''} applied: ${applied}`);
|
||||
console.log(`- Files skipped (already exist): ${skipped}`);
|
||||
console.log(`- Total patch files: ${mdFiles.length}`);
|
||||
|
||||
if (!dryRun && applied > 0) {
|
||||
console.log();
|
||||
console.log(
|
||||
`${Colors.GREEN}✨ Success!${Colors.NC} Created ${applied} new ` +
|
||||
'documentation file(s).'
|
||||
);
|
||||
console.log();
|
||||
console.log('Next steps:');
|
||||
console.log('1. Review the generated files and customize the content');
|
||||
console.log('2. Add proper examples with placeholders');
|
||||
console.log('3. Update descriptions and add any missing options');
|
||||
console.log('4. Run tests: yarn test:links');
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const args = process.argv.slice(2);
|
||||
const product =
|
||||
args.find((arg) => ['core', 'enterprise', 'both'].includes(arg)) || 'both';
|
||||
const dryRun = args.includes('--dry-run');
|
||||
|
||||
if (args.includes('--help') || args.includes('-h')) {
|
||||
console.log(
|
||||
'Usage: node apply-cli-patches.js [core|enterprise|both] [--dry-run]'
|
||||
);
|
||||
console.log();
|
||||
console.log('Options:');
|
||||
console.log(
|
||||
' --dry-run Show what would be done without creating files'
|
||||
);
|
||||
console.log();
|
||||
console.log('Examples:');
|
||||
console.log(
|
||||
' node apply-cli-patches.js # Apply patches for both products'
|
||||
);
|
||||
console.log(
|
||||
' node apply-cli-patches.js core --dry-run # Preview core patches'
|
||||
);
|
||||
console.log(
|
||||
' node apply-cli-patches.js enterprise # Apply enterprise patches'
|
||||
);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
try {
|
||||
if (product === 'both') {
|
||||
await applyPatches('core', dryRun);
|
||||
console.log();
|
||||
await applyPatches('enterprise', dryRun);
|
||||
} else {
|
||||
await applyPatches(product, dryRun);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`${Colors.RED}Error:${Colors.NC}`, error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main();
|
||||
}
|
||||
|
|
@ -1,974 +0,0 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Audit CLI documentation against current CLI help output
|
||||
* Usage: node audit-cli-documentation.js [core|enterprise|both] [version]
|
||||
* Example: node audit-cli-documentation.js core 3.2.0
|
||||
*/
|
||||
|
||||
import { spawn } from 'child_process';
|
||||
import { promises as fs } from 'fs';
|
||||
import { homedir } from 'os';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import {
|
||||
validateVersionInputs,
|
||||
getRepositoryRoot,
|
||||
} from '../common/validate-tags.js';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
|
||||
// Color codes
|
||||
const Colors = {
|
||||
RED: '\x1b[0;31m',
|
||||
GREEN: '\x1b[0;32m',
|
||||
YELLOW: '\x1b[1;33m',
|
||||
BLUE: '\x1b[0;34m',
|
||||
NC: '\x1b[0m', // No Color
|
||||
};
|
||||
|
||||
class CLIDocAuditor {
|
||||
constructor(product = 'both', version = 'local') {
|
||||
this.product = product;
|
||||
this.version = version;
|
||||
this.outputDir = join(dirname(__dirname), 'output', 'cli-audit');
|
||||
|
||||
// Token paths - check environment variables first (Docker Compose), then fall back to local files
|
||||
const coreTokenEnv = process.env.INFLUXDB3_CORE_TOKEN;
|
||||
const enterpriseTokenEnv = process.env.INFLUXDB3_ENTERPRISE_TOKEN;
|
||||
|
||||
if (coreTokenEnv && this.fileExists(coreTokenEnv)) {
|
||||
// Running in Docker Compose with secrets
|
||||
this.coreTokenFile = coreTokenEnv;
|
||||
this.enterpriseTokenFile = enterpriseTokenEnv;
|
||||
} else {
|
||||
// Running locally
|
||||
this.coreTokenFile = join(homedir(), '.env.influxdb3-core-admin-token');
|
||||
this.enterpriseTokenFile = join(
|
||||
homedir(),
|
||||
'.env.influxdb3-enterprise-admin-token'
|
||||
);
|
||||
}
|
||||
|
||||
// Commands to extract help for
|
||||
this.mainCommands = [
|
||||
'create',
|
||||
'delete',
|
||||
'disable',
|
||||
'enable',
|
||||
'query',
|
||||
'show',
|
||||
'test',
|
||||
'update',
|
||||
'write',
|
||||
];
|
||||
this.subcommands = [
|
||||
'create database',
|
||||
'create token admin',
|
||||
'create token',
|
||||
'create trigger',
|
||||
'create last_cache',
|
||||
'create distinct_cache',
|
||||
'create table',
|
||||
'show databases',
|
||||
'show tokens',
|
||||
'show system',
|
||||
'delete database',
|
||||
'delete table',
|
||||
'delete trigger',
|
||||
'update database',
|
||||
'test wal_plugin',
|
||||
'test schedule_plugin',
|
||||
];
|
||||
|
||||
// Map for command tracking during option parsing
|
||||
this.commandOptionsMap = {};
|
||||
}
|
||||
|
||||
async fileExists(path) {
|
||||
try {
|
||||
await fs.access(path);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async ensureDir(dir) {
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
}
|
||||
|
||||
async loadTokens() {
|
||||
let coreToken = null;
|
||||
let enterpriseToken = null;
|
||||
|
||||
try {
|
||||
if (await this.fileExists(this.coreTokenFile)) {
|
||||
const stat = await fs.stat(this.coreTokenFile);
|
||||
if (stat.size > 0) {
|
||||
coreToken = (await fs.readFile(this.coreTokenFile, 'utf8')).trim();
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Token file doesn't exist or can't be read
|
||||
}
|
||||
|
||||
try {
|
||||
if (await this.fileExists(this.enterpriseTokenFile)) {
|
||||
const stat = await fs.stat(this.enterpriseTokenFile);
|
||||
if (stat.size > 0) {
|
||||
enterpriseToken = (
|
||||
await fs.readFile(this.enterpriseTokenFile, 'utf8')
|
||||
).trim();
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Token file doesn't exist or can't be read
|
||||
}
|
||||
|
||||
return { coreToken, enterpriseToken };
|
||||
}
|
||||
|
||||
runCommand(cmd, args = []) {
|
||||
return new Promise((resolve) => {
|
||||
const child = spawn(cmd, args, { encoding: 'utf8' });
|
||||
let stdout = '';
|
||||
let stderr = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
stdout += data.toString();
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
stderr += data.toString();
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
resolve({ code, stdout, stderr });
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
resolve({ code: 1, stdout: '', stderr: err.message });
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async extractCurrentCLI(product, outputFile) {
|
||||
process.stdout.write(
|
||||
`Extracting current CLI help from influxdb3-${product}...`
|
||||
);
|
||||
|
||||
await this.loadTokens();
|
||||
|
||||
if (this.version === 'local') {
|
||||
const containerName = `influxdb3-${product}`;
|
||||
|
||||
// Check if container is running
|
||||
const { code, stdout } = await this.runCommand('docker', [
|
||||
'ps',
|
||||
'--format',
|
||||
'{{.Names}}',
|
||||
]);
|
||||
if (code !== 0 || !stdout.includes(containerName)) {
|
||||
console.log(` ${Colors.RED}✗${Colors.NC}`);
|
||||
console.log(`Error: Container ${containerName} is not running.`);
|
||||
console.log(`Start it with: docker compose up -d influxdb3-${product}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Extract comprehensive help
|
||||
let fileContent = '';
|
||||
|
||||
// Main help
|
||||
const mainHelp = await this.runCommand('docker', [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
'--help',
|
||||
]);
|
||||
fileContent += mainHelp.code === 0 ? mainHelp.stdout : mainHelp.stderr;
|
||||
|
||||
// Extract all subcommand help
|
||||
for (const cmd of this.mainCommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${cmd} --help =====\n`;
|
||||
const cmdHelp = await this.runCommand('docker', [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
cmd,
|
||||
'--help',
|
||||
]);
|
||||
fileContent += cmdHelp.code === 0 ? cmdHelp.stdout : cmdHelp.stderr;
|
||||
}
|
||||
|
||||
// Extract detailed subcommand help
|
||||
for (const subcmd of this.subcommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${subcmd} --help =====\n`;
|
||||
const cmdParts = [
|
||||
'exec',
|
||||
containerName,
|
||||
'influxdb3',
|
||||
...subcmd.split(' '),
|
||||
'--help',
|
||||
];
|
||||
const subcmdHelp = await this.runCommand('docker', cmdParts);
|
||||
fileContent +=
|
||||
subcmdHelp.code === 0 ? subcmdHelp.stdout : subcmdHelp.stderr;
|
||||
}
|
||||
|
||||
await fs.writeFile(outputFile, fileContent);
|
||||
console.log(` ${Colors.GREEN}✓${Colors.NC}`);
|
||||
} else {
|
||||
// Use specific version image
|
||||
const image = `influxdb:${this.version}-${product}`;
|
||||
|
||||
process.stdout.write(`Extracting CLI help from ${image}...`);
|
||||
|
||||
// Pull image if needed
|
||||
const pullResult = await this.runCommand('docker', ['pull', image]);
|
||||
if (pullResult.code !== 0) {
|
||||
console.log(` ${Colors.RED}✗${Colors.NC}`);
|
||||
console.log(`Error: Failed to pull image ${image}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Extract help from specific version
|
||||
let fileContent = '';
|
||||
|
||||
// Main help
|
||||
const mainHelp = await this.runCommand('docker', [
|
||||
'run',
|
||||
'--rm',
|
||||
image,
|
||||
'influxdb3',
|
||||
'--help',
|
||||
]);
|
||||
fileContent += mainHelp.code === 0 ? mainHelp.stdout : mainHelp.stderr;
|
||||
|
||||
// Extract subcommand help
|
||||
for (const cmd of this.mainCommands) {
|
||||
fileContent += `\n\n===== influxdb3 ${cmd} --help =====\n`;
|
||||
const cmdHelp = await this.runCommand('docker', [
|
||||
'run',
|
||||
'--rm',
|
||||
image,
|
||||
'influxdb3',
|
||||
cmd,
|
||||
'--help',
|
||||
]);
|
||||
fileContent += cmdHelp.code === 0 ? cmdHelp.stdout : cmdHelp.stderr;
|
||||
}
|
||||
|
||||
await fs.writeFile(outputFile, fileContent);
|
||||
console.log(` ${Colors.GREEN}✓${Colors.NC}`);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async parseCLIHelp(helpFile, parsedFile) {
|
||||
const content = await fs.readFile(helpFile, 'utf8');
|
||||
const lines = content.split('\n');
|
||||
|
||||
let output = '# CLI Commands and Options\n\n';
|
||||
let currentCommand = '';
|
||||
let inOptions = false;
|
||||
|
||||
for (const line of lines) {
|
||||
// Detect command headers
|
||||
if (line.startsWith('===== influxdb3') && line.endsWith('--help =====')) {
|
||||
currentCommand = line
|
||||
.replace('===== ', '')
|
||||
.replace(' --help =====', '')
|
||||
.trim();
|
||||
output += `## ${currentCommand}\n\n`;
|
||||
inOptions = false;
|
||||
// Initialize options list for this command
|
||||
this.commandOptionsMap[currentCommand] = [];
|
||||
}
|
||||
// Detect options sections
|
||||
else if (line.trim() === 'Options:') {
|
||||
output += '### Options:\n\n';
|
||||
inOptions = true;
|
||||
}
|
||||
// Parse option lines
|
||||
else if (inOptions && /^\s*-/.test(line)) {
|
||||
// Extract option and description
|
||||
const optionMatch = line.match(/--[a-z][a-z0-9-]*/);
|
||||
const shortMatch = line.match(/\s-[a-zA-Z],/);
|
||||
|
||||
if (optionMatch) {
|
||||
const option = optionMatch[0];
|
||||
const shortOption = shortMatch
|
||||
? shortMatch[0].replace(/[,\s]/g, '')
|
||||
: null;
|
||||
|
||||
// Extract description by removing option parts
|
||||
let description = line.replace(/^\s*-[^\s]*\s*/, '');
|
||||
description = description.replace(/^\s*--[^\s]*\s*/, '').trim();
|
||||
|
||||
if (shortOption) {
|
||||
output += `- \`${shortOption}, ${option}\`: ${description}\n`;
|
||||
} else {
|
||||
output += `- \`${option}\`: ${description}\n`;
|
||||
}
|
||||
|
||||
// Store option with its command context
|
||||
if (currentCommand && option) {
|
||||
this.commandOptionsMap[currentCommand].push(option);
|
||||
}
|
||||
}
|
||||
}
|
||||
// Reset options flag for new sections
|
||||
else if (/^[A-Z][a-z]+:$/.test(line.trim())) {
|
||||
inOptions = false;
|
||||
}
|
||||
}
|
||||
|
||||
await fs.writeFile(parsedFile, output);
|
||||
}
|
||||
|
||||
findDocsPath(product) {
|
||||
if (product === 'core') {
|
||||
return 'content/influxdb3/core/reference/cli/influxdb3';
|
||||
} else if (product === 'enterprise') {
|
||||
return 'content/influxdb3/enterprise/reference/cli/influxdb3';
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
async extractCommandHelp(content, command) {
|
||||
// Find the section for this specific command in the CLI help
|
||||
const lines = content.split('\n');
|
||||
let inCommand = false;
|
||||
let helpText = [];
|
||||
const commandHeader = `===== influxdb3 ${command} --help =====`;
|
||||
|
||||
for (let i = 0; i < lines.length; i++) {
|
||||
if (lines[i] === commandHeader) {
|
||||
inCommand = true;
|
||||
continue;
|
||||
}
|
||||
if (inCommand && lines[i].startsWith('===== influxdb3')) {
|
||||
break;
|
||||
}
|
||||
if (inCommand) {
|
||||
helpText.push(lines[i]);
|
||||
}
|
||||
}
|
||||
|
||||
return helpText.join('\n').trim();
|
||||
}
|
||||
|
||||
async generateDocumentationTemplate(command, helpText) {
|
||||
// Parse the help text to extract description and options
|
||||
const lines = helpText.split('\n');
|
||||
let description = '';
|
||||
let usage = '';
|
||||
let options = [];
|
||||
let inOptions = false;
|
||||
|
||||
for (let i = 0; i < lines.length; i++) {
|
||||
const line = lines[i];
|
||||
|
||||
if (i === 0 && !line.startsWith('Usage:') && line.trim()) {
|
||||
description = line.trim();
|
||||
}
|
||||
if (line.startsWith('Usage:')) {
|
||||
usage = line.replace('Usage:', '').trim();
|
||||
}
|
||||
if (line.trim() === 'Options:') {
|
||||
inOptions = true;
|
||||
continue;
|
||||
}
|
||||
if (inOptions && /^\s*-/.test(line)) {
|
||||
const optionMatch = line.match(/--([a-z][a-z0-9-]*)/);
|
||||
const shortMatch = line.match(/\s-([a-zA-Z]),/);
|
||||
if (optionMatch) {
|
||||
const optionName = optionMatch[1];
|
||||
const shortOption = shortMatch ? shortMatch[1] : null;
|
||||
let optionDesc = line
|
||||
.replace(/^\s*-[^\s]*\s*/, '')
|
||||
.replace(/^\s*--[^\s]*\s*/, '')
|
||||
.trim();
|
||||
|
||||
options.push({
|
||||
name: optionName,
|
||||
short: shortOption,
|
||||
description: optionDesc,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Generate markdown template
|
||||
let template = `---
|
||||
title: influxdb3 ${command}
|
||||
description: >
|
||||
The \`influxdb3 ${command}\` command ${description.toLowerCase()}.
|
||||
influxdb3/core/tags: [cli]
|
||||
menu:
|
||||
influxdb3_core_reference:
|
||||
parent: influxdb3 cli
|
||||
weight: 201
|
||||
---
|
||||
|
||||
# influxdb3 ${command}
|
||||
|
||||
${description}
|
||||
|
||||
## Usage
|
||||
|
||||
\`\`\`bash
|
||||
${usage || `influxdb3 ${command} [OPTIONS]`}
|
||||
\`\`\`
|
||||
|
||||
`;
|
||||
|
||||
if (options.length > 0) {
|
||||
template += `## Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
`;
|
||||
|
||||
for (const opt of options) {
|
||||
const optionDisplay = opt.short
|
||||
? `\`-${opt.short}\`, \`--${opt.name}\``
|
||||
: `\`--${opt.name}\``;
|
||||
template += `| ${optionDisplay} | ${opt.description} |\n`;
|
||||
}
|
||||
}
|
||||
|
||||
template += `
|
||||
## Examples
|
||||
|
||||
### Example 1: Basic usage
|
||||
|
||||
{{% code-placeholders "PLACEHOLDER1|PLACEHOLDER2" %}}
|
||||
\`\`\`bash
|
||||
influxdb3 ${command} --example PLACEHOLDER1
|
||||
\`\`\`
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
Replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}\`PLACEHOLDER1\`{{% /code-placeholder-key %}}: Description of placeholder
|
||||
`;
|
||||
|
||||
return template;
|
||||
}
|
||||
|
||||
async extractFrontmatter(content) {
|
||||
const lines = content.split('\n');
|
||||
if (lines[0] !== '---') return { frontmatter: null, content };
|
||||
|
||||
const frontmatterLines = [];
|
||||
let i = 1;
|
||||
while (i < lines.length && lines[i] !== '---') {
|
||||
frontmatterLines.push(lines[i]);
|
||||
i++;
|
||||
}
|
||||
|
||||
if (i >= lines.length) return { frontmatter: null, content };
|
||||
|
||||
const frontmatterText = frontmatterLines.join('\n');
|
||||
const remainingContent = lines.slice(i + 1).join('\n');
|
||||
|
||||
return { frontmatter: frontmatterText, content: remainingContent };
|
||||
}
|
||||
|
||||
async getActualContentPath(filePath) {
|
||||
// Get the actual content path, resolving source fields
|
||||
try {
|
||||
const content = await fs.readFile(filePath, 'utf8');
|
||||
const { frontmatter } = await this.extractFrontmatter(content);
|
||||
|
||||
if (frontmatter) {
|
||||
const sourceMatch = frontmatter.match(/^source:\s*(.+)$/m);
|
||||
if (sourceMatch) {
|
||||
let sourcePath = sourceMatch[1].trim();
|
||||
// Handle relative paths from project root
|
||||
if (sourcePath.startsWith('/shared/')) {
|
||||
sourcePath = `content${sourcePath}`;
|
||||
}
|
||||
return sourcePath;
|
||||
}
|
||||
}
|
||||
return null; // No source field found
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async parseDocumentedOptions(filePath) {
|
||||
// Parse a documentation file to extract all documented options
|
||||
try {
|
||||
const content = await fs.readFile(filePath, 'utf8');
|
||||
const options = [];
|
||||
|
||||
// Look for options in various patterns:
|
||||
// 1. Markdown tables with option columns
|
||||
// 2. Option lists with backticks
|
||||
// 3. Code examples with --option flags
|
||||
|
||||
// Pattern 1: Markdown tables (| Option | Description |)
|
||||
const tableMatches = content.match(/\|\s*`?--[a-z][a-z0-9-]*`?\s*\|/gi);
|
||||
if (tableMatches) {
|
||||
for (const match of tableMatches) {
|
||||
const option = match.match(/--[a-z][a-z0-9-]*/i);
|
||||
if (option) {
|
||||
options.push(option[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 2: Backtick-enclosed options in text
|
||||
const backtickMatches = content.match(/`--[a-z][a-z0-9-]*`/gi);
|
||||
if (backtickMatches) {
|
||||
for (const match of backtickMatches) {
|
||||
const option = match.replace(/`/g, '');
|
||||
options.push(option);
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 3: Options in code blocks
|
||||
const codeBlockMatches = content.match(/```[\s\S]*?```/g);
|
||||
if (codeBlockMatches) {
|
||||
for (const block of codeBlockMatches) {
|
||||
const blockOptions = block.match(/--[a-z][a-z0-9-]*/gi);
|
||||
if (blockOptions) {
|
||||
options.push(...blockOptions);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Pattern 4: Environment variable mappings (INFLUXDB3_* to --option)
|
||||
const envMatches = content.match(
|
||||
/\|\s*`INFLUXDB3_[^`]*`\s*\|\s*`--[a-z][a-z0-9-]*`\s*\|/gi
|
||||
);
|
||||
if (envMatches) {
|
||||
for (const match of envMatches) {
|
||||
const option = match.match(/--[a-z][a-z0-9-]*/);
|
||||
if (option) {
|
||||
options.push(option[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove duplicates and return sorted
|
||||
return [...new Set(options)].sort();
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async auditDocs(product, cliFile, auditFile) {
|
||||
const docsPath = this.findDocsPath(product);
|
||||
const sharedPath = 'content/shared/influxdb3-cli';
|
||||
const patchDir = join(this.outputDir, 'patches', product);
|
||||
await this.ensureDir(patchDir);
|
||||
|
||||
let output = `# CLI Documentation Audit - ${product}\n`;
|
||||
output += `Generated: ${new Date().toISOString()}\n\n`;
|
||||
|
||||
// GitHub base URL for edit links
|
||||
const githubBase = 'https://github.com/influxdata/docs-v2/edit/master';
|
||||
const githubNewBase = 'https://github.com/influxdata/docs-v2/new/master';
|
||||
|
||||
// VSCode links for local editing
|
||||
const vscodeBase = 'vscode://file';
|
||||
const projectRoot = join(__dirname, '..', '..');
|
||||
|
||||
// Check for missing documentation
|
||||
output += '## Missing Documentation\n\n';
|
||||
|
||||
let missingCount = 0;
|
||||
const missingDocs = [];
|
||||
|
||||
// Map commands to expected documentation files
|
||||
const commandToFile = {
|
||||
'create database': 'create/database.md',
|
||||
'create token': 'create/token/_index.md',
|
||||
'create token admin': 'create/token/admin.md',
|
||||
'create trigger': 'create/trigger.md',
|
||||
'create table': 'create/table.md',
|
||||
'create last_cache': 'create/last_cache.md',
|
||||
'create distinct_cache': 'create/distinct_cache.md',
|
||||
'show databases': 'show/databases.md',
|
||||
'show tokens': 'show/tokens.md',
|
||||
'delete database': 'delete/database.md',
|
||||
'delete table': 'delete/table.md',
|
||||
query: 'query.md',
|
||||
write: 'write.md',
|
||||
};
|
||||
|
||||
// Extract commands from CLI help
|
||||
const content = await fs.readFile(cliFile, 'utf8');
|
||||
const lines = content.split('\n');
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.startsWith('===== influxdb3') && line.endsWith('--help =====')) {
|
||||
const command = line
|
||||
.replace('===== influxdb3 ', '')
|
||||
.replace(' --help =====', '');
|
||||
|
||||
if (commandToFile[command]) {
|
||||
const expectedFile = commandToFile[command];
|
||||
const productFile = join(docsPath, expectedFile);
|
||||
const sharedFile = join(sharedPath, expectedFile);
|
||||
|
||||
const productExists = await this.fileExists(productFile);
|
||||
const sharedExists = await this.fileExists(sharedFile);
|
||||
|
||||
let needsContent = false;
|
||||
let targetPath = null;
|
||||
let stubPath = null;
|
||||
|
||||
if (!productExists && !sharedExists) {
|
||||
// Completely missing
|
||||
needsContent = true;
|
||||
targetPath = productFile;
|
||||
} else if (productExists) {
|
||||
// Check if it has a source field pointing to missing content
|
||||
const actualPath = await this.getActualContentPath(productFile);
|
||||
if (actualPath && !(await this.fileExists(actualPath))) {
|
||||
needsContent = true;
|
||||
targetPath = actualPath;
|
||||
stubPath = productFile;
|
||||
}
|
||||
} else if (sharedExists) {
|
||||
// Shared file exists, check if it has content
|
||||
const actualPath = await this.getActualContentPath(sharedFile);
|
||||
if (actualPath && !(await this.fileExists(actualPath))) {
|
||||
needsContent = true;
|
||||
targetPath = actualPath;
|
||||
stubPath = sharedFile;
|
||||
}
|
||||
}
|
||||
|
||||
if (needsContent && targetPath) {
|
||||
const githubNewUrl = `${githubNewBase}/${targetPath}`;
|
||||
const localPath = join(projectRoot, targetPath);
|
||||
|
||||
output += `- **Missing**: Documentation for \`influxdb3 ${command}\`\n`;
|
||||
if (stubPath) {
|
||||
output += ` - Stub exists at: \`${stubPath}\`\n`;
|
||||
output += ` - Content needed at: \`${targetPath}\`\n`;
|
||||
} else {
|
||||
output += ` - Expected: \`${targetPath}\` or \`${sharedFile}\`\n`;
|
||||
}
|
||||
output += ` - [Create on GitHub](${githubNewUrl})\n`;
|
||||
output += ` - Local: \`${localPath}\`\n`;
|
||||
|
||||
// Generate documentation template
|
||||
const helpText = await this.extractCommandHelp(content, command);
|
||||
const docTemplate = await this.generateDocumentationTemplate(
|
||||
command,
|
||||
helpText
|
||||
);
|
||||
|
||||
// Save patch file
|
||||
const patchFileName = `${command.replace(/ /g, '-')}.md`;
|
||||
const patchFile = join(patchDir, patchFileName);
|
||||
await fs.writeFile(patchFile, docTemplate);
|
||||
|
||||
output += ` - **Template generated**: \`${patchFile}\`\n`;
|
||||
|
||||
missingDocs.push({ command, file: targetPath, patchFile });
|
||||
missingCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (missingCount === 0) {
|
||||
output += 'No missing documentation files detected.\n';
|
||||
} else {
|
||||
output += '\n### Quick Actions\n\n';
|
||||
output +=
|
||||
'Copy and paste these commands to create missing documentation:\n\n';
|
||||
output += '```bash\n';
|
||||
for (const doc of missingDocs) {
|
||||
const relativePatch = join(
|
||||
'helper-scripts/output/cli-audit/patches',
|
||||
product,
|
||||
`${doc.command.replace(/ /g, '-')}.md`
|
||||
);
|
||||
output += `# Create ${doc.command} documentation\n`;
|
||||
output += `mkdir -p $(dirname ${doc.file})\n`;
|
||||
output += `cp ${relativePatch} ${doc.file}\n\n`;
|
||||
}
|
||||
output += '```\n';
|
||||
}
|
||||
|
||||
output += '\n';
|
||||
|
||||
// Check for outdated options in existing docs
|
||||
output += '## Existing Documentation Review\n\n';
|
||||
|
||||
// Parse CLI help first to populate commandOptionsMap
|
||||
const parsedFile = join(
|
||||
this.outputDir,
|
||||
`parsed-cli-${product}-${this.version}.md`
|
||||
);
|
||||
await this.parseCLIHelp(cliFile, parsedFile);
|
||||
|
||||
// For each command, check if documentation exists and compare content
|
||||
const existingDocs = [];
|
||||
for (const [command, expectedFile] of Object.entries(commandToFile)) {
|
||||
const productFile = join(docsPath, expectedFile);
|
||||
const sharedFile = join(sharedPath, expectedFile);
|
||||
|
||||
let docFile = null;
|
||||
let actualContentFile = null;
|
||||
|
||||
// Find the documentation file
|
||||
if (await this.fileExists(productFile)) {
|
||||
docFile = productFile;
|
||||
// Check if it's a stub with source field
|
||||
const actualPath = await this.getActualContentPath(productFile);
|
||||
actualContentFile = actualPath
|
||||
? join(projectRoot, actualPath)
|
||||
: join(projectRoot, productFile);
|
||||
} else if (await this.fileExists(sharedFile)) {
|
||||
docFile = sharedFile;
|
||||
actualContentFile = join(projectRoot, sharedFile);
|
||||
}
|
||||
|
||||
if (docFile && (await this.fileExists(actualContentFile))) {
|
||||
const githubEditUrl = `${githubBase}/${docFile}`;
|
||||
const localPath = join(projectRoot, docFile);
|
||||
const vscodeUrl = `${vscodeBase}/${localPath}`;
|
||||
|
||||
// Get CLI options for this command
|
||||
const cliOptions = this.commandOptionsMap[`influxdb3 ${command}`] || [];
|
||||
|
||||
// Parse documentation content to find documented options
|
||||
const documentedOptions =
|
||||
await this.parseDocumentedOptions(actualContentFile);
|
||||
|
||||
// Find missing options (in CLI but not in docs)
|
||||
const missingOptions = cliOptions.filter(
|
||||
(opt) => !documentedOptions.includes(opt)
|
||||
);
|
||||
|
||||
// Find extra options (in docs but not in CLI)
|
||||
const extraOptions = documentedOptions.filter(
|
||||
(opt) => !cliOptions.includes(opt)
|
||||
);
|
||||
|
||||
existingDocs.push({
|
||||
command,
|
||||
file: docFile,
|
||||
actualContentFile: actualContentFile.replace(
|
||||
join(projectRoot, ''),
|
||||
''
|
||||
),
|
||||
githubUrl: githubEditUrl,
|
||||
localPath,
|
||||
vscodeUrl,
|
||||
cliOptions,
|
||||
documentedOptions,
|
||||
missingOptions,
|
||||
extraOptions,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (existingDocs.length > 0) {
|
||||
output += 'Review these existing documentation files for accuracy:\n\n';
|
||||
|
||||
for (const doc of existingDocs) {
|
||||
output += `### \`influxdb3 ${doc.command}\`\n`;
|
||||
output += `- **File**: \`${doc.file}\`\n`;
|
||||
if (doc.actualContentFile !== doc.file) {
|
||||
output += `- **Content**: \`${doc.actualContentFile}\`\n`;
|
||||
}
|
||||
output += `- [Edit on GitHub](${doc.githubUrl})\n`;
|
||||
output += `- [Open in VS Code](${doc.vscodeUrl})\n`;
|
||||
output += `- **Local**: \`${doc.localPath}\`\n`;
|
||||
|
||||
// Show option analysis
|
||||
if (doc.missingOptions.length > 0) {
|
||||
output += `- **⚠️ Missing from docs** (${doc.missingOptions.length} options):\n`;
|
||||
for (const option of doc.missingOptions.sort()) {
|
||||
output += ` - \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
|
||||
if (doc.extraOptions.length > 0) {
|
||||
output += `- **ℹ️ Documented but not in CLI** (${doc.extraOptions.length} options):\n`;
|
||||
for (const option of doc.extraOptions.sort()) {
|
||||
output += ` - \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
|
||||
if (doc.missingOptions.length === 0 && doc.extraOptions.length === 0) {
|
||||
output += `- **✅ Options match** (${doc.cliOptions.length} options)\n`;
|
||||
}
|
||||
|
||||
if (doc.cliOptions.length > 0) {
|
||||
output += `- **All CLI Options** (${doc.cliOptions.length}):\n`;
|
||||
const uniqueOptions = [...new Set(doc.cliOptions)].sort();
|
||||
for (const option of uniqueOptions) {
|
||||
const status = doc.missingOptions.includes(option) ? '❌' : '✅';
|
||||
output += ` - ${status} \`${option}\`\n`;
|
||||
}
|
||||
}
|
||||
output += '\n';
|
||||
}
|
||||
}
|
||||
|
||||
output += '\n## Summary\n';
|
||||
output += `- Missing documentation files: ${missingCount}\n`;
|
||||
output += `- Existing documentation files: ${existingDocs.length}\n`;
|
||||
output += `- Generated templates: ${missingCount}\n`;
|
||||
output += '- Options are grouped by command for easier review\n\n';
|
||||
|
||||
output += '## Automation Suggestions\n\n';
|
||||
output +=
|
||||
'1. **Use generated templates**: Check the `patches` directory for pre-filled documentation templates\n';
|
||||
output +=
|
||||
'2. **Batch creation**: Use the shell commands above to quickly create all missing files\n';
|
||||
output +=
|
||||
'3. **CI Integration**: Add this audit to your CI pipeline to catch missing docs early\n';
|
||||
output +=
|
||||
'4. **Auto-PR**: Create a GitHub Action that runs this audit and opens PRs for missing docs\n\n';
|
||||
|
||||
await fs.writeFile(auditFile, output);
|
||||
console.log(`📄 Audit complete: ${auditFile}`);
|
||||
|
||||
if (missingCount > 0) {
|
||||
console.log(
|
||||
`📝 Generated ${missingCount} documentation templates in: ${patchDir}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
async run() {
|
||||
console.log(
|
||||
`${Colors.BLUE}🔍 InfluxDB 3 CLI Documentation Audit${Colors.NC}`
|
||||
);
|
||||
console.log('=======================================');
|
||||
console.log(`Product: ${this.product}`);
|
||||
console.log(`Version: ${this.version}`);
|
||||
console.log();
|
||||
|
||||
// Ensure output directory exists
|
||||
await this.ensureDir(this.outputDir);
|
||||
|
||||
if (this.product === 'core') {
|
||||
const cliFile = join(
|
||||
this.outputDir,
|
||||
`current-cli-core-${this.version}.txt`
|
||||
);
|
||||
const auditFile = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-core-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('core', cliFile)) {
|
||||
await this.auditDocs('core', cliFile, auditFile);
|
||||
}
|
||||
} else if (this.product === 'enterprise') {
|
||||
const cliFile = join(
|
||||
this.outputDir,
|
||||
`current-cli-enterprise-${this.version}.txt`
|
||||
);
|
||||
const auditFile = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-enterprise-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('enterprise', cliFile)) {
|
||||
await this.auditDocs('enterprise', cliFile, auditFile);
|
||||
}
|
||||
} else if (this.product === 'both') {
|
||||
// Core
|
||||
const cliFileCore = join(
|
||||
this.outputDir,
|
||||
`current-cli-core-${this.version}.txt`
|
||||
);
|
||||
const auditFileCore = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-core-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('core', cliFileCore)) {
|
||||
await this.auditDocs('core', cliFileCore, auditFileCore);
|
||||
}
|
||||
|
||||
// Enterprise
|
||||
const cliFileEnt = join(
|
||||
this.outputDir,
|
||||
`current-cli-enterprise-${this.version}.txt`
|
||||
);
|
||||
const auditFileEnt = join(
|
||||
this.outputDir,
|
||||
`documentation-audit-enterprise-${this.version}.md`
|
||||
);
|
||||
|
||||
if (await this.extractCurrentCLI('enterprise', cliFileEnt)) {
|
||||
await this.auditDocs('enterprise', cliFileEnt, auditFileEnt);
|
||||
}
|
||||
} else {
|
||||
console.error(`Error: Invalid product '${this.product}'`);
|
||||
console.error(
|
||||
'Usage: node audit-cli-documentation.js [core|enterprise|both] [version]'
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log();
|
||||
console.log(
|
||||
`${Colors.GREEN}✅ CLI documentation audit complete!${Colors.NC}`
|
||||
);
|
||||
console.log();
|
||||
console.log('Next steps:');
|
||||
console.log(`1. Review the audit reports in: ${this.outputDir}`);
|
||||
console.log('2. Update missing documentation files');
|
||||
console.log('3. Verify options match current CLI behavior');
|
||||
console.log('4. Update examples and usage patterns');
|
||||
}
|
||||
}
|
||||
|
||||
// Main execution
|
||||
async function main() {
|
||||
const args = process.argv.slice(2);
|
||||
const product = args[0] || 'both';
|
||||
const version = args[1] || 'local';
|
||||
|
||||
// Validate product
|
||||
if (!['core', 'enterprise', 'both'].includes(product)) {
|
||||
console.error(`Error: Invalid product '${product}'`);
|
||||
console.error(
|
||||
'Usage: node audit-cli-documentation.js [core|enterprise|both] [version]'
|
||||
);
|
||||
console.error('Example: node audit-cli-documentation.js core 3.2.0');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Validate version tag
|
||||
try {
|
||||
const repoRoot = await getRepositoryRoot();
|
||||
await validateVersionInputs(version, null, repoRoot);
|
||||
} catch (error) {
|
||||
console.error(`Version validation failed: ${error.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const auditor = new CLIDocAuditor(product, version);
|
||||
await auditor.run();
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main().catch((err) => {
|
||||
console.error('Error:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
|
||||
export { CLIDocAuditor };
|
||||
|
|
@ -1,164 +0,0 @@
|
|||
#!/bin/bash
|
||||
# Set up authentication tokens for InfluxDB 3 Core and Enterprise containers
|
||||
# Usage: ./setup-auth-tokens.sh [core|enterprise|both]
|
||||
|
||||
set -e
|
||||
|
||||
# Color codes
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
# Parse arguments
|
||||
TARGET=${1:-both}
|
||||
|
||||
echo -e "${BLUE}🔐 InfluxDB 3 Authentication Setup${NC}"
|
||||
echo "=================================="
|
||||
echo ""
|
||||
|
||||
# Check for and load existing secret files
|
||||
SECRET_CORE_FILE="$HOME/.env.influxdb3-core-admin-token"
|
||||
SECRET_ENT_FILE="$HOME/.env.influxdb3-enterprise-admin-token"
|
||||
|
||||
if [ -f "$SECRET_CORE_FILE" ]; then
|
||||
echo "✅ Found existing Core token secret file"
|
||||
else
|
||||
echo "📝 Creating new Core token secret file: $SECRET_CORE_FILE"
|
||||
touch "$SECRET_CORE_FILE"
|
||||
fi
|
||||
|
||||
if [ -f "$SECRET_ENT_FILE" ]; then
|
||||
echo "✅ Found existing Enterprise token secret file"
|
||||
else
|
||||
echo "📝 Creating new Enterprise token secret file: $SECRET_ENT_FILE"
|
||||
touch "$SECRET_ENT_FILE"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
|
||||
# Function to setup auth for a product
|
||||
setup_auth() {
|
||||
local product=$1
|
||||
local container_name="influxdb3-${product}"
|
||||
local port
|
||||
local secret_file
|
||||
|
||||
case "$product" in
|
||||
"core")
|
||||
port="8282"
|
||||
secret_file="$SECRET_CORE_FILE"
|
||||
;;
|
||||
"enterprise")
|
||||
port="8181"
|
||||
secret_file="$SECRET_ENT_FILE"
|
||||
;;
|
||||
esac
|
||||
|
||||
echo -e "${BLUE}Setting up $(echo ${product} | awk '{print toupper(substr($0,1,1)) tolower(substr($0,2))}') authentication...${NC}"
|
||||
|
||||
# Check if token already exists in secret file
|
||||
if [ -s "$secret_file" ]; then
|
||||
local existing_token=$(cat "$secret_file")
|
||||
echo "✅ Token already exists in secret file"
|
||||
echo " Token: ${existing_token:0:20}..."
|
||||
|
||||
# Test if the token works
|
||||
echo -n "🧪 Testing existing token..."
|
||||
if docker exec "${container_name}" influxdb3 show databases --token "${existing_token}" --host "http://localhost:${port}" > /dev/null 2>&1; then
|
||||
echo -e " ${GREEN}✓ Working${NC}"
|
||||
return 0
|
||||
else
|
||||
echo -e " ${YELLOW}⚠ Not working, will create new token${NC}"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check if container is running
|
||||
if ! docker ps --format '{{.Names}}' | grep -q "^${container_name}$"; then
|
||||
echo "🚀 Starting ${container_name} container..."
|
||||
if ! docker compose up -d "${container_name}"; then
|
||||
echo -e "${RED}❌ Failed to start container${NC}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo -n "⏳ Waiting for container to be ready..."
|
||||
sleep 5
|
||||
echo -e " ${GREEN}✓${NC}"
|
||||
else
|
||||
echo "✅ Container ${container_name} is running"
|
||||
fi
|
||||
|
||||
# Create admin token
|
||||
echo "🔑 Creating admin token..."
|
||||
|
||||
local token_output
|
||||
if token_output=$(docker exec "${container_name}" influxdb3 create token --admin 2>&1); then
|
||||
# Extract the token from the "Token: " line
|
||||
local new_token=$(echo "$token_output" | grep "^Token: " | sed 's/^Token: //' | tr -d '\r\n')
|
||||
|
||||
echo -e "✅ ${GREEN}Token created successfully!${NC}"
|
||||
echo " Token: ${new_token:0:20}..."
|
||||
|
||||
# Update secret file
|
||||
echo "${new_token}" > "$secret_file"
|
||||
|
||||
echo "📝 Updated secret file: $secret_file"
|
||||
|
||||
# Test the new token
|
||||
echo -n "🧪 Testing new token..."
|
||||
if docker exec "${container_name}" influxdb3 show databases --token "${new_token}" --host "http://localhost:${port}" > /dev/null 2>&1; then
|
||||
echo -e " ${GREEN}✓ Working${NC}"
|
||||
else
|
||||
echo -e " ${YELLOW}⚠ Test failed, but token was created${NC}"
|
||||
fi
|
||||
|
||||
else
|
||||
echo -e "${RED}❌ Failed to create token${NC}"
|
||||
echo "Error output: $token_output"
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo ""
|
||||
}
|
||||
|
||||
# Main execution
|
||||
case "$TARGET" in
|
||||
"core")
|
||||
setup_auth "core"
|
||||
;;
|
||||
"enterprise")
|
||||
setup_auth "enterprise"
|
||||
;;
|
||||
"both")
|
||||
setup_auth "core"
|
||||
setup_auth "enterprise"
|
||||
;;
|
||||
*)
|
||||
echo "Usage: $0 [core|enterprise|both]"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
echo -e "${GREEN}🎉 Authentication setup complete!${NC}"
|
||||
echo ""
|
||||
echo "📋 Next steps:"
|
||||
echo "1. Restart containers to load new secrets:"
|
||||
echo " docker compose down && docker compose up -d influxdb3-core influxdb3-enterprise"
|
||||
echo "2. Test CLI commands with authentication:"
|
||||
echo " ./detect-cli-changes.sh core 3.1.0 local"
|
||||
echo " ./detect-cli-changes.sh enterprise 3.1.0 local"
|
||||
echo ""
|
||||
echo "📄 Your secret files now contain:"
|
||||
|
||||
# Show Core tokens
|
||||
if [ -f "$SECRET_CORE_FILE" ] && [ -s "$SECRET_CORE_FILE" ]; then
|
||||
token_preview=$(head -c 20 "$SECRET_CORE_FILE")
|
||||
echo " $SECRET_CORE_FILE: ${token_preview}..."
|
||||
fi
|
||||
|
||||
# Show Enterprise tokens
|
||||
if [ -f "$SECRET_ENT_FILE" ] && [ -s "$SECRET_ENT_FILE" ]; then
|
||||
token_preview=$(head -c 20 "$SECRET_ENT_FILE")
|
||||
echo " $SECRET_ENT_FILE: ${token_preview}..."
|
||||
fi
|
||||
|
|
@ -55,12 +55,7 @@
|
|||
"test:codeblocks:v2": "docker compose run --rm --name v2-pytest v2-pytest",
|
||||
"test:codeblocks:stop-monitors": "./test/scripts/monitor-tests.sh stop cloud-dedicated-pytest && ./test/scripts/monitor-tests.sh stop clustered-pytest",
|
||||
"test:e2e": "node cypress/support/run-e2e-specs.js",
|
||||
"test:shortcode-examples": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/example.md",
|
||||
"audit:cli": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js both local",
|
||||
"audit:cli:3core": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js core local",
|
||||
"audit:cli:3ent": "node ./helper-scripts/influxdb3-monolith/audit-cli-documentation.js enterprise local",
|
||||
"audit:cli:apply": "node ./helper-scripts/influxdb3-monolith/apply-cli-patches.js both",
|
||||
"audit:cli:apply:dry": "node ./helper-scripts/influxdb3-monolith/apply-cli-patches.js both --dry-run"
|
||||
"test:shortcode-examples": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/example.md"
|
||||
},
|
||||
"type": "module",
|
||||
"browserslist": [
|
||||
|
|
|
|||
Loading…
Reference in New Issue