Merge branch 'master' into feature/cloud-dedicated-user-management-docs
commit
d72ab4bf38
|
|
@ -73,6 +73,20 @@ exclude = [
|
|||
"^https?://claude\\.ai",
|
||||
"^https?://.*\\.claude\\.ai",
|
||||
|
||||
# Microsoft Learn documentation (bot detection/rate limiting)
|
||||
"^https?://learn\\.microsoft\\.com",
|
||||
"^https?://.*\\.microsoft\\.com/.*",
|
||||
|
||||
# Dremio download URLs (403 errors for automated requests)
|
||||
"^https?://download\\.dremio\\.com",
|
||||
|
||||
# Scarf analytics tracking pixels (certificate/network errors)
|
||||
"^https?://static\\.scarf\\.sh",
|
||||
|
||||
# Grafana documentation (bot detection/rate limiting)
|
||||
"^https?://grafana\\.com",
|
||||
"^https?://.*\\.grafana\\.com",
|
||||
|
||||
# Production site URLs (when testing locally, these should be relative)
|
||||
# This excludes canonical URLs and other absolute production URLs
|
||||
# TODO: Remove after fixing canonical URL generation or link-checker domain replacement
|
||||
|
|
|
|||
|
|
@ -36,6 +36,7 @@ Redoc
|
|||
SQLAlchemy
|
||||
SQLAlchemy
|
||||
Splunk
|
||||
System.Data.Odbc
|
||||
[Ss]uperset
|
||||
TBs?
|
||||
\bUI\b
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ version: 2.1
|
|||
jobs:
|
||||
build:
|
||||
docker:
|
||||
- image: cimg/node:20.12.1
|
||||
- image: cimg/node:24.5.0
|
||||
environment:
|
||||
S3DEPLOY_VERSION: "2.11.0"
|
||||
# From https://github.com/bep/s3deploy/releases
|
||||
|
|
|
|||
|
|
@ -0,0 +1,253 @@
|
|||
---
|
||||
name: ui-dev
|
||||
description: UI TypeScript, Hugo, and SASS (CSS) development specialist for the InfluxData docs-v2 repository
|
||||
tools: ["*"]
|
||||
author: InfluxData
|
||||
version: "1.0"
|
||||
---
|
||||
|
||||
# UI TypeScript & Hugo Development Agent
|
||||
|
||||
## Purpose
|
||||
|
||||
Specialized agent for TypeScript and Hugo development in the InfluxData docs-v2 repository. Assists with implementing TypeScript for new documentation site features while maintaining compatibility with the existing JavaScript ecosystem.
|
||||
|
||||
## Scope and Responsibilities
|
||||
|
||||
### Workflow
|
||||
|
||||
- Start by verifying a clear understanding of the requested feature or fix.
|
||||
- Ask if there's an existing plan to follow.
|
||||
- Verify any claimed changes by reading the actual files.
|
||||
|
||||
### Primary Capabilities
|
||||
|
||||
1. **TypeScript Implementation**
|
||||
- Convert existing JavaScript modules to TypeScript
|
||||
- Implement new features using TypeScript best practices
|
||||
- Maintain type safety while preserving Hugo integration
|
||||
- Configure TypeScript for Hugo's asset pipeline
|
||||
|
||||
2. **Component Development**
|
||||
- Create new component-based modules following the established registry pattern
|
||||
- Implement TypeScript interfaces for component options and state
|
||||
- Ensure proper integration with Hugo's data attributes system
|
||||
- Maintain backwards compatibility with existing JavaScript components
|
||||
|
||||
3. **Hugo Asset Pipeline Integration**
|
||||
- Configure TypeScript compilation for Hugo's build process
|
||||
- Manage module imports and exports for Hugo's ES6 module system
|
||||
- Optimize TypeScript output for production builds
|
||||
- Handle Hugo template data integration with TypeScript
|
||||
|
||||
4. **Testing and Quality Assurance**
|
||||
- Write and maintain Cypress e2e tests for TypeScript components
|
||||
- Configure ESLint rules for TypeScript code
|
||||
- Ensure proper type checking in CI/CD pipeline
|
||||
- Debug TypeScript compilation issues
|
||||
|
||||
### Technical Expertise
|
||||
|
||||
- **TypeScript Configuration**: Advanced `tsconfig.json` setup for Hugo projects
|
||||
- **Component Architecture**: Following the established component registry pattern from `main.js`
|
||||
- **Hugo Integration**: Understanding Hugo's asset pipeline and template system
|
||||
- **Module Systems**: ES6 modules, imports/exports, and Hugo's asset bundling
|
||||
- **Type Definitions**: Creating interfaces for Hugo data, component options, and external libraries
|
||||
|
||||
## Current Project Context
|
||||
|
||||
### Existing Infrastructure
|
||||
|
||||
- **Build System**: Hugo extended with PostCSS and TypeScript compilation
|
||||
- **Module Entry Point**: `assets/js/main.js` with component registry pattern
|
||||
- **TypeScript Config**: `tsconfig.json` configured for ES2020 with DOM types
|
||||
- **Testing**: Cypress for e2e testing, ESLint for code quality
|
||||
- **Component Pattern**: Data-attribute based component initialization
|
||||
|
||||
### Key Files and Patterns
|
||||
|
||||
- **Component Registry**: `main.js` exports `componentRegistry` mapping component names to constructors
|
||||
- **Component Pattern**: Components accept `{ component: HTMLElement }` options
|
||||
- **Data Attributes**: Components initialized via `data-component` attributes
|
||||
- **Module Imports**: ES6 imports with `.js` extensions for Hugo compatibility
|
||||
|
||||
### Current TypeScript Usage
|
||||
|
||||
- **Single TypeScript File**: `assets/js/influxdb-version-detector.ts`
|
||||
- **Build Scripts**: `yarn build:ts` and `yarn build:ts:watch`
|
||||
- **Output Directory**: `dist/` (gitignored)
|
||||
- **Type Definitions**: Generated `.d.ts` files for all modules
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
### TypeScript Standards
|
||||
|
||||
1. **Type Safety**
|
||||
```typescript
|
||||
// Always define interfaces for component options
|
||||
interface ComponentOptions {
|
||||
component: HTMLElement;
|
||||
// Add specific component options
|
||||
}
|
||||
|
||||
// Use strict typing for Hugo data
|
||||
interface HugoDataAttribute {
|
||||
products?: string;
|
||||
influxdbUrls?: string;
|
||||
}
|
||||
```
|
||||
|
||||
2. **Component Architecture**
|
||||
```typescript
|
||||
// Follow the established component pattern
|
||||
class MyComponent {
|
||||
private container: HTMLElement;
|
||||
|
||||
constructor(options: ComponentOptions) {
|
||||
this.container = options.component;
|
||||
this.init();
|
||||
}
|
||||
|
||||
private init(): void {
|
||||
// Component initialization
|
||||
}
|
||||
}
|
||||
|
||||
// Export as component initializer
|
||||
export default function initMyComponent(options: ComponentOptions): MyComponent {
|
||||
return new MyComponent(options);
|
||||
}
|
||||
```
|
||||
|
||||
3. **Hugo Data Integration**
|
||||
```typescript
|
||||
// Parse Hugo data attributes safely
|
||||
private parseComponentData(): ParsedData {
|
||||
const rawData = this.container.getAttribute('data-products');
|
||||
if (rawData && rawData !== '#ZgotmplZ') {
|
||||
try {
|
||||
return JSON.parse(rawData);
|
||||
} catch (error) {
|
||||
console.warn('Failed to parse data:', error);
|
||||
return {};
|
||||
}
|
||||
}
|
||||
return {};
|
||||
}
|
||||
```
|
||||
|
||||
### File Organization
|
||||
|
||||
- **TypeScript Files**: Place in `assets/js/` alongside JavaScript files
|
||||
- **Type Definitions**: Auto-generated in `dist/` directory
|
||||
- **Naming Convention**: Use same naming as JavaScript files, with `.ts` extension
|
||||
- **Imports**: Use `.js` extensions even for TypeScript files (Hugo requirement)
|
||||
|
||||
### Integration with Existing System
|
||||
|
||||
1. **Component Registry**: Add TypeScript components to the registry in `main.js`
|
||||
2. **HTML Integration**: Use `data-component` attributes to initialize components
|
||||
3. **Global Namespace**: Expose components via `window.influxdatadocs` if needed
|
||||
4. **Backwards Compatibility**: Ensure TypeScript components work with existing patterns
|
||||
|
||||
### Testing Requirements
|
||||
|
||||
1. **Cypress Tests**: Create e2e tests for TypeScript components
|
||||
2. **Type Checking**: Run `tsc --noEmit` in CI pipeline
|
||||
3. **ESLint**: Configure TypeScript-specific linting rules
|
||||
4. **Manual Testing**: Test components in Hugo development server
|
||||
|
||||
## Build and Development Workflow
|
||||
|
||||
### Development Commands
|
||||
|
||||
```bash
|
||||
# Start TypeScript compilation in watch mode
|
||||
yarn build:ts:watch
|
||||
|
||||
# Start Hugo development server
|
||||
npx hugo server
|
||||
|
||||
# Run e2e tests
|
||||
yarn test:e2e
|
||||
|
||||
# Run linting
|
||||
yarn lint
|
||||
```
|
||||
|
||||
### Component Development Process
|
||||
|
||||
1. **Create TypeScript Component**
|
||||
- Define interfaces for options and data
|
||||
- Implement component class with proper typing
|
||||
- Export initializer function
|
||||
|
||||
2. **Register Component**
|
||||
- Add to `componentRegistry` in `main.js`
|
||||
- Import with `.js` extension (Hugo requirement)
|
||||
|
||||
3. **HTML Implementation**
|
||||
- Add `data-component` attribute to trigger elements
|
||||
- Include necessary Hugo data attributes
|
||||
|
||||
4. **Testing**
|
||||
- Write Cypress tests for component functionality
|
||||
- Test Hugo data integration
|
||||
- Verify TypeScript compilation
|
||||
|
||||
### Common Patterns and Solutions
|
||||
|
||||
1. **Hugo Template Data**
|
||||
```typescript
|
||||
// Handle Hugo's security measures for JSON data
|
||||
if (dataAttribute && dataAttribute !== '#ZgotmplZ') {
|
||||
// Safe to parse
|
||||
}
|
||||
```
|
||||
|
||||
2. **DOM Type Safety**
|
||||
```typescript
|
||||
// Use type assertions for DOM queries
|
||||
const element = this.container.querySelector('#input') as HTMLInputElement;
|
||||
```
|
||||
|
||||
3. **Event Handling**
|
||||
```typescript
|
||||
// Properly type event targets
|
||||
private handleClick = (e: Event): void => {
|
||||
const target = e.target as HTMLElement;
|
||||
// Handle event
|
||||
};
|
||||
```
|
||||
|
||||
## Error Handling and Debugging
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Module Resolution**: Use `.js` extensions in imports even for TypeScript files
|
||||
2. **Hugo Data Attributes**: Handle `#ZgotmplZ` security placeholders
|
||||
3. **Type Definitions**: Ensure proper typing for external libraries used in Hugo context
|
||||
4. **Compilation Errors**: Check `tsconfig.json` settings for Hugo compatibility
|
||||
|
||||
### Debugging Tools
|
||||
|
||||
- **VS Code TypeScript**: Use built-in TypeScript language server
|
||||
- **Hugo DevTools**: Browser debugging with source maps
|
||||
- **Component Registry**: Access `window.influxdatadocs.componentRegistry` for debugging
|
||||
- **TypeScript Compiler**: Use `tsc --noEmit --pretty` for detailed error reporting
|
||||
|
||||
## Future Considerations
|
||||
|
||||
### Migration Strategy
|
||||
|
||||
1. **Gradual Migration**: Convert JavaScript modules to TypeScript incrementally
|
||||
2. **Type Definitions**: Add type definitions for existing JavaScript modules
|
||||
3. **Shared Interfaces**: Create common interfaces for Hugo data and component patterns
|
||||
4. **Documentation**: Update component documentation with TypeScript examples
|
||||
|
||||
### Enhancement Opportunities
|
||||
|
||||
1. **Strict Type Checking**: Enable stricter TypeScript compiler options
|
||||
2. **Advanced Types**: Use utility types for Hugo-specific patterns
|
||||
3. **Build Optimization**: Optimize TypeScript compilation for Hugo builds
|
||||
4. **Developer Experience**: Improve tooling and IDE support for Hugo + TypeScript development
|
||||
|
|
@ -3,5 +3,5 @@ Closes #
|
|||
_Describe your proposed changes here._
|
||||
|
||||
- [ ] Signed the [InfluxData CLA](https://www.influxdata.com/legal/cla/)
|
||||
([if necessary](https://github.com/influxdata/docs-v2/blob/master/CONTRIBUTING.md#sign-the-influxdata-cla))
|
||||
([if necessary](https://github.com/influxdata/docs-v2/blob/master/DOCS-CONTRIBUTING.md#sign-the-influxdata-cla))
|
||||
- [ ] Rebased/mergeable
|
||||
|
|
|
|||
|
|
@ -0,0 +1,545 @@
|
|||
# TypeScript & Hugo Development Agent
|
||||
|
||||
You are a specialized TypeScript and Hugo development expert for the InfluxData documentation site. Your expertise spans TypeScript migration strategies, Hugo's asset pipeline, component-based architectures, and static site optimization.
|
||||
|
||||
## Core Expertise
|
||||
|
||||
### TypeScript Development
|
||||
- **Migration Strategy**: Guide incremental TypeScript adoption in existing ES6 modules
|
||||
- **Type Systems**: Create robust type definitions for documentation components
|
||||
- **Configuration**: Set up optimal `tsconfig.json` for Hugo browser environments
|
||||
- **Integration**: Configure TypeScript compilation within Hugo's asset pipeline
|
||||
- **Compatibility**: Ensure backward compatibility during migration phases
|
||||
|
||||
### Hugo Static Site Generator
|
||||
- **Asset Pipeline**: Deep understanding of Hugo's extended asset processing
|
||||
- **Build Process**: Optimize TypeScript compilation for Hugo's build system
|
||||
- **Shortcodes**: Integrate TypeScript components with Hugo shortcodes
|
||||
- **Templates**: Handle Hugo template data in TypeScript components
|
||||
- **Performance**: Optimize for both development (`hugo server`) and production builds
|
||||
|
||||
### Component Architecture
|
||||
- **Registry Pattern**: Maintain and enhance the existing component registry system
|
||||
- **Data Attributes**: Preserve `data-component` initialization pattern
|
||||
- **Module System**: Work with ES6 modules and TypeScript module resolution
|
||||
- **Service Layer**: Type and enhance services for API interactions
|
||||
- **Utilities**: Create strongly-typed utility functions
|
||||
|
||||
## Primary Responsibilities
|
||||
|
||||
### 1. TypeScript Migration Setup
|
||||
|
||||
#### Initial Configuration
|
||||
```typescript
|
||||
// tsconfig.json configuration for Hugo
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2020",
|
||||
"module": "ES2020",
|
||||
"lib": ["ES2020", "DOM", "DOM.Iterable"],
|
||||
"moduleResolution": "node",
|
||||
"baseUrl": "./assets/js",
|
||||
"paths": {
|
||||
"@components/*": ["components/*"],
|
||||
"@services/*": ["services/*"],
|
||||
"@utils/*": ["utils/*"]
|
||||
},
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"resolveJsonModule": true,
|
||||
"allowJs": true,
|
||||
"checkJs": false,
|
||||
"incremental": true,
|
||||
"sourceMap": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"outDir": "./assets/js/dist",
|
||||
"rootDir": "./assets/js"
|
||||
},
|
||||
"include": ["assets/js/**/*"],
|
||||
"exclude": ["node_modules", "public", "resources"]
|
||||
}
|
||||
```
|
||||
|
||||
#### Hugo Pipeline Integration
|
||||
```yaml
|
||||
# config/_default/config.yaml
|
||||
build:
|
||||
useResourceCacheWhen: fallback
|
||||
writeStats: true
|
||||
|
||||
params:
|
||||
assets:
|
||||
typescript:
|
||||
enabled: true
|
||||
sourceMap: true
|
||||
minify: production
|
||||
```
|
||||
|
||||
### 2. Component Migration Pattern
|
||||
|
||||
#### TypeScript Component Template
|
||||
```typescript
|
||||
// components/example-component/example-component.ts
|
||||
interface ExampleComponentConfig {
|
||||
apiEndpoint?: string;
|
||||
refreshInterval?: number;
|
||||
onUpdate?: (data: unknown) => void;
|
||||
}
|
||||
|
||||
interface ExampleComponentElements {
|
||||
root: HTMLElement;
|
||||
trigger?: HTMLButtonElement;
|
||||
content?: HTMLDivElement;
|
||||
}
|
||||
|
||||
export class ExampleComponent {
|
||||
private config: Required<ExampleComponentConfig>;
|
||||
private elements: ExampleComponentElements;
|
||||
private state: Map<string, unknown> = new Map();
|
||||
|
||||
constructor(element: HTMLElement, config: ExampleComponentConfig = {}) {
|
||||
this.elements = { root: element };
|
||||
this.config = this.mergeConfig(config);
|
||||
this.init();
|
||||
}
|
||||
|
||||
private mergeConfig(config: ExampleComponentConfig): Required<ExampleComponentConfig> {
|
||||
return {
|
||||
apiEndpoint: config.apiEndpoint ?? '/api/default',
|
||||
refreshInterval: config.refreshInterval ?? 5000,
|
||||
onUpdate: config.onUpdate ?? (() => {})
|
||||
};
|
||||
}
|
||||
|
||||
private init(): void {
|
||||
this.cacheElements();
|
||||
this.bindEvents();
|
||||
this.render();
|
||||
}
|
||||
|
||||
private cacheElements(): void {
|
||||
this.elements.trigger = this.elements.root.querySelector('[data-trigger]') ?? undefined;
|
||||
this.elements.content = this.elements.root.querySelector('[data-content]') ?? undefined;
|
||||
}
|
||||
|
||||
private bindEvents(): void {
|
||||
this.elements.trigger?.addEventListener('click', this.handleClick.bind(this));
|
||||
}
|
||||
|
||||
private handleClick(event: MouseEvent): void {
|
||||
event.preventDefault();
|
||||
this.updateContent();
|
||||
}
|
||||
|
||||
private async updateContent(): Promise<void> {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
private render(): void {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
public destroy(): void {
|
||||
this.elements.trigger?.removeEventListener('click', this.handleClick.bind(this));
|
||||
this.state.clear();
|
||||
}
|
||||
}
|
||||
|
||||
// Register with component registry
|
||||
import { registerComponent } from '@utils/component-registry';
|
||||
registerComponent('example-component', ExampleComponent);
|
||||
```
|
||||
|
||||
#### Migration Strategy for Existing Components
|
||||
```typescript
|
||||
// Step 1: Add type definitions alongside existing JS
|
||||
// types/components.d.ts
|
||||
declare module '@components/url-select' {
|
||||
export class UrlSelect {
|
||||
constructor(element: HTMLElement);
|
||||
destroy(): void;
|
||||
}
|
||||
}
|
||||
|
||||
// Step 2: Create wrapper with types
|
||||
// components/url-select/url-select.ts
|
||||
import { UrlSelect as UrlSelectJS } from './url-select.js';
|
||||
|
||||
export interface UrlSelectConfig {
|
||||
storageKey?: string;
|
||||
defaultUrl?: string;
|
||||
}
|
||||
|
||||
export class UrlSelect extends UrlSelectJS {
|
||||
constructor(element: HTMLElement, config?: UrlSelectConfig) {
|
||||
super(element, config);
|
||||
}
|
||||
}
|
||||
|
||||
// Step 3: Gradually migrate internals to TypeScript
|
||||
```
|
||||
|
||||
### 3. Hugo Asset Pipeline Configuration
|
||||
|
||||
#### TypeScript Processing with Hugo Pipes
|
||||
```javascript
|
||||
// assets/js/main.ts (entry point)
|
||||
import { ComponentRegistry } from './utils/component-registry';
|
||||
import './components/index'; // Auto-register all components
|
||||
|
||||
// Initialize on DOM ready
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', ComponentRegistry.initializeAll);
|
||||
} else {
|
||||
ComponentRegistry.initializeAll();
|
||||
}
|
||||
|
||||
// Hugo template integration
|
||||
{{ $opts := dict "targetPath" "js/main.js" "minify" (eq hugo.Environment "production") }}
|
||||
{{ $ts := resources.Get "js/main.ts" | js.Build $opts }}
|
||||
{{ if eq hugo.Environment "development" }}
|
||||
<script src="{{ $ts.RelPermalink }}" defer></script>
|
||||
{{ else }}
|
||||
{{ $ts = $ts | fingerprint }}
|
||||
<script src="{{ $ts.RelPermalink }}" integrity="{{ $ts.Data.Integrity }}" defer></script>
|
||||
{{ end }}
|
||||
```
|
||||
|
||||
#### Build Performance Optimization
|
||||
```typescript
|
||||
// utils/lazy-loader.ts
|
||||
export class LazyLoader {
|
||||
private static cache = new Map<string, Promise<any>>();
|
||||
|
||||
static async loadComponent(name: string): Promise<any> {
|
||||
if (!this.cache.has(name)) {
|
||||
this.cache.set(name,
|
||||
import(/* webpackChunkName: "[request]" */ `@components/${name}/${name}`)
|
||||
);
|
||||
}
|
||||
return this.cache.get(name);
|
||||
}
|
||||
}
|
||||
|
||||
// Usage in component registry
|
||||
async function initializeComponent(element: HTMLElement): Promise<void> {
|
||||
const componentName = element.dataset.component;
|
||||
if (!componentName) return;
|
||||
|
||||
const module = await LazyLoader.loadComponent(componentName);
|
||||
const Component = module.default || module[componentName];
|
||||
new Component(element);
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Type Definitions for Hugo Context
|
||||
|
||||
```typescript
|
||||
// types/hugo.d.ts
|
||||
interface HugoPage {
|
||||
title: string;
|
||||
description: string;
|
||||
permalink: string;
|
||||
section: string;
|
||||
params: Record<string, unknown>;
|
||||
}
|
||||
|
||||
interface HugoSite {
|
||||
baseURL: string;
|
||||
languageCode: string;
|
||||
params: {
|
||||
influxdb_urls: Array<{
|
||||
url: string;
|
||||
name: string;
|
||||
cloud?: boolean;
|
||||
}>;
|
||||
};
|
||||
}
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
Hugo: {
|
||||
page: HugoPage;
|
||||
site: HugoSite;
|
||||
};
|
||||
docsData: {
|
||||
page: HugoPage;
|
||||
site: HugoSite;
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export {};
|
||||
```
|
||||
|
||||
### 5. Testing Integration
|
||||
|
||||
#### Cypress with TypeScript
|
||||
```typescript
|
||||
// cypress/support/commands.ts
|
||||
Cypress.Commands.add('initComponent', (componentName: string, config?: Record<string, unknown>) => {
|
||||
cy.window().then((win) => {
|
||||
const element = win.document.querySelector(`[data-component="${componentName}"]`);
|
||||
if (element) {
|
||||
// Initialize component with TypeScript
|
||||
const event = new CustomEvent('component:init', { detail: config });
|
||||
element.dispatchEvent(event);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// cypress/support/index.d.ts
|
||||
declare namespace Cypress {
|
||||
interface Chainable {
|
||||
initComponent(componentName: string, config?: Record<string, unknown>): Chainable<void>;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Development Workflow
|
||||
|
||||
#### NPM Scripts for TypeScript Development
|
||||
```json
|
||||
{
|
||||
"scripts": {
|
||||
"ts:check": "tsc --noEmit",
|
||||
"ts:build": "tsc",
|
||||
"ts:watch": "tsc --watch",
|
||||
"dev": "concurrently \"yarn ts:watch\" \"hugo server\"",
|
||||
"build": "yarn ts:build && hugo --minify",
|
||||
"lint:ts": "eslint 'assets/js/**/*.ts'",
|
||||
"format:ts": "prettier --write 'assets/js/**/*.ts'"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### VSCode Configuration
|
||||
```json
|
||||
// .vscode/settings.json
|
||||
{
|
||||
"typescript.tsdk": "node_modules/typescript/lib",
|
||||
"typescript.enablePromptUseWorkspaceTsdk": true,
|
||||
"editor.defaultFormatter": "esbenp.prettier-vscode",
|
||||
"[typescript]": {
|
||||
"editor.defaultFormatter": "esbenp.prettier-vscode",
|
||||
"editor.formatOnSave": true,
|
||||
"editor.codeActionsOnSave": {
|
||||
"source.fixAll.eslint": true
|
||||
}
|
||||
},
|
||||
"typescript.preferences.importModuleSpecifier": "relative",
|
||||
"typescript.preferences.quoteStyle": "single"
|
||||
}
|
||||
```
|
||||
|
||||
## Migration Guidelines
|
||||
|
||||
### Phase 1: Setup (Week 1)
|
||||
1. Install TypeScript and type definitions
|
||||
2. Configure `tsconfig.json` for Hugo environment
|
||||
3. Set up build scripts and Hugo pipeline
|
||||
4. Create type definitions for existing globals
|
||||
|
||||
### Phase 2: Type Definitions (Week 2)
|
||||
1. Create `.d.ts` files for existing JS modules
|
||||
2. Add type definitions for Hugo context
|
||||
3. Type external dependencies
|
||||
4. Set up ambient declarations
|
||||
|
||||
### Phase 3: Incremental Migration (Weeks 3-8)
|
||||
1. Start with utility modules (pure functions)
|
||||
2. Migrate service layer (API interactions)
|
||||
3. Convert leaf components (no dependencies)
|
||||
4. Migrate complex components
|
||||
5. Update component registry
|
||||
|
||||
### Phase 4: Optimization (Week 9-10)
|
||||
1. Implement code splitting
|
||||
2. Set up lazy loading
|
||||
3. Optimize build performance
|
||||
4. Configure production builds
|
||||
|
||||
## Best Practices
|
||||
|
||||
### TypeScript Conventions
|
||||
- Use strict mode from the start
|
||||
- Prefer interfaces over type aliases for objects
|
||||
- Use const assertions for literal types
|
||||
- Implement proper error boundaries
|
||||
- Use discriminated unions for state management
|
||||
|
||||
### Hugo Integration
|
||||
- Leverage Hugo's build stats for optimization
|
||||
- Use resource bundling for related assets
|
||||
- Implement proper cache busting
|
||||
- Utilize Hugo's minification in production
|
||||
- Keep source maps in development only
|
||||
|
||||
### Component Guidelines
|
||||
- One component per file
|
||||
- Co-locate types with implementation
|
||||
- Use composition over inheritance
|
||||
- Implement cleanup in destroy methods
|
||||
- Follow single responsibility principle
|
||||
|
||||
### Performance Considerations
|
||||
- Use dynamic imports for large components
|
||||
- Implement intersection observer for lazy loading
|
||||
- Minimize bundle size with tree shaking
|
||||
- Use TypeScript's `const enum` for better optimization
|
||||
- Leverage browser caching strategies
|
||||
|
||||
## Debugging Strategies
|
||||
|
||||
### Development Tools
|
||||
```typescript
|
||||
// utils/debug.ts
|
||||
export const debug = {
|
||||
log: (component: string, message: string, data?: unknown): void => {
|
||||
if (process.env.NODE_ENV === 'development') {
|
||||
console.log(`[${component}]`, message, data);
|
||||
}
|
||||
},
|
||||
|
||||
time: (label: string): void => {
|
||||
if (process.env.NODE_ENV === 'development') {
|
||||
console.time(label);
|
||||
}
|
||||
},
|
||||
|
||||
timeEnd: (label: string): void => {
|
||||
if (process.env.NODE_ENV === 'development') {
|
||||
console.timeEnd(label);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Source Maps Configuration
|
||||
```javascript
|
||||
// hugo.config.js for development
|
||||
module.exports = {
|
||||
module: {
|
||||
rules: [{
|
||||
test: /\.ts$/,
|
||||
use: {
|
||||
loader: 'ts-loader',
|
||||
options: {
|
||||
compilerOptions: {
|
||||
sourceMap: true
|
||||
}
|
||||
}
|
||||
}
|
||||
}]
|
||||
},
|
||||
devtool: 'inline-source-map'
|
||||
};
|
||||
```
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### State Management
|
||||
```typescript
|
||||
// utils/state-manager.ts
|
||||
export class StateManager<T extends Record<string, unknown>> {
|
||||
private state: T;
|
||||
private listeners: Set<(state: T) => void> = new Set();
|
||||
|
||||
constructor(initialState: T) {
|
||||
this.state = { ...initialState };
|
||||
}
|
||||
|
||||
get current(): Readonly<T> {
|
||||
return Object.freeze({ ...this.state });
|
||||
}
|
||||
|
||||
update(updates: Partial<T>): void {
|
||||
this.state = { ...this.state, ...updates };
|
||||
this.notify();
|
||||
}
|
||||
|
||||
subscribe(listener: (state: T) => void): () => void {
|
||||
this.listeners.add(listener);
|
||||
return () => this.listeners.delete(listener);
|
||||
}
|
||||
|
||||
private notify(): void {
|
||||
this.listeners.forEach(listener => listener(this.current));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### API Service Pattern
|
||||
```typescript
|
||||
// services/base-service.ts
|
||||
export abstract class BaseService {
|
||||
protected baseURL: string;
|
||||
protected headers: HeadersInit;
|
||||
|
||||
constructor(baseURL: string = '') {
|
||||
this.baseURL = baseURL;
|
||||
this.headers = {
|
||||
'Content-Type': 'application/json'
|
||||
};
|
||||
}
|
||||
|
||||
protected async request<T>(
|
||||
endpoint: string,
|
||||
options: RequestInit = {}
|
||||
): Promise<T> {
|
||||
const url = `${this.baseURL}${endpoint}`;
|
||||
const config: RequestInit = {
|
||||
...options,
|
||||
headers: { ...this.headers, ...options.headers }
|
||||
};
|
||||
|
||||
const response = await fetch(url, config);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Troubleshooting Guide
|
||||
|
||||
### Common Issues and Solutions
|
||||
|
||||
1. **Module Resolution Issues**
|
||||
- Check `tsconfig.json` paths configuration
|
||||
- Verify Hugo's asset directory structure
|
||||
- Ensure proper file extensions in imports
|
||||
|
||||
2. **Type Definition Conflicts**
|
||||
- Use namespace isolation for global types
|
||||
- Check for duplicate declarations
|
||||
- Verify ambient module declarations
|
||||
|
||||
3. **Build Performance**
|
||||
- Enable incremental compilation
|
||||
- Use project references for large codebases
|
||||
- Implement proper code splitting
|
||||
|
||||
4. **Runtime Errors**
|
||||
- Verify TypeScript target matches browser support
|
||||
- Check for proper polyfills
|
||||
- Ensure correct module format for Hugo
|
||||
|
||||
5. **Hugo Integration Issues**
|
||||
- Verify resource pipeline configuration
|
||||
- Check for proper asset fingerprinting
|
||||
- Ensure correct build environment detection
|
||||
|
||||
## Reference Documentation
|
||||
|
||||
- [TypeScript Documentation](https://www.typescriptlang.org/docs/)
|
||||
- [Hugo Pipes Documentation](https://gohugo.io/hugo-pipes/)
|
||||
- [ESBuild with Hugo](https://gohugo.io/hugo-pipes/js/)
|
||||
- [TypeScript ESLint](https://typescript-eslint.io/)
|
||||
- [Cypress TypeScript](https://docs.cypress.io/guides/tooling/typescript-support)
|
||||
|
|
@ -1,6 +1,17 @@
|
|||
# InfluxData Documentation Repository (docs-v2)
|
||||
|
||||
Always follow these instructions first and fallback to additional search and context gathering only when the information provided here is incomplete or found to be in error.
|
||||
This is the primary instruction file for working with the InfluxData documentation site.
|
||||
For detailed information on specific topics, refer to the specialized instruction files in `.github/instructions/`.
|
||||
|
||||
## Quick Reference
|
||||
|
||||
| Task | Command | Time | Details |
|
||||
|------|---------|------|---------|
|
||||
| Install | `CYPRESS_INSTALL_BINARY=0 yarn install` | ~4s | Skip Cypress for CI |
|
||||
| Build | `npx hugo --quiet` | ~75s | NEVER CANCEL |
|
||||
| Dev Server | `npx hugo server` | ~92s | Port 1313 |
|
||||
| Test All | `yarn test:codeblocks:all` | 15-45m | NEVER CANCEL |
|
||||
| Lint | `yarn lint` | ~1m | Pre-commit checks |
|
||||
|
||||
## Working Effectively
|
||||
|
||||
|
|
@ -8,275 +19,110 @@ Always follow these instructions first and fallback to additional search and con
|
|||
|
||||
Be a critical thinking partner, provide honest feedback, and identify potential issues.
|
||||
|
||||
### Bootstrap, Build, and Test the Repository
|
||||
### Setup Steps
|
||||
|
||||
Execute these commands in order to set up a complete working environment:
|
||||
1. Install dependencies (see Quick Reference table above)
|
||||
2. Build the static site
|
||||
3. Start development server at http://localhost:1313/
|
||||
4. Alternative: Use `docker compose up local-dev` if local setup fails
|
||||
|
||||
1. **Install Node.js dependencies** (takes ~4 seconds):
|
||||
### Testing
|
||||
|
||||
```bash
|
||||
# Skip Cypress binary download due to network restrictions in CI environments
|
||||
CYPRESS_INSTALL_BINARY=0 yarn install
|
||||
```
|
||||
For comprehensive testing procedures, see **[DOCS-TESTING.md](../DOCS-TESTING.md)**.
|
||||
|
||||
2. **Build the static site** (takes ~75 seconds, NEVER CANCEL - set timeout to 180+ seconds):
|
||||
**Quick reference** (NEVER CANCEL long-running tests):
|
||||
- **Code blocks**: `yarn test:codeblocks:all` (15-45 minutes)
|
||||
- **Links**: `yarn test:links` (1-5 minutes, requires link-checker binary)
|
||||
- **Style**: `docker compose run -T vale content/**/*.md` (30-60 seconds)
|
||||
- **Pre-commit**: `yarn lint` (or skip with `--no-verify`)
|
||||
|
||||
```bash
|
||||
npx hugo --quiet
|
||||
```
|
||||
### Validation
|
||||
|
||||
3. **Start the development server** (builds in ~92 seconds, NEVER CANCEL - set timeout to 150+ seconds):
|
||||
|
||||
```bash
|
||||
npx hugo server --bind 0.0.0.0 --port 1313
|
||||
```
|
||||
|
||||
- Access at: http://localhost:1313/
|
||||
- Serves 5,359+ pages and 441 static files
|
||||
- Auto-rebuilds on file changes
|
||||
|
||||
4. **Alternative Docker development setup** (use if local Hugo fails):
|
||||
```bash
|
||||
docker compose up local-dev
|
||||
```
|
||||
**Note**: May fail in restricted network environments due to Alpine package manager issues.
|
||||
|
||||
### Testing (CRITICAL: NEVER CANCEL long-running tests)
|
||||
|
||||
#### Code Block Testing (takes 5-15 minutes per product, NEVER CANCEL - set timeout to 30+ minutes):
|
||||
Test these after changes:
|
||||
|
||||
```bash
|
||||
# Build test environment first (takes ~30 seconds, may fail due to network restrictions)
|
||||
docker build -t influxdata/docs-pytest:latest -f Dockerfile.pytest .
|
||||
|
||||
# Test all products (takes 15-45 minutes total)
|
||||
yarn test:codeblocks:all
|
||||
|
||||
# Test specific products
|
||||
yarn test:codeblocks:cloud
|
||||
yarn test:codeblocks:v2
|
||||
yarn test:codeblocks:telegraf
|
||||
```
|
||||
|
||||
#### Link Validation (takes 1-5 minutes):
|
||||
|
||||
Runs automatically on pull requests.
|
||||
Requires the **link-checker** binary from the repo release artifacts.
|
||||
|
||||
```bash
|
||||
# Test specific files/products (faster)
|
||||
# JSON format is required for accurate reporting
|
||||
link-checker map content/influxdb3/core/**/*.md \
|
||||
| link-checker check \
|
||||
--config .ci/link-checker/production.lycherc.toml
|
||||
--format json
|
||||
```
|
||||
|
||||
#### Style Linting (takes 30-60 seconds):
|
||||
|
||||
```bash
|
||||
# Basic Vale linting
|
||||
docker compose run -T vale content/**/*.md
|
||||
|
||||
# Product-specific linting with custom configurations
|
||||
docker compose run -T vale --config=content/influxdb3/cloud-dedicated/.vale.ini --minAlertLevel=error content/influxdb3/cloud-dedicated/**/*.md
|
||||
```
|
||||
|
||||
#### JavaScript and CSS Linting (takes 5-10 seconds):
|
||||
|
||||
```bash
|
||||
yarn eslint assets/js/**/*.js
|
||||
yarn prettier --check "**/*.{css,js,ts,jsx,tsx}"
|
||||
```
|
||||
|
||||
### Pre-commit Hooks (automatically run, can be skipped if needed):
|
||||
|
||||
```bash
|
||||
# Run all pre-commit checks manually
|
||||
yarn lint
|
||||
|
||||
# Skip pre-commit hooks if necessary (not recommended)
|
||||
git commit -m "message" --no-verify
|
||||
```
|
||||
|
||||
## Validation Scenarios
|
||||
|
||||
Always test these scenarios after making changes to ensure full functionality:
|
||||
|
||||
### 1. Documentation Rendering Test
|
||||
|
||||
```bash
|
||||
# Start Hugo server
|
||||
npx hugo server --bind 0.0.0.0 --port 1313
|
||||
|
||||
# Verify key pages load correctly (200 status)
|
||||
# 1. Server renders pages (check 200 status)
|
||||
curl -s -o /dev/null -w "%{http_code}" http://localhost:1313/influxdb3/core/
|
||||
curl -s -o /dev/null -w "%{http_code}" http://localhost:1313/influxdb/v2/
|
||||
curl -s -o /dev/null -w "%{http_code}" http://localhost:1313/telegraf/v1/
|
||||
|
||||
# Verify content contains expected elements
|
||||
curl -s http://localhost:1313/influxdb3/core/ | grep -i "influxdb"
|
||||
```
|
||||
# 2. Build outputs exist (~529MB)
|
||||
npx hugo --quiet && du -sh public/
|
||||
|
||||
### 2. Build Output Validation
|
||||
|
||||
```bash
|
||||
# Verify build completes successfully
|
||||
npx hugo --quiet
|
||||
|
||||
# Check build output exists and has reasonable size (~529MB)
|
||||
ls -la public/
|
||||
du -sh public/
|
||||
|
||||
# Verify key files exist
|
||||
file public/index.html
|
||||
file public/influxdb3/core/index.html
|
||||
```
|
||||
|
||||
### 3. Shortcode and Formatting Test
|
||||
|
||||
```bash
|
||||
# Test shortcode examples page
|
||||
# 3. Shortcodes work
|
||||
yarn test:links content/example.md
|
||||
```
|
||||
|
||||
## Repository Structure and Key Locations
|
||||
## Repository Structure
|
||||
|
||||
### Content Organization
|
||||
|
||||
- **InfluxDB 3**: `/content/influxdb3/` (core, enterprise, cloud-dedicated, cloud-serverless, clustered, explorer)
|
||||
- **InfluxDB v2**: `/content/influxdb/` (v2, cloud, enterprise_influxdb, v1)
|
||||
- **InfluxDB v2**: `/content/influxdb/` (v2, cloud)
|
||||
- **InfluxDB v1**: `/content/influxdb/v1`
|
||||
- **InfluxDB Enterprise (v1)**: `/content/enterprise_influxdb/v1/`
|
||||
- **Telegraf**: `/content/telegraf/v1/`
|
||||
- **Other tools**: `/content/kapacitor/`, `/content/chronograf/`, `/content/flux/`
|
||||
- **Shared content**: `/content/shared/`
|
||||
- **Kapacitor**: `/content/kapacitor/`
|
||||
- **Chronograf**: `/content/chronograf/`
|
||||
- **Flux**: `/content/flux/`
|
||||
- **Examples**: `/content/example.md` (comprehensive shortcode reference)
|
||||
- **Shared content**: `/content/shared/`
|
||||
|
||||
### Configuration Files
|
||||
### Key Files
|
||||
|
||||
- **Hugo config**: `/config/_default/`
|
||||
- **Package management**: `package.json`, `yarn.lock`
|
||||
- **Docker**: `compose.yaml`, `Dockerfile.pytest`
|
||||
- **Git hooks**: `lefthook.yml`
|
||||
- **Testing**: `cypress.config.js`, `pytest.ini` (in test directories)
|
||||
- **Linting**: `.vale.ini`, `.prettierrc.yaml`, `eslint.config.js`
|
||||
|
||||
### Build and Development
|
||||
|
||||
- **Hugo binary**: Available via `npx hugo` (version 0.148.2+)
|
||||
- **Static assets**: `/assets/` (JavaScript, CSS, images)
|
||||
- **Build output**: `/public/` (generated, ~529MB)
|
||||
- **Layouts**: `/layouts/` (Hugo templates)
|
||||
- **Data files**: `/data/` (YAML/JSON data for templates)
|
||||
- **Config**: `/config/_default/`, `package.json`, `compose.yaml`, `lefthook.yml`
|
||||
- **Testing**: `cypress.config.js`, `pytest.ini`, `.vale.ini`
|
||||
- **Assets**: `/assets/` (JS, CSS), `/layouts/` (templates), `/data/` (YAML/JSON)
|
||||
- **Build output**: `/public/` (~529MB, gitignored)
|
||||
|
||||
## Technology Stack
|
||||
|
||||
- **Static Site Generator**: Hugo (0.148.2+ extended)
|
||||
- **Package Manager**: Yarn (1.22.22+) with Node.js (20.19.4+)
|
||||
- **Testing Framework**:
|
||||
- Pytest with pytest-codeblocks (for code examples)
|
||||
- Cypress (for E2E tests)
|
||||
- influxdata/docs-link-checker (for link validation)
|
||||
- Vale (for style and writing guidelines)
|
||||
- **Containerization**: Docker with Docker Compose
|
||||
- **Linting**: ESLint, Prettier, Vale
|
||||
- **Git Hooks**: Lefthook
|
||||
- **Hugo** - Static site generator
|
||||
- **Node.js/Yarn** - Package management
|
||||
- **Testing**: Pytest, Cypress, link-checker, Vale
|
||||
- **Tools**: Docker, ESLint, Prettier, Lefthook
|
||||
|
||||
## Common Tasks and Build Times
|
||||
## Common Issues
|
||||
|
||||
### Network Connectivity Issues
|
||||
### Network Restrictions
|
||||
Commands that may fail in restricted environments:
|
||||
- Docker builds (external repos)
|
||||
- `docker compose up local-dev` (Alpine packages)
|
||||
- Cypress installation (use `CYPRESS_INSTALL_BINARY=0`)
|
||||
|
||||
In restricted environments, these commands may fail due to external dependency downloads:
|
||||
|
||||
- `docker build -t influxdata/docs-pytest:latest -f Dockerfile.pytest .` (InfluxData repositories, HashiCorp repos)
|
||||
- `docker compose up local-dev` (Alpine package manager)
|
||||
- Cypress binary installation (use `CYPRESS_INSTALL_BINARY=0`)
|
||||
|
||||
Document these limitations but proceed with available functionality.
|
||||
|
||||
### Validation Commands for CI
|
||||
|
||||
Always run these before committing changes:
|
||||
### Pre-commit Validation
|
||||
|
||||
```bash
|
||||
# Format and lint code
|
||||
# Quick validation before commits
|
||||
yarn prettier --write "**/*.{css,js,ts,jsx,tsx}"
|
||||
yarn eslint assets/js/**/*.js
|
||||
|
||||
# Test Hugo build
|
||||
npx hugo --quiet
|
||||
|
||||
# Test development server startup
|
||||
timeout 150 npx hugo server --bind 0.0.0.0 --port 1313 &
|
||||
sleep 120
|
||||
curl -s -o /dev/null -w "%{http_code}" http://localhost:1313/
|
||||
pkill hugo
|
||||
```
|
||||
|
||||
## Key Projects in This Codebase
|
||||
## Documentation Coverage
|
||||
|
||||
1. **InfluxDB 3 Documentation** (Core, Enterprise, Clustered, Cloud Dedicated, Cloud Serverless, and InfluxDB 3 plugins for Core and Enterprise)
|
||||
2. **InfluxDB 3 Explorer** (UI)
|
||||
3. **InfluxDB v2 Documentation** (OSS and Cloud)
|
||||
3. **InfuxDB v1 Documentation** (OSS and Enterprise)
|
||||
4. **Telegraf Documentation** (agent and plugins)
|
||||
5. **Supporting Tools Documentation** (Kapacitor, Chronograf, Flux)
|
||||
6. **API Reference Documentation** (`/api-docs/`)
|
||||
7. **Shared Documentation Components** (`/content/shared/`)
|
||||
- **InfluxDB 3**: Core, Enterprise, Cloud (Dedicated/Serverless), Clustered, Explorer, plugins
|
||||
- **InfluxDB v2/v1**: OSS, Cloud, Enterprise
|
||||
- **Tools**: Telegraf, Kapacitor, Chronograf, Flux
|
||||
- **API Reference**: All InfluxDB editions
|
||||
|
||||
## Important Locations for Frequent Tasks
|
||||
## Content Guidelines
|
||||
|
||||
- **Shortcode reference**: `/content/example.md`
|
||||
- **Contributing guide**: `CONTRIBUTING.md`
|
||||
- **Testing guide**: `TESTING.md`
|
||||
- **Product configurations**: `/data/products.yml`
|
||||
- **Vale style rules**: `/.ci/vale/styles/`
|
||||
- **GitHub workflows**: `/.github/workflows/`
|
||||
- **Test scripts**: `/test/scripts/`
|
||||
- **Hugo layouts and shortcodes**: `/layouts/`
|
||||
- **CSS/JS assets**: `/assets/`
|
||||
- **Product versions**: `/data/products.yml`
|
||||
- **Query languages**: SQL, InfluxQL, Flux (per product version)
|
||||
- **Site**: https://docs.influxdata.com
|
||||
|
||||
## Content Guidelines and Style
|
||||
### Writing Documentation
|
||||
|
||||
### Documentation Structure
|
||||
For detailed guidelines, see:
|
||||
- **Workflow**: [DOCS-CONTRIBUTING.md](../DOCS-CONTRIBUTING.md) - Contribution guidelines and workflow
|
||||
- **Shortcodes**: [DOCS-SHORTCODES.md](../DOCS-SHORTCODES.md) - Complete shortcode reference
|
||||
- **Examples**: [content/example.md](../content/example.md) - Working examples for testing
|
||||
- **Frontmatter**: [DOCS-FRONTMATTER.md](../DOCS-FRONTMATTER.md) - Complete page metadata reference
|
||||
- **Testing**: [DOCS-TESTING.md](../DOCS-TESTING.md) - Testing procedures
|
||||
- **API Docs**: [api-docs/README.md](../api-docs/README.md) - API documentation workflow
|
||||
|
||||
- **Product version data**: `/data/products.yml`
|
||||
- **Query Languages**: SQL, InfluxQL, Flux (use appropriate language per product version)
|
||||
- **Documentation Site**: https://docs.influxdata.com
|
||||
- **Framework**: Hugo static site generator
|
||||
### Code Examples
|
||||
|
||||
### Style Guidelines
|
||||
|
||||
- Follow Google Developer Documentation style guidelines
|
||||
- Use semantic line feeds (one sentence per line)
|
||||
- Format code examples to fit within 80 characters
|
||||
- Use long options in command line examples (`--option` instead of `-o`)
|
||||
- Use GitHub callout syntax for notes and warnings
|
||||
- Image naming: `project/version-context-description.png`
|
||||
|
||||
### Markdown and Shortcodes
|
||||
|
||||
Include proper frontmatter for all content pages:
|
||||
|
||||
```yaml
|
||||
title: # Page title (h1)
|
||||
seotitle: # SEO title
|
||||
description: # SEO description
|
||||
menu:
|
||||
product_version:
|
||||
weight: # Page order (1-99, 101-199, etc.)
|
||||
```
|
||||
|
||||
Key shortcodes (see `/content/example.md` for full reference):
|
||||
|
||||
- Notes/warnings (GitHub syntax): `> [!Note]`, `> [!Warning]`
|
||||
- Tabbed content: `{{< tabs-wrapper >}}`, `{{% tabs %}}`, `{{% tab-content %}}`
|
||||
- Code examples: `{{< code-tabs-wrapper >}}`, `{{% code-tabs %}}`, `{{% code-tab-content %}}`
|
||||
- Required elements: `{{< req >}}`
|
||||
- API endpoints: `{{< api-endpoint >}}`
|
||||
|
||||
### Code Examples and Testing
|
||||
|
||||
Provide complete, working examples with pytest annotations:
|
||||
Use pytest annotations for testable examples:
|
||||
|
||||
```python
|
||||
print("Hello, world!")
|
||||
|
|
@ -288,21 +134,31 @@ print("Hello, world!")
|
|||
Hello, world!
|
||||
```
|
||||
|
||||
## Troubleshooting Common Issues
|
||||
## Troubleshooting
|
||||
|
||||
1. **"Pytest collected 0 items"**: Use `python` (not `py`) for code block language identifiers
|
||||
2. **Hugo build errors**: Check `/config/_default/` for configuration issues
|
||||
3. **Docker build failures**: Expected in restricted networks - document and continue with local Hugo
|
||||
4. **Cypress installation failures**: Use `CYPRESS_INSTALL_BINARY=0 yarn install`
|
||||
5. **Link validation slow**: Use file-specific testing: `yarn test:links content/specific-file.md`
|
||||
6. **Vale linting errors**: Check `.ci/vale/styles/config/vocabularies` for accepted/rejected terms
|
||||
| Issue | Solution |
|
||||
|-------|----------|
|
||||
| Pytest collected 0 items | Use `python` not `py` for language identifier |
|
||||
| Hugo build errors | Check `/config/_default/` |
|
||||
| Docker build fails | Expected in restricted networks - use local Hugo |
|
||||
| Cypress install fails | Use `CYPRESS_INSTALL_BINARY=0 yarn install` |
|
||||
| Link validation slow | Test specific files: `yarn test:links content/file.md` |
|
||||
| Vale errors | Check `.ci/vale/styles/config/vocabularies` |
|
||||
|
||||
## Additional Instruction Files
|
||||
## Specialized Instructions
|
||||
|
||||
For specific workflows and content types, also refer to:
|
||||
For detailed information on specific topics:
|
||||
|
||||
- **InfluxDB 3 code placeholders**: `.github/instructions/influxdb3-code-placeholders.instructions.md`
|
||||
- **Contributing guidelines**: `.github/instructions/contributing.instructions.md`
|
||||
- **Content-specific instructions**: Check `.github/instructions/` directory
|
||||
| Topic | File | Description |
|
||||
|-------|------|-------------|
|
||||
| **Content** | [content.instructions.md](instructions/content.instructions.md) | Lightweight pointer to frontmatter and shortcode references |
|
||||
| **Layouts** | [layouts.instructions.md](instructions/layouts.instructions.md) | Shortcode implementation patterns and testing |
|
||||
| **API Docs** | [api-docs.instructions.md](instructions/api-docs.instructions.md) | OpenAPI spec workflow |
|
||||
| **Assets** | [assets.instructions.md](instructions/assets.instructions.md) | TypeScript/JavaScript and CSS development |
|
||||
| **Testing** | [DOCS-TESTING.md](../DOCS-TESTING.md) | Comprehensive testing procedures |
|
||||
|
||||
Remember: This is a large documentation site with complex build processes. Patience with build times is essential, and NEVER CANCEL long-running operations.
|
||||
## Important Notes
|
||||
|
||||
- This is a large site (5,359+ pages) with complex build processes
|
||||
- **NEVER CANCEL** long-running operations (Hugo builds, tests)
|
||||
- Set appropriate timeouts: Hugo build (180s+), tests (30+ minutes)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,33 @@
|
|||
---
|
||||
applyTo: "api-docs/**/*.md, api-docs/**/*.yml, api-docs/**/*.yaml"
|
||||
---
|
||||
|
||||
# InfluxDB API Documentation
|
||||
|
||||
**Complete guide**: [api-docs/README.md](../../api-docs/README.md)
|
||||
|
||||
API documentation uses OpenAPI specifications and Redoc, not Hugo shortcodes.
|
||||
|
||||
## Workflow
|
||||
|
||||
1. Edit YAML files in `/api-docs/`
|
||||
2. Generate HTML documentation locally:
|
||||
```sh
|
||||
cd api-docs
|
||||
sh generate-api-docs.sh
|
||||
```
|
||||
3. Test generated documentation
|
||||
4. Commit YAML changes (HTML is gitignored)
|
||||
|
||||
## Files
|
||||
|
||||
- `ref.yml`: Main API specification
|
||||
- `content/*.yml`: Custom content overlays
|
||||
- `.redocly.yaml`: Linter and bundler configuration
|
||||
|
||||
## Tools
|
||||
|
||||
- Redoc: Generates HTML from OpenAPI specs
|
||||
- @redocly/cli: Lints and bundles specs
|
||||
|
||||
For complete documentation workflow, see [api-docs/README.md](../../api-docs/README.md).
|
||||
|
|
@ -0,0 +1,66 @@
|
|||
---
|
||||
applyTo: "assets/js/**/*.js, assets/js/**/*.ts"
|
||||
---
|
||||
|
||||
# JavaScript and TypeScript Guidelines
|
||||
|
||||
## TypeScript Configuration
|
||||
|
||||
Project uses TypeScript with ES2020 target:
|
||||
- Config: `tsconfig.json`
|
||||
- Source: `assets/js/**/*.ts`
|
||||
- Output: `dist/`
|
||||
- Build: `yarn build:ts`
|
||||
- Watch: `yarn build:ts:watch`
|
||||
|
||||
## Component Pattern
|
||||
|
||||
1. Add `data-component` attribute to HTML element:
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Create component module in `assets/js/components/`:
|
||||
```typescript
|
||||
// assets/js/components/my-component.ts
|
||||
export function initMyComponent(element: HTMLElement): void {
|
||||
// Component logic
|
||||
}
|
||||
```
|
||||
|
||||
3. Register in `assets/js/main.js`:
|
||||
```typescript
|
||||
import { initMyComponent } from './components/my-component';
|
||||
|
||||
registerComponent('my-component', initMyComponent);
|
||||
```
|
||||
|
||||
## Debugging
|
||||
|
||||
### Method 1: Chrome DevTools with Source Maps
|
||||
1. VS Code: Run > Start Debugging
|
||||
2. Select "Debug Docs (source maps)"
|
||||
3. Set breakpoints in `assets/js/ns-hugo-imp:` namespace
|
||||
|
||||
### Method 2: Debug Helpers
|
||||
```typescript
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
debugBreak(); // Breakpoint
|
||||
```
|
||||
|
||||
Start with: `yarn hugo server`
|
||||
Debug with: VS Code "Debug JS (debug-helpers)" configuration
|
||||
|
||||
**Remove debug statements before committing.**
|
||||
|
||||
## Type Safety
|
||||
|
||||
- Use strict TypeScript mode
|
||||
- Add type annotations for parameters and returns
|
||||
- Use interfaces for complex objects
|
||||
- Enable `checkJs: false` for gradual migration
|
||||
|
||||
For complete JavaScript documentation, see [DOCS-CONTRIBUTING.md](../../DOCS-CONTRIBUTING.md#javascript-in-the-documentation-ui).
|
||||
|
|
@ -0,0 +1,74 @@
|
|||
---
|
||||
applyTo: "content/**/*.md"
|
||||
---
|
||||
|
||||
# Content File Guidelines
|
||||
|
||||
**Frontmatter reference**: [DOCS-FRONTMATTER.md](../../DOCS-FRONTMATTER.md)
|
||||
**Shortcodes reference**: [DOCS-SHORTCODES.md](../../DOCS-SHORTCODES.md)
|
||||
**Working examples**: [content/example.md](../../content/example.md)
|
||||
|
||||
## Required for All Content Files
|
||||
|
||||
Every content file needs:
|
||||
```yaml
|
||||
title: # Page h1 heading
|
||||
description: # SEO meta description
|
||||
menu:
|
||||
product_menu_key: # Identifies the Hugo menu specific to the current product
|
||||
name: # Navigation link text
|
||||
parent: # Parent menu item (if nested)
|
||||
weight: # Sort order (1-99, 101-199, 201-299...)
|
||||
```
|
||||
|
||||
## Style Guidelines
|
||||
|
||||
- Use semantic line feeds (one sentence per line)
|
||||
- Test all code examples before committing
|
||||
- Use appropriate shortcodes for UI elements
|
||||
|
||||
## Most Common Shortcodes
|
||||
|
||||
**Callouts**:
|
||||
```markdown
|
||||
> [!Note]
|
||||
> [!Warning]
|
||||
> [!Caution]
|
||||
> [!Important]
|
||||
> [!Tip]
|
||||
```
|
||||
|
||||
**Required elements**:
|
||||
```markdown
|
||||
{{< req >}}
|
||||
{{< req type="key" >}}
|
||||
```
|
||||
|
||||
**Code placeholders**:
|
||||
~~~markdown
|
||||
```sh { placeholders="DATABASE_NAME|API_TOKEN" }
|
||||
curl -X POST https://cloud2.influxdata.com/api/v2/write?bucket=DATABASE_NAME
|
||||
```
|
||||
~~~
|
||||
|
||||
Replace the following:
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}: your database name
|
||||
```
|
||||
|
||||
**Tabbed content**:
|
||||
```markdown
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs %}}
|
||||
[Tab 1](#)
|
||||
[Tab 2](#)
|
||||
{{% /tabs %}}
|
||||
{{% tab-content %}}
|
||||
Content for tab 1
|
||||
{{% /tab-content %}}
|
||||
{{% tab-content %}}
|
||||
Content for tab 2
|
||||
{{% /tab-content %}}
|
||||
{{< /tabs-wrapper >}}
|
||||
```
|
||||
|
||||
For complete shortcodes reference, see [DOCS-SHORTCODES.md](../../DOCS-SHORTCODES.md).
|
||||
|
|
@ -1,287 +0,0 @@
|
|||
---
|
||||
applyTo: "content/**/*.md, layouts/**/*.html"
|
||||
---
|
||||
|
||||
# Contributing instructions for InfluxData Documentation
|
||||
|
||||
## Purpose and scope
|
||||
|
||||
Help document InfluxData products
|
||||
by creating clear, accurate technical content with proper
|
||||
code examples, frontmatter, shortcodes, and formatting.
|
||||
|
||||
## Quick Start
|
||||
|
||||
Ready to contribute? Here's the essential workflow:
|
||||
|
||||
1. [Sign the InfluxData CLA](#sign-the-influxdata-cla) (for substantial changes)
|
||||
2. [Fork and clone](#fork-and-clone-influxdata-documentation-repository) this repository
|
||||
3. [Install dependencies](#development-environment-setup) (Node.js, Yarn, Docker)
|
||||
4. Make your changes following [style guidelines](#making-changes)
|
||||
5. [Test your changes](TESTING.md) (pre-commit and pre-push hooks run automatically)
|
||||
6. [Submit a pull request](#submission-process)
|
||||
|
||||
For detailed setup and reference information, see the sections below.
|
||||
|
||||
---
|
||||
|
||||
### Sign the InfluxData CLA
|
||||
|
||||
The InfluxData Contributor License Agreement (CLA) is part of the legal framework
|
||||
for the open source ecosystem that protects both you and InfluxData.
|
||||
To make substantial contributions to InfluxData documentation, first sign the InfluxData CLA.
|
||||
What constitutes a "substantial" change is at the discretion of InfluxData documentation maintainers.
|
||||
|
||||
[Sign the InfluxData CLA](https://www.influxdata.com/legal/cla/)
|
||||
|
||||
_**Note:** Typo and broken link fixes are greatly appreciated and do not require signing the CLA._
|
||||
|
||||
_If you're new to contributing or you're looking for an easy update, see [`docs-v2` good-first-issues](https://github.com/influxdata/docs-v2/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue)._
|
||||
|
||||
### Fork and clone InfluxData Documentation Repository
|
||||
|
||||
[Fork this repository](https://help.github.com/articles/fork-a-repo/) and
|
||||
[clone it](https://help.github.com/articles/cloning-a-repository/) to your local machine.
|
||||
|
||||
---
|
||||
|
||||
### Prerequisites
|
||||
|
||||
docs-v2 automatically runs format (Markdown, JS, and CSS) linting and code block tests for staged files that you try to commit.
|
||||
|
||||
For the linting and tests to run, you need to install:
|
||||
|
||||
- **Node.js and Yarn**: For managing dependencies and running build scripts
|
||||
- **Docker**: For running Vale linter and code block tests
|
||||
- **VS Code extensions** (optional): For enhanced editing experience
|
||||
|
||||
|
||||
```sh
|
||||
git commit -m "<COMMIT_MESSAGE>" --no-verify
|
||||
```
|
||||
# ... (see full CONTRIBUTING.md for complete example)
|
||||
```bash
|
||||
docker build -t influxdata/docs-pytest:latest -f Dockerfile.pytest .
|
||||
```
|
||||
|
||||
### Install Visual Studio Code extensions
|
||||
|
||||
|
||||
- Comment Anchors: recognizes tags (for example, `//SOURCE`) and makes links and filepaths clickable in comments.
|
||||
- Vale: shows linter errors and suggestions in the editor.
|
||||
- YAML Schemas: validates frontmatter attributes.
|
||||
|
||||
|
||||
_See full CONTRIBUTING.md for complete details._
|
||||
|
||||
#### Markdown
|
||||
|
||||
Most docs-v2 documentation content uses [Markdown](https://en.wikipedia.org/wiki/Markdown).
|
||||
|
||||
_Some parts of the documentation, such as `./api-docs`, contain Markdown within YAML and rely on additional tooling._
|
||||
|
||||
#### Semantic line feeds
|
||||
|
||||
|
||||
```diff
|
||||
-Data is taking off. This data is time series. You need a database that specializes in time series. You should check out InfluxDB.
|
||||
+Data is taking off. This data is time series. You need a database that specializes in time series. You need InfluxDB.
|
||||
# ... (see full CONTRIBUTING.md for complete example)
|
||||
```
|
||||
|
||||
### Essential Frontmatter Reference
|
||||
|
||||
|
||||
```yaml
|
||||
title: # Title of the page used in the page's h1
|
||||
description: # Page description displayed in search engine results
|
||||
# ... (see full CONTRIBUTING.md for complete example)
|
||||
```
|
||||
|
||||
|
||||
_See full CONTRIBUTING.md for complete details._
|
||||
|
||||
#### Notes and warnings
|
||||
|
||||
```md
|
||||
> [!Note]
|
||||
> Insert note markdown content here.
|
||||
|
||||
> [!Warning]
|
||||
> Insert warning markdown content here.
|
||||
|
||||
> [!Caution]
|
||||
> Insert caution markdown content here.
|
||||
|
||||
> [!Important]
|
||||
> Insert important markdown content here.
|
||||
|
||||
> [!Tip]
|
||||
> Insert tip markdown content here.
|
||||
```
|
||||
|
||||
#### Tabbed content
|
||||
|
||||
```md
|
||||
{{< tabs-wrapper >}}
|
||||
|
||||
{{% tabs %}}
|
||||
[Button text for tab 1](#)
|
||||
[Button text for tab 2](#)
|
||||
{{% /tabs %}}
|
||||
|
||||
{{% tab-content %}}
|
||||
Markdown content for tab 1.
|
||||
{{% /tab-content %}}
|
||||
|
||||
{{% tab-content %}}
|
||||
Markdown content for tab 2.
|
||||
{{% /tab-content %}}
|
||||
|
||||
{{< /tabs-wrapper >}}
|
||||
```
|
||||
|
||||
#### Required elements
|
||||
|
||||
```md
|
||||
{{< req >}}
|
||||
{{< req type="key" >}}
|
||||
|
||||
- {{< req "\*" >}} **This element is required**
|
||||
- {{< req "\*" >}} **This element is also required**
|
||||
- **This element is NOT required**
|
||||
```
|
||||
|
||||
For the complete shortcodes reference with all available shortcodes, see [Complete Shortcodes Reference](#complete-shortcodes-reference).
|
||||
|
||||
---
|
||||
|
||||
### InfluxDB API documentation
|
||||
|
||||
docs-v2 includes the InfluxDB API reference documentation in the `/api-docs` directory.
|
||||
To edit the API documentation, edit the YAML files in `/api-docs`.
|
||||
|
||||
InfluxData uses [Redoc](https://github.com/Redocly/redoc) to generate the full
|
||||
InfluxDB API documentation when documentation is deployed.
|
||||
Redoc generates HTML documentation using the InfluxDB `swagger.yml`.
|
||||
For more information about generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
---
|
||||
|
||||
## Testing & Quality Assurance
|
||||
|
||||
For comprehensive testing information, including code block testing, link validation, style linting, and advanced testing procedures, see **[TESTING.md](TESTING.md)**.
|
||||
|
||||
### Quick Testing Reference
|
||||
|
||||
```bash
|
||||
# Test code blocks
|
||||
yarn test:codeblocks:all
|
||||
|
||||
# Test links
|
||||
yarn test:links content/influxdb3/core/**/*.md
|
||||
|
||||
# Run style linting
|
||||
docker compose run -T vale content/**/*.md
|
||||
```
|
||||
|
||||
Pre-commit hooks run automatically when you commit changes, testing your staged files with Vale, Prettier, Cypress, and Pytest. To skip hooks if needed:
|
||||
|
||||
```sh
|
||||
git commit -m "<COMMIT_MESSAGE>" --no-verify
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Commit Guidelines
|
||||
|
||||
When creating commits, follow these guidelines:
|
||||
|
||||
- Use a clear, descriptive commit message that explains the change
|
||||
- Start with a type prefix: `fix()`, `feat()`, `style()`, `refactor()`, `test()`, `chore()`
|
||||
- For product-specific changes, include the product in parentheses: `fix(enterprise)`, `fix(influxdb3)`, `fix(core)`
|
||||
- Keep the first line under 72 characters
|
||||
- Reference issues with "closes" or "fixes": `closes #123` or `closes influxdata/DAR#123`
|
||||
- For multiple issues, use comma separation: `closes influxdata/DAR#517, closes influxdata/DAR#518`
|
||||
|
||||
**Examples:**
|
||||
```
|
||||
fix(enterprise): correct Docker environment variable name for license email
|
||||
fix(influxdb3): correct Docker environment variable and compose examples for monolith
|
||||
feat(telegraf): add new plugin documentation
|
||||
chore(ci): update Vale configuration
|
||||
```
|
||||
|
||||
## Reference Sections
|
||||
|
||||
|
||||
_See full CONTRIBUTING.md for complete details._
|
||||
|
||||
### Complete Frontmatter Reference
|
||||
|
||||
_For the complete Complete Frontmatter Reference reference, see frontmatter-reference.instructions.md._
|
||||
|
||||
### Complete Shortcodes Reference
|
||||
|
||||
_For the complete Complete Shortcodes Reference reference, see shortcodes-reference.instructions.md._
|
||||
|
||||
#### Vale style linting configuration
|
||||
|
||||
docs-v2 includes Vale writing style linter configurations to enforce documentation writing style rules, guidelines, branding, and vocabulary terms.
|
||||
|
||||
**Advanced Vale usage:**
|
||||
|
||||
```sh
|
||||
docker compose run -T vale --config=content/influxdb/cloud-dedicated/.vale.ini --minAlertLevel=error content/influxdb/cloud-dedicated/write-data/**/*.md
|
||||
```
|
||||
|
||||
|
||||
- **Error**:
|
||||
- **Warning**: General style guide rules and best practices
|
||||
- **Suggestion**: Style preferences that may require refactoring or updates to an exceptions list
|
||||
|
||||
#### Configure style rules
|
||||
|
||||
|
||||
_See full CONTRIBUTING.md for complete details._
|
||||
|
||||
#### JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
|
||||
# ... (see full CONTRIBUTING.md for complete example)
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
||||
_See full CONTRIBUTING.md for complete details._
|
||||
|
||||
|
|
@ -1,100 +0,0 @@
|
|||
---
|
||||
mode: 'edit'
|
||||
applyTo: "content/{influxdb3/core,influxdb3/enterprise,shared/influxdb3*}/**"
|
||||
---
|
||||
## Best Practices
|
||||
|
||||
- Use UPPERCASE for placeholders to make them easily identifiable
|
||||
- Don't use pronouns in placeholders (e.g., "your", "this")
|
||||
- List placeholders in the same order they appear in the code
|
||||
- Provide clear descriptions including:
|
||||
- - Expected data type or format
|
||||
- - Purpose of the value
|
||||
- - Any constraints or requirements
|
||||
- Mark optional placeholders as "Optional:" in their descriptions
|
||||
- Placeholder key descriptions should fit the context of the code snippet
|
||||
- Include examples for complex formats
|
||||
|
||||
## Writing Placeholder Descriptions
|
||||
|
||||
Descriptions should follow consistent patterns:
|
||||
|
||||
1. **Admin Authentication tokens**:
|
||||
- Recommended: "a {{% token-link "admin" %}} for your {{< product-name >}} instance"
|
||||
- Avoid: "your token", "the token", "an authorization token"
|
||||
2. **Database resource tokens**:
|
||||
- Recommended: "your {{% token-link "database" %}}"{{% show-in "enterprise" %}} with permissions on the specified database{{% /show-in %}}
|
||||
- Avoid: "your token", "the token", "an authorization token"
|
||||
3. **Database names**:
|
||||
- Recommended: "the name of the database to [action]"
|
||||
- Avoid: "your database", "the database name"
|
||||
4. **Conditional content**:
|
||||
- Use `{{% show-in "enterprise" %}}` for content specific to enterprise versions
|
||||
- Example: "your {{% token-link "database" %}}{{% show-in "enterprise" %}} with permission to query the specified database{{% /show-in %}}"
|
||||
|
||||
## Common placeholders for InfluxDB 3
|
||||
|
||||
- `AUTH_TOKEN`: your {{% token-link %}}
|
||||
- `DATABASE_NAME`: the database to use
|
||||
- `TABLE_NAME`: Name of the table/measurement to query or write to
|
||||
- `NODE_ID`: Node ID for a specific node in a cluster
|
||||
- `CLUSTER_ID`: Cluster ID for a specific cluster
|
||||
- `HOST`: InfluxDB server hostname or URL
|
||||
- `PORT`: InfluxDB server port (typically 8181)
|
||||
- `QUERY`: SQL or InfluxQL query string
|
||||
- `LINE_PROTOCOL`: Line protocol data for writes
|
||||
- `PLUGIN_FILENAME`: Name of plugin file to use
|
||||
- `CACHE_NAME`: Name for a new or existing cache
|
||||
|
||||
## Hugo shortcodes in Markdown
|
||||
|
||||
**Syntax**:
|
||||
|
||||
- Use the `placeholders` code block attribute to define placeholder patterns:
|
||||
```<language> { placeholders="<expr>" }
|
||||
function sampleCode () {};
|
||||
```
|
||||
**Old (deprecated) syntax**:
|
||||
|
||||
- `{{% code-placeholders "PLACEHOLDER1|PLACEHOLDER2" %}}`
|
||||
- `{{% /code-placeholders %}}`
|
||||
|
||||
**Define a placeholder key (typically following the example)**:
|
||||
|
||||
- `{{% code-placeholder-key %}}`: Use this shortcode to define a placeholder key
|
||||
- `{{% /code-placeholder-key %}}`: Use this shortcode to close the key name
|
||||
- Follow with a description
|
||||
|
||||
## Language-Specific Placeholder Formatting
|
||||
|
||||
- **Bash/Shell**: Use uppercase variables with no quotes or prefix
|
||||
```bash { placeholders="DATABASE_NAME" }
|
||||
--database DATABASE_NAME
|
||||
```
|
||||
- Python: Use string literals with quotes
|
||||
```python { placeholders="DATABASE_NAME" }
|
||||
database_name='DATABASE_NAME'
|
||||
```
|
||||
- JSON: Use key-value pairs with quotes
|
||||
```json { placeholders="DATABASE_NAME" }
|
||||
{
|
||||
"database": "DATABASE_NAME"
|
||||
}
|
||||
```
|
||||
|
||||
## Real-World Examples from Documentation
|
||||
|
||||
### InfluxDB CLI Commands
|
||||
This pattern appears frequently in CLI documentation:
|
||||
|
||||
```bash { placeholders="DATABASE_NAME|AUTH_TOKEN" }
|
||||
influxdb3 write \
|
||||
--database DATABASE_NAME \
|
||||
--token AUTH_TOKEN \
|
||||
--precision ns
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
Replace the following placeholders with your values:
|
||||
|
||||
{{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}: the name of the database to write to
|
||||
{{% code-placeholder-key %}}`AUTH_TOKEN`{{% /code-placeholder-key %}}: your {{% token-link "database" %}}{{% show-in "enterprise" %}} with write permissions on the specified database{{% /show-in %}}
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
---
|
||||
applyTo: "layouts/**/*.html"
|
||||
---
|
||||
|
||||
# Layout and Shortcode Implementation Guidelines
|
||||
|
||||
**Shortcodes reference**: [DOCS-SHORTCODES.md](../../DOCS-SHORTCODES.md)
|
||||
**Test examples**: [content/example.md](../../content/example.md)
|
||||
|
||||
## Implementing Shortcodes
|
||||
|
||||
When creating or modifying Hugo layouts and shortcodes:
|
||||
|
||||
1. Use Hugo template syntax and functions
|
||||
2. Follow existing patterns in `/layouts/shortcodes/`
|
||||
3. Test in [content/example.md](../../content/example.md)
|
||||
4. Document new shortcodes in [DOCS-SHORTCODES.md](../../DOCS-SHORTCODES.md)
|
||||
|
||||
## Shortcode Pattern
|
||||
|
||||
```html
|
||||
<!-- layouts/shortcodes/example.html -->
|
||||
{{ $param := .Get 0 }}
|
||||
{{ $namedParam := .Get "name" }}
|
||||
|
||||
<div class="example">
|
||||
{{ .Inner | markdownify }}
|
||||
</div>
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
Add usage examples to `content/example.md` to verify:
|
||||
- Rendering in browser
|
||||
- Hugo build succeeds
|
||||
- No console errors
|
||||
|
||||
See [DOCS-SHORTCODES.md](../../DOCS-SHORTCODES.md) for complete shortcode documentation.
|
||||
|
|
@ -1,15 +0,0 @@
|
|||
---
|
||||
applyTo: "content/**/*.md, layouts/**/*.html"
|
||||
---
|
||||
|
||||
### Detailed Testing Setup
|
||||
|
||||
For comprehensive testing information, including:
|
||||
- Code block testing setup and configuration
|
||||
- Link validation testing procedures
|
||||
- Style linting with Vale
|
||||
- Pre-commit hooks and GitHub Actions integration
|
||||
- Advanced testing procedures and troubleshooting
|
||||
|
||||
Please refer to the main **[TESTING.md](../../TESTING.md)** file.
|
||||
|
||||
|
|
@ -0,0 +1,55 @@
|
|||
## InfluxDB v1 Release Documentation
|
||||
|
||||
**Release Version:** v1.x.x
|
||||
**Release Type:** [ ] OSS [ ] Enterprise [ ] Both
|
||||
|
||||
### Description
|
||||
Brief description of the release and documentation changes.
|
||||
|
||||
### Release Documentation Checklist
|
||||
|
||||
#### Release Notes
|
||||
- [ ] Generate release notes from changelog
|
||||
- [ ] OSS: Use commit messages from GitHub release tag `https://github.com/influxdata/influxdb/releases/tag/v1.x.x`
|
||||
- [ ] Enterprise: Use `https://dl.influxdata.com/enterprise/nightlies/master/CHANGELOG.md`
|
||||
- [ ] **Note**: For Enterprise releases, include important updates, features, and fixes from the corresponding OSS tag
|
||||
- [ ] Update release notes in appropriate location
|
||||
- [ ] OSS: `/content/influxdb/v1/about_the_project/releasenotes-changelog.md`
|
||||
- [ ] Enterprise: `/content/enterprise_influxdb/v1/about-the-project/release-notes.md`
|
||||
- [ ] Ensure release notes follow documentation formatting standards
|
||||
|
||||
#### Version Updates
|
||||
- [ ] Update patch version in `/data/products.yml`
|
||||
- [ ] OSS: `influxdb > v1 > latest`
|
||||
- [ ] Enterprise: `enterprise_influxdb > v1 > latest`
|
||||
- [ ] Update version references in documentation
|
||||
- [ ] Installation guides
|
||||
- [ ] Docker documentation
|
||||
- [ ] Download links
|
||||
- [ ] Code examples with version-specific commands
|
||||
|
||||
#### Content Verification
|
||||
- [ ] Review breaking changes and update migration guides if needed
|
||||
- [ ] Update compatibility matrices if applicable
|
||||
- [ ] Verify all download links work correctly
|
||||
- [ ] Check that version-specific features are documented
|
||||
|
||||
#### Testing
|
||||
- [ ] Build documentation locally and verify changes render correctly
|
||||
- [ ] Test all updated links
|
||||
- [ ] Run link validation: `yarn test:links content/influxdb/v1/**/*.md`
|
||||
- [ ] Run link validation: `yarn test:links content/enterprise_influxdb/v1/**/*.md`
|
||||
|
||||
### Related Resources
|
||||
- DAR Issue: #
|
||||
- OSS Release: https://github.com/influxdata/influxdb/releases/tag/v1.x.x
|
||||
- Enterprise Changelog: https://dl.influxdata.com/enterprise/nightlies/master/CHANGELOG.md
|
||||
- Slack Discussion: [Link to #releases thread]
|
||||
|
||||
### Post-Merge Actions
|
||||
- [ ] Verify documentation is deployed to production
|
||||
- [ ] Announce in #docs channel
|
||||
- [ ] Close related DAR issue(s)
|
||||
|
||||
---
|
||||
**Note:** For Enterprise releases, ensure you have access to the Enterprise changelog and coordinate with the release team for timing.
|
||||
|
|
@ -95,7 +95,7 @@ jobs:
|
|||
curl -L -H "Accept: application/vnd.github+json" \
|
||||
-H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN }}" \
|
||||
-o link-checker-info.json \
|
||||
"https://api.github.com/repos/influxdata/docs-v2/releases/tags/link-checker-v1.2.3"
|
||||
"https://api.github.com/repos/influxdata/docs-v2/releases/tags/link-checker-v1.2.4"
|
||||
|
||||
# Extract download URL for linux binary
|
||||
DOWNLOAD_URL=$(jq -r '.assets[] | select(.name | test("link-checker.*linux")) | .url' link-checker-info.json)
|
||||
|
|
|
|||
|
|
@ -35,6 +35,9 @@ tmp
|
|||
.idea
|
||||
**/config.toml
|
||||
|
||||
# TypeScript build output
|
||||
**/dist/
|
||||
|
||||
# User context files for AI assistant tools
|
||||
.context/*
|
||||
!.context/README.md
|
||||
|
|
|
|||
|
|
@ -33,9 +33,6 @@ call_lefthook()
|
|||
then
|
||||
"$dir/node_modules/lefthook/bin/index.js" "$@"
|
||||
|
||||
elif go tool lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
go tool lefthook "$@"
|
||||
elif bundle exec lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
bundle exec lefthook "$@"
|
||||
|
|
@ -45,21 +42,12 @@ call_lefthook()
|
|||
elif pnpm lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
pnpm lefthook "$@"
|
||||
elif swift package lefthook >/dev/null 2>&1
|
||||
elif swift package plugin lefthook >/dev/null 2>&1
|
||||
then
|
||||
swift package --build-path .build/lefthook --disable-sandbox lefthook "$@"
|
||||
swift package --disable-sandbox plugin lefthook "$@"
|
||||
elif command -v mint >/dev/null 2>&1
|
||||
then
|
||||
mint run csjones/lefthook-plugin "$@"
|
||||
elif uv run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
uv run lefthook "$@"
|
||||
elif mise exec -- lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
mise exec -- lefthook "$@"
|
||||
elif devbox run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
devbox run lefthook "$@"
|
||||
else
|
||||
echo "Can't find lefthook in PATH"
|
||||
fi
|
||||
|
|
|
|||
|
|
@ -33,9 +33,6 @@ call_lefthook()
|
|||
then
|
||||
"$dir/node_modules/lefthook/bin/index.js" "$@"
|
||||
|
||||
elif go tool lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
go tool lefthook "$@"
|
||||
elif bundle exec lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
bundle exec lefthook "$@"
|
||||
|
|
@ -45,21 +42,12 @@ call_lefthook()
|
|||
elif pnpm lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
pnpm lefthook "$@"
|
||||
elif swift package lefthook >/dev/null 2>&1
|
||||
elif swift package plugin lefthook >/dev/null 2>&1
|
||||
then
|
||||
swift package --build-path .build/lefthook --disable-sandbox lefthook "$@"
|
||||
swift package --disable-sandbox plugin lefthook "$@"
|
||||
elif command -v mint >/dev/null 2>&1
|
||||
then
|
||||
mint run csjones/lefthook-plugin "$@"
|
||||
elif uv run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
uv run lefthook "$@"
|
||||
elif mise exec -- lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
mise exec -- lefthook "$@"
|
||||
elif devbox run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
devbox run lefthook "$@"
|
||||
else
|
||||
echo "Can't find lefthook in PATH"
|
||||
fi
|
||||
|
|
|
|||
|
|
@ -33,9 +33,6 @@ call_lefthook()
|
|||
then
|
||||
"$dir/node_modules/lefthook/bin/index.js" "$@"
|
||||
|
||||
elif go tool lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
go tool lefthook "$@"
|
||||
elif bundle exec lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
bundle exec lefthook "$@"
|
||||
|
|
@ -45,21 +42,12 @@ call_lefthook()
|
|||
elif pnpm lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
pnpm lefthook "$@"
|
||||
elif swift package lefthook >/dev/null 2>&1
|
||||
elif swift package plugin lefthook >/dev/null 2>&1
|
||||
then
|
||||
swift package --build-path .build/lefthook --disable-sandbox lefthook "$@"
|
||||
swift package --disable-sandbox plugin lefthook "$@"
|
||||
elif command -v mint >/dev/null 2>&1
|
||||
then
|
||||
mint run csjones/lefthook-plugin "$@"
|
||||
elif uv run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
uv run lefthook "$@"
|
||||
elif mise exec -- lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
mise exec -- lefthook "$@"
|
||||
elif devbox run lefthook -h >/dev/null 2>&1
|
||||
then
|
||||
devbox run lefthook "$@"
|
||||
else
|
||||
echo "Can't find lefthook in PATH"
|
||||
fi
|
||||
|
|
|
|||
14
CLAUDE.md
14
CLAUDE.md
|
|
@ -17,22 +17,12 @@ See @README.md
|
|||
See @.github/copilot-instructions.md for style guidelines and
|
||||
product-specific documentation paths and URLs managed in this project.
|
||||
|
||||
See @.github/instructions/contributing.instructions.md for essential InfluxData
|
||||
See @DOCS-CONTRIBUTING.md for essential InfluxData
|
||||
documentation contributing guidelines, such as style and
|
||||
formatting, and commonly used shortcodes.
|
||||
|
||||
See @TESTING.md for comprehensive testing information, including code block
|
||||
See @DOCS-TESTING.md for comprehensive testing information, including code block
|
||||
testing, link validation, style linting, and advanced testing procedures.
|
||||
|
||||
See @.github/instructions/shortcodes-reference.instructions.md for detailed
|
||||
information about shortcodes used in this project.
|
||||
|
||||
See @.github/instructions/frontmatter-reference.instructions.md for detailed
|
||||
information about frontmatter used in this project.
|
||||
|
||||
See @.github/instructions/influxdb3-code-placeholders.instructions.md for using
|
||||
placeholders in code samples and CLI commands.
|
||||
|
||||
See @api-docs/README.md for information about the API reference documentation, how to
|
||||
generate it, and how to contribute to it.
|
||||
|
||||
|
|
|
|||
1807
CONTRIBUTING.md
1807
CONTRIBUTING.md
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,442 @@
|
|||
# Contributing to InfluxData Documentation
|
||||
|
||||
<!-- agent:instruct: essential -->
|
||||
## Quick Start
|
||||
|
||||
Ready to contribute?
|
||||
|
||||
1. [Sign the InfluxData CLA](#sign-the-influxdata-cla) (for substantial changes)
|
||||
2. [Fork and clone](#fork-and-clone-influxdata-documentation-repository) this repository
|
||||
3. [Install dependencies](#development-environment-setup) (Node.js, Yarn, Docker)
|
||||
4. Make your changes following [style guidelines](#making-changes)
|
||||
5. [Test your changes](TESTING.md) (pre-commit and pre-push hooks run automatically)
|
||||
6. [Submit a pull request](#submission-process)
|
||||
|
||||
For detailed setup and reference information, see the sections below.
|
||||
|
||||
---
|
||||
|
||||
## Legal & Getting Started
|
||||
|
||||
### Sign the InfluxData CLA
|
||||
|
||||
The InfluxData Contributor License Agreement (CLA) is part of the legal framework
|
||||
for the open source ecosystem that protects both you and InfluxData.
|
||||
To make substantial contributions to InfluxData documentation, first sign the InfluxData CLA.
|
||||
What constitutes a "substantial" change is at the discretion of InfluxData documentation maintainers.
|
||||
|
||||
[Sign the InfluxData CLA](https://www.influxdata.com/legal/cla/)
|
||||
|
||||
_**Note:** Typo and broken link fixes are greatly appreciated and do not require signing the CLA._
|
||||
|
||||
_If you're new to contributing or you're looking for an easy update, see [`docs-v2` good-first-issues](https://github.com/influxdata/docs-v2/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue)._
|
||||
|
||||
### Fork and clone InfluxData Documentation Repository
|
||||
|
||||
[Fork this repository](https://help.github.com/articles/fork-a-repo/) and
|
||||
[clone it](https://help.github.com/articles/cloning-a-repository/) to your local machine.
|
||||
|
||||
---
|
||||
|
||||
<!-- agent:instruct: condense -->
|
||||
## Development Environment Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
docs-v2 automatically runs format (Markdown, JS, and CSS) linting and code block tests for staged files that you try to commit.
|
||||
|
||||
For the linting and tests to run, you need to install:
|
||||
|
||||
- **Node.js and Yarn**: For managing dependencies and running build scripts
|
||||
- **Docker**: For running Vale linter and code block tests
|
||||
- **VS Code extensions** (optional): For enhanced editing experience
|
||||
|
||||
\_**Note:**
|
||||
The git pre-commit and pre-push hooks are configured to run linting and tests automatically
|
||||
when you commit or push changes.
|
||||
We strongly recommend letting them run, but you can skip them
|
||||
(and avoid installing related dependencies)
|
||||
by including the `--no-verify` flag with your commit--for example:
|
||||
|
||||
```sh
|
||||
git commit -m "<COMMIT_MESSAGE>" --no-verify
|
||||
```
|
||||
|
||||
### Install Node.js dependencies
|
||||
|
||||
To install dependencies listed in package.json:
|
||||
|
||||
1. Install [Node.js](https://nodejs.org/en) for your system.
|
||||
2. Install [Yarn](https://yarnpkg.com/getting-started/install) for your system.
|
||||
3. Run `yarn` to install dependencies (including Hugo).
|
||||
|
||||
`package.json` contains dependencies used in `/assets/js` JavaScript code and
|
||||
dev dependencies used in pre-commit hooks for linting, syntax-checking, and testing.
|
||||
|
||||
Dev dependencies include:
|
||||
|
||||
- [Lefthook](https://github.com/evilmartians/lefthook): configures and
|
||||
manages git pre-commit and pre-push hooks for linting and testing Markdown content.
|
||||
- [prettier](https://prettier.io/docs/en/): formats code, including Markdown, according to style rules for consistency
|
||||
- [Cypress]: e2e testing for UI elements and URLs in content
|
||||
|
||||
### Install Docker
|
||||
|
||||
docs-v2 includes Docker configurations (`compose.yaml` and Dockerfiles) for running the Vale style linter and tests for code blocks (Shell, Bash, and Python) in Markdown files.
|
||||
|
||||
Install [Docker](https://docs.docker.com/get-docker/) for your system.
|
||||
|
||||
#### Build the test dependency image
|
||||
|
||||
After you have installed Docker, run the following command to build the test
|
||||
dependency image, `influxdata:docs-pytest`.
|
||||
The tests defined in `compose.yaml` use the dependencies and execution
|
||||
environment from this image.
|
||||
|
||||
```bash
|
||||
docker build -t influxdata/docs-pytest:latest -f Dockerfile.pytest .
|
||||
```
|
||||
|
||||
### Run the documentation locally (optional)
|
||||
|
||||
To run the documentation locally, follow the instructions provided in the README.
|
||||
|
||||
### Install Visual Studio Code extensions
|
||||
|
||||
If you use Microsoft Visual Studio (VS) Code, you can install extensions
|
||||
to help you navigate, check, and edit files.
|
||||
|
||||
docs-v2 contains a `./.vscode/settings.json` that configures the following extensions:
|
||||
|
||||
- Comment Anchors: recognizes tags (for example, `//SOURCE`) and makes links and filepaths clickable in comments.
|
||||
- Vale: shows linter errors and suggestions in the editor.
|
||||
- YAML Schemas: validates frontmatter attributes.
|
||||
|
||||
---
|
||||
|
||||
<!-- agent:instruct: condense -->
|
||||
## Making Changes
|
||||
|
||||
|
||||
### Style Guidelines
|
||||
|
||||
#### Markdown
|
||||
|
||||
Most docs-v2 documentation content uses [Markdown](https://en.wikipedia.org/wiki/Markdown).
|
||||
|
||||
_Some parts of the documentation, such as `./api-docs`, contain Markdown within YAML and rely on additional tooling._
|
||||
|
||||
#### Semantic line feeds
|
||||
|
||||
Use [semantic line feeds](http://rhodesmill.org/brandon/2012/one-sentence-per-line/).
|
||||
Separating each sentence with a new line makes it easy to parse diffs with the human eye.
|
||||
|
||||
**Diff without semantic line feeds:**
|
||||
|
||||
```diff
|
||||
-Data is taking off. This data is time series. You need a database that specializes in time series. You should check out InfluxDB.
|
||||
+Data is taking off. This data is time series. You need a database that specializes in time series. You need InfluxDB.
|
||||
```
|
||||
|
||||
**Diff with semantic line feeds:**
|
||||
|
||||
```diff
|
||||
Data is taking off.
|
||||
This data is time series.
|
||||
You need a database that specializes in time series.
|
||||
-You should check out InfluxDB.
|
||||
+You need InfluxDB.
|
||||
```
|
||||
|
||||
#### Article headings
|
||||
|
||||
Use only h2-h6 headings in markdown content.
|
||||
h1 headings act as the page title and are populated automatically from the `title` frontmatter.
|
||||
h2-h6 headings act as section headings.
|
||||
|
||||
#### Image naming conventions
|
||||
|
||||
Save images using the following naming format: `project/version-context-description.png`.
|
||||
For example, `influxdb/2-0-visualizations-line-graph.png` or `influxdb/2-0-tasks-add-new.png`.
|
||||
Specify a version other than 2.0 only if the image is specific to that version.
|
||||
|
||||
### Essential Frontmatter Reference
|
||||
|
||||
Every documentation page includes frontmatter which specifies information about the page.
|
||||
Frontmatter populates variables in page templates and the site's navigation menu.
|
||||
|
||||
**Essential fields:**
|
||||
|
||||
```yaml
|
||||
title: # Title of the page used in the page's h1
|
||||
description: # Page description displayed in search engine results
|
||||
menu:
|
||||
influxdb_2_0:
|
||||
name: # Article name that only appears in the left nav
|
||||
parent: # Specifies a parent group and nests navigation items
|
||||
weight: # Determines sort order in both the nav tree and in article lists
|
||||
```
|
||||
|
||||
For the complete frontmatter reference with all available fields and detailed usage, see **[DOCS-FRONTMATTER.md](DOCS-FRONTMATTER.md)**.
|
||||
|
||||
### Shared Content
|
||||
|
||||
This repository uses shared content extensively to avoid duplication across InfluxDB editions and versions.
|
||||
|
||||
Use the `source` frontmatter to specify a shared file for page content:
|
||||
|
||||
```yaml
|
||||
source: /shared/path/to/content.md
|
||||
```
|
||||
|
||||
For complete details including examples and best practices, see the [Source section in DOCS-FRONTMATTER.md](DOCS-FRONTMATTER.md#source).
|
||||
|
||||
<!-- agent:instruct: essential -->
|
||||
### Common Shortcodes Reference
|
||||
|
||||
#### Callouts (notes and warnings)
|
||||
|
||||
```md
|
||||
> [!Note]
|
||||
> Insert note markdown content here.
|
||||
|
||||
> [!Warning]
|
||||
> Insert warning markdown content here.
|
||||
|
||||
> [!Caution]
|
||||
> Insert caution markdown content here.
|
||||
|
||||
> [!Important]
|
||||
> Insert important markdown content here.
|
||||
|
||||
> [!Tip]
|
||||
> Insert tip markdown content here.
|
||||
```
|
||||
|
||||
#### Tabbed content
|
||||
|
||||
```md
|
||||
{{< tabs-wrapper >}}
|
||||
|
||||
{{% tabs %}}
|
||||
[Button text for tab 1](#)
|
||||
[Button text for tab 2](#)
|
||||
{{% /tabs %}}
|
||||
|
||||
{{% tab-content %}}
|
||||
Markdown content for tab 1.
|
||||
{{% /tab-content %}}
|
||||
|
||||
{{% tab-content %}}
|
||||
Markdown content for tab 2.
|
||||
{{% /tab-content %}}
|
||||
|
||||
{{< /tabs-wrapper >}}
|
||||
```
|
||||
|
||||
#### Required elements
|
||||
|
||||
```md
|
||||
{{< req >}}
|
||||
{{< req type="key" >}}
|
||||
|
||||
- {{< req "\*" >}} **This element is required**
|
||||
- {{< req "\*" >}} **This element is also required**
|
||||
- **This element is NOT required**
|
||||
```
|
||||
|
||||
For the complete shortcodes reference with all available shortcodes and usage examples, see **[SHORTCODES.md](SHORTCODES.md)**.
|
||||
|
||||
Test shortcodes with working examples in **[content/example.md](content/example.md)**.
|
||||
|
||||
---
|
||||
|
||||
### InfluxDB API documentation
|
||||
|
||||
docs-v2 includes the InfluxDB API reference documentation in the `/api-docs` directory. The files are written in YAML and use [OpenAPI 3.0](https://swagger.io/specification/) standard.
|
||||
|
||||
InfluxData uses [Redoc](https://github.com/Redocly/redoc) to build and generate the full
|
||||
InfluxDB API documentation when documentation is deployed.
|
||||
For more information about editing and generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
---
|
||||
|
||||
## Testing & Quality Assurance
|
||||
|
||||
|
||||
Pre-commit hooks run automatically when you commit changes, testing your staged files with Vale, Prettier, Cypress, and Pytest. To skip hooks if needed:
|
||||
|
||||
```sh
|
||||
git commit -m "<COMMIT_MESSAGE>" --no-verify
|
||||
```
|
||||
|
||||
### Quick Testing Reference
|
||||
|
||||
```bash
|
||||
# Test code blocks
|
||||
yarn test:codeblocks:all
|
||||
|
||||
# Test links
|
||||
yarn test:links content/influxdb3/core/**/*.md
|
||||
|
||||
# Run style linting
|
||||
docker compose run -T vale content/**/*.md
|
||||
```
|
||||
|
||||
For comprehensive testing information, including code block testing, link validation, style linting, and advanced testing procedures, see **[TESTING.md](TESTING.md)**.
|
||||
|
||||
|
||||
---
|
||||
|
||||
<!-- agent:instruct: condense -->
|
||||
## Submission Process
|
||||
|
||||
<!-- agent:instruct: essential -->
|
||||
### Commit Guidelines
|
||||
|
||||
When creating commits, follow these guidelines:
|
||||
|
||||
- Use a clear, descriptive commit message that explains the change
|
||||
- Start with a type prefix: `fix()`, `feat()`, `style()`, `refactor()`, `test()`, `chore()`
|
||||
- For product-specific changes, include the product in parentheses: `fix(enterprise)`, `fix(influxdb3)`, `fix(core)`
|
||||
- Keep the first line under 72 characters
|
||||
- Reference issues with "closes" or "fixes": `closes #123` or `closes influxdata/DAR#123`
|
||||
- For multiple issues, use comma separation: `closes influxdata/DAR#517, closes influxdata/DAR#518`
|
||||
|
||||
**Examples:**
|
||||
```
|
||||
fix(enterprise): correct Docker environment variable name for license email
|
||||
fix(influxdb3): correct Docker environment variable and compose examples for monolith
|
||||
feat(telegraf): add new plugin documentation
|
||||
chore(ci): update Vale configuration
|
||||
```
|
||||
|
||||
### Submit a pull request
|
||||
|
||||
Push your changes up to your forked repository, then [create a new pull request](https://help.github.com/articles/creating-a-pull-request/).
|
||||
|
||||
---
|
||||
|
||||
## Reference Documentation
|
||||
|
||||
For detailed reference documentation, see:
|
||||
|
||||
- **[DOCS-FRONTMATTER.md](DOCS-FRONTMATTER.md)** - Complete frontmatter field reference with all available options
|
||||
- **[DOCS-SHORTCODES.md](DOCS-SHORTCODES.md)** - Complete shortcodes reference with usage examples for all available shortcodes
|
||||
|
||||
<!-- agent:instruct: condense -->
|
||||
### Advanced Configuration
|
||||
|
||||
#### Vale style linting configuration
|
||||
|
||||
docs-v2 includes Vale writing style linter configurations to enforce documentation writing style rules, guidelines, branding, and vocabulary terms.
|
||||
|
||||
**Advanced Vale usage:**
|
||||
|
||||
```sh
|
||||
docker compose run -T vale --config=content/influxdb/cloud-dedicated/.vale.ini --minAlertLevel=error content/influxdb/cloud-dedicated/write-data/**/*.md
|
||||
```
|
||||
|
||||
The output contains error-level style alerts for the Markdown content.
|
||||
|
||||
If a file contains style, spelling, or punctuation problems,
|
||||
the Vale linter can raise one of the following alert levels:
|
||||
|
||||
- **Error**:
|
||||
- Problems that can cause content to render incorrectly
|
||||
- Violations of branding guidelines or trademark guidelines
|
||||
- Rejected vocabulary terms
|
||||
- **Warning**: General style guide rules and best practices
|
||||
- **Suggestion**: Style preferences that may require refactoring or updates to an exceptions list
|
||||
|
||||
#### Configure style rules
|
||||
|
||||
`<docs-v2>/.ci/vale/styles/` contains configuration files for the custom `InfluxDataDocs` style.
|
||||
|
||||
The easiest way to add accepted or rejected spellings is to enter your terms (or regular expression patterns) into the Vocabulary files at `.ci/vale/styles/config/vocabularies`.
|
||||
|
||||
To add accepted/rejected terms for specific products, configure a style for the product and include a `Branding.yml` configuration. As an example, see `content/influxdb/cloud-dedicated/.vale.ini` and `.ci/vale/styles/Cloud-Dedicated/Branding.yml`.
|
||||
|
||||
To learn more about configuration and rules, see [Vale configuration](https://vale.sh/docs/topics/config).
|
||||
|
||||
<!-- agent:instruct: condense -->
|
||||
#### JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses TypeScript and JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
`assets/js`.
|
||||
Only `assets/js/main.js` should be imported in HTML files.
|
||||
|
||||
`assets/js/main.js` registers components and initializes them on page load.
|
||||
|
||||
If you're adding UI functionality that requires JavaScript, follow these steps:
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
should be initialized by your JavaScript code. For example:
|
||||
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Following the component pattern, create a single-purpose JavaScript module
|
||||
(`assets/js/components/my-component.js`)
|
||||
that exports a single function that receives the component element and initializes it.
|
||||
3. In `assets/js/main.js`, import the module and register the component to ensure
|
||||
the component is initialized on page load.
|
||||
|
||||
##### Debugging JavaScript
|
||||
|
||||
To debug JavaScript code used in the InfluxData documentation UI, choose one of the following methods:
|
||||
|
||||
- Use source maps and the Chrome DevTools debugger.
|
||||
- Use debug helpers that provide breakpoints and console logging as a workaround or alternative for using source maps and the Chrome DevTools debugger.
|
||||
|
||||
###### Using source maps and Chrome DevTools debugger
|
||||
|
||||
1. In VS Code, select Run > Start Debugging.
|
||||
2. Select the "Debug Docs (source maps)" configuration.
|
||||
3. Click the play button to start the debugger.
|
||||
5. Set breakpoints in the JavaScript source files--files in the
|
||||
`assets/js/ns-hugo-imp:` namespace-- in the
|
||||
VS Code editor or in the Chrome Developer Tools Sources panel:
|
||||
|
||||
- In the VS Code Debugger panel > "Loaded Scripts" section, find the
|
||||
`assets/js/ns-hugo-imp:` namespace.
|
||||
- In the Chrome Developer Tools Sources panel, expand
|
||||
`js/ns-hugo-imp:/<YOUR_WORKSPACE_ROOT>/assets/js/`.
|
||||
|
||||
###### Using debug helpers
|
||||
|
||||
1. In your JavaScript module, import debug helpers from `assets/js/utils/debug-helpers.js`.
|
||||
These helpers provide breakpoints and console logging as a workaround or alternative for
|
||||
using source maps and the Chrome DevTools debugger.
|
||||
2. Insert debug statements by calling the helper functions in your code--for example:
|
||||
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
|
@ -1,11 +1,22 @@
|
|||
---
|
||||
applyTo: "content/**/*.md, layouts/**/*.html"
|
||||
---
|
||||
# Frontmatter Reference
|
||||
|
||||
### Complete Frontmatter Reference
|
||||
Complete reference for frontmatter fields used in InfluxData documentation pages.
|
||||
|
||||
Every documentation page includes frontmatter which specifies information about the page.
|
||||
Frontmatter populates variables in page templates and the site's navigation menu.
|
||||
## Essential Fields
|
||||
|
||||
Every documentation page requires these fields:
|
||||
|
||||
```yaml
|
||||
title: # Page h1 heading
|
||||
description: # SEO meta description
|
||||
menu:
|
||||
product_version:
|
||||
name: # Navigation link text
|
||||
parent: # Parent menu item (if nested)
|
||||
weight: # Sort order (1-99, 101-199, 201-299...)
|
||||
```
|
||||
|
||||
## Complete Field Reference
|
||||
|
||||
```yaml
|
||||
title: # Title of the page used in the page's h1
|
||||
|
|
@ -44,29 +55,31 @@ updated_in: # Product and version the referenced feature was updated in (display
|
|||
source: # Specify a file to pull page content from (typically in /content/shared/)
|
||||
```
|
||||
|
||||
#### Title usage
|
||||
## Field Usage Details
|
||||
|
||||
##### `title`
|
||||
### Title Fields
|
||||
|
||||
#### `title`
|
||||
|
||||
The `title` frontmatter populates each page's HTML `h1` heading tag.
|
||||
It shouldn't be overly long, but should set the context for users coming from outside sources.
|
||||
|
||||
##### `seotitle`
|
||||
#### `seotitle`
|
||||
|
||||
The `seotitle` frontmatter populates each page's HTML `title` attribute.
|
||||
Search engines use this in search results (not the page's h1) and therefore it should be keyword optimized.
|
||||
|
||||
##### `list_title`
|
||||
#### `list_title`
|
||||
|
||||
The `list_title` frontmatter determines an article title when in a list generated
|
||||
by the [`{{< children >}}` shortcode](#generate-a-list-of-children-articles).
|
||||
|
||||
##### `menu > name`
|
||||
#### `menu > name`
|
||||
|
||||
The `name` attribute under the `menu` frontmatter determines the text used in each page's link in the site navigation.
|
||||
It should be short and assume the context of its parent if it has one.
|
||||
|
||||
#### Page Weights
|
||||
### Page Weights
|
||||
|
||||
To ensure pages are sorted both by weight and their depth in the directory
|
||||
structure, pages should be weighted in "levels."
|
||||
|
|
@ -76,7 +89,7 @@ Then 201-299 and so on.
|
|||
|
||||
_**Note:** `_index.md` files should be weighted one level up from the other `.md` files in the same directory._
|
||||
|
||||
#### Related content
|
||||
### Related Content
|
||||
|
||||
Use the `related` frontmatter to include links to specific articles at the bottom of an article.
|
||||
|
||||
|
|
@ -95,7 +108,7 @@ related:
|
|||
- https://influxdata.com, This is an external link
|
||||
```
|
||||
|
||||
#### Canonical URLs
|
||||
### Canonical URLs
|
||||
|
||||
Search engines use canonical URLs to accurately rank pages with similar or identical content.
|
||||
The `canonical` HTML meta tag identifies which page should be used as the source of truth.
|
||||
|
|
@ -115,7 +128,7 @@ canonical: /path/to/canonical/doc/
|
|||
canonical: /{{< latest "influxdb" "v2" >}}/path/to/canonical/doc/
|
||||
```
|
||||
|
||||
#### v2 equivalent documentation
|
||||
### v2 Equivalent Documentation
|
||||
|
||||
To display a notice on a 1.x page that links to an equivalent 2.0 page,
|
||||
add the following frontmatter to the 1.x page:
|
||||
|
|
@ -124,7 +137,7 @@ add the following frontmatter to the 1.x page:
|
|||
v2: /influxdb/v2.0/get-started/
|
||||
```
|
||||
|
||||
#### Alternative links for cross-product navigation
|
||||
### Alternative Links (alt_links)
|
||||
|
||||
Use the `alt_links` frontmatter to specify equivalent pages in other InfluxDB products,
|
||||
for example, when a page exists at a different path in a different version or if
|
||||
|
|
@ -147,7 +160,7 @@ Supported product keys for InfluxDB 3:
|
|||
- `cloud-dedicated`
|
||||
- `clustered`
|
||||
|
||||
#### Prepend and append content to a page
|
||||
### Prepend and Append
|
||||
|
||||
Use the `prepend` and `append` frontmatter to add content to the top or bottom of a page.
|
||||
Each has the following fields:
|
||||
|
|
@ -170,7 +183,7 @@ cascade:
|
|||
> This is just an example note block that gets appended to the article.
|
||||
```
|
||||
|
||||
#### Cascade
|
||||
### Cascade
|
||||
|
||||
To automatically apply frontmatter to a page and all of its children, use the
|
||||
[`cascade` frontmatter](https://gohugo.io/content-management/front-matter/#front-matter-cascade)
|
||||
|
|
@ -187,12 +200,75 @@ cascade:
|
|||
those frontmatter keys. Frontmatter defined on the page overrides frontmatter
|
||||
"cascaded" from a parent.
|
||||
|
||||
#### Use shared content in a page
|
||||
### Source
|
||||
|
||||
This repository makes heavy use of shared content to avoid duplication across InfluxDB editions and versions.
|
||||
Use the `source` frontmatter to specify a shared file to use to populate the
|
||||
page content. Shared files are typically stored in the `/content/shared` directory.
|
||||
page content. Shared files are typically stored in the `/content/shared` directory. To source files, include the absolute path from the `/content` directory--for example, for a file located at `/content/shared/influxdb3-admin/databases/_index.md`, use the following frontmatter:
|
||||
|
||||
```yaml
|
||||
source: /shared/influxdb3-admin/databases/_index.md
|
||||
```
|
||||
|
||||
When building shared content, use the `show-in` and `hide-in` shortcodes to show
|
||||
or hide blocks of content based on the current InfluxDB product/version.
|
||||
For more information, see [show-in](#show-in) and [hide-in](#hide-in).
|
||||
For more information, see [show-in](DOCS-SHORTCODES.md#show-in) and [hide-in](DOCS-SHORTCODES.md#hide-in).
|
||||
|
||||
#### Links in shared content
|
||||
|
||||
When creating links in shared content files, you can use the `version` keyword, which gets replaced during the build process with the appropriate product version.
|
||||
|
||||
**Use this in shared content:**
|
||||
```markdown
|
||||
[Configuration options](/influxdb3/version/reference/config-options/)
|
||||
[CLI serve command](/influxdb3/version/reference/cli/influxdb3/serve/)
|
||||
```
|
||||
|
||||
**Not this:**
|
||||
```markdown
|
||||
[Configuration options](/influxdb3/{{% product-key %}}/reference/config-options/)
|
||||
[CLI serve command](/influxdb3/{{% product-key %}}/reference/cli/influxdb3/serve/)
|
||||
```
|
||||
|
||||
Don't list links to related content at the bottom of shared content files.
|
||||
Instead, add the `related` frontmatter to the individual pages that use the shared content.
|
||||
|
||||
### Validations for shared content
|
||||
|
||||
If you edit shared content files, the link and style checks configured for the repository run on all files that use that shared content.
|
||||
|
||||
## Children Shortcode Specific Frontmatter
|
||||
|
||||
The following frontmatter fields are used specifically with the `{{< children >}}` shortcode
|
||||
to control how pages appear in generated lists:
|
||||
|
||||
### `list_title`
|
||||
|
||||
Title used in article lists generated using the `{{< children >}}` shortcode.
|
||||
|
||||
### `external_url`
|
||||
|
||||
Used in `children` shortcode `type="list"` for page links that are external.
|
||||
|
||||
### `list_image`
|
||||
|
||||
Image included with article descriptions in `children type="articles"` shortcode.
|
||||
|
||||
### `list_note`
|
||||
|
||||
Used in `children` shortcode `type="list"` to add a small note next to listed links.
|
||||
|
||||
### `list_code_example`
|
||||
|
||||
Code example included with article descriptions in `children type="articles"` shortcode.
|
||||
|
||||
### `list_query_example`
|
||||
|
||||
Code examples included with article descriptions in `children type="articles"` shortcode.
|
||||
References examples in `data/query_examples`.
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- **Shortcodes Reference**: See `/.github/instructions/shortcodes-reference.instructions.md`
|
||||
- **Contributing Guide**: See `/.github/instructions/contributing.instructions.md`
|
||||
- **Style Guidelines**: Follow Google Developer Documentation style guidelines
|
||||
File diff suppressed because it is too large
Load Diff
|
|
@ -431,7 +431,7 @@ LEFTHOOK=0 git commit
|
|||
yarn test:e2e
|
||||
|
||||
# Run specific E2E specs
|
||||
node cypress/support/run-e2e-specs.js --spec "cypress/e2e/content/article-links.cy.js"
|
||||
node cypress/support/run-e2e-specs.js --spec "cypress/e2e/content/index.cy.js"
|
||||
```
|
||||
|
||||
### JavaScript Testing and Debugging
|
||||
|
|
@ -1,30 +1,65 @@
|
|||
When a user asks a question and doesn't include a product from the list below, ask them which product in the list they are using, along with the version and query language:
|
||||
<!-- This file is auto-generated from data/products.yml. Do not edit directly. -->
|
||||
<!-- Run 'npm run build:agent:instructions' to regenerate this file. -->
|
||||
|
||||
InfluxDB OSS 1.x (v1)
|
||||
- Documentation: https://docs.influxdata.com/influxdb/v1/
|
||||
- Query languages: v1.8+ supports InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1 client libraries
|
||||
InfluxDB Enterprise (v1)
|
||||
- Documentation: https://docs.influxdata.com/enterprise_influxdb/v1/
|
||||
- Query languages: v1.8+ supports InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1 client libraries
|
||||
InfluxDB OSS 2.x (v2)
|
||||
Use the following information to help determine which InfluxDB version and product the user is asking about:
|
||||
|
||||
InfluxDB OSS v2:
|
||||
- Documentation: https://docs.influxdata.com/influxdb/v2/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v2 client libraries
|
||||
InfluxDB Cloud (v2, multi-tenant)
|
||||
- Clients: Telegraf, influx CLI, v1/v2 client libraries
|
||||
|
||||
InfluxDB OSS v1:
|
||||
- Documentation: https://docs.influxdata.com/influxdb/v1/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1/v2 client libraries
|
||||
|
||||
InfluxDB Enterprise:
|
||||
- Documentation: https://docs.influxdata.com/enterprise_influxdb/v1.12/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v1/v2 client libraries
|
||||
|
||||
InfluxDB Cloud (TSM):
|
||||
- Documentation: https://docs.influxdata.com/influxdb/cloud/
|
||||
- Query languages: InfluxQL and Flux
|
||||
- Clients: Telegraf, influx CLI, v2 client libraries
|
||||
InfluxDB Clustered (v3, 3.0, self-managed distributed)
|
||||
- Documentation: https://docs.influxdata.com/influxdb/clustered/
|
||||
|
||||
InfluxDB Cloud Serverless:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/cloud-serverless/
|
||||
- Query languages: SQL and InfluxQL and Flux
|
||||
- Clients: Telegraf, influxctl CLI, v3 client libraries
|
||||
|
||||
InfluxDB Cloud Dedicated:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/cloud-dedicated/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxctl CLI, v3 client libraries
|
||||
InfluxDB Cloud Dedicated (3.0, v3, InfluxData-managed single tenant)
|
||||
- Documentation: https://docs.influxdata.com/influxdb/cloud-dedicated/
|
||||
|
||||
InfluxDB Clustered:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/clustered/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxctl CLI, v3 client libraries
|
||||
InfluxDB Cloud Serverless (v3, 3.0, InfluxData-managed multi-tenant)
|
||||
- Documentation: https://docs.influxdata.com/influxdb/clustered/
|
||||
|
||||
InfluxDB 3 Core:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/core/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influx CLI, v3 client libraries
|
||||
- Clients: Telegraf, influxdb3 CLI, v3 client libraries, InfluxDB 3 Explorer
|
||||
|
||||
InfluxDB 3 Enterprise:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/enterprise/
|
||||
- Query languages: SQL and InfluxQL
|
||||
- Clients: Telegraf, influxdb3 CLI, v3 client libraries, InfluxDB 3 Explorer
|
||||
|
||||
InfluxDB 3 Explorer:
|
||||
- Documentation: https://docs.influxdata.com/influxdb3/explorer/
|
||||
|
||||
Telegraf:
|
||||
- Documentation: https://docs.influxdata.com/telegraf/v1.36/
|
||||
|
||||
Chronograf:
|
||||
- Documentation: https://docs.influxdata.com/chronograf/v1.10/
|
||||
|
||||
Kapacitor:
|
||||
- Documentation: https://docs.influxdata.com/kapacitor/v1.8/
|
||||
|
||||
Flux:
|
||||
- Documentation: https://docs.influxdata.com/flux/v0.x/
|
||||
|
||||
|
|
|
|||
22
README.md
22
README.md
|
|
@ -9,11 +9,27 @@ This repository contains the InfluxDB 2.x documentation published at [docs.influ
|
|||
## Contributing
|
||||
|
||||
We welcome and encourage community contributions.
|
||||
For information about contributing to the InfluxData documentation, see [Contribution guidelines](CONTRIBUTING.md).
|
||||
For information about contributing to the InfluxData documentation, see [Contribution guidelines](DOCS-CONTRIBUTING.md).
|
||||
|
||||
## Testing
|
||||
|
||||
For information about testing the documentation, including code block testing, link validation, and style linting, see [Testing guide](TESTING.md).
|
||||
For information about testing the documentation, including code block testing, link validation, and style linting, see [Testing guide](DOCS-TESTING.md).
|
||||
|
||||
## Documentation
|
||||
|
||||
Comprehensive reference documentation for contributors:
|
||||
|
||||
- **[Contributing Guide](DOCS-CONTRIBUTING.md)** - Workflow and contribution guidelines
|
||||
- **[Shortcodes Reference](DOCS-SHORTCODES.md)** - Complete Hugo shortcode documentation
|
||||
- [Working examples](content/example.md) - Test shortcodes in the browser
|
||||
- **[Frontmatter Reference](DOCS-FRONTMATTER.md)** - Complete page metadata documentation
|
||||
- **[Testing Guide](DOCS-TESTING.md)** - Testing procedures and requirements
|
||||
- **[API Documentation](api-docs/README.md)** - API reference generation
|
||||
|
||||
### Quick Links
|
||||
- [Style guidelines](DOCS-CONTRIBUTING.md#style-guidelines)
|
||||
- [Commit guidelines](DOCS-CONTRIBUTING.md#commit-guidelines)
|
||||
- [Code block testing](DOCS-TESTING.md#code-block-testing)
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
|
|
@ -42,7 +58,7 @@ including our GPG key, can be found at https://www.influxdata.com/how-to-report-
|
|||
yarn install
|
||||
```
|
||||
|
||||
_**Note:** The most recent version of Hugo tested with this documentation is **0.123.8**._
|
||||
_**Note:** The most recent version of Hugo tested with this documentation is **0.149.0**._
|
||||
|
||||
3. To generate the API docs, see [api-docs/README.md](api-docs/README.md).
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,2 @@
|
|||
# API reference documentation instructions
|
||||
See @.github/instructions/api-docs.instructions.md for the complete API reference docs editing guidelines and instructions for generating pages locally.
|
||||
|
|
@ -0,0 +1,4 @@
|
|||
## JavaScript, TypeScript, and CSS in the documentation UI
|
||||
|
||||
See @.github/instructions/assets.instructions.md for the complete JavaScript, TypeScript, and SASS (CSS) development guidelines.
|
||||
|
||||
|
|
@ -34,7 +34,8 @@ export function InfluxDBUrl() {
|
|||
const elementSelector = '.article--content pre:not(.preserve)';
|
||||
|
||||
///////////////////// Stored preference management ///////////////////////
|
||||
// Retrieve the user's InfluxDB preference (cloud or oss) from the influxdb_pref local storage key. Default is cloud.
|
||||
// Retrieve the user's InfluxDB preference (cloud or oss) from the
|
||||
// influxdb_pref local storage key. Default is cloud.
|
||||
function getURLPreference() {
|
||||
return getPreference('influxdb_url');
|
||||
}
|
||||
|
|
@ -100,9 +101,9 @@ export function InfluxDBUrl() {
|
|||
removeInfluxDBUrl(product);
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////// InfluxDB URL utility functions ////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
/////////////////////// InfluxDB URL utility functions ///////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// Preserve URLs in codeblocks that come just after or are inside a div
|
||||
// with the class, .keep-url
|
||||
|
|
@ -110,7 +111,8 @@ export function InfluxDBUrl() {
|
|||
$('.keep-url').each(function () {
|
||||
// For code blocks with no syntax highlighting
|
||||
$(this).next('pre').addClass('preserve');
|
||||
// For code blocks with no syntax highlighting inside of a link (API endpoint blocks)
|
||||
// For code blocks with no syntax highlighting inside of a link
|
||||
// (API endpoint blocks)
|
||||
$(this).next('a').find('pre').addClass('preserve');
|
||||
// For code blocks with syntax highlighting
|
||||
$(this).next('.highlight').find('pre').addClass('preserve');
|
||||
|
|
@ -127,8 +129,8 @@ export function InfluxDBUrl() {
|
|||
return { oss, cloud, core, enterprise, serverless, dedicated, clustered };
|
||||
}
|
||||
|
||||
// Retrieve the previously selected URLs from the from the urls local storage object.
|
||||
// This is used to update URLs whenever you switch between browser tabs.
|
||||
// Retrieve the previously selected URLs from the urls local storage
|
||||
// object. This updates URLs when switching between browser tabs.
|
||||
function getPrevUrls() {
|
||||
const {
|
||||
prev_cloud: cloud,
|
||||
|
|
@ -291,7 +293,7 @@ export function InfluxDBUrl() {
|
|||
});
|
||||
}
|
||||
|
||||
// Append the URL selector button to each codeblock containing a placeholder URL
|
||||
// Append the URL selector button to codeblocks that contain a placeholder URL
|
||||
function appendUrlSelector(
|
||||
urls = {
|
||||
cloud: '',
|
||||
|
|
@ -320,19 +322,29 @@ export function InfluxDBUrl() {
|
|||
return contextText[context];
|
||||
};
|
||||
|
||||
appendToUrls.forEach(function (url) {
|
||||
$(elementSelector).each(function () {
|
||||
var code = $(this).html();
|
||||
if (code.includes(url)) {
|
||||
$(this).after(
|
||||
"<div class='select-url'><a class='url-trigger' href='#'>" +
|
||||
getBtnText(PRODUCT_CONTEXT) +
|
||||
'</a></div>'
|
||||
);
|
||||
$('.select-url').fadeIn(400);
|
||||
}
|
||||
// Process each code block only once
|
||||
$(elementSelector).each(function () {
|
||||
var code = $(this).html();
|
||||
var $codeBlock = $(this);
|
||||
|
||||
// Check if this code block contains any of the URLs
|
||||
var containsUrl = appendToUrls.some(function (url) {
|
||||
return url && code.includes(url);
|
||||
});
|
||||
|
||||
// If the code block contains at least one URL and doesn't already have a
|
||||
// URL selector button
|
||||
if (containsUrl && !$codeBlock.next('.select-url').length) {
|
||||
$codeBlock.after(
|
||||
"<div class='select-url'><a class='url-trigger' href='#'>" +
|
||||
getBtnText(PRODUCT_CONTEXT) +
|
||||
'</a></div>'
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// Fade in all select-url elements after they've been added
|
||||
$('.select-url').fadeIn(400);
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////
|
||||
|
|
@ -365,9 +377,9 @@ export function InfluxDBUrl() {
|
|||
// Set active radio button on page load
|
||||
setRadioButtons(getUrls());
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
////////////////////////// Modal window interactions ///////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
///////////////////////// Modal window interactions //////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// General modal window interactions are controlled in modals.js
|
||||
|
||||
|
|
@ -513,12 +525,18 @@ export function InfluxDBUrl() {
|
|||
// Toggled preferred service on load
|
||||
showPreference();
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
///////////////////////////////// Custom URLs //////////////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////////////// Custom URLs /////////////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// Validate custom URLs
|
||||
function validateUrl(url) {
|
||||
/** Match 3 possible types of hosts:
|
||||
* Named host = (unreserved | pct-encoded | sub-delims)+
|
||||
* IPv6 host = \[([a-f0-9:.]+)\]
|
||||
* IPvFuture host = \[v[a-f0-9][a-z0-9\-._~%!$&'()*+,;=:]+\]
|
||||
* Port = [0-9]+
|
||||
*/
|
||||
/** validDomain = (Named host | IPv6 host | IPvFuture host)(:Port)? **/
|
||||
const validDomain = new RegExp(
|
||||
`([a-z0-9\-._~%]+` +
|
||||
|
|
@ -536,7 +554,7 @@ export function InfluxDBUrl() {
|
|||
const domain = url.replace(protocol, '');
|
||||
|
||||
// First use the regex to check for an HTTP protocol and valid domain
|
||||
// --JS URL validation can't differentiate host:port string from a protocol.
|
||||
// JS URL validation can't differentiate host:port string from a protocol.
|
||||
if (validProtocol.test(protocol) == false) {
|
||||
return { valid: false, error: 'Invalid protocol, use http[s]' };
|
||||
} else if (validDomain.test(domain) == false) {
|
||||
|
|
@ -598,7 +616,9 @@ export function InfluxDBUrl() {
|
|||
removeCustomUrl();
|
||||
hideValidationMessage();
|
||||
$(
|
||||
`input[name="influxdb-${PRODUCT_CONTEXT}-url"][value="${DEFAULT_STORAGE_URLS[PRODUCT_CONTEXT]}"]`
|
||||
`input[name="influxdb-${PRODUCT_CONTEXT}-url"][value="` +
|
||||
DEFAULT_STORAGE_URLS[PRODUCT_CONTEXT] +
|
||||
'"]'
|
||||
).trigger('click');
|
||||
}
|
||||
}
|
||||
|
|
@ -659,7 +679,7 @@ export function InfluxDBUrl() {
|
|||
'#clustered-url-field',
|
||||
].join();
|
||||
|
||||
// Store the custom InfluxDB URL or product-specific URL when exiting the field
|
||||
// Store the custom InfluxDB URL or product-specific URL when exiting fields
|
||||
$(urlValueElements).blur(function () {
|
||||
!['dedicated', 'clustered'].includes(PRODUCT_CONTEXT)
|
||||
? applyCustomUrl()
|
||||
|
|
@ -694,9 +714,9 @@ export function InfluxDBUrl() {
|
|||
$(`#${productEl}-url-field`).val(productUrlCookie);
|
||||
});
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
/////////////////////////// Dynamically update URLs ////////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
////////////////////////// Dynamically update URLs ///////////////////////////
|
||||
//////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// Check if the referrerHost is one of the cloud URLs
|
||||
// cloudUrls is built dynamically in layouts/partials/footer/javascript.html
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
|
|
@ -35,6 +35,7 @@ import DocSearch from './components/doc-search.js';
|
|||
import FeatureCallout from './feature-callouts.js';
|
||||
import FluxGroupKeysDemo from './flux-group-keys.js';
|
||||
import FluxInfluxDBVersionsTrigger from './flux-influxdb-versions.js';
|
||||
import InfluxDBVersionDetector from './influxdb-version-detector.ts';
|
||||
import KeyBinding from './keybindings.js';
|
||||
import ListFilters from './list-filters.js';
|
||||
import ProductSelector from './version-selector.js';
|
||||
|
|
@ -64,6 +65,7 @@ const componentRegistry = {
|
|||
'feature-callout': FeatureCallout,
|
||||
'flux-group-keys-demo': FluxGroupKeysDemo,
|
||||
'flux-influxdb-versions-trigger': FluxInfluxDBVersionsTrigger,
|
||||
'influxdb-version-detector': InfluxDBVersionDetector,
|
||||
keybinding: KeyBinding,
|
||||
'list-filters': ListFilters,
|
||||
'product-selector': ProductSelector,
|
||||
|
|
@ -113,7 +115,10 @@ function initComponents(globals) {
|
|||
if (ComponentConstructor) {
|
||||
// Initialize the component and store its instance in the global namespace
|
||||
try {
|
||||
const instance = ComponentConstructor({ component });
|
||||
// Prepare component options
|
||||
const options = { component };
|
||||
|
||||
const instance = ComponentConstructor(options);
|
||||
globals[componentName] = ComponentConstructor;
|
||||
|
||||
// Optionally store component instances for future reference
|
||||
|
|
|
|||
|
|
@ -0,0 +1,647 @@
|
|||
// InfluxDB Version Detector Component Styles
|
||||
|
||||
.influxdb-version-detector {
|
||||
// CSS Custom Properties
|
||||
--transition-fast: 0.2s ease;
|
||||
--transition-normal: 0.3s ease;
|
||||
--spacing-sm: 0.625rem;
|
||||
--spacing-md: 1.25rem;
|
||||
|
||||
margin: 2rem auto;
|
||||
|
||||
.detector-title {
|
||||
color: $article-heading;
|
||||
margin-bottom: 0.625rem;
|
||||
font-size: 1.8em;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.detector-subtitle {
|
||||
color: $article-text;
|
||||
margin-bottom: 1.875rem;
|
||||
font-size: 0.95em;
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
// Progress bar
|
||||
.progress {
|
||||
margin-bottom: 1.5625rem;
|
||||
height: 6px;
|
||||
background: $article-hr;
|
||||
border-radius: 3px;
|
||||
overflow: hidden;
|
||||
|
||||
.progress-bar {
|
||||
height: 100%;
|
||||
background: $article-link;
|
||||
transition: width var(--transition-normal);
|
||||
}
|
||||
}
|
||||
|
||||
// Question container
|
||||
.question-container {
|
||||
min-height: 150px;
|
||||
|
||||
.question {
|
||||
display: none;
|
||||
animation: fadeIn var(--transition-normal);
|
||||
|
||||
&.active {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.question-text {
|
||||
font-size: 1.1em;
|
||||
color: $article-heading;
|
||||
margin-bottom: 1.25rem;
|
||||
font-weight: 500;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Buttons - Base styles and variants
|
||||
%button-base {
|
||||
border: none;
|
||||
border-radius: var(--border-radius);
|
||||
cursor: pointer;
|
||||
transition: all var(--transition-fast);
|
||||
font-family: inherit;
|
||||
|
||||
&:focus {
|
||||
outline: 2px solid $article-link;
|
||||
outline-offset: 2px;
|
||||
}
|
||||
}
|
||||
|
||||
.option-button {
|
||||
@extend %button-base;
|
||||
display: block;
|
||||
width: 100%;
|
||||
text-align: left;
|
||||
margin-bottom: 0.75rem;
|
||||
padding: 0.875rem 1.125rem;
|
||||
background: $article-bg;
|
||||
color: $article-text;
|
||||
border: 2px solid $article-hr;
|
||||
font-size: 15px;
|
||||
|
||||
&:hover {
|
||||
border-color: $article-link;
|
||||
background: $article-bg;
|
||||
transform: translateX(3px);
|
||||
}
|
||||
|
||||
&:active {
|
||||
transform: translateX(1px);
|
||||
}
|
||||
}
|
||||
|
||||
.submit-button {
|
||||
@extend %button-base;
|
||||
background: $article-link;
|
||||
color: $g20-white;
|
||||
padding: 0.75rem 1.5rem;
|
||||
font-size: 15px;
|
||||
font-weight: 500;
|
||||
|
||||
&:hover {
|
||||
background: $b-ocean;
|
||||
color: $g20-white;
|
||||
}
|
||||
|
||||
&:disabled {
|
||||
background: $g8-storm;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
}
|
||||
|
||||
.back-button {
|
||||
@extend %button-base;
|
||||
background: $g8-storm;
|
||||
color: $g20-white;
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
font-size: 14px;
|
||||
margin-right: var(--spacing-sm);
|
||||
|
||||
&:hover {
|
||||
background: $g9-mountain;
|
||||
}
|
||||
}
|
||||
|
||||
.restart-button {
|
||||
@extend .back-button;
|
||||
margin-top: var(--spacing-md);
|
||||
margin-right: 0;
|
||||
}
|
||||
|
||||
// Input fields
|
||||
%input-base {
|
||||
width: 100%;
|
||||
border: 2px solid $article-hr;
|
||||
border-radius: var(--border-radius);
|
||||
transition: border-color var(--transition-fast);
|
||||
background: $article-bg;
|
||||
color: $article-text;
|
||||
|
||||
&:focus {
|
||||
outline: none;
|
||||
border-color: $article-link;
|
||||
}
|
||||
}
|
||||
|
||||
.input-group {
|
||||
margin-bottom: var(--spacing-md);
|
||||
|
||||
label {
|
||||
display: block;
|
||||
margin-bottom: 0.5rem;
|
||||
color: $article-text;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
input {
|
||||
@extend %input-base;
|
||||
padding: 0.75rem;
|
||||
font-size: 14px;
|
||||
}
|
||||
}
|
||||
|
||||
textarea {
|
||||
@extend %input-base;
|
||||
padding: var(--spacing-sm);
|
||||
font-family: var(--font-mono, 'Courier New', monospace);
|
||||
font-size: 12px;
|
||||
resize: vertical;
|
||||
min-height: 120px;
|
||||
|
||||
&::placeholder {
|
||||
color: rgba($article-text, 0.6);
|
||||
opacity: 1; // Firefox fix
|
||||
}
|
||||
|
||||
&::-webkit-input-placeholder {
|
||||
color: rgba($article-text, 0.6);
|
||||
}
|
||||
|
||||
&::-moz-placeholder {
|
||||
color: rgba($article-text, 0.6);
|
||||
opacity: 1;
|
||||
}
|
||||
|
||||
&:-ms-input-placeholder {
|
||||
color: rgba($article-text, 0.6);
|
||||
}
|
||||
}
|
||||
|
||||
// Code block - match site standards
|
||||
.code-block {
|
||||
background: $article-code-bg;
|
||||
color: $article-code;
|
||||
padding: 1.75rem 1.75rem 1.25rem;
|
||||
border-radius: $radius;
|
||||
font-family: $code;
|
||||
font-size: 1rem;
|
||||
margin: 2rem 0 2.25rem;
|
||||
overflow-x: scroll;
|
||||
overflow-y: hidden;
|
||||
line-height: 1.7rem;
|
||||
white-space: pre;
|
||||
}
|
||||
|
||||
// URL pattern hint
|
||||
.url-pattern-hint {
|
||||
margin-bottom: var(--spacing-sm);
|
||||
padding: var(--spacing-sm);
|
||||
background: $article-note-base;
|
||||
border: 1px solid $article-note-base;
|
||||
border-radius: var(--border-radius);
|
||||
color: $article-note-text;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
// URL suggestions
|
||||
.url-suggestions {
|
||||
margin-bottom: var(--spacing-md);
|
||||
|
||||
.suggestions-header {
|
||||
color: $article-heading;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.suggestion-button {
|
||||
@extend %button-base;
|
||||
display: block;
|
||||
width: 100%;
|
||||
text-align: left;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
padding: var(--spacing-sm);
|
||||
background: $article-bg;
|
||||
border: 1px solid $article-hr;
|
||||
|
||||
&:hover {
|
||||
border-color: $article-link;
|
||||
background: $article-bg;
|
||||
}
|
||||
|
||||
.suggestion-url {
|
||||
font-family: var(--font-mono, 'Courier New', monospace);
|
||||
font-size: 13px;
|
||||
color: $article-link;
|
||||
margin-bottom: 2px;
|
||||
}
|
||||
|
||||
.suggestion-product {
|
||||
font-size: 12px;
|
||||
color: $article-text;
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
.suggestion-pattern {
|
||||
font-size: 11px;
|
||||
color: $article-link;
|
||||
font-style: italic;
|
||||
margin-top: 2px;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Results
|
||||
.result {
|
||||
display: none;
|
||||
margin-top: var(--spacing-sm);
|
||||
padding: var(--spacing-md);
|
||||
border-radius: var(--border-radius);
|
||||
animation: fadeIn var(--transition-normal);
|
||||
|
||||
&.show {
|
||||
display: block;
|
||||
}
|
||||
|
||||
&.success {
|
||||
background: $article-bg;
|
||||
border-left: 3px solid $article-note-base;
|
||||
color: $article-text;
|
||||
}
|
||||
|
||||
&.error {
|
||||
background: $r-flan;
|
||||
border-left: 3px solid $article-caution-base;
|
||||
color: $r-basalt;
|
||||
}
|
||||
|
||||
&.info {
|
||||
background: $article-note-base;
|
||||
border-left: 3px solid $article-note-base;
|
||||
color: $article-note-text;
|
||||
}
|
||||
|
||||
&.warning {
|
||||
background: $article-warning-bg;
|
||||
border-left: 3px solid $article-warning-base;
|
||||
color: $article-warning-text;
|
||||
}
|
||||
}
|
||||
|
||||
.detected-version {
|
||||
font-size: 1.3em;
|
||||
font-weight: bold;
|
||||
color: $article-link;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
padding: var(--spacing-sm);
|
||||
background: rgba($article-link, 0.1);
|
||||
border-radius: 4px;
|
||||
border-left: 4px solid $article-link;
|
||||
}
|
||||
|
||||
// URL pre-filled indicator
|
||||
.url-prefilled-indicator {
|
||||
font-size: 0.85em;
|
||||
color: $article-note-text;
|
||||
margin-bottom: 8px;
|
||||
padding: 4px 8px;
|
||||
background: rgba($article-link, 0.1);
|
||||
border-left: 3px solid $article-link;
|
||||
}
|
||||
|
||||
// Loading animation
|
||||
.loading {
|
||||
display: inline-block;
|
||||
margin-left: var(--spacing-sm);
|
||||
|
||||
&:after {
|
||||
content: '...';
|
||||
animation: dots 1.5s steps(4, end) infinite;
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes dots {
|
||||
0%, 20% {
|
||||
content: '.';
|
||||
}
|
||||
40% {
|
||||
content: '..';
|
||||
}
|
||||
60%, 100% {
|
||||
content: '...';
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes fadeIn {
|
||||
from {
|
||||
opacity: 0;
|
||||
transform: translateY(10px);
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Responsive design
|
||||
@media (max-width: 768px) {
|
||||
padding: 1.5rem;
|
||||
|
||||
.detector-title {
|
||||
font-size: 1.5em;
|
||||
}
|
||||
|
||||
.option-button {
|
||||
padding: 0.75rem 1rem;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.submit-button,
|
||||
.back-button {
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
font-size: 14px;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 480px) {
|
||||
padding: 1rem;
|
||||
|
||||
.detector-title {
|
||||
font-size: 1.3em;
|
||||
}
|
||||
|
||||
.detector-subtitle {
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.question-text {
|
||||
font-size: 1em;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Product ranking results
|
||||
.product-ranking {
|
||||
margin-bottom: var(--spacing-sm);
|
||||
padding: 0.75rem;
|
||||
border-radius: var(--border-radius);
|
||||
border-left: 4px solid $article-hr;
|
||||
background: $article-bg;
|
||||
|
||||
&.top-result {
|
||||
background: rgba($article-link, 0.1);
|
||||
border-color: $article-link;
|
||||
}
|
||||
|
||||
.product-title {
|
||||
font-weight: 600;
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
.most-likely-label {
|
||||
color: $article-link;
|
||||
font-size: 0.9em;
|
||||
margin-left: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.product-details {
|
||||
color: $article-text;
|
||||
font-size: 0.9em;
|
||||
margin-top: 0.25rem;
|
||||
opacity: 0.8;
|
||||
}
|
||||
}
|
||||
|
||||
// Grafana networking tips
|
||||
.grafana-tips {
|
||||
margin-top: var(--spacing-md);
|
||||
padding: 1rem;
|
||||
background: rgba($article-link, 0.1);
|
||||
border-left: 4px solid $article-link;
|
||||
border-radius: var(--border-radius);
|
||||
|
||||
.tips-title {
|
||||
margin: 0 0 var(--spacing-sm) 0;
|
||||
color: $article-link;
|
||||
font-size: 1.1em;
|
||||
}
|
||||
|
||||
.tips-description {
|
||||
margin: 0 0 var(--spacing-sm) 0;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
.tips-list {
|
||||
margin: 0;
|
||||
padding-left: 1.25rem;
|
||||
font-size: 0.85em;
|
||||
|
||||
li {
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
code {
|
||||
background: rgba($article-link, 0.15);
|
||||
padding: 0.125rem 0.25rem;
|
||||
border-radius: 3px;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
}
|
||||
|
||||
.tips-link {
|
||||
margin: var(--spacing-sm) 0 0 0;
|
||||
font-size: 0.85em;
|
||||
}
|
||||
}
|
||||
|
||||
// Expected results section
|
||||
.expected-results {
|
||||
margin: 1rem 0;
|
||||
|
||||
.results-title {
|
||||
font-weight: 600;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.results-list {
|
||||
margin: 0;
|
||||
padding-left: 1rem;
|
||||
font-size: 0.9em;
|
||||
|
||||
li {
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Question text styling
|
||||
.question-text-spaced {
|
||||
margin-top: 1rem;
|
||||
font-weight: normal;
|
||||
font-size: 0.95em;
|
||||
}
|
||||
|
||||
.question-options {
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
// Command help section
|
||||
.command-help {
|
||||
margin-top: var(--spacing-md);
|
||||
}
|
||||
|
||||
// Grafana links styling
|
||||
.grafana-link {
|
||||
color: $article-link;
|
||||
text-decoration: underline;
|
||||
|
||||
&:hover {
|
||||
color: $article-link-hover;
|
||||
}
|
||||
}
|
||||
|
||||
// Manual command output
|
||||
.manual-output {
|
||||
margin: 1rem 0;
|
||||
padding: var(--spacing-sm);
|
||||
background: $article-bg;
|
||||
border-left: 4px solid $article-link;
|
||||
border-radius: var(--border-radius);
|
||||
}
|
||||
|
||||
// Action section with buttons
|
||||
.action-section {
|
||||
margin-top: var(--spacing-md);
|
||||
}
|
||||
|
||||
// Quick Reference expandable section
|
||||
.quick-reference {
|
||||
margin-top: 2rem;
|
||||
|
||||
details {
|
||||
border: 1px solid $article-hr;
|
||||
border-radius: var(--border-radius);
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
.reference-summary {
|
||||
cursor: pointer;
|
||||
font-weight: 600;
|
||||
padding: 0.5rem 0;
|
||||
user-select: none;
|
||||
color: $article-link;
|
||||
|
||||
&:hover {
|
||||
color: $article-link-hover;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Expandable summary styling (for Docker Commands, etc.)
|
||||
.expandable-summary {
|
||||
cursor: pointer;
|
||||
font-weight: 600;
|
||||
padding: 0.5rem 0;
|
||||
user-select: none;
|
||||
color: $article-link;
|
||||
position: relative;
|
||||
padding-left: 1.5rem; // Make room for custom icon
|
||||
|
||||
&:hover {
|
||||
color: $article-link-hover;
|
||||
}
|
||||
|
||||
// Hide the default disclosure triangle
|
||||
&::marker,
|
||||
&::-webkit-details-marker {
|
||||
display: none;
|
||||
}
|
||||
|
||||
// Add custom plus/minus icon
|
||||
&::before {
|
||||
content: '+';
|
||||
position: absolute;
|
||||
left: 0;
|
||||
top: 50%;
|
||||
transform: translateY(-50%);
|
||||
width: 1rem;
|
||||
height: 1rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
font-size: 14px;
|
||||
font-weight: bold;
|
||||
color: $article-link;
|
||||
border: 1px solid $article-link;
|
||||
border-radius: 3px;
|
||||
background: transparent;
|
||||
}
|
||||
|
||||
// Change to minus when expanded
|
||||
details[open] & {
|
||||
&::before {
|
||||
content: '−';
|
||||
}
|
||||
}
|
||||
|
||||
&:hover::before {
|
||||
color: $article-link-hover;
|
||||
border-color: $article-link-hover;
|
||||
}
|
||||
}
|
||||
|
||||
// Quick Reference expandable section
|
||||
.quick-reference {
|
||||
margin-top: 2rem;
|
||||
|
||||
details {
|
||||
border: 1px solid $article-hr;
|
||||
border-radius: var(--border-radius);
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
.reference-table {
|
||||
margin-top: 1rem;
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
font-size: 0.9em;
|
||||
|
||||
th, td {
|
||||
padding: 0.5rem;
|
||||
text-align: left;
|
||||
border: 1px solid $article-hr;
|
||||
}
|
||||
|
||||
th {
|
||||
padding: 0.75rem 0.5rem;
|
||||
background: rgba($article-link, 0.1);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
tbody tr:nth-child(even) {
|
||||
background: rgba($article-text, 0.02);
|
||||
}
|
||||
|
||||
.product-name {
|
||||
font-weight: 600;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -189,7 +189,8 @@
|
|||
}
|
||||
|
||||
|
||||
@import "article/blocks",
|
||||
@import "article/badges",
|
||||
"article/blocks",
|
||||
"article/buttons",
|
||||
"article/captions",
|
||||
"article/children",
|
||||
|
|
|
|||
|
|
@ -26,7 +26,8 @@
|
|||
.modal-body {
|
||||
position: relative;
|
||||
display: flex;
|
||||
overflow: hidden;
|
||||
overflow-y: auto;
|
||||
overflow-x: hidden;
|
||||
// width: 100%;
|
||||
max-width: 650px;
|
||||
max-height: 97.5vh;
|
||||
|
|
@ -37,6 +38,27 @@
|
|||
color: $article-text;
|
||||
font-size: 1rem;
|
||||
transition: margin .4s;
|
||||
scroll-behavior: smooth;
|
||||
-webkit-overflow-scrolling: touch; // iOS smooth scrolling
|
||||
|
||||
// Custom scrollbar styling
|
||||
&::-webkit-scrollbar {
|
||||
width: 8px;
|
||||
}
|
||||
|
||||
&::-webkit-scrollbar-track {
|
||||
background: rgba($article-hr, 0.2);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
&::-webkit-scrollbar-thumb {
|
||||
background: rgba($article-text, 0.3);
|
||||
border-radius: 4px;
|
||||
|
||||
&:hover {
|
||||
background: rgba($article-text, 0.5);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
&.open {
|
||||
|
|
@ -62,6 +84,7 @@
|
|||
overflow: visible;
|
||||
width: 586px;
|
||||
max-width: 100%;
|
||||
flex-shrink: 0;
|
||||
|
||||
h3 {
|
||||
color: $article-heading;
|
||||
|
|
|
|||
|
|
@ -31,6 +31,10 @@
|
|||
position: relative;
|
||||
}
|
||||
|
||||
code {
|
||||
font-size: .85rem;
|
||||
}
|
||||
|
||||
.notification-slug {
|
||||
font-size: .97rem;
|
||||
margin: -.5rem 0 1.5rem 0;
|
||||
|
|
|
|||
|
|
@ -0,0 +1,17 @@
|
|||
.badge {
|
||||
font-size: .7rem;
|
||||
margin: 0 .2rem;
|
||||
padding: .1rem .4rem;
|
||||
border-radius: .6rem;
|
||||
font-weight: bold;
|
||||
vertical-align: top;
|
||||
|
||||
&.dvc {
|
||||
color: #2e7d2e;
|
||||
background-color: #e8f5e8;
|
||||
}
|
||||
&.lvc {
|
||||
color: #1976d2;
|
||||
background-color: #e3f2fd;
|
||||
}
|
||||
}
|
||||
|
|
@ -99,5 +99,5 @@ a.btn {
|
|||
li .url-trigger { padding: 0rem .5rem; }
|
||||
|
||||
.code-tab-content {
|
||||
.select-url{margin-top: -3.25rem}
|
||||
.select-url{margin-top: -3.15rem}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -34,3 +34,6 @@
|
|||
"layouts/code-controls",
|
||||
"layouts/v3-wayfinding";
|
||||
|
||||
// Import Components
|
||||
@import "components/influxdb-version-detector";
|
||||
|
||||
|
|
|
|||
12
compose.yaml
12
compose.yaml
|
|
@ -369,6 +369,18 @@ services:
|
|||
target: /var/lib/influxdb3/plugins/custom
|
||||
secrets:
|
||||
- influxdb3-enterprise-admin-token
|
||||
influxdb3-explorer:
|
||||
container_name: influxdb3-explorer
|
||||
image: influxdata/influxdb3-ui:latest
|
||||
pull_policy: always
|
||||
ports:
|
||||
- 8888:80
|
||||
- 8889:8888
|
||||
command:
|
||||
- --mode=admin
|
||||
profiles:
|
||||
- explorer
|
||||
- influxdb3
|
||||
telegraf-pytest:
|
||||
container_name: telegraf-pytest
|
||||
image: influxdata/docs-pytest
|
||||
|
|
|
|||
|
|
@ -0,0 +1,2 @@
|
|||
# Frontmatter and Content Instructions
|
||||
See @.github/instructions/content.instructions.md for the complete frontmatter reference and content guidelines.
|
||||
|
|
@ -11,12 +11,9 @@ alt_links:
|
|||
v1: /influxdb/v1/about_the_project/release-notes/
|
||||
---
|
||||
|
||||
## v1.12.x {date="TBD"}
|
||||
<span id="v1.12.x"></span>
|
||||
|
||||
> [!Important]
|
||||
> #### Pre-release documentation
|
||||
>
|
||||
> This release is not yet available. [**v{{% latest-patch %}}**](#v1118) is the latest InfluxDB Enterprise v1 release.
|
||||
## v1.12.2 {date="2025-09-15"}
|
||||
|
||||
> [!Important]
|
||||
> #### Upgrade meta nodes first
|
||||
|
|
@ -24,61 +21,40 @@ alt_links:
|
|||
> When upgrading to InfluxDB Enterprise 1.12.1+, upgrade meta nodes before
|
||||
> upgrading data nodes.
|
||||
|
||||
## Features
|
||||
### Features
|
||||
|
||||
- Add additional log output when using
|
||||
[`influx_inspect buildtsi`](/enterprise_influxdb/v1/tools/influx_inspect/#buildtsi) to
|
||||
rebuild the TSI index.
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Use [`influx_inspect export`](/enterprise_influxdb/v1/tools/influx_inspect/#export) with
|
||||
[`-tsmfile` option](/enterprise_influxdb/v1/tools/influx_inspect/#--tsmfile-tsm_file-) to
|
||||
export a single TSM file.
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Use [`influx_inspect export`](/enterprise_influxdb/v1/tools/influx_inspect/#export) with
|
||||
`-tsmfile` option to
|
||||
export a single TSM file.
|
||||
- Add `-m` flag to the [`influxd-ctl show-shards` command](/enterprise_influxdb/v1/tools/influxd-ctl/show-shards/)
|
||||
to output inconsistent shards.
|
||||
- Allow the specification of a write window for retention policies.
|
||||
- Add `fluxQueryRespBytes` metric to the `/debug/vars` metrics endpoint.
|
||||
- Log whenever meta gossip times exceed expiration.
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Add [`query-log-path` configuration option](/enterprise_influxdb/v1/administration/configure/config-data-nodes/#query-log-path)
|
||||
to data nodes.
|
||||
- Add [`aggressive-points-per-block` configuration option](/influxdb/v1/administration/config/#aggressive-points-per-block)
|
||||
to prevent TSM files from not getting fully compacted.
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Add `query-log-path` configuration option to data nodes.
|
||||
- Add `aggressive-points-per-block` configuration option to prevent TSM files from not getting fully compacted.
|
||||
- Log TLS configuration settings on startup.
|
||||
- Check for TLS certificate and private key permissions.
|
||||
- Add a warning if the TLS certificate is expired.
|
||||
- Add authentication to the Raft portal and add the following related _data_
|
||||
node configuration options:
|
||||
<!-- Uncomment with 1.12.x release
|
||||
- [`[meta].raft-portal-auth-required`](/enterprise_influxdb/v1/administration/configure/config-data-nodes/#raft-portal-auth-required)
|
||||
- [`[meta].raft-dialer-auth-required`](/enterprise_influxdb/v1/administration/configure/config-data-nodes/#raft-dialer-auth-required)
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- `[meta].raft-portal-auth-required`
|
||||
- `[meta].raft-dialer-auth-required`
|
||||
- Improve error handling.
|
||||
- InfluxQL updates:
|
||||
- Delete series by retention policy.
|
||||
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Allow retention policies to discard writes that fall within their range, but
|
||||
outside of [`FUTURE LIMIT`](/enterprise_influxdb/v1/query_language/manage-database/#future-limit)
|
||||
and [`PAST LIMIT`](/enterprise_influxdb/v1/query_language/manage-database/#past-limit).
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Allow retention policies to discard writes that fall within their range, but
|
||||
outside of `FUTURE LIMIT` and `PAST LIMIT`.
|
||||
|
||||
## Bug fixes
|
||||
### Bug fixes
|
||||
|
||||
- Fixed SSH key usage for cloning PCL/HT.
|
||||
- Log rejected writes to subscriptions.
|
||||
- Update `xxhash` and avoid `stringtoslicebyte` in the cache.
|
||||
- Prevent a panic when a shard group has no shards.
|
||||
|
|
@ -89,7 +65,7 @@ alt_links:
|
|||
- Update the `/shard-status` API to return the correct result and use a
|
||||
consistent "idleness" definition for shards.
|
||||
|
||||
## Other
|
||||
### Other
|
||||
|
||||
- Update Go to 1.23.5.
|
||||
- Upgrade Flux to v0.196.1.
|
||||
|
|
@ -230,26 +206,24 @@ alt_links:
|
|||
|
||||
## v1.11.3 {date="2023-10-12"}
|
||||
|
||||
{{% warn %}}
|
||||
#### Series file compaction on startup
|
||||
|
||||
With InfluxDB Enterprise v1.11.3, on startup, InfluxDB runs the
|
||||
`influxd_inspect -compact-series-file` command to [compact series files](/enterprise_influxdb/v1/tools/influx_inspect/#--compact-series-file-) before data nodes are started.
|
||||
Series files are stored in `_series` directories inside the
|
||||
[InfluxDB data directory](/enterprise_influxdb/v1/concepts/file-system-layout/#data-node-file-system-layout). Default: `/var/lib/data/<db-name>/_series`
|
||||
|
||||
- InfluxDB Enterprise v1.11.4+ introduces a configuration setting to optionally
|
||||
compact series on startup.
|
||||
- If any series files are corrupt, the `influx_inspect` or `influxd` processes on
|
||||
the data node may fail to start. In both cases, delete the series file
|
||||
directories before restarting the database. InfluxDB will automatically
|
||||
regenerate the deleted series files when the database is restarted.
|
||||
- To check if series files are corrupt before starting the database, run the
|
||||
[`influx_inspect verify-seriesfile` command](/enterprise_influxdb/v1/tools/influx_inspect/#verify-seriesfile)
|
||||
while the database is off-line.
|
||||
- If series files are large (20+ gigabytes), it may also be faster to delete the
|
||||
series file directories before starting the database.
|
||||
{{% /warn %}}
|
||||
> [!Important]
|
||||
> #### Series file compaction on startup
|
||||
>
|
||||
> With InfluxDB Enterprise v1.11.3, on startup, InfluxDB runs the
|
||||
> `influxd_inspect -compact-series-file` command to [compact series files](/enterprise_influxdb/v1/tools/influx_inspect/#--compact-series-file-) before data nodes are started.
|
||||
> Series files are stored in `_series` directories inside the
|
||||
> [InfluxDB data directory](/enterprise_influxdb/v1/concepts/file-system-layout/#data-node-file-system-layout). Default: `/var/lib/data/<db-name>/_series`
|
||||
>
|
||||
> - InfluxDB Enterprise v1.11.4+ introduces a configuration setting to optionally
|
||||
> compact series on startup.
|
||||
> - If any series files are corrupt, the `influx_inspect` or `influxd` processes on
|
||||
> the data node may fail to start. In both cases, delete the series file directories and [rebuild the indexes](/enterprise_influxdb/v1/administration/upgrading/#rebuild-tsi-indexes) before restarting the database. InfluxDB automatically
|
||||
> regenerates the deleted series files when the database restarts.
|
||||
> - To check if series files are corrupt before starting the database, run the
|
||||
> [`influx_inspect verify-seriesfile` command](/enterprise_influxdb/v1/tools/influx_inspect/#verify-seriesfile)
|
||||
> while the database is off-line.
|
||||
> - If series files are large (20+ gigabytes), it may be faster to delete the
|
||||
> series file directories before starting the database.
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
|
|
@ -1181,7 +1155,8 @@ Please see the [InfluxDB OSS release notes](/influxdb/v1/about_the_project/relea
|
|||
|
||||
## v1.5.0 {date="2018-03-06"}
|
||||
|
||||
> ***Note:*** This release builds off of the 1.5 release of InfluxDB OSS. Please see the [InfluxDB OSS release
|
||||
> [!Note]
|
||||
> This release builds off of the 1.5 release of InfluxDB OSS. Please see the [InfluxDB OSS release
|
||||
> notes](/influxdb/v1/about_the_project/release-notes/) for more information about the InfluxDB OSS release.
|
||||
|
||||
For highlights of the InfluxDB 1.5 release, see [What's new in InfluxDB 1.5](/influxdb/v1/about_the_project/whats_new/).
|
||||
|
|
|
|||
|
|
@ -1679,9 +1679,10 @@ max-version = "tls1.3"
|
|||
Default is `"tls1.3"`.
|
||||
|
||||
Minimum version of the TLS protocol that will be negotiated.
|
||||
Valid values include: `tls1.0`, `tls1.1`, and `tls1.3`.
|
||||
Valid values include: `tls1.0`, `tls1.1`, `tls1.2`, and `tls1.3`.
|
||||
If not specified, `min-version` is the minimum TLS version specified in the [Go `crypto/tls` package](https://golang.org/pkg/crypto/tls/#pkg-constants).
|
||||
In this example, `tls1.3` specifies the minimum version as TLS 1.3.
|
||||
|
||||
In the preceding example, `min-version = "tls1.3"` specifies the minimum version as TLS 1.3.
|
||||
|
||||
Environment variable: `INFLUXDB_TLS_MIN_VERSION`
|
||||
|
||||
|
|
@ -1690,9 +1691,10 @@ Environment variable: `INFLUXDB_TLS_MIN_VERSION`
|
|||
Default is `"tls1.3"`.
|
||||
|
||||
The maximum version of the TLS protocol that will be negotiated.
|
||||
Valid values include: `tls1.0`, `tls1.1`, and `tls1.3`.
|
||||
Valid values include: `tls1.0`, `tls1.1`, `tls1.2`, and `tls1.3`.
|
||||
If not specified, `max-version` is the maximum TLS version specified in the [Go `crypto/tls` package](https://golang.org/pkg/crypto/tls/#pkg-constants).
|
||||
In this example, `tls1.3` specifies the maximum version as TLS 1.3.
|
||||
|
||||
In the preceding example, `max-version = "tls1.3"` specifies the maximum version as TLS 1.3.
|
||||
|
||||
Environment variable: `INFLUXDB_TLS_MAX_VERSION`
|
||||
|
||||
|
|
|
|||
|
|
@ -20,8 +20,6 @@ Configure InfluxDB Enterprise to use LDAP (Lightweight Directory Access Protocol
|
|||
- Synchronize InfluxDB and LDAP so each LDAP request doesn't need to be queried
|
||||
|
||||
{{% note %}}
|
||||
LDAP **requires** JWT authentication. For more information, see [Configure authentication using JWT tokens](/enterprise_influxdb/v1/administration/configure/security/authentication/#configure-authentication-using-jwt-tokens).
|
||||
|
||||
To configure InfluxDB Enterprise to support LDAP, all users must be managed in the remote LDAP service. If LDAP is configured and enabled, users **must** authenticate through LDAP, including users who may have existed before enabling LDAP.
|
||||
{{% /note %}}
|
||||
|
||||
|
|
@ -44,9 +42,7 @@ Update the following settings in each data node configuration file (`/etc/influx
|
|||
|
||||
1. Under `[http]`, enable HTTP authentication by setting `auth-enabled` to `true`.
|
||||
(Or set the corresponding environment variable `INFLUXDB_HTTP_AUTH_ENABLED` to `true`.)
|
||||
2. Configure the HTTP shared secret to validate requests using JSON web tokens (JWT) and sign each HTTP payload with the secret and username.
|
||||
Set the `[http]` configuration setting for `shared-secret`, or the corresponding environment variable `INFLUXDB_HTTP_SHARED_SECRET`.
|
||||
3. If you're enabling authentication on meta nodes, you must also include the following configurations:
|
||||
2. If you're enabling authentication on meta nodes, you must also include the following configurations:
|
||||
- `INFLUXDB_META_META_AUTH_ENABLED` environment variable, or `[http]` configuration setting `meta-auth-enabled`, is set to `true`.
|
||||
This value must be the same value as the meta node's `meta.auth-enabled` configuration.
|
||||
- `INFLUXDB_META_META_INTERNAL_SHARED_SECRET`,
|
||||
|
|
|
|||
|
|
@ -6,6 +6,9 @@ menu:
|
|||
name: Monitor
|
||||
parent: Administration
|
||||
weight: 50
|
||||
aliases:
|
||||
- /enterprise_influxdb/v1/administration/monitor-enterprise/monitor-with-cloud/
|
||||
- /enterprise_influxdb/v1/administration/monitor/monitor-with-cloud/
|
||||
---
|
||||
|
||||
Monitoring is the act of observing changes in data over time.
|
||||
|
|
|
|||
|
|
@ -1,185 +0,0 @@
|
|||
---
|
||||
title: Monitor InfluxDB Enterprise with InfluxDB Cloud
|
||||
description: >
|
||||
Monitor your InfluxDB Enterprise instance using InfluxDB Cloud and
|
||||
a pre-built InfluxDB template.
|
||||
menu:
|
||||
enterprise_influxdb_v1:
|
||||
name: Monitor with Cloud
|
||||
parent: Monitor
|
||||
weight: 100
|
||||
aliases:
|
||||
- /enterprise_influxdb/v1/administration/monitor-enterprise/monitor-with-cloud/
|
||||
---
|
||||
|
||||
Use [InfluxDB Cloud](/influxdb/cloud/), the [InfluxDB Enterprise 1.x Template](https://github.com/influxdata/community-templates/tree/master/influxdb-enterprise-1x), and Telegraf to monitor one or more InfluxDB Enterprise instances.
|
||||
|
||||
Do the following:
|
||||
|
||||
1. [Review requirements](#review-requirements)
|
||||
2. [Install the InfluxDB Enterprise Monitoring template](#install-the-influxdb-enterprise-monitoring-template)
|
||||
3. [Set up InfluxDB Enterprise for monitoring](#set-up-influxdb-enterprise-for-monitoring)
|
||||
4. [Set up Telegraf](#set-up-telegraf)
|
||||
5. [View the Monitoring dashboard](#view-the-monitoring-dashboard)
|
||||
6. (Optional) [Alert when metrics stop reporting](#alert-when-metrics-stop-reporting)
|
||||
7. (Optional) [Create a notification endpoint and rule](#create-a-notification-endpoint-and-rule)
|
||||
8. (Optional) [Monitor with InfluxDB Insights](#monitor-with-influxdb-insights)
|
||||
|
||||
## Review requirements
|
||||
|
||||
Before you begin, make sure you have access to the following:
|
||||
|
||||
- An InfluxDB Cloud account. ([Sign up for free here](https://cloud2.influxdata.com/signup)).
|
||||
- Command line access to a machine [running InfluxDB Enterprise 1.x](/enterprise_influxdb/v1/introduction/install-and-deploy/) and permissions to install Telegraf on this machine.
|
||||
- Internet connectivity from the machine running InfluxDB Enterprise 1.x and Telegraf to InfluxDB Cloud.
|
||||
- Sufficient resource availability to install the template. (InfluxDB Cloud Free Plan accounts include a finite number of [available resources](/influxdb/cloud/account-management/limits/#free-plan-limits).)
|
||||
|
||||
## Install the InfluxDB Enterprise Monitoring template
|
||||
|
||||
The InfluxDB Enterprise Monitoring template includes a Telegraf configuration that sends InfluxDB Enterprise metrics to an InfluxDB endpoint, and a dashboard that visualizes the metrics.
|
||||
|
||||
1. [Log into your InfluxDB Cloud account](https://cloud2.influxdata.com/), go to **Settings > Templates**, and enter the following template URL:
|
||||
|
||||
```
|
||||
https://raw.githubusercontent.com/influxdata/community-templates/master/influxdb-enterprise-1x/enterprise.yml
|
||||
```
|
||||
|
||||
2. Click **Lookup Template**, and then click **Install Template**. InfluxDB Cloud imports the template, which includes the following resources:
|
||||
- Telegraf Configuration `monitoring-enterprise-1x`
|
||||
- Dashboard `InfluxDB 1.x Enterprise`
|
||||
- Label `enterprise`
|
||||
- Variables `influxdb_host` and `bucket`
|
||||
|
||||
## Set up InfluxDB Enterprise for monitoring
|
||||
|
||||
By default, InfluxDB Enterprise 1.x has a `/metrics` endpoint available, which exports Prometheus-style system metrics.
|
||||
|
||||
1. Make sure the `/metrics` endpoint is [enabled](/influxdb/v2/reference/config-options/#metrics-disabled). If you've changed the default settings to disable the `/metrics` endpoint, [re-enable these settings](/influxdb/v2/reference/config-options/#metrics-disabled).
|
||||
2. Navigate to the `/metrics` endpoint of your InfluxDB Enterprise instance to view the InfluxDB Enterprise system metrics in your browser:
|
||||
|
||||
```
|
||||
http://localhost:8086/metrics
|
||||
```
|
||||
|
||||
Or use `curl` to fetch metrics:
|
||||
|
||||
```sh
|
||||
curl http://localhost:8086/metrics
|
||||
# HELP boltdb_reads_total Total number of boltdb reads
|
||||
# TYPE boltdb_reads_total counter
|
||||
boltdb_reads_total 41
|
||||
# HELP boltdb_writes_total Total number of boltdb writes
|
||||
# TYPE boltdb_writes_total counter
|
||||
boltdb_writes_total 28
|
||||
# HELP go_gc_duration_seconds A summary of the pause duration of garbage collection cycles.
|
||||
...
|
||||
```
|
||||
3. Add your **InfluxDB Cloud** account information (URL and organization) to your Telegraf configuration by doing the following:
|
||||
1. Go to **Load Data > Telegraf** [in your InfluxDB Cloud account](https://cloud2.influxdata.com/), and click **InfluxDB Output Plugin** at the top-right corner.
|
||||
2. Copy the `urls`, `token`, `organization`, and `bucket` and close the window.
|
||||
3. Click **monitoring-enterprise-1.x**.
|
||||
4. Replace `urls`, `token`, `organization`, and `bucket` under `outputs.influxdb_v2` with your InfluxDB Cloud account information. Alternatively, store this information in your environment variables and include the environment variables in your configuration.
|
||||
|
||||
{{% note %}}
|
||||
To ensure the InfluxDB Enterprise monitoring dashboard can display the recorded metrics, set the destination bucket name to `enterprise_metrics` in your `telegraf.conf`.
|
||||
{{% /note %}}
|
||||
|
||||
5. Add the [Prometheus input plugin](https://github.com/influxdata/telegraf/blob/release-1.19/plugins/inputs/prometheus/README.md) to your `telegraf.conf`. Specify your your InfluxDB Enterprise URL(s) in the `urls` parameter. For example:
|
||||
|
||||
{{< keep-url >}}
|
||||
```toml
|
||||
[[inputs.prometheus]]
|
||||
urls = ["http://localhost:8086/metrics"]
|
||||
username = "$INFLUX_USER"
|
||||
password = "$INFLUX_PASSWORD"
|
||||
```
|
||||
|
||||
If you're using unique URLs or have authentication set up for your `/metrics` endpoint, configure those options here and save the updated configuration.
|
||||
|
||||
For more information about customizing Telegraf, see [Configure Telegraf](/telegraf/v1/administration/configuration/#global-tags).
|
||||
4. Click **Save Changes**.
|
||||
|
||||
## Set up Telegraf
|
||||
|
||||
Set up Telegraf to scrape metrics from InfluxDB Enterprise to send to your InfluxDB Cloud account.
|
||||
|
||||
On each InfluxDB Enterprise instance you want to monitor, do the following:
|
||||
|
||||
1. Go to **Load Data > Telegraf** [in your InfluxDB Cloud account](https://cloud2.influxdata.com/).
|
||||
2. Click **Setup Instructions** under **monitoring-enterprise-1.x**.
|
||||
3. Complete the Telegraf Setup instructions. If you are using environment variables, set them up now.
|
||||
|
||||
{{% note %}}
|
||||
For your API token, generate a new token or use an existing All Access token. If you run Telegraf as a service, edit your init script to set the environment variable and ensure that it's available to the service.
|
||||
{{% /note %}}
|
||||
|
||||
Telegraf runs quietly in the background (no immediate output appears), and Telegraf begins pushing metrics to your InfluxDB Cloud account.
|
||||
|
||||
## View the Monitoring dashboard
|
||||
|
||||
To see your data in real time, view the Monitoring dashboard.
|
||||
|
||||
1. Select **Boards** (**Dashboards**) in your **InfluxDB Cloud** account.
|
||||
|
||||
{{< nav-icon "dashboards" >}}
|
||||
|
||||
2. Click **InfluxDB Enterprise Metrics**. Metrics appear in your dashboard.
|
||||
3. Customize your monitoring dashboard as needed. For example, send an alert in the following cases:
|
||||
- Users create a new task or bucket
|
||||
- You're testing machine limits
|
||||
- [Metrics stop reporting](#alert-when-metrics-stop-reporting)
|
||||
|
||||
## Alert when metrics stop reporting
|
||||
|
||||
The Monitoring template includes a [deadman check](/influxdb/cloud/monitor-alert/checks/create/#deadman-check) to verify metrics are reported at regular intervals.
|
||||
|
||||
To alert when data stops flowing from InfluxDB OSS instances to your InfluxDB Cloud account, do the following:
|
||||
|
||||
1. [Customize the deadman check](#customize-the-deadman-check) to identify the fields you want to monitor.
|
||||
2. [Create a notification endpoint and rule](#create-a-notification-endpoint-and-rule) to receive notifications when your deadman check is triggered.
|
||||
|
||||
### Customize the deadman check
|
||||
|
||||
1. To view the deadman check, click **Alerts** in the navigation bar of your **InfluxDB Cloud** account.
|
||||
|
||||
{{< nav-icon "alerts" >}}
|
||||
|
||||
2. Choose a InfluxDB OSS field or create a new OSS field for your deadman alert:
|
||||
1. Click **{{< icon "plus" "v2" >}} Create** and select **Deadman Check** in the dropown menu.
|
||||
2. Define your query with at least one field.
|
||||
3. Click **Submit** and **Configure Check**.
|
||||
When metrics stop reporting, you'll receive an alert.
|
||||
3. Start under **Schedule Every**, set the amount of time to check for data.
|
||||
4. Set the amount of time to wait before switching to a critical alert.
|
||||
5. Save the Check and click on **View History** of the Check under the gear icon to verify it is running.
|
||||
|
||||
## Create a notification endpoint and rule
|
||||
|
||||
To receive a notification message when your deadman check is triggered, create a [notification endpoint](#create-a-notification-endpoint) and [rule](#create-a-notification-rule).
|
||||
|
||||
### Create a notification endpoint
|
||||
|
||||
InfluxDB Cloud supports different endpoints: Slack, PagerDuty, and HTTP. Slack is free for all users, while PagerDuty and HTTP are exclusive to the Usage-Based Plan.
|
||||
|
||||
#### Send a notification to Slack
|
||||
|
||||
1. Create a [Slack Webhooks](https://api.slack.com/messaging/webhooks).
|
||||
2. Go to **Alerts > Notification Endpoint** and click **{{< icon "plus" "v2" >}} Create**, and enter a name and description for your Slack endpoint.
|
||||
3. Enter your Slack Webhook under **Incoming Webhook URL** and click **Create Notification Endpoint**.
|
||||
|
||||
#### Send a notification to PagerDuty or HTTP
|
||||
|
||||
Send a notification to PagerDuty or HTTP endpoints (other webhooks) by [upgrading your InfluxDB Cloud account](/influxdb/cloud/account-management/billing/#upgrade-to-usage-based-plan).
|
||||
|
||||
### Create a notification rule
|
||||
|
||||
[Create a notification rule](/influxdb/cloud/monitor-alert/notification-rules/create/) to set rules for when to send a deadman alert message to your notification endpoint.
|
||||
|
||||
1. Go to **Alerts > Notification Rules** and click **{{< icon "plus" "v2" >}} Create**.
|
||||
2. Fill out the **About** and **Conditions** section then click **Create Notification Rule**.
|
||||
|
||||
## Monitor with InfluxDB Insights
|
||||
|
||||
For InfluxDB Enterprise customers, Insights is a free service that monitors your cluster and sends metrics to a private Cloud account. This allows InfluxDB Support to monitor your cluster health and access usage statistics when assisting with support tickets that you raise.
|
||||
|
||||
To apply for this service, please [contact InfluxData support](https://support.influxdata.com).
|
||||
|
|
@ -14,7 +14,7 @@ aliases:
|
|||
- /enterprise_influxdb/v1/administration/monitor-enterprise/monitor-with-oss/
|
||||
---
|
||||
|
||||
Use [InfluxDB OSS](/influxdb/v2/), the [InfluxDB Enterprise 1.x Template](https://github.com/influxdata/community-templates/tree/master/influxdb-enterprise-1x), and Telegraf to monitor one or more InfluxDB Enterprise instances.
|
||||
Use [InfluxDB OSS 2.x](/influxdb/v2/), the [InfluxDB Enterprise 1.x Template](https://github.com/influxdata/community-templates/tree/master/influxdb-enterprise-1x), and Telegraf to monitor one or more InfluxDB Enterprise instances.
|
||||
|
||||
Do the following:
|
||||
|
||||
|
|
|
|||
|
|
@ -8,128 +8,211 @@ menu:
|
|||
name: Grafana
|
||||
weight: 60
|
||||
parent: Tools
|
||||
related:
|
||||
- /flux/v0/get-started/, Get started with Flux
|
||||
- https://grafana.com/docs/, Grafana documentation
|
||||
alt_links:
|
||||
core: /influxdb3/core/visualize-data/grafana/
|
||||
enterprise: /influxdb3/enterprise/visualize-data/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/grafana/
|
||||
canonical: /influxdb/v2/tools/grafana/
|
||||
---
|
||||
|
||||
Use [Grafana](https://grafana.com/) or [Grafana Cloud](https://grafana.com/products/cloud/)
|
||||
to visualize data from your **InfluxDB Enterprise** cluster.
|
||||
|
||||
{{% note %}}
|
||||
#### Required
|
||||
- The instructions in this guide require **Grafana Cloud** or **Grafana v10.3+**.
|
||||
For information about using InfluxDB with other versions of Grafana,
|
||||
see the [Grafana documentation](https://grafana.com/docs/grafana/latest/datasources/influxdb/).
|
||||
- To use **Flux**, use **InfluxDB 1.8.1+** and [enable Flux](/influxdb/v1/flux/installation/)
|
||||
in your InfluxDB configuration file.
|
||||
{{% /note %}}
|
||||
> [!Note]
|
||||
> {{< influxdb-version-detector >}}
|
||||
|
||||
1. [Set up an InfluxDB Enterprise cluster](/enterprise_influxdb/v1/introduction/installation/).
|
||||
2. [Sign up for Grafana Cloud](https://grafana.com/products/cloud/) or
|
||||
[download and install Grafana](https://grafana.com/grafana/download).
|
||||
3. Visit your **Grafana Cloud user interface** (UI) or, if running Grafana locally,
|
||||
[start Grafana](https://grafana.com/docs/grafana/latest/installation/) and visit
|
||||
<http://localhost:3000> in your browser.
|
||||
4. In the left navigation of the Grafana UI, expand the **Connections** section
|
||||
and click **Add new connection**.
|
||||
5. Select **InfluxDB** from the list of available data sources and click
|
||||
**Add data source**.
|
||||
6. On the **Data Source configuration page**, enter a **name** for your InfluxDB data source.
|
||||
7. In the **Query Language** drop-down menu, select one of the query languages
|
||||
supported by InfluxDB {{< current-version >}} (InfluxQL or Flux):
|
||||
> [!Note]
|
||||
> #### Required
|
||||
> - The instructions in this guide require **Grafana Cloud** or **Grafana v10.3+**.
|
||||
> For information about using InfluxDB with other versions of Grafana,
|
||||
> see the [Grafana documentation](https://grafana.com/docs/grafana/latest/datasources/influxdb/).
|
||||
> - To use **Flux**, use **InfluxDB 1.8.1+** and [enable Flux](/enterprise_influxdb/v1/flux/installation/)
|
||||
> in your InfluxDB data nodes.
|
||||
|
||||
{{% note %}}
|
||||
SQL is only supported in InfluxDB 3.
|
||||
{{% /note %}}
|
||||
- [Install Grafana](#install-grafana)
|
||||
- [Create an InfluxDB data source](#create-an-influxdb-data-source)
|
||||
- [Query and visualize data](#query-and-visualize-data)
|
||||
|
||||
## Install Grafana
|
||||
|
||||
1. [Set up an InfluxDB Enterprise cluster](/enterprise_influxdb/v1/introduction/installation/).
|
||||
2. [Sign up for Grafana Cloud](https://grafana.com/products/cloud/) or
|
||||
[download and install Grafana](https://grafana.com/grafana/download).
|
||||
3. If running Grafana locally, enable the `newInfluxDSConfigPageDesign` feature flag to use the latest InfluxDB data source plugin.
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Option 1: Configuration file (recommended)" %}}
|
||||
|
||||
Add the following to your `grafana.ini` configuration file:
|
||||
|
||||
```ini
|
||||
[feature_toggles]
|
||||
enable = newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
Configuration file locations:
|
||||
- **Linux**: `/etc/grafana/grafana.ini`
|
||||
- **macOS (Homebrew)**: `/opt/homebrew/etc/grafana/grafana.ini`
|
||||
- **Windows**: `<GRAFANA_INSTALL_DIR>\conf\grafana.ini`
|
||||
|
||||
{{% /expand %}}
|
||||
|
||||
{{% expand "Option 2: Command line" %}}
|
||||
|
||||
Enable the feature flag when starting Grafana:
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[Linux](#)
|
||||
[macOS (Homebrew)](#)
|
||||
[Windows](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh
|
||||
grafana-server --config /etc/grafana/grafana.ini \
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh
|
||||
/opt/homebrew/opt/grafana/bin/grafana server \
|
||||
--config /opt/homebrew/etc/grafana/grafana.ini \
|
||||
--homepath /opt/homebrew/opt/grafana/share/grafana \
|
||||
--packaging=brew \
|
||||
cfg:default.paths.logs=/opt/homebrew/var/log/grafana \
|
||||
cfg:default.paths.data=/opt/homebrew/var/lib/grafana \
|
||||
cfg:default.paths.plugins=/opt/homebrew/var/lib/grafana/plugins \
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```powershell
|
||||
grafana-server.exe --config <GRAFANA_INSTALL_DIR>\conf\grafana.ini `
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
For more information, see [Configure feature toggles](https://grafana.com/docs/grafana/latest/setup-grafana/configure-grafana/feature-toggles/) in the Grafana documentation.
|
||||
|
||||
4. Visit your **Grafana Cloud user interface** (UI) or, if running Grafana locally,
|
||||
[start Grafana](https://grafana.com/docs/grafana/latest/installation/) and visit
|
||||
<http://localhost:3000> in your browser.
|
||||
|
||||
> [!Note]
|
||||
> #### Grafana 12.2+
|
||||
>
|
||||
> The instructions below are for **Grafana 12.2+** with the `newInfluxDSConfigPageDesign`
|
||||
> feature flag enabled. This introduces the newest version of the InfluxDB core plugin.
|
||||
> The updated plugin includes **SQL support** for InfluxDB 3-based products such
|
||||
> as {{< product-name >}}, and the interface dynamically adapts based on your
|
||||
> product and query language selection in [URL and authentication](#configure-url-and-authentication).
|
||||
|
||||
> [!Note]
|
||||
> #### Using Grafana Cloud with a local InfluxDB instance
|
||||
>
|
||||
> If you need to keep your database local, consider running Grafana locally instead of using Grafana Cloud,
|
||||
> as this avoids the need to expose your database to the internet.
|
||||
>
|
||||
> To use InfluxDB running on your private network with Grafana Cloud, you must configure a
|
||||
> [private data source for Grafana Cloud](https://grafana.com/docs/grafana-cloud/data-sources/private-data-sources/).
|
||||
|
||||
> [!Note]
|
||||
> #### Query language support
|
||||
> - InfluxQL is supported in InfluxDB Enterprise v1.8.x and later.
|
||||
> - Flux is supported in InfluxDB Enterprise v1.8.1 and later.
|
||||
> - SQL is only supported in InfluxDB 3. For more information, see how to [get started with InfluxDB 3 Enterprise](/influxdb3/enterprise/get-started/).
|
||||
|
||||
## Create an InfluxDB data source
|
||||
|
||||
1. In your Grafana interface, click **Connections** in the left sidebar.
|
||||
2. Click **Data sources**.
|
||||
3. Click **Add new connection**.
|
||||
4. Search for and select **InfluxDB**. The InfluxDB data source configuration page displays.
|
||||
5. In the **Settings** tab, enter a **Name** for your data source.
|
||||
|
||||
### Configure URL and authentication
|
||||
|
||||
In the **URL and authentication** section, configure the following:
|
||||
|
||||
- **URL**: Your server or load balancer URL--for example, `https://{{< influxdb/host >}}`
|
||||
- **Product**: From the dropdown, select **InfluxDB Enterprise 1.x**
|
||||
- **Query Language**: Select **InfluxQL** or **Flux**
|
||||
- _(Optional)_ **Advanced HTTP Settings**, **Auth**, and **TLS/SSL Settings** as needed for your environment
|
||||
|
||||
### Configure database settings
|
||||
|
||||
The fields in this section change based on your query language selection in [URL and authentication](#configure-url-and-authentication).
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs %}}
|
||||
[InfluxQL](#)
|
||||
[Flux](#)
|
||||
{{% /tabs %}}
|
||||
<!--------------------------- BEGIN INFLUXQL CONTENT -------------------------->
|
||||
{{% tab-content %}}
|
||||
<!--------------------------- BEGIN INFLUXQL CONTENT -------------------------->
|
||||
|
||||
## Configure Grafana to use InfluxQL
|
||||
|
||||
With **InfluxQL** selected as the query language in your InfluxDB data source settings:
|
||||
When you select **InfluxQL** as the query language, configure the following:
|
||||
|
||||
1. Under **HTTP**, enter the following:
|
||||
- **Database**: Your database name
|
||||
- **User**: Your InfluxDB username _(if [authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/))_
|
||||
- **Password**: Your InfluxDB password _(if [authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/))_
|
||||
|
||||
- **URL**: Your **InfluxDB Enterprise URL** or **load balancer URL**.
|
||||
{{< img-hd src="/img/grafana/enterprise-influxdb-v1-grafana-influxql.png" alt="InfluxQL configuration for InfluxDB Enterprise 1.x" />}}
|
||||
|
||||
```sh
|
||||
http://localhost:8086
|
||||
```
|
||||
Click **Save & Test**. Grafana attempts to connect to InfluxDB Enterprise and returns the result of the test.
|
||||
|
||||
2. Under **InfluxDB Details**, enter the following:
|
||||
|
||||
- **Database**: your database name
|
||||
- **User**: your InfluxDB username _(if [authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/))_
|
||||
- **Password**: your InfluxDB password _(if [authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/))_
|
||||
- **HTTP Method**: Select **GET** or **POST** _(for differences between the two,
|
||||
see the [query HTTP endpoint documentation](/enterprise_influxdb/v1/tools/api/#query-http-endpoint))_
|
||||
|
||||
3. Provide a **[Min time interval](https://grafana.com/docs/grafana/latest/datasources/influxdb/#min-time-interval)**
|
||||
(default is 10s).
|
||||
|
||||
{{< img-hd src="/img/influxdb/v1-tools-grafana-influxql.png" />}}
|
||||
|
||||
4. Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns
|
||||
the result of the test.
|
||||
|
||||
{{% /tab-content %}}
|
||||
<!---------------------------- END INFLUXQL CONTENT --------------------------->
|
||||
<!----------------------------- BEGIN FLUX CONTENT ---------------------------->
|
||||
{{% /tab-content %}}
|
||||
{{% tab-content %}}
|
||||
<!----------------------------- BEGIN FLUX CONTENT ---------------------------->
|
||||
|
||||
## Configure Grafana to use Flux
|
||||
|
||||
With **Flux** selected as the query language in your InfluxDB data source,
|
||||
configure your InfluxDB connection:
|
||||
When you select **Flux** as the query language, configure the following:
|
||||
|
||||
1. Ensure [Flux is enabled](/enterprise_influxdb/v1/flux/installation/) in
|
||||
your InfluxDB Enterprise data nodes.
|
||||
1. Ensure [Flux is enabled](/enterprise_influxdb/v1/flux/installation/) in your InfluxDB Enterprise data nodes.
|
||||
|
||||
2. Under **HTTP**, enter the following:
|
||||
2. Configure the database settings:
|
||||
|
||||
- **URL**: Your **InfluxDB Enterprise URL** or **load balancer URL**.
|
||||
- **Organization**: Provide an arbitrary value (InfluxDB Enterprise 1.x does not use organizations)
|
||||
- **Default Bucket**: Provide a default database and retention policy
|
||||
- **Token**: If [InfluxDB authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/)
|
||||
|
||||
```sh
|
||||
http://localhost:8086
|
||||
```
|
||||
{{< img-hd src="/img/grafana/enterprise-influxdb-v1-grafana-flux.png" alt="Flux configuration for InfluxDB Enterprise 1.x" />}}
|
||||
|
||||
3. Under **InfluxDB Details**, enter the following:
|
||||
Click **Save & Test**. Grafana attempts to connect to InfluxDB Enterprise and returns the result of the test.
|
||||
|
||||
- **Organization**: Provide an arbitrary value.
|
||||
- **Token**: If [InfluxDB authentication is enabled](/enterprise_influxdb/v1/administration/authentication_and_authorization/),
|
||||
provide your InfluxDB username and password using the following syntax:
|
||||
|
||||
```sh
|
||||
# Syntax
|
||||
username:password
|
||||
|
||||
# Example
|
||||
johndoe:mY5uP3rS3crE7pA5Sw0Rd
|
||||
```
|
||||
|
||||
If authentication is not enabled, leave blank.
|
||||
|
||||
- **Default Bucket**: Provide a default database and retention policy combination
|
||||
using the following syntax:
|
||||
|
||||
```sh
|
||||
# Syntax
|
||||
database-name/retention-policy-name
|
||||
|
||||
# Examples
|
||||
example-db/example-rp
|
||||
telegraf/autogen
|
||||
```
|
||||
|
||||
- **Min time interval**: [Grafana minimum time interval](https://grafana.com/docs/grafana/latest/features/datasources/influxdb/#min-time-interval).
|
||||
|
||||
{{< img-hd src="/img/influxdb/v1-tools-grafana-flux.png" />}}
|
||||
|
||||
3. Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns
|
||||
the result of the test.
|
||||
{{% /tab-content %}}
|
||||
<!------------------------------ END FLUX CONTENT ----------------------------->
|
||||
{{% /tab-content %}}
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## Query and visualize data
|
||||
|
||||
With your InfluxDB connection configured, use Grafana to query and visualize time series data.
|
||||
|
||||
### Query inspection in Grafana
|
||||
|
||||
To learn about query management and inspection in Grafana, see the
|
||||
[Grafana Explore documentation](https://grafana.com/docs/grafana/latest/explore/).
|
||||
|
||||
### Build visualizations with Grafana
|
||||
|
||||
For a comprehensive walk-through of creating visualizations with
|
||||
Grafana, see the [Grafana documentation](https://grafana.com/docs/grafana/latest/).
|
||||
|
|
|
|||
|
|
@ -12,8 +12,14 @@ related:
|
|||
- https://grafana.com/docs/, Grafana documentation
|
||||
- /influxdb/cloud/query-data/get-started/
|
||||
- /influxdb/cloud/query-data/influxql/
|
||||
- /flux/v0/get-started/, Get started with Flux
|
||||
alt_links:
|
||||
cloud-serverless: /influxdb3/cloud-serverless/visualize-data/grafana/
|
||||
v1: /influxdb/v1/tools/grafana/
|
||||
enterprise_v1: /enterprise_influxdb/v1/tools/grafana/
|
||||
v2: /influxdb/v2/tools/grafana/
|
||||
core: /influxdb3/core/visualize-data/grafana/
|
||||
enterprise: /influxdb3/enterprise/visualize-data/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/grafana/
|
||||
source: /shared/influxdb-v2/tools/grafana.md
|
||||
|
|
|
|||
|
|
@ -13,49 +13,28 @@ alt_links:
|
|||
enterprise_v1: /enterprise_influxdb/v1/about-the-project/release-notes/
|
||||
---
|
||||
|
||||
## v1.12.x {date="TBD"}
|
||||
|
||||
> [!Important]
|
||||
> #### Pre-release documentation
|
||||
>
|
||||
> This release is not yet available. [**v{{% latest-patch %}}**](#v1118) is the latest InfluxDB v1 release.
|
||||
## v1.12.2 {date="2025-09-15"}
|
||||
|
||||
## Features
|
||||
### Features
|
||||
|
||||
- Add additional log output when using
|
||||
[`influx_inspect buildtsi`](/influxdb/v1/tools/influx_inspect/#buildtsi) to
|
||||
rebuild the TSI index.
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Use [`influx_inspect export`](/influxdb/v1/tools/influx_inspect/#export) with
|
||||
[`-tsmfile` option](/influxdb/v1/tools/influx_inspect/#--tsmfile-tsm_file-) to
|
||||
export a single TSM file.
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Use [`influx_inspect export`](/influxdb/v1/tools/influx_inspect/#export) with
|
||||
`-tsmfile` option to
|
||||
export a single TSM file.
|
||||
|
||||
- Add `fluxQueryRespBytes` metric to the `/debug/vars` metrics endpoint.
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Add [`aggressive-points-per-block` configuration option](/influxdb/v1/administration/config/#aggressive-points-per-block)
|
||||
to prevent TSM files from not getting fully compacted.
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Add `aggressive-points-per-block` configuration option
|
||||
to prevent TSM files from not getting fully compacted.
|
||||
- Improve error handling.
|
||||
- InfluxQL updates:
|
||||
- Delete series by retention policy.
|
||||
<!-- TODO: Uncomment with 1.12.x release:
|
||||
- Allow retention policies to discard writes that fall within their range, but
|
||||
outside of [`FUTURE LIMIT`](/influxdb/v1/query_language/manage-database/#future-limit)
|
||||
and [`PAST LIMIT`](/influxdb/v1/query_language/manage-database/#past-limit).
|
||||
-->
|
||||
<!-- TODO: Remove with 1.12.x release: -->
|
||||
- Allow retention policies to discard writes that fall within their range, but
|
||||
outside of `FUTURE LIMIT` and `PAST LIMIT`.
|
||||
|
||||
## Bug fixes
|
||||
### Bug fixes
|
||||
|
||||
- Log rejected writes to subscriptions.
|
||||
- Update `xxhash` and avoid `stringtoslicebyte` in the cache.
|
||||
|
|
@ -65,7 +44,7 @@ alt_links:
|
|||
- Ensure temporary files are removed after failed compactions.
|
||||
- Do not panic on invalid multiple subqueries.
|
||||
|
||||
## Other
|
||||
### Other
|
||||
|
||||
- Update Go to 1.23.5.
|
||||
- Upgrade Flux to v0.196.1.
|
||||
|
|
|
|||
|
|
@ -1,136 +1,217 @@
|
|||
---
|
||||
title: Use Grafana with InfluxDB
|
||||
seotitle: Use Grafana with InfluxDB v1.11
|
||||
seotitle: Use Grafana with InfluxDB v1.x
|
||||
description: >
|
||||
Configure Grafana to query and visualize data from InfluxDB v1.11.
|
||||
Configure Grafana to query and visualize data from InfluxDB v1.x.
|
||||
menu:
|
||||
influxdb_v1:
|
||||
name: Grafana
|
||||
weight: 60
|
||||
parent: Tools
|
||||
related:
|
||||
- /flux/v0/get-started/, Get started with Flux
|
||||
alt_links:
|
||||
v2: /influxdb/v2/tools/grafana/
|
||||
core: /influxdb3/core/visualize-data/grafana/
|
||||
enterprise: /influxdb3/enterprise/visualize-data/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/grafana/
|
||||
canonical: /influxdb/v2/tools/grafana/
|
||||
---
|
||||
|
||||
Use [Grafana](https://grafana.com/) or [Grafana Cloud](https://grafana.com/products/cloud/)
|
||||
to visualize data from your **InfluxDB v1.11** instance.
|
||||
to visualize data from your {{% product-name %}} instance.
|
||||
|
||||
{{% note %}}
|
||||
#### Required
|
||||
- The instructions in this guide require **Grafana Cloud** or **Grafana v10.3+**.
|
||||
For information about using InfluxDB with other versions of Grafana,
|
||||
see the [Grafana documentation](https://grafana.com/docs/grafana/latest/datasources/influxdb/).
|
||||
- To use **Flux**, use **InfluxDB 1.8.1+** and [enable Flux](/influxdb/v1/flux/installation/)
|
||||
in your InfluxDB configuration file.
|
||||
{{% /note %}}
|
||||
> [!Note]
|
||||
> {{< influxdb-version-detector >}}
|
||||
|
||||
1. [Start InfluxDB](/influxdb/v1/introduction/get-started/).
|
||||
2. [Sign up for Grafana Cloud](https://grafana.com/products/cloud/) or
|
||||
[download and install Grafana](https://grafana.com/grafana/download).
|
||||
3. Visit your **Grafana Cloud user interface** (UI) or, if running Grafana locally,
|
||||
[start Grafana](https://grafana.com/docs/grafana/latest/installation/) and visit
|
||||
<http://localhost:3000> in your browser.
|
||||
4. In the left navigation of the Grafana UI, expand the **Connections** section
|
||||
and click **Add new connection**.
|
||||
5. Select **InfluxDB** from the list of available data sources and click
|
||||
**Add data source**.
|
||||
6. On the **Data Source configuration page**, enter a **name** for your InfluxDB data source.
|
||||
7. In the **Query Language** drop-down menu, select one of the query languages
|
||||
supported by InfluxDB {{< current-version >}} (InfluxQL or Flux):
|
||||
> [!Note]
|
||||
> #### Grafana 12.2+
|
||||
>
|
||||
> The instructions below are for **Grafana 12.2+** with the `newInfluxDSConfigPageDesign`
|
||||
> feature flag enabled. This introduces the newest version of the InfluxDB core plugin.
|
||||
> The updated plugin includes **SQL support** for InfluxDB 3-based products such
|
||||
> as {{< product-name >}}, and the interface dynamically adapts based on your
|
||||
> product and query language selection in [URL and authentication](#configure-url-and-authentication).
|
||||
|
||||
{{% note %}}
|
||||
SQL is only supported in InfluxDB 3.
|
||||
{{% /note %}}
|
||||
> [!Note]
|
||||
> #### Required
|
||||
> - The instructions below are for **Grafana 12.2+** with the `newInfluxDSConfigPageDesign`
|
||||
> feature flag enabled. This introduces the newest version of the InfluxDB core plugin.
|
||||
> For information about using InfluxDB with other versions of Grafana,
|
||||
> see the [Grafana documentation](https://grafana.com/docs/grafana/latest/datasources/influxdb/).
|
||||
> - To use **Flux**, use **InfluxDB 1.8.1+** and [enable Flux](/influxdb/v1/flux/installation/)
|
||||
> in your InfluxDB configuration file.
|
||||
|
||||
- [Install Grafana](#install-grafana)
|
||||
- [Create an InfluxDB data source](#create-an-influxdb-data-source)
|
||||
- [Query and visualize data](#query-and-visualize-data)
|
||||
|
||||
## Install Grafana
|
||||
|
||||
1. [Start InfluxDB](/influxdb/v1/introduction/get-started/).
|
||||
2. [Sign up for Grafana Cloud](https://grafana.com/products/cloud/) or
|
||||
[download and install Grafana](https://grafana.com/grafana/download).
|
||||
3. If running Grafana locally, enable the `newInfluxDSConfigPageDesign` feature flag to use the latest InfluxDB data source plugin.
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Option 1: Configuration file (recommended)" %}}
|
||||
|
||||
Add the following to your `grafana.ini` configuration file:
|
||||
|
||||
```ini
|
||||
[feature_toggles]
|
||||
enable = newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
Configuration file locations:
|
||||
- **Linux**: `/etc/grafana/grafana.ini`
|
||||
- **macOS (Homebrew)**: `/opt/homebrew/etc/grafana/grafana.ini`
|
||||
- **Windows**: `<GRAFANA_INSTALL_DIR>\conf\grafana.ini`
|
||||
|
||||
{{% /expand %}}
|
||||
|
||||
{{% expand "Option 2: Command line" %}}
|
||||
|
||||
Enable the feature flag when starting Grafana:
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[Linux](#)
|
||||
[macOS (Homebrew)](#)
|
||||
[Windows](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh
|
||||
grafana-server --config /etc/grafana/grafana.ini \
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh
|
||||
/opt/homebrew/opt/grafana/bin/grafana server \
|
||||
--config /opt/homebrew/etc/grafana/grafana.ini \
|
||||
--homepath /opt/homebrew/opt/grafana/share/grafana \
|
||||
--packaging=brew \
|
||||
cfg:default.paths.logs=/opt/homebrew/var/log/grafana \
|
||||
cfg:default.paths.data=/opt/homebrew/var/lib/grafana \
|
||||
cfg:default.paths.plugins=/opt/homebrew/var/lib/grafana/plugins \
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```powershell
|
||||
grafana-server.exe --config <GRAFANA_INSTALL_DIR>\conf\grafana.ini `
|
||||
cfg:default.feature_toggles.enable=newInfluxDSConfigPageDesign
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
For more information, see [Configure feature toggles](https://grafana.com/docs/grafana/latest/setup-grafana/configure-grafana/feature-toggles/) in the Grafana documentation.
|
||||
|
||||
4. Visit your **Grafana Cloud user interface** (UI) or, if running Grafana locally,
|
||||
[start Grafana](https://grafana.com/docs/grafana/latest/installation/) and visit
|
||||
<http://localhost:3000> in your browser.
|
||||
|
||||
> [!Note]
|
||||
> #### Using Grafana Cloud with a local InfluxDB instance
|
||||
>
|
||||
> If you need to keep your database local, consider running Grafana locally instead of using Grafana Cloud,
|
||||
> as this avoids the need to expose your database to the internet.
|
||||
>
|
||||
> To use InfluxDB running on your private network with Grafana Cloud, you must configure a
|
||||
> [private data source for Grafana Cloud](https://grafana.com/docs/grafana-cloud/data-sources/private-data-sources/).
|
||||
|
||||
> [!Note]
|
||||
> SQL is only supported in InfluxDB 3.
|
||||
> For more information, see how to [get started with InfluxDB 3 Core](/influxdb3/core/get-started/).
|
||||
|
||||
## Create an InfluxDB data source
|
||||
|
||||
1. In your Grafana interface, click **Connections** in the left sidebar.
|
||||
2. Click **Data sources**.
|
||||
3. Click **Add new connection**.
|
||||
4. Search for and select **InfluxDB**. The InfluxDB data source configuration page displays.
|
||||
5. In the **Settings** tab, enter a **Name** for your data source.
|
||||
|
||||
### Configure URL and authentication
|
||||
|
||||
In the **URL and authentication** section, configure the following:
|
||||
|
||||
- **URL**: Your server URL--for example, `https://{{< influxdb/host >}}`
|
||||
- **Product**: From the dropdown, select **InfluxDB OSS 1.x**
|
||||
- **Query Language**: Select **InfluxQL** or **Flux**
|
||||
- _(Optional)_ **Advanced HTTP Settings**, **Auth**, and **TLS/SSL Settings** as needed for your environment
|
||||
|
||||
### Configure database settings
|
||||
|
||||
The fields in this section change based on your query language selection in [URL and authentication](#configure-url-and-authentication).
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs %}}
|
||||
[InfluxQL](#)
|
||||
[Flux](#)
|
||||
{{% /tabs %}}
|
||||
<!--------------------------- BEGIN INFLUXQL CONTENT -------------------------->
|
||||
{{% tab-content %}}
|
||||
<!--------------------------- BEGIN INFLUXQL CONTENT -------------------------->
|
||||
|
||||
## Configure Grafana to use InfluxQL
|
||||
|
||||
With **InfluxQL** selected as the query language in your InfluxDB data source settings:
|
||||
When you select **InfluxQL** as the query language, configure the following:
|
||||
|
||||
1. Under **HTTP**, enter the following:
|
||||
- **Database**: Your database name
|
||||
- **User**: Your InfluxDB username _(if [authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/)); leave blank if authentication is disabled._
|
||||
- **Password**: Your InfluxDB password _(if [authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/)); leave blank if authentication is disabled._
|
||||
|
||||
- **URL**: Your **InfluxDB URL**.
|
||||
{{< img-hd src="/img/influxdb3/OSS-v1-grafana-product-dropdown-influxql.png" alt="InfluxQL configuration for InfluxDB OSS 1.x" />}}
|
||||
|
||||
```sh
|
||||
http://localhost:8086
|
||||
```
|
||||
Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns the result of the test.
|
||||
|
||||
2. Under **InfluxDB Details**, enter the following:
|
||||
|
||||
- **Database**: your database name
|
||||
- **User**: your InfluxDB username _(if [authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/))_
|
||||
- **Password**: your InfluxDB password _(if [authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/))_
|
||||
- **HTTP Method**: Select **GET** or **POST** _(for differences between the two,
|
||||
see the [query HTTP endpoint documentation](/influxdb/v1/tools/api/#query-http-endpoint))_
|
||||
|
||||
3. Provide a **[Min time interval](https://grafana.com/docs/grafana/latest/datasources/influxdb/#min-time-interval)**
|
||||
(default is 10s).
|
||||
|
||||
{{< img-hd src="/img/influxdb/v1-tools-grafana-influxql.png" />}}
|
||||
|
||||
4. Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns
|
||||
the result of the test.
|
||||
|
||||
{{% /tab-content %}}
|
||||
<!---------------------------- END INFLUXQL CONTENT --------------------------->
|
||||
<!----------------------------- BEGIN FLUX CONTENT ---------------------------->
|
||||
{{% /tab-content %}}
|
||||
{{% tab-content %}}
|
||||
<!----------------------------- BEGIN FLUX CONTENT ---------------------------->
|
||||
|
||||
## Configure Grafana to use Flux
|
||||
|
||||
With **Flux** selected as the query language in your InfluxDB data source,
|
||||
configure your InfluxDB connection:
|
||||
When you select **Flux** as the query language, configure the following:
|
||||
|
||||
1. Ensure [Flux is enabled](/influxdb/v1/flux/installation/) in InfluxDB.
|
||||
1. Ensure [Flux is enabled](/influxdb/v1/flux/installation/) in your InfluxDB configuration file.
|
||||
|
||||
2. Under **HTTP**, enter the following:
|
||||
2. Configure the database settings:
|
||||
|
||||
- **URL**: Your **InfluxDB URL**.
|
||||
- **Organization**: Provide an arbitrary value (InfluxDB 1.x does not use organizations)
|
||||
- **Default Bucket**: Provide a default database and retention policy
|
||||
- **Token**: If [InfluxDB authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/) provide your InfluxDB username and password
|
||||
|
||||
```sh
|
||||
http://localhost:8086
|
||||
```
|
||||
{{< img-hd src="/img/influxdb3/OSS-v1-grafana-product-dropdown-flux.png" alt="Flux configuration for InfluxDB OSS 1.x" />}}
|
||||
|
||||
3. Under **InfluxDB Details**, enter the following:
|
||||
Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns the result of the test.
|
||||
|
||||
- **Organization**: Provide an arbitrary value.
|
||||
- **Token**: If [InfluxDB authentication is enabled](/influxdb/v1/administration/authentication_and_authorization/),
|
||||
provide your InfluxDB username and password using the following syntax:
|
||||
|
||||
```sh
|
||||
# Syntax
|
||||
username:password
|
||||
|
||||
# Example
|
||||
johndoe:mY5uP3rS3crE7pA5Sw0Rd
|
||||
```
|
||||
|
||||
If authentication is not enabled, leave blank.
|
||||
|
||||
- **Default Bucket**: Provide a default database and retention policy combination
|
||||
using the following syntax:
|
||||
|
||||
```sh
|
||||
# Syntax
|
||||
database-name/retention-policy-name
|
||||
|
||||
# Examples
|
||||
example-db/example-rp
|
||||
telegraf/autogen
|
||||
```
|
||||
|
||||
- **Min time interval**: [Grafana minimum time interval](https://grafana.com/docs/grafana/latest/features/datasources/influxdb/#min-time-interval).
|
||||
|
||||
{{< img-hd src="/img/influxdb/v1-tools-grafana-flux.png" />}}
|
||||
|
||||
3. Click **Save & Test**. Grafana attempts to connect to InfluxDB and returns
|
||||
the result of the test.
|
||||
{{% /tab-content %}}
|
||||
<!------------------------------ END FLUX CONTENT ----------------------------->
|
||||
{{% /tab-content %}}
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## Query and visualize data
|
||||
|
||||
With your InfluxDB connection configured, use Grafana to query and visualize time series data.
|
||||
|
||||
### Query inspection in Grafana
|
||||
|
||||
To learn about query management and inspection in Grafana, see the
|
||||
[Grafana Explore documentation](https://grafana.com/docs/grafana/latest/explore/).
|
||||
|
||||
### Build visualizations with Grafana
|
||||
|
||||
For a comprehensive walk-through of creating visualizations with
|
||||
Grafana, see the [Grafana documentation](https://grafana.com/docs/grafana/latest/).
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ title: Troubleshoot systemd errors
|
|||
list_title: systemd permission errors
|
||||
description: Troubleshoot errors with InfluxDB and systemd permissions
|
||||
menu:
|
||||
platform:
|
||||
influxdb_v1:
|
||||
name: systemd errors
|
||||
parent: Troubleshoot
|
||||
weight: 1
|
||||
|
|
|
|||
|
|
@ -13,6 +13,15 @@ aliases:
|
|||
related:
|
||||
- https://grafana.com/docs/, Grafana documentation
|
||||
- /influxdb/v2/query-data/get-started/
|
||||
alt_links:
|
||||
v1: /influxdb/v1/tools/grafana/
|
||||
enterprise_v1: /enterprise_influxdb/v1/tools/grafana/
|
||||
cloud: /influxdb/cloud/tools/grafana/
|
||||
core: /influxdb3/core/visualize-data/grafana/
|
||||
enterprise: /influxdb3/enterprise/visualize-data/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/grafana/
|
||||
source: /shared/influxdb-v2/tools/grafana.md
|
||||
---
|
||||
|
||||
|
|
|
|||
|
|
@ -25,6 +25,10 @@ to rename a database in your {{< product-name omit=" Cluster" >}} cluster.
|
|||
> or update [database tokens](/influxdb3/cloud-dedicated/admin/tokens/database/).
|
||||
> After renaming a database, any existing database tokens will stop working and you
|
||||
> must create new tokens with permissions for the renamed database.
|
||||
>
|
||||
> If you create a new database using the previous database name, tokens
|
||||
> associated with that database name will grant access to the newly created
|
||||
> database.
|
||||
|
||||
## Rename a database using the influxctl CLI
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,65 @@
|
|||
---
|
||||
title: Undelete a table
|
||||
description: >
|
||||
Use the [`influxctl table undelete` command](/influxdb3/cloud-dedicated/reference/cli/influxctl/table/undelete/)
|
||||
to restore a previously deleted table in your {{< product-name omit=" Cluster" >}} cluster.
|
||||
menu:
|
||||
influxdb3_cloud_dedicated:
|
||||
parent: Manage tables
|
||||
weight: 204
|
||||
list_code_example: |
|
||||
```bash { placeholders="DATABASE_NAME|TABLE_ID" }
|
||||
influxctl table undelete DATABASE_NAME TABLE_ID
|
||||
```
|
||||
related:
|
||||
- /influxdb3/cloud-dedicated/reference/cli/influxctl/table/undelete/
|
||||
- /influxdb3/cloud-dedicated/admin/tables/delete/
|
||||
- /influxdb3/cloud-dedicated/admin/tokens/table/create/
|
||||
---
|
||||
|
||||
Use the [`influxctl table undelete` command](/influxdb3/cloud-dedicated/reference/cli/influxctl/table/undelete/)
|
||||
to restore a previously deleted table in your {{< product-name omit=" Cluster" >}} cluster.
|
||||
|
||||
> [!Important]
|
||||
> To undelete a table:
|
||||
>
|
||||
> - A new table with the same name cannot already exist.
|
||||
> - You must have appropriate permissions to manage databases.
|
||||
|
||||
When you undelete a table, it is restored with the same partition template and
|
||||
other settings as when it was deleted.
|
||||
|
||||
> [!Warning]
|
||||
> Tables can only be undeleted for
|
||||
> {{% show-in "cloud-dedicated" %}}approximately 14 days{{% /show-in %}}{{% show-in "clustered" %}}a configurable "hard-delete" grace period{{% /show-in %}}
|
||||
> after they are deleted.
|
||||
> After this grace period, all Parquet files associated with the deleted table
|
||||
> are permanently removed and the table cannot be undeleted.
|
||||
|
||||
## Undelete a table using the influxctl CLI
|
||||
|
||||
```bash { placeholders="DATABASE_NAME|TABLE_ID" }
|
||||
influxctl table undelete DATABASE_NAME TABLE_ID
|
||||
```
|
||||
|
||||
Replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
Name of the database associated with the deleted table
|
||||
- {{% code-placeholder-key %}}`TABLE_ID`{{% /code-placeholder-key %}}:
|
||||
ID of the deleted table to restore
|
||||
|
||||
> [!Tip]
|
||||
> #### View deleted table IDs
|
||||
>
|
||||
> To view the IDs of deleted tables, use the `influxctl table list` command with
|
||||
> the `--filter-status=deleted` flag--for example:
|
||||
>
|
||||
> <!--pytest.mark.skip-->
|
||||
>
|
||||
> ```bash {placeholders="DATABASE_NAME" }
|
||||
> influxctl table list --filter-status=deleted DATABASE_NAME
|
||||
> ```
|
||||
>
|
||||
> Replace {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}
|
||||
> with the name of the database associated with the table you want to undelete.
|
||||
|
|
@ -9,6 +9,8 @@ menu:
|
|||
name: Use Chronograf
|
||||
parent: Visualize data
|
||||
weight: 202
|
||||
aliases:
|
||||
- /influxdb3/cloud-dedicated/visualize-data/chronograf/
|
||||
related:
|
||||
- /chronograf/v1/
|
||||
metadata: [InfluxQL only]
|
||||
|
|
@ -84,14 +86,14 @@ If you haven't already, [download and install Chronograf](/chronograf/v1/introdu
|
|||
> schema information may not be available in the Data Explorer.
|
||||
> This limits the Data Explorer's query building functionality and requires you to
|
||||
> build queries manually using
|
||||
> [fully-qualified measurements](/influxdb3/cloud-dedicated/reference/influxql/select/#fully-qualified-measurement)
|
||||
> [fully qualified measurements](/influxdb3/cloud-dedicated/reference/influxql/select/#fully-qualified-measurement)
|
||||
> in the `FROM` clause. For example:
|
||||
>
|
||||
> ```sql
|
||||
> -- Fully-qualified measurement
|
||||
> -- Fully qualified measurement
|
||||
> SELECT * FROM "db-name"."rp-name"."measurement-name"
|
||||
>
|
||||
> -- Fully-qualified measurement shorthand (use the default retention policy)
|
||||
>
|
||||
> -- Fully qualified measurement shorthand (use the default retention policy)
|
||||
> SELECT * FROM "db-name".."measurement-name"
|
||||
> ```
|
||||
>
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ aliases:
|
|||
- /influxdb3/cloud-dedicated/query-data/sql/execute-queries/grafana/
|
||||
- /influxdb3/cloud-dedicated/query-data/influxql/execute-queries/grafana
|
||||
- /influxdb3/cloud-dedicated/process-data/tools/grafana/
|
||||
- /influxdb3/cloud-dedicated/visualize-data/grafana/
|
||||
alt_links:
|
||||
v2: /influxdb/v2/tools/grafana/
|
||||
cloud: /influxdb/cloud/tools/grafana/
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ menu:
|
|||
influxdb3/cloud-dedicated/tags: [Flight client, query, flightsql, superset]
|
||||
aliases:
|
||||
- /influxdb3/cloud-dedicated/query-data/execute-queries/flight-sql/superset/
|
||||
- /influxdb3/cloud-dedicated/visualize-data/superset/
|
||||
- /influxdb3/cloud-dedicated/query-data/tools/superset/
|
||||
- /influxdb3/cloud-dedicated/query-data/sql/execute-queries/superset/
|
||||
- /influxdb3/cloud-dedicated/process-data/tools/superset/
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ menu:
|
|||
influxdb3/cloud-dedicated/tags: [Flight client, query, flightsql, tableau, sql]
|
||||
aliases:
|
||||
- /influxdb3/cloud-dedicated/query-data/execute-queries/flight-sql/tableau/
|
||||
- /influxdb3/cloud-dedicated/visualize-data/tableau/
|
||||
- /influxdb3/cloud-dedicated/query-data/tools/tableau/
|
||||
- /influxdb3/cloud-dedicated/query-data/sql/execute-queries/tableau/
|
||||
- /influxdb3/cloud-dedicated/process-data/tools/tableau/
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Execute SQL queries with ODBC
|
||||
description: >
|
||||
Use the Arrow Flight SQL ODBC driver to execute SQL queries against {{% product-name %}} from
|
||||
ODBC-compatible applications and programming languages.
|
||||
menu:
|
||||
influxdb3_cloud_dedicated:
|
||||
name: Use ODBC
|
||||
parent: Execute queries
|
||||
weight: 351
|
||||
influxdb3/cloud-dedicated/tags: [query, sql, odbc]
|
||||
metadata: [SQL]
|
||||
|
||||
related:
|
||||
- /influxdb3/cloud-dedicated/reference/sql/
|
||||
- /influxdb3/cloud-dedicated/query-data/
|
||||
source: /shared/influxdb3-query-guides/execute-queries/odbc.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-query-guides/execute-queries/odbc.md -->
|
||||
|
|
@ -9,7 +9,7 @@ menu:
|
|||
parent: Execute queries
|
||||
name: Use visualization tools
|
||||
identifier: query-with-visualization-tools
|
||||
influxdb3/cloud-dedicated/tags: [query, sql, influxql]
|
||||
influxdb3/cloud-dedicated/tags: [query, sql, influxql, visualization]
|
||||
metadata: [SQL, InfluxQL]
|
||||
aliases:
|
||||
- /influxdb3/cloud-dedicated/query-data/influxql/execute-queries/visualization-tools/
|
||||
|
|
@ -27,6 +27,7 @@ Use visualization tools to query data stored in {{% product-name %}} with SQL.
|
|||
The following visualization tools support querying InfluxDB with SQL:
|
||||
|
||||
- [Grafana](/influxdb3/cloud-dedicated/process-data/visualize/grafana/)
|
||||
- [Power BI](/influxdb3/cloud-dedicated/process-data/visualize/powerbi/)
|
||||
- [Superset](/influxdb3/cloud-dedicated/process-data/visualize/superset/)
|
||||
- [Tableau](/influxdb3/cloud-dedicated/process-data/visualize/tableau/)
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,14 @@
|
|||
---
|
||||
title: influxctl table undelete
|
||||
description: >
|
||||
The `influxctl table undelete` command undeletes a previously deleted
|
||||
table in an {{% product-name omit=" Clustered" %}} cluster.
|
||||
menu:
|
||||
influxdb3_cloud_dedicated:
|
||||
parent: influxctl table
|
||||
weight: 301
|
||||
metadata: [influxctl 2.10.4+]
|
||||
source: /shared/influxctl/table/undelete.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxctl/table/undelete.md -->
|
||||
|
|
@ -9,495 +9,9 @@ menu:
|
|||
name: Sample data
|
||||
parent: Reference
|
||||
weight: 110
|
||||
source: /shared/influxdb3-sample-data/sample-data-dist.md
|
||||
---
|
||||
|
||||
Sample datasets are used throughout the {{< product-name >}} documentation to
|
||||
demonstrate functionality.
|
||||
Use the following sample datasets to replicate provided examples.
|
||||
|
||||
- [Get started home sensor data](#get-started-home-sensor-data)
|
||||
- [Home sensor actions data](#home-sensor-actions-data)
|
||||
- [NOAA Bay Area weather data](#noaa-bay-area-weather-data)
|
||||
- [Bitcoin price data](#bitcoin-price-data)
|
||||
- [Random numbers sample data](#random-numbers-sample-data)
|
||||
|
||||
## Get started home sensor data
|
||||
|
||||
Includes hourly home sensor data used in the
|
||||
[Get started with {{< product-name >}}](/influxdb3/cloud-dedicated/get-started/) guide.
|
||||
This dataset includes anomalous sensor readings and helps to demonstrate
|
||||
processing and alerting on time series data.
|
||||
To customize timestamps in the dataset, use the {{< icon "clock" >}} button in
|
||||
the lower right corner of the page.
|
||||
This lets you modify the sample dataset to stay within the retention period of
|
||||
the database you write it to.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T08:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
to
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T20:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
<em style="opacity: .5">(Customizable)</em>
|
||||
|
||||
##### Schema
|
||||
|
||||
- home <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- room
|
||||
- Kitchen
|
||||
- Living Room
|
||||
- **fields**:
|
||||
- co <em style="opacity: .5">(integer)</em>
|
||||
- temp <em style="opacity: .5">(float)</em>
|
||||
- hum <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write home sensor data to InfluxDB" %}}
|
||||
|
||||
#### Write the home sensor data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the Get started home sensor sample data
|
||||
to {{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "
|
||||
home,room=Living\ Room temp=21.1,hum=35.9,co=0i 1641024000
|
||||
home,room=Kitchen temp=21.0,hum=35.9,co=0i 1641024000
|
||||
home,room=Living\ Room temp=21.4,hum=35.9,co=0i 1641027600
|
||||
home,room=Kitchen temp=23.0,hum=36.2,co=0i 1641027600
|
||||
home,room=Living\ Room temp=21.8,hum=36.0,co=0i 1641031200
|
||||
home,room=Kitchen temp=22.7,hum=36.1,co=0i 1641031200
|
||||
home,room=Living\ Room temp=22.2,hum=36.0,co=0i 1641034800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=0i 1641034800
|
||||
home,room=Living\ Room temp=22.2,hum=35.9,co=0i 1641038400
|
||||
home,room=Kitchen temp=22.5,hum=36.0,co=0i 1641038400
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=0i 1641042000
|
||||
home,room=Kitchen temp=22.8,hum=36.5,co=1i 1641042000
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=0i 1641045600
|
||||
home,room=Kitchen temp=22.8,hum=36.3,co=1i 1641045600
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=1i 1641049200
|
||||
home,room=Kitchen temp=22.7,hum=36.2,co=3i 1641049200
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=4i 1641052800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=7i 1641052800
|
||||
home,room=Living\ Room temp=22.6,hum=35.9,co=5i 1641056400
|
||||
home,room=Kitchen temp=22.7,hum=36.0,co=9i 1641056400
|
||||
home,room=Living\ Room temp=22.8,hum=36.2,co=9i 1641060000
|
||||
home,room=Kitchen temp=23.3,hum=36.9,co=18i 1641060000
|
||||
home,room=Living\ Room temp=22.5,hum=36.3,co=14i 1641063600
|
||||
home,room=Kitchen temp=23.1,hum=36.6,co=22i 1641063600
|
||||
home,room=Living\ Room temp=22.2,hum=36.4,co=17i 1641067200
|
||||
home,room=Kitchen temp=22.7,hum=36.5,co=26i 1641067200
|
||||
"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/write?db=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "
|
||||
home,room=Living\ Room temp=21.1,hum=35.9,co=0i 1641024000
|
||||
home,room=Kitchen temp=21.0,hum=35.9,co=0i 1641024000
|
||||
home,room=Living\ Room temp=21.4,hum=35.9,co=0i 1641027600
|
||||
home,room=Kitchen temp=23.0,hum=36.2,co=0i 1641027600
|
||||
home,room=Living\ Room temp=21.8,hum=36.0,co=0i 1641031200
|
||||
home,room=Kitchen temp=22.7,hum=36.1,co=0i 1641031200
|
||||
home,room=Living\ Room temp=22.2,hum=36.0,co=0i 1641034800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=0i 1641034800
|
||||
home,room=Living\ Room temp=22.2,hum=35.9,co=0i 1641038400
|
||||
home,room=Kitchen temp=22.5,hum=36.0,co=0i 1641038400
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=0i 1641042000
|
||||
home,room=Kitchen temp=22.8,hum=36.5,co=1i 1641042000
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=0i 1641045600
|
||||
home,room=Kitchen temp=22.8,hum=36.3,co=1i 1641045600
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=1i 1641049200
|
||||
home,room=Kitchen temp=22.7,hum=36.2,co=3i 1641049200
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=4i 1641052800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=7i 1641052800
|
||||
home,room=Living\ Room temp=22.6,hum=35.9,co=5i 1641056400
|
||||
home,room=Kitchen temp=22.7,hum=36.0,co=9i 1641056400
|
||||
home,room=Living\ Room temp=22.8,hum=36.2,co=9i 1641060000
|
||||
home,room=Kitchen temp=23.3,hum=36.9,co=18i 1641060000
|
||||
home,room=Living\ Room temp=22.5,hum=36.3,co=14i 1641063600
|
||||
home,room=Kitchen temp=23.1,hum=36.6,co=22i 1641063600
|
||||
home,room=Living\ Room temp=22.2,hum=36.4,co=17i 1641067200
|
||||
home,room=Kitchen temp=22.7,hum=36.5,co=26i 1641067200
|
||||
"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your InfluxDB Cloud Dedicated database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/cloud-dedicated/admin/tokens/#database-tokens)
|
||||
with _write_ permission to the database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Home sensor actions data
|
||||
|
||||
Includes hypothetical actions triggered by data in the [Get started home sensor data](#get-started-home-sensor-data)
|
||||
and is a companion dataset to that sample dataset.
|
||||
To customize timestamps in the dataset, use the {{< icon "clock" >}} button in
|
||||
the lower right corner of the page.
|
||||
This lets you modify the sample dataset to stay within the retention period of
|
||||
the database you write it to.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T08:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
to
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T20:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
<em style="opacity: .5">(Customizable)</em>
|
||||
|
||||
##### Schema
|
||||
|
||||
- home_actions <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- room
|
||||
- Kitchen
|
||||
- Living Room
|
||||
- action
|
||||
- alert
|
||||
- cool
|
||||
- level
|
||||
- ok
|
||||
- warn
|
||||
- **fields**:
|
||||
- description <em style="opacity: .5">(string)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write home sensor actions data to InfluxDB" %}}
|
||||
|
||||
#### Write the home sensor actions data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the home sensor actions sample data
|
||||
to {{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary '
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23°C). Cooling to 22°C." 1641027600
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.3°C). Cooling to 22°C." 1641060000
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.1°C). Cooling to 22°C." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 18 ppm." 1641060000
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 22 ppm." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 26 ppm." 1641067200
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 14 ppm." 1641063600
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 17 ppm." 1641067200
|
||||
'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/write?db=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary '
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23°C). Cooling to 22°C." 1641027600
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.3°C). Cooling to 22°C." 1641060000
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.1°C). Cooling to 22°C." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 18 ppm." 1641060000
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 22 ppm." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 26 ppm." 1641067200
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 14 ppm." 1641063600
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 17 ppm." 1641067200
|
||||
'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your InfluxDB Cloud Dedicated database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/cloud-dedicated/admin/tokens/#database-tokens)
|
||||
with _write_ permission to the database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## NOAA Bay Area weather data
|
||||
|
||||
Includes daily weather metrics from three San Francisco Bay Area airports from
|
||||
**January 1, 2020 to December 31, 2022**.
|
||||
This sample dataset includes seasonal trends and is good for exploring time
|
||||
series use cases that involve seasonality.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2020-01-01T00:00:00Z** to **2022-12-31T00:00:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- weather <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- location
|
||||
- Concord
|
||||
- Hayward
|
||||
- San Francisco
|
||||
- **fields**
|
||||
- precip <em style="opacity: .5">(float)</em>
|
||||
- temp_avg <em style="opacity: .5">(float)</em>
|
||||
- temp_max <em style="opacity: .5">(float)</em>
|
||||
- temp_min <em style="opacity: .5">(float)</em>
|
||||
- wind_avg <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the NOAA Bay Area weather data to InfluxDB" %}}
|
||||
|
||||
#### Write the NOAA Bay Area weather data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the NOAA Bay Area weather sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bay-area-weather.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bay-area-weather.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your InfluxDB Cloud Dedicated database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/cloud-dedicated/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Bitcoin price data
|
||||
|
||||
The Bitcoin price sample dataset provides Bitcoin prices from
|
||||
**2023-05-01T00:00:00Z to 2023-05-15T00:00:00Z**—_[Powered by CoinDesk](https://www.coindesk.com/price/bitcoin)_.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2023-05-01T00:19:00Z** to **2023-05-14T23:48:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- bitcoin <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- code
|
||||
- EUR
|
||||
- GBP
|
||||
- USD
|
||||
- crypto
|
||||
- bitcoin
|
||||
- description
|
||||
- Euro
|
||||
- British Pound Sterling
|
||||
- United States Dollar
|
||||
- symbol
|
||||
- \€ <em style="opacity: .5">(€)</em>
|
||||
- \£ <em style="opacity: .5">(£)</em>
|
||||
- \$ <em style="opacity: .5">($)</em>
|
||||
- **fields**
|
||||
- price <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the Bitcoin sample data to InfluxDB" %}}
|
||||
|
||||
#### Write the Bitcoin price sample data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the Bitcoin price sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your InfluxDB Cloud Dedicated database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/cloud-dedicated/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Random numbers sample data
|
||||
|
||||
Includes two fields with randomly generated numbers reported every minute.
|
||||
Each field has a specific range of randomly generated numbers.
|
||||
This sample dataset is used to demonstrate mathematic operations and
|
||||
transformation functions.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2023-01-01T00:00:00Z** to **2023-01-01T12:00:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- numbers <em style="opacity: .5">(measurement)</em>
|
||||
- **fields**
|
||||
- a <em style="opacity: .5">(float between -1 and 1)</em>
|
||||
- b <em style="opacity: .5">(float between -3 and 3)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the random number sample data to InfluxDB" %}}
|
||||
|
||||
#### Write the random number sample data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the random number sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/random-numbers.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/random-numbers.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your InfluxDB Cloud Dedicated database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/cloud-dedicated/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
<!--
|
||||
//SOURCE content/shared/influxdb3-sample-data/sample-data-dist.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Use Power BI to visualize data
|
||||
description: >
|
||||
Use Microsoft Power BI Desktop with the InfluxDB 3 custom connector to query and
|
||||
visualize data from {{% product-name %}}.
|
||||
menu:
|
||||
influxdb3_cloud_dedicated:
|
||||
name: Power BI
|
||||
parent: Visualize data
|
||||
weight: 104
|
||||
influxdb3/cloud-dedicated/tags: [visualization, powerbi, sql]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- https://learn.microsoft.com/en-us/power-bi/desktop/, Power BI documentation
|
||||
- /influxdb3/cloud-dedicated/query-data/sql/
|
||||
- /influxdb3/cloud-dedicated/query-data/execute-queries/odbc/
|
||||
source: /shared/influxdb3-visualize/powerbi.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-visualize/powerbi.md -->
|
||||
|
|
@ -9,6 +9,8 @@ menu:
|
|||
name: Use Chronograf
|
||||
parent: Visualize data
|
||||
weight: 202
|
||||
aliases:
|
||||
- /influxdb3/cloud-serverless/visualize-data/chronograf/
|
||||
related:
|
||||
- /chronograf/v1/
|
||||
metadata: [InfluxQL only]
|
||||
|
|
@ -77,14 +79,14 @@ If you haven't already, [download and install Chronograf](/chronograf/v1/introdu
|
|||
> schema information may not be available in the Data Explorer.
|
||||
> This limits the Data Explorer's query building functionality and requires you to
|
||||
> build queries manually using
|
||||
> [fully-qualified measurements](/influxdb3/cloud-serverless/reference/influxql/select/#fully-qualified-measurement)
|
||||
> [fully qualified measurements](/influxdb3/cloud-serverless/reference/influxql/select/#fully-qualified-measurement)
|
||||
> in the `FROM` clause. For example:
|
||||
>
|
||||
> ```sql
|
||||
> -- Fully-qualified measurement
|
||||
> -- Fully qualified measurement
|
||||
> SELECT * FROM "db-name"."rp-name"."measurement-name"
|
||||
>
|
||||
> -- Fully-qualified measurement shorthand (use the default retention policy)
|
||||
>
|
||||
> -- Fully qualified measurement shorthand (use the default retention policy)
|
||||
> SELECT * FROM "db-name".."measurement-name"
|
||||
> ```
|
||||
>
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ influxdb3/cloud-serverless/tags: [Flight client, query, flightsql, superset]
|
|||
aliases:
|
||||
- /influxdb3/cloud-serverless/query-data/tools/superset/
|
||||
- /influxdb3/cloud-serverless/query-data/sql/execute-queries/superset/
|
||||
- /influxdb3/cloud-serverless/visualize-data/superset/
|
||||
- /influxdb3/cloud-serverless/process-data/tools/superset/
|
||||
alt_links:
|
||||
core: /influxdb3/core/visualize-data/superset/
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ menu:
|
|||
influxdb3/cloud-serverless/tags: [Flight client, query, flightsql, tableau, sql]
|
||||
aliases:
|
||||
- /influxdb3/cloud-serverless/query-data/sql/execute-queries/tableau/
|
||||
- /influxdb3/cloud-serverless/visualize-data/tableau/
|
||||
alt_links:
|
||||
core: /influxdb3/core/visualize-data/tableau/
|
||||
enterprise: /influxdb3/enterprise/visualize-data/tableau/
|
||||
|
|
|
|||
|
|
@ -0,0 +1,19 @@
|
|||
---
|
||||
title: Execute SQL queries with ODBC
|
||||
description: >
|
||||
Use the Arrow Flight SQL ODBC driver to execute SQL queries against {{% product-name %}} from
|
||||
ODBC-compatible applications and programming languages.
|
||||
menu:
|
||||
influxdb3_cloud_serverless:
|
||||
name: Use ODBC
|
||||
parent: Execute queries
|
||||
weight: 351
|
||||
influxdb3/cloud-serverless/tags: [query, sql, odbc]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- /influxdb3/cloud-serverless/reference/sql/
|
||||
- /influxdb3/cloud-serverless/query-data/
|
||||
source: /shared/influxdb3-query-guides/execute-queries/odbc.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-query-guides/execute-queries/odbc.md -->
|
||||
|
|
@ -9,7 +9,7 @@ menu:
|
|||
parent: Execute queries
|
||||
name: Use visualization tools
|
||||
identifier: query-with-visualization-tools
|
||||
influxdb3/cloud-serverless/tags: [query, sql, influxql]
|
||||
influxdb3/cloud-serverless/tags: [query, sql, influxql, visualization]
|
||||
metadata: [SQL, InfluxQL]
|
||||
aliases:
|
||||
- /influxdb3/cloud-serverless/query-data/influxql/execute-queries/visualization-tools/
|
||||
|
|
@ -27,6 +27,7 @@ Use visualization tools to query data stored in {{% product-name %}}.
|
|||
The following visualization tools support querying InfluxDB with SQL:
|
||||
|
||||
- [Grafana](/influxdb3/cloud-serverless/process-data/visualize/grafana/)
|
||||
- [Power BI](/influxdb3/cloud-serverless/process-data/visualize/powerbi/)
|
||||
- [Superset](/influxdb3/cloud-serverless/process-data/visualize/superset/)
|
||||
- [Tableau](/influxdb3/cloud-serverless/process-data/visualize/tableau/)
|
||||
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@ Use the following sample datasets to replicate provided examples.
|
|||
- [Get started home sensor data](#get-started-home-sensor-data)
|
||||
- [Home sensor actions data](#home-sensor-actions-data)
|
||||
- [NOAA Bay Area weather data](#noaa-bay-area-weather-data)
|
||||
- [European Union wind data](#european-union-wind-data)
|
||||
- [Bitcoin price data](#bitcoin-price-data)
|
||||
- [Random numbers sample data](#random-numbers-sample-data)
|
||||
|
||||
|
|
@ -348,6 +349,78 @@ Replace the following in the sample script:
|
|||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## European Union wind data
|
||||
|
||||
The European Union (EU) wind sample dataset provides hourly measurements of
|
||||
wind speed and wind direction from various cities in the EU.
|
||||
The dataset includes a hierarchical tag set of country, county, and city.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2025-10-01T00:00:00Z** to **2025-10-01T23:00:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- wind_data <em style="opacity: .5">(table)</em>
|
||||
- **tags**:
|
||||
- country
|
||||
- _20 countries_
|
||||
- county
|
||||
- _111 counties_
|
||||
- city
|
||||
- _129 cities_
|
||||
- **fields**:
|
||||
- wind_speed <em style="opacity: .5">(float)</em>
|
||||
- wind_direction <em style="opacity: .5">(integer)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the EU wind sample data to InfluxDB" %}}
|
||||
|
||||
#### Write the EU wind sample data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the EU wind sample data to {{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh {placeholders="DATABASE_(TOKEN|NAME)"}
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/eu-wind-data.lp)"
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```sh {placeholders="DATABASE_(TOKEN|NAME)"}
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/eu-wind-data.lp)"
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your {{% product-name %}} database
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/version/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Bitcoin price data
|
||||
|
||||
The Bitcoin price sample dataset provides Bitcoin prices from
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Use Power BI to visualize data
|
||||
description: >
|
||||
Use Microsoft Power BI Desktop with the InfluxDB 3 custom connector to query and
|
||||
visualize data from {{% product-name %}}.
|
||||
menu:
|
||||
influxdb3_cloud_serverless:
|
||||
name: Power BI
|
||||
parent: Visualize data
|
||||
weight: 104
|
||||
influxdb3/cloud-serverless/tags: [visualization, powerbi, sql]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- https://learn.microsoft.com/en-us/power-bi/desktop/, Power BI documentation
|
||||
- /influxdb3/cloud-serverless/query-data/sql/
|
||||
- /influxdb3/cloud-serverless/query-data/execute-queries/odbc/
|
||||
source: /shared/influxdb3-visualize/powerbi.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-visualize/powerbi.md -->
|
||||
|
|
@ -25,6 +25,10 @@ to rename a database in your {{< product-name omit=" Cluster" >}} cluster.
|
|||
> or update [database tokens](/influxdb3/clustered/admin/tokens/database/).
|
||||
> After renaming a database, any existing database tokens will stop working and you
|
||||
> must create new tokens with permissions for the renamed database.
|
||||
>
|
||||
> If you create a new database using the previous database name, tokens
|
||||
> associated with that database name will grant access to the newly created
|
||||
> database.
|
||||
|
||||
## Rename a database using the influxctl CLI
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,65 @@
|
|||
---
|
||||
title: Undelete a table
|
||||
description: >
|
||||
Use the [`influxctl table undelete` command](/influxdb3/clustered/reference/cli/influxctl/table/undelete/)
|
||||
to restore a previously deleted table in your {{< product-name omit=" Cluster" >}} cluster.
|
||||
menu:
|
||||
influxdb3_clustered:
|
||||
parent: Manage tables
|
||||
weight: 204
|
||||
list_code_example: |
|
||||
```bash { placeholders="DATABASE_NAME|TABLE_ID" }
|
||||
influxctl table undelete DATABASE_NAME TABLE_ID
|
||||
```
|
||||
related:
|
||||
- /influxdb3/clustered/reference/cli/influxctl/table/undelete/
|
||||
- /influxdb3/clustered/admin/tables/delete/
|
||||
- /influxdb3/clustered/admin/tokens/table/create/
|
||||
---
|
||||
|
||||
Use the [`influxctl table undelete` command](/influxdb3/clustered/reference/cli/influxctl/table/undelete/)
|
||||
to restore a previously deleted table in your {{< product-name omit=" Cluster" >}} cluster.
|
||||
|
||||
> [!Important]
|
||||
> To undelete a table:
|
||||
>
|
||||
> - A new table with the same name cannot already exist.
|
||||
> - You must have appropriate permissions to manage databases.
|
||||
|
||||
When you undelete a table, it is restored with the same partition template and
|
||||
other settings as when it was deleted.
|
||||
|
||||
> [!Warning]
|
||||
> Tables can only be undeleted for
|
||||
> {{% show-in "cloud-dedicated" %}}approximately 14 days{{% /show-in %}}{{% show-in "clustered" %}}a configurable "hard-delete" grace period{{% /show-in %}}
|
||||
> after they are deleted.
|
||||
> After this grace period, all Parquet files associated with the deleted table
|
||||
> are permanently removed and the table cannot be undeleted.
|
||||
|
||||
## Undelete a table using the influxctl CLI
|
||||
|
||||
```bash { placeholders="DATABASE_NAME|TABLE_ID" }
|
||||
influxctl table undelete DATABASE_NAME TABLE_ID
|
||||
```
|
||||
|
||||
Replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
Name of the database associated with the deleted table
|
||||
- {{% code-placeholder-key %}}`TABLE_ID`{{% /code-placeholder-key %}}:
|
||||
ID of the deleted table to restore
|
||||
|
||||
> [!Tip]
|
||||
> #### View deleted table IDs
|
||||
>
|
||||
> To view the IDs of deleted tables, use the `influxctl table list` command with
|
||||
> the `--filter-status=deleted` flag--for example:
|
||||
>
|
||||
> <!--pytest.mark.skip-->
|
||||
>
|
||||
> ```bash {placeholders="DATABASE_NAME" }
|
||||
> influxctl table list --filter-status=deleted DATABASE_NAME
|
||||
> ```
|
||||
>
|
||||
> Replace {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}
|
||||
> with the name of the database associated with the table you want to undelete.
|
||||
|
|
@ -16,13 +16,13 @@ aliases:
|
|||
- /influxdb3/clustered/install/licensing/
|
||||
---
|
||||
|
||||
Install your InfluxDB Clustered license in your cluster to authorize the use
|
||||
of the InfluxDB Clustered software.
|
||||
Install your {{% product-name %}} license in your cluster to authorize the use
|
||||
of the {{% product-name %}} software.
|
||||
|
||||
## Install your InfluxDB license
|
||||
## Install your {{% product-name %}} license
|
||||
|
||||
1. If you haven't already,
|
||||
[request an InfluxDB Clustered license](https://influxdata.com/contact-sales).
|
||||
[request an {{% product-name %}} license](https://influxdata.com/contact-sales).
|
||||
2. InfluxData provides you with a `license.yml` file that encapsulates your
|
||||
license token as a custom Kubernetes resource.
|
||||
3. Use `kubectl` to apply and create the `License` resource in your InfluxDB
|
||||
|
|
@ -34,28 +34,28 @@ of the InfluxDB Clustered software.
|
|||
kubectl apply --filename license.yml --namespace influxdb
|
||||
```
|
||||
|
||||
InfluxDB Clustered detects the `License` resource and extracts the credentials
|
||||
into a secret required by InfluxDB Clustered Kubernetes pods.
|
||||
{{% product-name %}} detects the `License` resource and extracts the credentials
|
||||
into a secret required by {{% product-name %}} Kubernetes pods.
|
||||
Pods validate the license secret both at startup and periodically (roughly once
|
||||
per hour) while running.
|
||||
|
||||
## Upgrade from a non-licensed release
|
||||
|
||||
If you are currently using a non-licensed preview release of InfluxDB Clustered
|
||||
If you are currently using a non-licensed preview release of {{% product-name %}}
|
||||
and want to upgrade to a licensed release, do the following:
|
||||
|
||||
1. [Install an InfluxDB license](#install-your-influxdb-license)
|
||||
1. [Install an {{% product-name %}} license](#install-your-influxdb-clustered-license)
|
||||
2. If you [use the `AppInstance` resource configuration](/influxdb3/clustered/install/set-up-cluster/configure-cluster/directly/)
|
||||
to configure your cluster, in your `myinfluxdb.yml`, update the package
|
||||
version defined in `spec.package.image` to use a licensed release.
|
||||
|
||||
If using the InfluxDB Clustered Helm chart, update the `image.tag` property
|
||||
If using the {{% product-name %}} Helm chart, update the `image.tag` property
|
||||
in your `values.yaml`to use a licensed release.
|
||||
|
||||
> [!Warning]
|
||||
> #### Upgrade to checkpoint releases first
|
||||
>
|
||||
> When upgrading InfluxDB Clustered, always upgrade to each
|
||||
> When upgrading {{% product-name %}}, always upgrade to each
|
||||
> [checkpoint release](/influxdb3/clustered/admin/upgrade/#checkpoint-releases)
|
||||
> first, before proceeding to newer versions.
|
||||
> Upgrading past a checkpoint release without first upgrading to it may result in
|
||||
|
|
@ -103,6 +103,33 @@ the version number to upgrade to.
|
|||
After you have activated your license, use the following signals to verify the
|
||||
license is active and functioning.
|
||||
|
||||
In your commands, replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}`NAMESPACE`{{% /code-placeholder-key %}}:
|
||||
your [InfluxDB namespace](/influxdb3/clustered/install/set-up-cluster/configure-cluster/#create-a-namespace-for-influxdb)
|
||||
- {{% code-placeholder-key %}}`POD_NAME`{{% /code-placeholder-key %}}:
|
||||
your [InfluxDB Kubernetes pod](/influxdb3/clustered/install/set-up-cluster/deploy/#inspect-cluster-pods)
|
||||
|
||||
### Verify database components
|
||||
|
||||
After you [install your license](#install-your-influxdb-clustered-license),
|
||||
run the following command to check that database pods start up and are in the
|
||||
`Running` state:
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
```bash
|
||||
kubectl get pods -l app=iox --namespace influxdb
|
||||
```
|
||||
|
||||
If a `Pod` fails to start, run the following command to view pod information:
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
```sh { placeholders="POD_NAME" }
|
||||
kubectl describe pod POD_NAME --namespace influxdb
|
||||
```
|
||||
|
||||
### Verify the `Secret` exists
|
||||
|
||||
Run the following command to verify that the licensing activation created a
|
||||
|
|
@ -116,7 +143,8 @@ kubectl get secret iox-license --namespace influxdb
|
|||
|
||||
If the secret doesn't exist,
|
||||
[view `license-controller` logs](#view-license-controller-logs) for more
|
||||
information or errors.
|
||||
information or errors. For troubleshooting guidance, see
|
||||
[Manage your {{% product-name %}} license](/influxdb3/clustered/admin/licensing/).
|
||||
|
||||
### View `license controller` logs
|
||||
|
||||
|
|
@ -130,7 +158,20 @@ following command:
|
|||
kubectl logs deployment/license-controller --namespace influxdb
|
||||
```
|
||||
|
||||
For more information about InfluxDB Clustered licensing, see
|
||||
[Manage your InfluxDB Clustered license](/influxdb3/clustered/admin/licensing/)
|
||||
## Renew your license
|
||||
|
||||
> [!Tip]
|
||||
> Before your license expires, your InfluxData sales representative will
|
||||
> contact you about license renewal.
|
||||
> You may also contact your sales representative at any time.
|
||||
|
||||
If you have an expired license, follow the same process to [install your renewed license](#install-your-influxdb-clustered-license) using the new `license.yml` file provided by InfluxData.
|
||||
|
||||
> [!Important]
|
||||
> #### Recover from an expired license
|
||||
> If your license has already expired and your cluster pods are in a `CrashLoopBackoff` state, applying a valid renewed license will restore normal operation. For more information about license enforcement and recovery, see [Manage your {{% product-name %}} license](/influxdb3/clustered/admin/licensing/).
|
||||
|
||||
For more information about {{% product-name %}} licensing, including license enforcement, grace periods, and detailed troubleshooting, see
|
||||
[Manage your {{% product-name %}} license](/influxdb3/clustered/admin/licensing/).
|
||||
|
||||
{{< page-nav prev="/influxdb3/clustered/install/set-up-cluster/configure-cluster/" prevText="Configure your cluster" next="/influxdb3/clustered/install/set-up-cluster/deploy/" nextText="Deploy your cluster" keepTab=true >}}
|
||||
|
|
|
|||
|
|
@ -9,6 +9,8 @@ menu:
|
|||
name: Use Chronograf
|
||||
parent: Visualize data
|
||||
weight: 202
|
||||
aliases:
|
||||
- /influxdb3/clustered/visualize-data/chronograf/
|
||||
related:
|
||||
- /chronograf/v1/
|
||||
metadata: [InfluxQL only]
|
||||
|
|
@ -84,14 +86,14 @@ If you haven't already, [download and install Chronograf](/chronograf/v1/introdu
|
|||
> schema information may not be available in the Data Explorer.
|
||||
> This limits the Data Explorer's query building functionality and requires you to
|
||||
> build queries manually using
|
||||
> [fully-qualified measurements](/influxdb3/clustered/reference/influxql/select/#fully-qualified-measurement)
|
||||
> [fully qualified measurements](/influxdb3/clustered/reference/influxql/select/#fully-qualified-measurement)
|
||||
> in the `FROM` clause. For example:
|
||||
>
|
||||
> ```sql
|
||||
> -- Fully-qualified measurement
|
||||
> -- Fully qualified measurement
|
||||
> SELECT * FROM "db-name"."rp-name"."measurement-name"
|
||||
>
|
||||
> -- Fully-qualified measurement shorthand (use the default retention policy)
|
||||
>
|
||||
> -- Fully qualified measurement shorthand (use the default retention policy)
|
||||
> SELECT * FROM "db-name".."measurement-name"
|
||||
> ```
|
||||
>
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ aliases:
|
|||
- /influxdb3/clustered/query-data/sql/execute-queries/grafana/
|
||||
- /influxdb3/clustered/query-data/influxql/execute-queries/grafana
|
||||
- /influxdb3/clustered/process-data/tools/grafana/
|
||||
- /influxdb3/clustered/visualize-data/grafana/
|
||||
alt_links:
|
||||
v2: /influxdb/v2/tools/grafana/
|
||||
cloud: /influxdb/cloud/tools/grafana/
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ menu:
|
|||
influxdb3/clustered/tags: [Flight client, query, flightsql, superset]
|
||||
aliases:
|
||||
- /influxdb3/clustered/query-data/execute-queries/flight-sql/superset/
|
||||
- /influxdb3/clustered/visualize-data/superset/
|
||||
- /influxdb3/clustered/query-data/tools/superset/
|
||||
- /influxdb3/clustered/query-data/sql/execute-queries/superset/
|
||||
- /influxdb3/clustered/process-data/tools/superset/
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ menu:
|
|||
influxdb3/clustered/tags: [Flight client, query, flightsql, tableau, sql]
|
||||
aliases:
|
||||
- /influxdb3/clustered/query-data/execute-queries/flight-sql/tableau/
|
||||
- /influxdb3/clustered/visualize-data/tableau/
|
||||
- /influxdb3/clustered/query-data/tools/tableau/
|
||||
- /influxdb3/clustered/query-data/sql/execute-queries/tableau/
|
||||
- /influxdb3/clustered/process-data/tools/tableau/
|
||||
|
|
|
|||
|
|
@ -0,0 +1,19 @@
|
|||
---
|
||||
title: Execute SQL queries with ODBC
|
||||
description: >
|
||||
Use the Arrow Flight SQL ODBC driver to execute SQL queries against {{% product-name %}} from
|
||||
ODBC-compatible applications and programming languages.
|
||||
menu:
|
||||
influxdb3_clustered:
|
||||
name: Use ODBC
|
||||
parent: Execute queries
|
||||
weight: 351
|
||||
influxdb3/clustered/tags: [query, sql, odbc]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- /influxdb3/clustered/reference/sql/
|
||||
- /influxdb3/clustered/query-data/
|
||||
source: /shared/influxdb3-query-guides/execute-queries/odbc.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-query-guides/execute-queries/odbc.md -->
|
||||
|
|
@ -9,7 +9,7 @@ menu:
|
|||
parent: Execute queries
|
||||
name: Use visualization tools
|
||||
identifier: query-with-visualization-tools
|
||||
influxdb3/clustered/tags: [query, sql, influxql]
|
||||
influxdb3/clustered/tags: [query, sql, influxql, visualization]
|
||||
metadata: [SQL, InfluxQL]
|
||||
aliases:
|
||||
- /influxdb3/clustered/query-data/influxql/execute-queries/visualization-tools/
|
||||
|
|
@ -27,6 +27,7 @@ Use visualization tools to query data stored in {{% product-name %}} with SQL.
|
|||
The following visualization tools support querying InfluxDB with SQL:
|
||||
|
||||
- [Grafana](/influxdb3/clustered/process-data/visualize/grafana/)
|
||||
- [Power BI](/influxdb3/clustered/process-data/visualize/powerbi/)
|
||||
- [Superset](/influxdb3/clustered/process-data/visualize/superset/)
|
||||
- [Tableau](/influxdb3/clustered/process-data/visualize/tableau/)
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,14 @@
|
|||
---
|
||||
title: influxctl table undelete
|
||||
description: >
|
||||
The `influxctl table undelete` command undeletes a previously deleted
|
||||
table in an {{% product-name omit=" Clustered" %}} cluster.
|
||||
menu:
|
||||
influxdb3_clustered:
|
||||
parent: influxctl table
|
||||
weight: 301
|
||||
metadata: [influxctl 2.10.4+]
|
||||
source: /shared/influxctl/table/undelete.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxctl/table/undelete.md -->
|
||||
|
|
@ -12,12 +12,12 @@ weight: 201
|
|||
|
||||
> [!Note]
|
||||
> ## Checkpoint releases {.checkpoint}
|
||||
>
|
||||
>
|
||||
> Some InfluxDB Clustered releases are checkpoint releases that introduce a
|
||||
> breaking change to an InfluxDB component.
|
||||
> When [upgrading InfluxDB Clustered](/influxdb3/clustered/admin/upgrade/),
|
||||
> **always upgrade to each checkpoint release first, before proceeding to newer versions**.
|
||||
>
|
||||
>
|
||||
> Checkpoint releases are only made when absolutely necessary and are clearly
|
||||
> identified below with the <span class="cf-icon Shield pink"></span> icon.
|
||||
|
||||
|
|
@ -61,6 +61,69 @@ directory. This new directory contains artifacts associated with the specified r
|
|||
|
||||
---
|
||||
|
||||
## 20250925-1878107 {date="2025-09-25"}
|
||||
|
||||
### Quickstart
|
||||
|
||||
```yaml
|
||||
spec:
|
||||
package:
|
||||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20250925-1878107
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20250925-1878107/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20250925-1878107/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Rename and undelete tables
|
||||
|
||||
Tables can now be renamed and undeleted with [influxctl v2.10.5](https://docs.influxdata.com/influxdb3/clustered/reference/release-notes/influxctl/#2105) or later.
|
||||
|
||||
To enable hard delete of soft-deleted namespaces:
|
||||
- Set `INFLUXDB_IOX_ENABLE_NAMESPACE_ROW_DELETION` to `true`.
|
||||
- If needed, adjust how long a namespace remains soft-deleted (and eligible for undeletion) by setting `INFLUXDB_IOX_GC_NAMESPACE_CUTOFF` (default: `14d`)
|
||||
- If needed, adjust how long the garbage collector should sleep between runs of the namespace deletion task with `INFLUXDB_IOX_GC_NAMESPACE_SLEEP_INTERVAL` The default is `24h` which should be suitable for ongoing cleanup, but if there is a backlog of soft-deleted namespaces to clean up, you may want to run this more frequently until the garbage collector has caught up.
|
||||
- If needed, adjust the maximum number of namespaces that will get hard deleted in one run of the namespace deletion task with `INFLUXDB_IOX_GC_NAMESPACE_LIMIT` The default is `1000` which should be suitable for ongoing cleanup, but if you have a large number of namespaces and you're running the task very frequently, you may need to lower this to delete fewer records per run if each individual run is timing out.
|
||||
|
||||
To enable hard delete of soft-deleted tables in active namespaces (soft-deleted tables in soft-deleted namespaces get cleaned up when the namespace gets cleaned up):
|
||||
- Set `INFLUXDB_IOX_ENABLE_TABLE_ROW_DELETION` to `true`, and if needed, adjust these settings that work in the same way as the corresponding namespace flags:
|
||||
- `INFLUXDB_IOX_GC_TABLE_CUTOFF` (default: `14d`)
|
||||
- `INFLUXDB_IOX_GC_TABLE_SLEEP_INTERVAL` (default: `24h`)
|
||||
- `INFLUXDB_IOX_GC_TABLE_LIMIT` (default: `1000`)
|
||||
|
||||
### Changes
|
||||
|
||||
- Update Prometheus to `2.55.1`.
|
||||
|
||||
#### Database Engine
|
||||
|
||||
- Update DataFusion to `49`.
|
||||
- Improve performance of retention checking and reduce memory impact of Data Snapshots.
|
||||
|
||||
### Known Bugs
|
||||
|
||||
Customers who specify the S3 bucket in `spec.package.spec.objectStore.s3.endpoint`
|
||||
(for example: "https://$BUCKET.$REGION.amazonaws.com") and the bucket in
|
||||
`spec.package.spec.objectStore.bucket` need to disable the
|
||||
`CATALOG_BACKUP_DATA_SNAPSHOT` feature:
|
||||
|
||||
```yaml
|
||||
spec:
|
||||
package:
|
||||
spec:
|
||||
+ components:
|
||||
+ garbage-collector:
|
||||
+ template:
|
||||
+ containers:
|
||||
+ iox:
|
||||
+ env:
|
||||
+ INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'false'
|
||||
+ INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'false'
|
||||
```
|
||||
|
||||
## 20250814-1819052 {date="2025-08-14"}
|
||||
|
||||
### Quickstart
|
||||
|
|
@ -277,7 +340,7 @@ spec:
|
|||
|
||||
### Bug Fixes
|
||||
|
||||
This release fixes a bug in the 20241217-1494922 release where the default
|
||||
This release fixes a bug in the 20241217-1494922 release where the default
|
||||
Prometheus CPU limit was set to an integer instead of a string.
|
||||
|
||||
### Changes
|
||||
|
|
@ -367,7 +430,7 @@ Due to incorrect parsing of the
|
|||
`POSTGRES_DSN` environment variable, the `influxdb&options=-c%20search_path=` string is
|
||||
interpreted as the database name.
|
||||
|
||||
To work around this bug, in your AppInstance,
|
||||
To work around this bug, in your AppInstance,
|
||||
include a `spec.package.spec.images.overrides` section to override the
|
||||
`core` pods built-in image with an image that has the bugfix for the DSN
|
||||
parsing error--for example:
|
||||
|
|
@ -498,7 +561,7 @@ For more information about defining variables in your InfluxDB cluster, see
|
|||
##### Write API behaviors
|
||||
|
||||
When submitting a write request that includes invalid or malformed line protocol,
|
||||
The InfluxDB write API returns a 400 response code and does the following:
|
||||
The InfluxDB write API returns a 400 response code and does the following:
|
||||
|
||||
- With partial writes _enabled_:
|
||||
|
||||
|
|
@ -641,7 +704,7 @@ customer workloads.
|
|||
- A best-effort, pre-populated `influxctl` config file is provided as a
|
||||
`ConfigMap` for your convenience.
|
||||
- Limit garbage collector replicas to 1, see the
|
||||
[documentation](/influxdb3/clustered/reference/internals/storage-engine/#garbage-collector-scaling-strategies)
|
||||
[documentation](/influxdb3/clustered/reference/internals/storage-engine/#garbage-collector-scaling-strategies)
|
||||
for further details.
|
||||
|
||||
#### Database engine
|
||||
|
|
@ -979,7 +1042,7 @@ Kubernetes scheduler's default behavior. For further details, please consult the
|
|||
- Improve compactor concurrency heuristics.
|
||||
- Fix gRPC reflection to only include services served by a particular listening
|
||||
port.
|
||||
|
||||
|
||||
> [!Note]
|
||||
> `arrow.flight.protocol.FlightService` is known to be missing in the
|
||||
> `iox-shared-querier`'s reflection service even though `iox-shared-querier`
|
||||
|
|
@ -1007,7 +1070,7 @@ spec:
|
|||
|
||||
#### Deployment
|
||||
|
||||
- Gossip communication between the `global-router`, `iox-shared-compactor`, and
|
||||
- Gossip communication between the `global-router`, `iox-shared-compactor`, and
|
||||
iox-shared-ingester` now works as expected.
|
||||
- Provide sane defaults to the `global-router` for maximum number of concurrent
|
||||
requests.
|
||||
|
|
@ -1138,7 +1201,7 @@ these are automatically created for you.
|
|||
|
||||
As part of a `partition_id` migration that runs, if you have more than 10
|
||||
million rows in the `parquet_file` table, reach out to your Sales representative
|
||||
before proceeding. You can confirm this with the following query:
|
||||
before proceeding. You can confirm this with the following query:
|
||||
|
||||
```sql
|
||||
SELECT count(*) FROM iox_catalog.parquet_file
|
||||
|
|
|
|||
|
|
@ -9,495 +9,9 @@ menu:
|
|||
name: Sample data
|
||||
parent: Reference
|
||||
weight: 182
|
||||
source: /shared/influxdb3-sample-data/sample-data-dist.md
|
||||
---
|
||||
|
||||
Sample datasets are used throughout the {{< product-name >}} documentation to
|
||||
demonstrate functionality.
|
||||
Use the following sample datasets to replicate provided examples.
|
||||
|
||||
- [Get started home sensor data](#get-started-home-sensor-data)
|
||||
- [Home sensor actions data](#home-sensor-actions-data)
|
||||
- [NOAA Bay Area weather data](#noaa-bay-area-weather-data)
|
||||
- [Bitcoin price data](#bitcoin-price-data)
|
||||
- [Random numbers sample data](#random-numbers-sample-data)
|
||||
|
||||
## Get started home sensor data
|
||||
|
||||
Includes hourly home sensor data used in the
|
||||
[Get started with {{< product-name >}}](/influxdb3/clustered/get-started/) guide.
|
||||
This dataset includes anomalous sensor readings and helps to demonstrate
|
||||
processing and alerting on time series data.
|
||||
To customize timestamps in the dataset, use the {{< icon "clock" >}} button in
|
||||
the lower right corner of the page.
|
||||
This lets you modify the sample dataset to stay within the retention period of
|
||||
the database you write it to.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T08:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
to
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T20:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
<em style="opacity: .5">(Customizable)</em>
|
||||
|
||||
##### Schema
|
||||
|
||||
- home <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- room
|
||||
- Kitchen
|
||||
- Living Room
|
||||
- **fields**:
|
||||
- co <em style="opacity: .5">(integer)</em>
|
||||
- temp <em style="opacity: .5">(float)</em>
|
||||
- hum <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write home sensor data to InfluxDB" %}}
|
||||
|
||||
#### Write the home sensor data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the Get started home sensor sample data
|
||||
to {{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "
|
||||
home,room=Living\ Room temp=21.1,hum=35.9,co=0i 1641024000
|
||||
home,room=Kitchen temp=21.0,hum=35.9,co=0i 1641024000
|
||||
home,room=Living\ Room temp=21.4,hum=35.9,co=0i 1641027600
|
||||
home,room=Kitchen temp=23.0,hum=36.2,co=0i 1641027600
|
||||
home,room=Living\ Room temp=21.8,hum=36.0,co=0i 1641031200
|
||||
home,room=Kitchen temp=22.7,hum=36.1,co=0i 1641031200
|
||||
home,room=Living\ Room temp=22.2,hum=36.0,co=0i 1641034800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=0i 1641034800
|
||||
home,room=Living\ Room temp=22.2,hum=35.9,co=0i 1641038400
|
||||
home,room=Kitchen temp=22.5,hum=36.0,co=0i 1641038400
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=0i 1641042000
|
||||
home,room=Kitchen temp=22.8,hum=36.5,co=1i 1641042000
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=0i 1641045600
|
||||
home,room=Kitchen temp=22.8,hum=36.3,co=1i 1641045600
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=1i 1641049200
|
||||
home,room=Kitchen temp=22.7,hum=36.2,co=3i 1641049200
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=4i 1641052800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=7i 1641052800
|
||||
home,room=Living\ Room temp=22.6,hum=35.9,co=5i 1641056400
|
||||
home,room=Kitchen temp=22.7,hum=36.0,co=9i 1641056400
|
||||
home,room=Living\ Room temp=22.8,hum=36.2,co=9i 1641060000
|
||||
home,room=Kitchen temp=23.3,hum=36.9,co=18i 1641060000
|
||||
home,room=Living\ Room temp=22.5,hum=36.3,co=14i 1641063600
|
||||
home,room=Kitchen temp=23.1,hum=36.6,co=22i 1641063600
|
||||
home,room=Living\ Room temp=22.2,hum=36.4,co=17i 1641067200
|
||||
home,room=Kitchen temp=22.7,hum=36.5,co=26i 1641067200
|
||||
"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/write?db=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "
|
||||
home,room=Living\ Room temp=21.1,hum=35.9,co=0i 1641024000
|
||||
home,room=Kitchen temp=21.0,hum=35.9,co=0i 1641024000
|
||||
home,room=Living\ Room temp=21.4,hum=35.9,co=0i 1641027600
|
||||
home,room=Kitchen temp=23.0,hum=36.2,co=0i 1641027600
|
||||
home,room=Living\ Room temp=21.8,hum=36.0,co=0i 1641031200
|
||||
home,room=Kitchen temp=22.7,hum=36.1,co=0i 1641031200
|
||||
home,room=Living\ Room temp=22.2,hum=36.0,co=0i 1641034800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=0i 1641034800
|
||||
home,room=Living\ Room temp=22.2,hum=35.9,co=0i 1641038400
|
||||
home,room=Kitchen temp=22.5,hum=36.0,co=0i 1641038400
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=0i 1641042000
|
||||
home,room=Kitchen temp=22.8,hum=36.5,co=1i 1641042000
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=0i 1641045600
|
||||
home,room=Kitchen temp=22.8,hum=36.3,co=1i 1641045600
|
||||
home,room=Living\ Room temp=22.3,hum=36.1,co=1i 1641049200
|
||||
home,room=Kitchen temp=22.7,hum=36.2,co=3i 1641049200
|
||||
home,room=Living\ Room temp=22.4,hum=36.0,co=4i 1641052800
|
||||
home,room=Kitchen temp=22.4,hum=36.0,co=7i 1641052800
|
||||
home,room=Living\ Room temp=22.6,hum=35.9,co=5i 1641056400
|
||||
home,room=Kitchen temp=22.7,hum=36.0,co=9i 1641056400
|
||||
home,room=Living\ Room temp=22.8,hum=36.2,co=9i 1641060000
|
||||
home,room=Kitchen temp=23.3,hum=36.9,co=18i 1641060000
|
||||
home,room=Living\ Room temp=22.5,hum=36.3,co=14i 1641063600
|
||||
home,room=Kitchen temp=23.1,hum=36.6,co=22i 1641063600
|
||||
home,room=Living\ Room temp=22.2,hum=36.4,co=17i 1641067200
|
||||
home,room=Kitchen temp=22.7,hum=36.5,co=26i 1641067200
|
||||
"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your [database](/influxdb3/clustered/admin/databases/)
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/clustered/admin/tokens/#database-tokens)
|
||||
with _write_ permission to the database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Home sensor actions data
|
||||
|
||||
Includes hypothetical actions triggered by data in the [Get started home sensor data](#get-started-home-sensor-data)
|
||||
and is a companion dataset to that sample dataset.
|
||||
To customize timestamps in the dataset, use the {{< icon "clock" >}} button in
|
||||
the lower right corner of the page.
|
||||
This lets you modify the sample dataset to stay within the retention period of
|
||||
the database you write it to.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T08:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
to
|
||||
**{{% influxdb/custom-timestamps-span %}}2022-01-01T20:00:00Z{{% /influxdb/custom-timestamps-span %}}**
|
||||
<em style="opacity: .5">(Customizable)</em>
|
||||
|
||||
##### Schema
|
||||
|
||||
- home_actions <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- room
|
||||
- Kitchen
|
||||
- Living Room
|
||||
- action
|
||||
- alert
|
||||
- cool
|
||||
- level
|
||||
- ok
|
||||
- warn
|
||||
- **fields**:
|
||||
- description <em style="opacity: .5">(string)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write home sensor actions data to InfluxDB" %}}
|
||||
|
||||
#### Write the home sensor actions data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the home sensor actions sample data
|
||||
to {{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary '
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23°C). Cooling to 22°C." 1641027600
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.3°C). Cooling to 22°C." 1641060000
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.1°C). Cooling to 22°C." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 18 ppm." 1641060000
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 22 ppm." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 26 ppm." 1641067200
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 14 ppm." 1641063600
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 17 ppm." 1641067200
|
||||
'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% influxdb/custom-timestamps %}}
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
https://{{< influxdb/host >}}/write?db=DATABASE_NAME&precision=s \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary '
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23°C). Cooling to 22°C." 1641027600
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.3°C). Cooling to 22°C." 1641060000
|
||||
home_actions,room=Kitchen,action=cool,level=ok description="Temperature at or above 23°C (23.1°C). Cooling to 22°C." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 18 ppm." 1641060000
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 22 ppm." 1641063600
|
||||
home_actions,room=Kitchen,action=alert,level=warn description="Carbon monoxide level above normal: 26 ppm." 1641067200
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 14 ppm." 1641063600
|
||||
home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monoxide level above normal: 17 ppm." 1641067200
|
||||
'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
{{% /influxdb/custom-timestamps %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your [database](/influxdb3/clustered/admin/databases/)
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/clustered/admin/tokens/#database-tokens)
|
||||
with _write_ permission to the database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## NOAA Bay Area weather data
|
||||
|
||||
Includes daily weather metrics from three San Francisco Bay Area airports from
|
||||
**January 1, 2020 to December 31, 2022**.
|
||||
This sample dataset includes seasonal trends and is good for exploring time
|
||||
series use cases that involve seasonality.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2020-01-01T00:00:00Z** to **2022-12-31T00:00:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- weather <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- location
|
||||
- Concord
|
||||
- Hayward
|
||||
- San Francisco
|
||||
- **fields**
|
||||
- precip <em style="opacity: .5">(float)</em>
|
||||
- temp_avg <em style="opacity: .5">(float)</em>
|
||||
- temp_max <em style="opacity: .5">(float)</em>
|
||||
- temp_min <em style="opacity: .5">(float)</em>
|
||||
- wind_avg <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the NOAA Bay Area weather data to InfluxDB" %}}
|
||||
|
||||
#### Write the NOAA Bay Area weather data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the NOAA Bay Area weather sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bay-area-weather.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bay-area-weather.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your [database](/influxdb3/clustered/admin/databases/)
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/clustered/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Bitcoin price data
|
||||
|
||||
The Bitcoin price sample dataset provides Bitcoin prices from
|
||||
**2023-05-01T00:00:00Z to 2023-05-15T00:00:00Z**—_[Powered by CoinDesk](https://www.coindesk.com/price/bitcoin)_.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2023-05-01T00:19:00Z** to **2023-05-14T23:48:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- bitcoin <em style="opacity: .5">(measurement)</em>
|
||||
- **tags**:
|
||||
- code
|
||||
- EUR
|
||||
- GBP
|
||||
- USD
|
||||
- crypto
|
||||
- bitcoin
|
||||
- description
|
||||
- Euro
|
||||
- British Pound Sterling
|
||||
- United States Dollar
|
||||
- symbol
|
||||
- \€ <em style="opacity: .5">(€)</em>
|
||||
- \£ <em style="opacity: .5">(£)</em>
|
||||
- \$ <em style="opacity: .5">($)</em>
|
||||
- **fields**
|
||||
- price <em style="opacity: .5">(float)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the Bitcoin sample data to InfluxDB" %}}
|
||||
|
||||
#### Write the Bitcoin price sample data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the Bitcoin price sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your [database](/influxdb3/clustered/admin/databases/)
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/clustered/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
## Random numbers sample data
|
||||
|
||||
Includes two fields with randomly generated numbers reported every minute.
|
||||
Each field has a specific range of randomly generated numbers.
|
||||
This sample dataset is used to demonstrate mathematic operations and
|
||||
transformation functions.
|
||||
|
||||
##### Time Range
|
||||
|
||||
**2023-01-01T00:00:00Z** to **2023-01-01T12:00:00Z**
|
||||
|
||||
##### Schema
|
||||
|
||||
- numbers <em style="opacity: .5">(measurement)</em>
|
||||
- **fields**
|
||||
- a <em style="opacity: .5">(float between -1 and 1)</em>
|
||||
- b <em style="opacity: .5">(float between -3 and 3)</em>
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Write the random number sample data to InfluxDB" %}}
|
||||
|
||||
#### Write the random number sample data to InfluxDB
|
||||
|
||||
Use the InfluxDB v2 or v1 API to write the random number sample data to
|
||||
{{< product-name >}}.
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[v2 API](#)
|
||||
[v1 API](#)
|
||||
{{% /code-tabs %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/api/v2/write?bucket=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-Type: text/plain; charset=utf-8" \
|
||||
--header "Accept: application/json" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/random-numbers.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{% code-tab-content %}}
|
||||
|
||||
{{% code-placeholders "DATABASE_TOKEN|DATABASE_NAME" %}}
|
||||
```sh
|
||||
curl --request POST \
|
||||
http://{{< influxdb/host >}}/write?db=DATABASE_NAME \
|
||||
--header "Authorization: Bearer DATABASE_TOKEN" \
|
||||
--header "Content-type: text/plain; charset=utf-8" \
|
||||
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/random-numbers.lp)"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
Replace the following in the sample script:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}:
|
||||
your [database](/influxdb3/clustered/admin/databases/)
|
||||
- {{% code-placeholder-key %}}`DATABASE_TOKEN`{{% /code-placeholder-key %}}:
|
||||
a [database token](/influxdb3/clustered/admin/tokens/#database-tokens)
|
||||
with sufficient permissions to the specified database
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
<!--
|
||||
//SOURCE content/shared/influxdb3-sample-data/sample-data-dist.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Use Power BI to visualize data
|
||||
description: >
|
||||
Use Microsoft Power BI Desktop with the InfluxDB 3 custom connector to query and
|
||||
visualize data from {{% product-name %}}.
|
||||
menu:
|
||||
influxdb3_clustered:
|
||||
name: Power BI
|
||||
parent: Visualize data
|
||||
weight: 104
|
||||
influxdb3/clustered/tags: [visualization, powerbi, sql]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- https://learn.microsoft.com/en-us/power-bi/desktop/, Power BI documentation
|
||||
- /influxdb3/clustered/query-data/sql/
|
||||
- /influxdb3/clustered/query-data/execute-queries/odbc/
|
||||
source: /shared/influxdb3-visualize/powerbi.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-visualize/powerbi.md -->
|
||||
|
|
@ -12,6 +12,8 @@ influxdb3/core/tags: [databases]
|
|||
related:
|
||||
- /influxdb3/core/write-data/best-practices/schema-design/
|
||||
- /influxdb3/core/reference/cli/influxdb3/
|
||||
- /influxdb3/core/api/v3/#tag/Database, Database API reference
|
||||
- /influxdb3/core/reference/internals/data-retention/
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
alt_links:
|
||||
cloud: /influxdb/cloud/admin/buckets/
|
||||
|
|
@ -22,5 +24,5 @@ source: /shared/influxdb3-admin/databases/_index.md
|
|||
---
|
||||
|
||||
<!--
|
||||
The content of this file is located at content/shared/influxdb3-admin/databases/_index.md
|
||||
//SOURCE - content/shared/influxdb3-admin/databases/_index.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -1,23 +1,35 @@
|
|||
---
|
||||
title: Create a database
|
||||
description: >
|
||||
Use the [`influxdb3 create database` command](/influxdb3/core/reference/cli/influxdb3/create/database/)
|
||||
to create a new database in {{< product-name >}}.
|
||||
Use the influxdb3 CLI, HTTP API, or InfluxDB 3 Explorer to create a new database
|
||||
in {{< product-name >}}.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Manage databases
|
||||
weight: 201
|
||||
list_code_example: |
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
{{% code-placeholders "DATABASE_NAME" %}}
|
||||
```sh
|
||||
influxdb3 create database DATABASE_NAME
|
||||
|
||||
```sh{placeholders="DATABASE_NAME|AUTH_TOKEN"}
|
||||
# influxdb3 CLI
|
||||
influxdb3 create database \
|
||||
--retention-period 30d \
|
||||
DATABASE_NAME
|
||||
|
||||
# HTTP API
|
||||
curl --request POST "http://localhost:8181/api/v3/configure/database" \
|
||||
--header "Content-Type: application/json" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN" \
|
||||
--data '{
|
||||
"db": "DATABASE_NAME",
|
||||
"retention_period": "30d"
|
||||
}'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
related:
|
||||
- /influxdb3/core/reference/cli/influxdb3/create/database/
|
||||
- /influxdb3/core/api/v3/#operation/PostConfigureDatabase, Create database API
|
||||
- /influxdb3/core/reference/naming-restrictions/
|
||||
- /influxdb3/core/reference/internals/data-retention/
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
source: /shared/influxdb3-admin/databases/create.md
|
||||
---
|
||||
|
|
|
|||
|
|
@ -1,25 +1,28 @@
|
|||
---
|
||||
title: Delete a database
|
||||
description: >
|
||||
Use the [`influxdb3 delete database` command](/influxdb3/core/reference/cli/influxdb3/delete/database/)
|
||||
to delete a database from {{< product-name >}}.
|
||||
Provide the name of the database you want to delete.
|
||||
Use the influxdb3 CLI, HTTP API, or InfluxDB 3 Explorer to delete a database
|
||||
from {{< product-name >}}.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Manage databases
|
||||
weight: 203
|
||||
list_code_example: |
|
||||
{{% code-placeholders "DATABASE_NAME" %}}
|
||||
```sh
|
||||
```sh{placeholders="DATABASE_NAME"}
|
||||
# influxdb3 CLI
|
||||
influxdb3 delete database DATABASE_NAME
|
||||
|
||||
# HTTP API
|
||||
curl --request DELETE "http://localhost:8181/api/v3/configure/database?db=DATABASE_NAME" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN"
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
related:
|
||||
- /influxdb3/core/reference/cli/influxdb3/delete/database/
|
||||
- /influxdb3/core/api/v3/#operation/DeleteConfigureDatabase, Delete database API
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
source: /shared/influxdb3-admin/databases/delete.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is located at content/shared/influxdb3-admin/databases/delete.md
|
||||
//SOURCE - content/shared/influxdb3-admin/databases/delete.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -1,22 +1,27 @@
|
|||
---
|
||||
title: List databases
|
||||
description: >
|
||||
Use the [`influxdb3 show databases` command](/influxdb3/core/reference/cli/influxdb3/show/databases/)
|
||||
to list databases in {{< product-name >}}.
|
||||
Use the influxdb3 CLI, HTTP API, or InfluxDB 3 Explorer to list databases in {{< product-name >}}.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Manage databases
|
||||
weight: 202
|
||||
list_code_example: |
|
||||
```sh
|
||||
```sh{placeholders="AUTH_TOKEN"}
|
||||
# influxdb3 CLI
|
||||
influxdb3 show databases
|
||||
|
||||
# HTTP API
|
||||
curl --request GET "http://localhost:8181/api/v3/configure/database" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN"
|
||||
```
|
||||
related:
|
||||
- /influxdb3/core/reference/cli/influxdb3/show/databases/
|
||||
- /influxdb3/core/api/v3/#operation/GetConfigureDatabase, List databases API
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
source: /shared/influxdb3-admin/databases/list.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is located at content/shared/influxdb3-admin/databases/list.md
|
||||
//SOURCE - content/shared/influxdb3-admin/databases/list.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -1,27 +1,26 @@
|
|||
---
|
||||
title: Create a table
|
||||
description: >
|
||||
Use the [`influxdb3 create table` command](/influxdb3/core/reference/cli/influxdb3/create/table/)
|
||||
or the [HTTP API](/influxdb3/core/api/v3/) to create a new table in a specified database in InfluxDB 3 Core.
|
||||
Provide the database name, table name, and tag columns.
|
||||
Use the influxdb3 CLI or HTTP API to create a table in a specified database
|
||||
in {{< product-name >}}.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Manage tables
|
||||
weight: 201
|
||||
list_code_example: |
|
||||
```sh
|
||||
# CLI
|
||||
```sh{placeholders="DATABASE_NAME|TABLE_NAME|AUTH_TOKEN"}
|
||||
# influxdb3 CLI
|
||||
influxdb3 create table \
|
||||
--tags tag1,tag2,tag3 \
|
||||
--database <DATABASE_NAME> \
|
||||
--token <AUTH_TOKEN> \
|
||||
<TABLE_NAME>
|
||||
--database DATABASE_NAME \
|
||||
--token AUTH_TOKEN \
|
||||
TABLE_NAME
|
||||
|
||||
# HTTP API
|
||||
curl -X POST "http://localhost:8181/api/v3/configure/table" \
|
||||
--header "Authorization: Bearer <AUTH_TOKEN>" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN" \
|
||||
--header "Content-Type: application/json" \
|
||||
--data '{"db": "<DATABASE_NAME>", "table": "<TABLE_NAME>", "tags": ["tag1", "tag2", "tag3"]}'
|
||||
--data '{"db": "DATABASE_NAME", "table": "TABLE_NAME", "tags": ["tag1", "tag2", "tag3"]}'
|
||||
```
|
||||
related:
|
||||
- /influxdb3/core/reference/cli/influxdb3/create/table/
|
||||
|
|
|
|||
|
|
@ -1,25 +1,25 @@
|
|||
---
|
||||
title: List tables
|
||||
description: >
|
||||
Use the [`influxdb3 query` command](/influxdb3/core/reference/cli/influxdb3/query/)
|
||||
or the [HTTP API](/influxdb3/core/api/v3/) to list tables in a specified database in {{< product-name >}}.
|
||||
Use the influxdb3 CLI or HTTP API to list tables in a specified database
|
||||
in {{< product-name >}}.
|
||||
Use SQL SHOW TABLES or InfluxQL SHOW MEASUREMENTS statements.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Manage tables
|
||||
weight: 202
|
||||
list_code_example: |
|
||||
```sh
|
||||
# CLI
|
||||
```sh{placeholders="DATABASE_NAME|AUTH_TOKEN"}
|
||||
# influxdb3 CLI
|
||||
influxdb3 query \
|
||||
--database <DATABASE_NAME> \
|
||||
--token <AUTH_TOKEN> \
|
||||
--database DATABASE_NAME \
|
||||
--token AUTH_TOKEN \
|
||||
"SHOW TABLES"
|
||||
|
||||
# HTTP API
|
||||
curl --get "http://localhost:8181/api/v3/query_sql" \
|
||||
--header "Authorization: Bearer <AUTH_TOKEN>" \
|
||||
--data-urlencode "db=<DATABASE_NAME>" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN" \
|
||||
--data-urlencode "db=DATABASE_NAME" \
|
||||
--data-urlencode "q=SHOW TABLES"
|
||||
```
|
||||
related:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Execute SQL queries with ODBC
|
||||
description: >
|
||||
Use the Arrow Flight SQL ODBC driver to execute SQL queries against {{% product-name %}} from
|
||||
ODBC-compatible applications and programming languages.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
name: Use ODBC
|
||||
parent: Execute queries
|
||||
weight: 351
|
||||
influxdb3/core/tags: [query, sql, odbc]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- /influxdb3/core/reference/sql/
|
||||
- /influxdb3/core/query-data/
|
||||
- /influxdb3/core/visualize-data/powerbi/
|
||||
source: /shared/influxdb3-query-guides/execute-queries/odbc.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-query-guides/execute-queries/odbc.md -->
|
||||
|
|
@ -7,6 +7,10 @@ menu:
|
|||
parent: influxdb3 create
|
||||
name: influxdb3 create database
|
||||
weight: 400
|
||||
related:
|
||||
- /influxdb3/core/admin/databases/create/
|
||||
- /influxdb3/core/api/v3/#operation/PostConfigureDatabase, Create database API
|
||||
- /influxdb3/core/reference/internals/data-retention/
|
||||
source: /shared/influxdb3-cli/create/database.md
|
||||
---
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,22 @@
|
|||
---
|
||||
title: Data retention in {{< product-name >}}
|
||||
description: >
|
||||
{{% product-name %}} enforces database retention periods at query time.
|
||||
Retention periods are set when creating a database and cannot be changed afterward.
|
||||
weight: 103
|
||||
menu:
|
||||
influxdb3_core:
|
||||
name: Data retention
|
||||
parent: Core internals
|
||||
influxdb3/core/tags: [internals, retention]
|
||||
related:
|
||||
- /influxdb3/core/admin/databases/create/
|
||||
- /influxdb3/core/reference/cli/influxdb3/create/database/
|
||||
- /influxdb3/core/api/v3/#operation/PostConfigureDatabase, Create database API
|
||||
- /influxdb3/core/reference/glossary/#retention-period
|
||||
source: /shared/influxdb3-internals/data-retention.md
|
||||
---
|
||||
|
||||
<!--
|
||||
//SOURCE content/shared/influxdb3-internals/data-retention.md
|
||||
-->
|
||||
|
|
@ -11,15 +11,15 @@ menu:
|
|||
parent: Visualize data
|
||||
influxdb3/core/tags: [query, visualization]
|
||||
alt_links:
|
||||
enterprise: /influxdb3/enterprise/visualize-data/grafana/
|
||||
enterprise_v1: /enterprise_influxdb/v1/tools/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/grafana/
|
||||
v1: /influxdb/v1/tools/grafana/
|
||||
v2: /influxdb/v2/tools/grafana/
|
||||
cloud: /influxdb/cloud/tools/grafana/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/grafana/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/grafana/
|
||||
clustered: /influxdb3/cloud-clustered/process-data/visualize/grafana/
|
||||
source: /shared/influxdb3-visualize/grafana.md
|
||||
source: /content/shared/v3-process-data/visualize/grafana.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this page is at content/shared/influxdb3-visualize/grafana.md
|
||||
-->
|
||||
<!-- SOURCE: /content/shared/v3-process-data/visualize/grafana.md -->
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
---
|
||||
title: Use Power BI to visualize data
|
||||
description: >
|
||||
Use Microsoft Power BI Desktop with the InfluxDB 3 custom connector to query and
|
||||
visualize data from {{% product-name %}}.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
name: Power BI
|
||||
parent: Visualize data
|
||||
weight: 104
|
||||
influxdb3/core/tags: [visualization, powerbi, sql]
|
||||
metadata: [SQL]
|
||||
related:
|
||||
- https://learn.microsoft.com/en-us/power-bi/desktop/, Power BI documentation
|
||||
- /influxdb3/core/query-data/sql/
|
||||
- /influxdb3/core/query-data/execute-queries/odbc/
|
||||
source: /shared/influxdb3-visualize/powerbi.md
|
||||
---
|
||||
|
||||
<!-- //SOURCE content/shared/influxdb3-visualize/powerbi.md -->
|
||||
|
|
@ -16,7 +16,7 @@ alt_links:
|
|||
cloud: /influxdb/cloud/tools/tableau/
|
||||
cloud-serverless: /influxdb3/cloud-serverless/process-data/visualize/tableau/
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/process-data/visualize/tableau/
|
||||
clustered: /influxdb3/cloud-clustered/process-data/visualize/tableau/
|
||||
clustered: /influxdb3/clustered/process-data/visualize/tableau/
|
||||
metadata: [SQL only]
|
||||
draft: true
|
||||
source: /shared/influxdb3-visualize/tableau.md
|
||||
|
|
|
|||
|
|
@ -11,5 +11,5 @@ source: /shared/influxdb3-admin/_index.md
|
|||
---
|
||||
|
||||
<!--
|
||||
The content of this file is located at content/shared/influxdb3-admin/_index.md
|
||||
//SOURCE - content/shared/influxdb3-admin/_index.md
|
||||
-->
|
||||
|
|
|
|||
|
|
@ -12,6 +12,8 @@ influxdb3/enterprise/tags: [databases]
|
|||
related:
|
||||
- /influxdb3/enterprise/write-data/best-practices/schema-design/
|
||||
- /influxdb3/enterprise/reference/cli/influxdb3/
|
||||
- /influxdb3/enterprise/api/v3/#tag/Database, Database API reference
|
||||
- /influxdb3/enterprise/reference/internals/data-retention/
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
alt_links:
|
||||
cloud: /influxdb/cloud/admin/buckets/
|
||||
|
|
|
|||
|
|
@ -1,23 +1,36 @@
|
|||
---
|
||||
title: Create a database
|
||||
description: >
|
||||
Use the [`influxdb3 create database` command](/influxdb3/enterprise/reference/cli/influxdb3/create/database/)
|
||||
to create a new database in {{< product-name >}}.
|
||||
Use the influxdb3 CLI, HTTP API, or InfluxDB 3 Explorer to create a new database
|
||||
in {{< product-name >}}.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: Manage databases
|
||||
weight: 201
|
||||
list_code_example: |
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
{{% code-placeholders "DATABASE_NAME" %}}
|
||||
```sh
|
||||
influxdb3 create database DATABASE_NAME
|
||||
|
||||
```sh{placeholders="DATABASE_NAME"}
|
||||
|
||||
# influxdb3 CLI
|
||||
influxdb3 create database \
|
||||
--retention-period 30d \
|
||||
DATABASE_NAME
|
||||
|
||||
# HTTP API
|
||||
curl --request POST "http://localhost:8181/api/v3/configure/database" \
|
||||
--header "Content-Type: application/json" \
|
||||
--header "Authorization: Bearer AUTH_TOKEN" \
|
||||
--data '{
|
||||
"db": "DATABASE_NAME",
|
||||
"retention_period": "30d"
|
||||
}'
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
related:
|
||||
- /influxdb3/enterprise/reference/cli/influxdb3/create/database/
|
||||
- /influxdb3/enterprise/api/v3/#operation/PostConfigureDatabase, Create database API
|
||||
- /influxdb3/enterprise/reference/naming-restrictions/
|
||||
- /influxdb3/enterprise/reference/internals/data-retention/
|
||||
- /influxdb3/explorer/manage-databases/
|
||||
source: /shared/influxdb3-admin/databases/create.md
|
||||
---
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue