Merge branch 'master' into restructure-processing-engine-docs
commit
946a6e1f4d
|
@ -1,4 +1,4 @@
|
|||
version: 2
|
||||
version: 2.1
|
||||
jobs:
|
||||
build:
|
||||
docker:
|
||||
|
@ -41,7 +41,7 @@ jobs:
|
|||
- /home/circleci/bin
|
||||
- run:
|
||||
name: Hugo Build
|
||||
command: npx hugo --logLevel info --minify --destination workspace/public
|
||||
command: yarn hugo --environment production --logLevel info --gc --destination workspace/public
|
||||
- persist_to_workspace:
|
||||
root: workspace
|
||||
paths:
|
||||
|
@ -68,7 +68,6 @@ jobs:
|
|||
when: on_success
|
||||
|
||||
workflows:
|
||||
version: 2
|
||||
build:
|
||||
jobs:
|
||||
- build
|
||||
|
|
|
@ -1679,7 +1679,7 @@ The shortcode takes a regular expression for matching placeholder names.
|
|||
Use the `code-placeholder-key` shortcode to format the placeholder names in
|
||||
text that describes the placeholder--for example:
|
||||
|
||||
```
|
||||
```markdown
|
||||
{{% code-placeholders "DATABASE_NAME|USERNAME|PASSWORD_OR_TOKEN|API_TOKEN|exampleuser@influxdata.com" %}}
|
||||
```sh
|
||||
curl --request POST http://localhost:8086/write?db=DATABASE_NAME \
|
||||
|
@ -1703,3 +1703,83 @@ InfluxDB API documentation when documentation is deployed.
|
|||
Redoc generates HTML documentation using the InfluxDB `swagger.yml`.
|
||||
For more information about generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
## JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
`assets/js`.
|
||||
Only `assets/js/main.js` should be imported in HTML files.
|
||||
|
||||
`assets/js/main.js` registers components and initializes them on page load.
|
||||
|
||||
If you're adding UI functionality that requires JavaScript, follow these steps:
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
should be initialized by your JavaScript code. For example:
|
||||
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Following the component pattern, create a single-purpose JavaScript module
|
||||
(`assets/js/components/my-component.js`)
|
||||
that exports a single function that receives the component element and initializes it.
|
||||
3. In `assets/js/main.js`, import the module and register the component to ensure
|
||||
the component is initialized on page load.
|
||||
|
||||
### Debugging JavaScript
|
||||
|
||||
To debug JavaScript code used in the InfluxData documentation UI, choose one of the following methods:
|
||||
|
||||
- Use source maps and the Chrome DevTools debugger.
|
||||
- Use debug helpers that provide breakpoints and console logging as a workaround or alternative for using source maps and the Chrome DevTools debugger.
|
||||
|
||||
#### Using source maps and Chrome DevTools debugger
|
||||
|
||||
1. In VS Code, select Run > Start Debugging.
|
||||
2. Select the "Debug Docs (source maps)" configuration.
|
||||
3. Click the play button to start the debugger.
|
||||
5. Set breakpoints in the JavaScript source files--files in the
|
||||
`assets/js/ns-hugo-imp:` namespace-- in the
|
||||
VS Code editor or in the Chrome Developer Tools Sources panel:
|
||||
|
||||
- In the VS Code Debugger panel > "Loaded Scripts" section, find the
|
||||
`assets/js/ns-hugo-imp:` namespace.
|
||||
- In the Chrome Developer Tools Sources panel, expand
|
||||
`js/ns-hugo-imp:/<YOUR_WORKSPACE_ROOT>/assets/js/`.
|
||||
|
||||
#### Using debug helpers
|
||||
|
||||
1. In your JavaScript module, import debug helpers from `assets/js/utils/debug-helpers.js`.
|
||||
These helpers provide breakpoints and console logging as a workaround or alternative for
|
||||
using source maps and the Chrome DevTools debugger.
|
||||
2. Insert debug statements by calling the helper functions in your code--for example:
|
||||
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
|
|
@ -15,11 +15,13 @@ node_modules
|
|||
!telegraf-build/templates
|
||||
!telegraf-build/scripts
|
||||
!telegraf-build/README.md
|
||||
/cypress/downloads
|
||||
/cypress/screenshots/*
|
||||
/cypress/videos/*
|
||||
test-results.xml
|
||||
/influxdb3cli-build-scripts/content
|
||||
.vscode/*
|
||||
!.vscode/launch.json
|
||||
.idea
|
||||
**/config.toml
|
||||
package-lock.json
|
||||
|
|
|
@ -3,3 +3,4 @@
|
|||
**/.svn
|
||||
**/.hg
|
||||
**/node_modules
|
||||
assets/jsconfig.json
|
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"name": "Debug JS (debug-helpers)",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"url": "http://localhost:1313",
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"skipFiles": [
|
||||
"<node_internals>/**"
|
||||
],
|
||||
"sourceMaps": false,
|
||||
"trace": true,
|
||||
"smartStep": false
|
||||
},
|
||||
{
|
||||
"name": "Debug JS (source maps)",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"url": "http://localhost:1313",
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"sourceMaps": true,
|
||||
"sourceMapPathOverrides": {
|
||||
"*": "${webRoot}/assets/js/*",
|
||||
"main.js": "${webRoot}/assets/js/main.js",
|
||||
"page-context.js": "${webRoot}/assets/js/page-context.js",
|
||||
"ask-ai-trigger.js": "${webRoot}/assets/js/ask-ai-trigger.js",
|
||||
"ask-ai.js": "${webRoot}/assets/js/ask-ai.js",
|
||||
"utils/*": "${webRoot}/assets/js/utils/*",
|
||||
"services/*": "${webRoot}/assets/js/services/*"
|
||||
},
|
||||
"skipFiles": [
|
||||
"<node_internals>/**",
|
||||
"node_modules/**",
|
||||
"chrome-extension://**"
|
||||
],
|
||||
"trace": true,
|
||||
"smartStep": true,
|
||||
"disableNetworkCache": true,
|
||||
"userDataDir": "${workspaceFolder}/.vscode/chrome-user-data",
|
||||
"runtimeArgs": [
|
||||
"--disable-features=VizDisplayCompositor"
|
||||
]
|
||||
},
|
||||
]
|
||||
}
|
|
@ -1667,7 +1667,7 @@ The shortcode takes a regular expression for matching placeholder names.
|
|||
Use the `code-placeholder-key` shortcode to format the placeholder names in
|
||||
text that describes the placeholder--for example:
|
||||
|
||||
```
|
||||
```markdown
|
||||
{{% code-placeholders "DATABASE_NAME|USERNAME|PASSWORD_OR_TOKEN|API_TOKEN|exampleuser@influxdata.com" %}}
|
||||
```sh
|
||||
curl --request POST http://localhost:8086/write?db=DATABASE_NAME \
|
||||
|
@ -1691,3 +1691,83 @@ InfluxDB API documentation when documentation is deployed.
|
|||
Redoc generates HTML documentation using the InfluxDB `swagger.yml`.
|
||||
For more information about generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
## JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
`assets/js`.
|
||||
Only `assets/js/main.js` should be imported in HTML files.
|
||||
|
||||
`assets/js/main.js` registers components and initializes them on page load.
|
||||
|
||||
If you're adding UI functionality that requires JavaScript, follow these steps:
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
should be initialized by your JavaScript code. For example:
|
||||
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Following the component pattern, create a single-purpose JavaScript module
|
||||
(`assets/js/components/my-component.js`)
|
||||
that exports a single function that receives the component element and initializes it.
|
||||
3. In `assets/js/main.js`, import the module and register the component to ensure
|
||||
the component is initialized on page load.
|
||||
|
||||
### Debugging JavaScript
|
||||
|
||||
To debug JavaScript code used in the InfluxData documentation UI, choose one of the following methods:
|
||||
|
||||
- Use source maps and the Chrome DevTools debugger.
|
||||
- Use debug helpers that provide breakpoints and console logging as a workaround or alternative for using source maps and the Chrome DevTools debugger.
|
||||
|
||||
#### Using source maps and Chrome DevTools debugger
|
||||
|
||||
1. In VS Code, select Run > Start Debugging.
|
||||
2. Select the "Debug Docs (source maps)" configuration.
|
||||
3. Click the play button to start the debugger.
|
||||
5. Set breakpoints in the JavaScript source files--files in the
|
||||
`assets/js/ns-hugo-imp:` namespace-- in the
|
||||
VS Code editor or in the Chrome Developer Tools Sources panel:
|
||||
|
||||
- In the VS Code Debugger panel > "Loaded Scripts" section, find the
|
||||
`assets/js/ns-hugo-imp:` namespace.
|
||||
- In the Chrome Developer Tools Sources panel, expand
|
||||
`js/ns-hugo-imp:/<YOUR_WORKSPACE_ROOT>/assets/js/`.
|
||||
|
||||
#### Using debug helpers
|
||||
|
||||
1. In your JavaScript module, import debug helpers from `assets/js/utils/debug-helpers.js`.
|
||||
These helpers provide breakpoints and console logging as a workaround or alternative for
|
||||
using source maps and the Chrome DevTools debugger.
|
||||
2. Insert debug statements by calling the helper functions in your code--for example:
|
||||
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
|
|
@ -146,15 +146,15 @@ tags:
|
|||
description: |
|
||||
Manage Processing engine triggers, test plugins, and send requests to trigger On Request plugins.
|
||||
|
||||
InfluxDB 3 Core provides the InfluxDB 3 Processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
InfluxDB 3 Core provides the InfluxDB 3 processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
Use Processing engine plugins and triggers to run code and perform tasks for different database events.
|
||||
|
||||
To get started with the Processing engine, see the [Processing engine and Python plugins](/influxdb3/core/processing-engine/) guide.
|
||||
To get started with the processing engine, see the [Processing engine and Python plugins](/influxdb3/core/processing-engine/) guide.
|
||||
- name: Query data
|
||||
description: Query data using SQL or InfluxQL
|
||||
- name: Quick start
|
||||
description: |
|
||||
1. [Create an admin token](#section/Authentication) for the InfluxDB 3 Core API.
|
||||
1. [Create an admin token](#section/Authentication) to authorize API requests.
|
||||
|
||||
```bash
|
||||
curl -X POST "http://localhost:8181/api/v3/configure/token/admin"
|
||||
|
@ -385,12 +385,7 @@ paths:
|
|||
parameters:
|
||||
- $ref: '#/components/parameters/dbWriteParam'
|
||||
- $ref: '#/components/parameters/accept_partial'
|
||||
- name: precision
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
$ref: '#/components/schemas/PrecisionWrite'
|
||||
description: Precision of timestamps.
|
||||
- $ref: '#/components/parameters/precisionParam'
|
||||
- name: no_sync
|
||||
in: query
|
||||
schema:
|
||||
|
@ -440,16 +435,8 @@ paths:
|
|||
description: Executes an SQL query to retrieve data from the specified database.
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/db'
|
||||
- name: q
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: format
|
||||
in: query
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- $ref: '#/components/parameters/querySqlParam'
|
||||
- $ref: '#/components/parameters/format'
|
||||
- $ref: '#/components/parameters/AcceptQueryHeader'
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
responses:
|
||||
|
@ -1072,15 +1059,104 @@ paths:
|
|||
post:
|
||||
operationId: PostConfigureProcessingEngineTrigger
|
||||
summary: Create processing engine trigger
|
||||
description: Creates a new processing engine trigger.
|
||||
description: |
|
||||
Creates a processing engine trigger with the specified plugin file and trigger specification.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
requestBody:
|
||||
required: true
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
examples:
|
||||
schedule_cron:
|
||||
summary: Schedule trigger using cron
|
||||
description: |
|
||||
In `"cron:CRON_EXPRESSION"`, `CRON_EXPRESSION` uses extended 6-field cron format.
|
||||
The cron expression `0 0 6 * * 1-5` means the trigger will run at 6:00 AM every weekday (Monday to Friday).
|
||||
value:
|
||||
db: DATABASE_NAME
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_cron_trigger
|
||||
trigger_specification: cron:0 0 6 * * 1-5
|
||||
schedule_every:
|
||||
summary: Schedule trigger using interval
|
||||
description: |
|
||||
In `"every:DURATION"`, `DURATION` specifies the interval between trigger executions.
|
||||
The duration `1h` means the trigger will run every hour.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_trigger
|
||||
trigger_specification: every:1h
|
||||
schedule_every_seconds:
|
||||
summary: Schedule trigger using seconds interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 30 seconds.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_30s_trigger
|
||||
trigger_specification: every:30s
|
||||
schedule_every_minutes:
|
||||
summary: Schedule trigger using minutes interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 5 minutes.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_5m_trigger
|
||||
trigger_specification: every:5m
|
||||
all_tables:
|
||||
summary: All tables trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to any table in the database.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: all_tables.py
|
||||
trigger_name: all_tables_trigger
|
||||
trigger_specification: all_tables
|
||||
table_specific:
|
||||
summary: Table-specific trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to a specific table.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: table.py
|
||||
trigger_name: table_trigger
|
||||
trigger_specification: table:sensors
|
||||
api_request:
|
||||
summary: On-demand request trigger example
|
||||
description: |
|
||||
Creates an HTTP endpoint `/api/v3/engine/hello-world` for manual invocation.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: request.py
|
||||
trigger_name: hello_world_trigger
|
||||
trigger_specification: path:hello-world
|
||||
cron_friday_afternoon:
|
||||
summary: Cron trigger for Friday afternoons
|
||||
description: |
|
||||
Example of a cron trigger that runs every Friday at 2:30 PM.
|
||||
value:
|
||||
db: reports
|
||||
plugin_filename: weekly_report.py
|
||||
trigger_name: friday_report_trigger
|
||||
trigger_specification: cron:0 30 14 * * 5
|
||||
cron_monthly:
|
||||
summary: Cron trigger for monthly execution
|
||||
description: |
|
||||
Example of a cron trigger that runs on the first day of every month at midnight.
|
||||
value:
|
||||
db: monthly_data
|
||||
plugin_filename: monthly_cleanup.py
|
||||
trigger_name: monthly_cleanup_trigger
|
||||
trigger_specification: cron:0 0 0 1 * *
|
||||
responses:
|
||||
'201':
|
||||
'200':
|
||||
description: Success. Processing engine trigger created.
|
||||
'400':
|
||||
description: Bad request.
|
||||
|
@ -1157,7 +1233,7 @@ paths:
|
|||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The processing engine trigger has been enabled.
|
||||
description: Success. The processing engine trigger is enabled.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1170,7 +1246,14 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginPackages
|
||||
summary: Install plugin packages
|
||||
description: Installs packages for the plugin environment.
|
||||
description: |
|
||||
Installs the specified Python packages into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the packages are installed.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1179,10 +1262,30 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
packages:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: |
|
||||
A list of Python package names to install.
|
||||
Can include version specifiers (e.g., "scipy==1.9.0").
|
||||
example:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
required:
|
||||
- packages
|
||||
example:
|
||||
packages:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The packages have been installed.
|
||||
description: Success. The packages are installed.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1193,7 +1296,15 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginRequirements
|
||||
summary: Install plugin requirements
|
||||
description: Installs requirements for the plugin environment.
|
||||
description: |
|
||||
Installs requirements from a requirements file (also known as a "pip requirements file") into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the requirements are installed.
|
||||
|
||||
### Related
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
- [Python requirements file format](https://pip.pypa.io/en/stable/reference/requirements-file-format/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1202,7 +1313,17 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
requirements_location:
|
||||
type: string
|
||||
description: |
|
||||
The path to the requirements file containing Python packages to install.
|
||||
Can be a relative path (relative to the plugin directory) or an absolute path.
|
||||
example: requirements.txt
|
||||
required:
|
||||
- requirements_location
|
||||
example:
|
||||
requirements_location: requirements.txt
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The requirements have been installed.
|
||||
|
@ -1248,18 +1369,18 @@ paths:
|
|||
parameters:
|
||||
- name: plugin_path
|
||||
description: |
|
||||
The path configured in the `trigger-spec` for the plugin.
|
||||
The path configured in the request trigger specification "path:<plugin_path>"` for the plugin.
|
||||
|
||||
For example, if you define a trigger with the following:
|
||||
|
||||
```
|
||||
trigger-spec: "request:hello-world"
|
||||
```json
|
||||
trigger-spec: "path:hello-world"
|
||||
```
|
||||
|
||||
then, the HTTP API exposes the following plugin endpoint:
|
||||
|
||||
```
|
||||
<INFLUXDB_HOST>/api/v3/engine/hello-world
|
||||
<INFLUXDB3_HOST>/api/v3/engine/hello-world
|
||||
```
|
||||
in: path
|
||||
required: true
|
||||
|
@ -1269,7 +1390,7 @@ paths:
|
|||
operationId: GetProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1296,7 +1417,7 @@ paths:
|
|||
operationId: PostProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1335,8 +1456,6 @@ paths:
|
|||
description: |
|
||||
Creates an admin token.
|
||||
An admin token is a special type of token that has full access to all resources in the system.
|
||||
|
||||
This endpoint is only available in InfluxDB 3 Enterprise.
|
||||
responses:
|
||||
'201':
|
||||
description: |
|
||||
|
@ -1357,8 +1476,6 @@ paths:
|
|||
summary: Regenerate admin token
|
||||
description: |
|
||||
Regenerates an admin token and revokes the previous token with the same name.
|
||||
|
||||
This endpoint is only available in InfluxDB 3 Enterprise.
|
||||
parameters: []
|
||||
responses:
|
||||
'201':
|
||||
|
@ -1429,7 +1546,6 @@ components:
|
|||
schema:
|
||||
type: string
|
||||
description: |
|
||||
The name of the database.
|
||||
The name of the database.
|
||||
InfluxDB creates the database if it doesn't already exist, and then
|
||||
writes all points in the batch to the database.
|
||||
|
@ -1747,15 +1863,69 @@ components:
|
|||
type: string
|
||||
plugin_filename:
|
||||
type: string
|
||||
description: |
|
||||
The path and filename of the plugin to execute--for example,
|
||||
`schedule.py` or `endpoints/report.py`.
|
||||
The path can be absolute or relative to the `--plugins-dir` directory configured when starting InfluxDB 3.
|
||||
|
||||
The plugin file must implement the trigger interface associated with the trigger's specification (`trigger_spec`).
|
||||
trigger_name:
|
||||
type: string
|
||||
trigger_specification:
|
||||
type: string
|
||||
description: |
|
||||
Specifies when and how the processing engine trigger should be invoked.
|
||||
|
||||
## Supported trigger specifications:
|
||||
|
||||
### Cron-based scheduling
|
||||
Format: `cron:CRON_EXPRESSION`
|
||||
|
||||
Uses extended (6-field) cron format (second minute hour day_of_month month day_of_week):
|
||||
```
|
||||
┌───────────── second (0-59)
|
||||
│ ┌───────────── minute (0-59)
|
||||
│ │ ┌───────────── hour (0-23)
|
||||
│ │ │ ┌───────────── day of month (1-31)
|
||||
│ │ │ │ ┌───────────── month (1-12)
|
||||
│ │ │ │ │ ┌───────────── day of week (0-6, Sunday=0)
|
||||
│ │ │ │ │ │
|
||||
* * * * * *
|
||||
```
|
||||
Examples:
|
||||
- `cron:0 0 6 * * 1-5` - Every weekday at 6:00 AM
|
||||
- `cron:0 30 14 * * 5` - Every Friday at 2:30 PM
|
||||
- `cron:0 0 0 1 * *` - First day of every month at midnight
|
||||
|
||||
### Interval-based scheduling
|
||||
Format: `every:DURATION`
|
||||
|
||||
Supported durations: `s` (seconds), `m` (minutes), `h` (hours), `d` (days):
|
||||
- `every:30s` - Every 30 seconds
|
||||
- `every:5m` - Every 5 minutes
|
||||
- `every:1h` - Every hour
|
||||
- `every:1d` - Every day
|
||||
|
||||
### Table-based triggers
|
||||
- `all_tables` - Triggers on write events to any table in the database
|
||||
- `table:TABLE_NAME` - Triggers on write events to a specific table
|
||||
|
||||
### On-demand triggers
|
||||
Format: `path:ENDPOINT_NAME`
|
||||
|
||||
Creates an HTTP endpoint `/api/v3/engine/ENDPOINT_NAME` for manual invocation:
|
||||
- `path:hello-world` - Creates endpoint `/api/v3/engine/hello-world`
|
||||
- `path:data-export` - Creates endpoint `/api/v3/engine/data-export`
|
||||
pattern: ^(cron:[0-9 *,/-]+|every:[0-9]+[smhd]|all_tables|table:[a-zA-Z_][a-zA-Z0-9_]*|path:[a-zA-Z0-9_-]+)$
|
||||
example: cron:0 0 6 * * 1-5
|
||||
trigger_arguments:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
description: Optional arguments passed to the plugin.
|
||||
disabled:
|
||||
type: boolean
|
||||
default: false
|
||||
description: Whether the trigger is disabled.
|
||||
required:
|
||||
- db
|
||||
- plugin_filename
|
||||
|
@ -1879,8 +2049,6 @@ components:
|
|||
scheme: bearer
|
||||
bearerFormat: JWT
|
||||
description: |
|
||||
_During Alpha release, an API token is not required._
|
||||
|
||||
A Bearer token for authentication.
|
||||
|
||||
Provide the scheme and the API token in the `Authorization` header--for example:
|
||||
|
|
|
@ -146,15 +146,15 @@ tags:
|
|||
description: |
|
||||
Manage Processing engine triggers, test plugins, and send requests to trigger On Request plugins.
|
||||
|
||||
InfluxDB 3 Enterprise provides the InfluxDB 3 Processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
InfluxDB 3 Enterprise provides the InfluxDB 3 processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
Use Processing engine plugins and triggers to run code and perform tasks for different database events.
|
||||
|
||||
To get started with the Processing engine, see the [Processing engine and Python plugins](/influxdb3/enterprise/processing-engine/) guide.
|
||||
To get started with the processing engine, see the [Processing engine and Python plugins](/influxdb3/enterprise/processing-engine/) guide.
|
||||
- name: Query data
|
||||
description: Query data using SQL or InfluxQL
|
||||
- name: Quick start
|
||||
description: |
|
||||
1. [Create an admin token](#section/Authentication) for the InfluxDB 3 Enterprise API.
|
||||
1. [Create an admin token](#section/Authentication) to authorize API requests.
|
||||
|
||||
```bash
|
||||
curl -X POST "http://localhost:8181/api/v3/configure/token/admin"
|
||||
|
@ -385,12 +385,7 @@ paths:
|
|||
parameters:
|
||||
- $ref: '#/components/parameters/dbWriteParam'
|
||||
- $ref: '#/components/parameters/accept_partial'
|
||||
- name: precision
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
$ref: '#/components/schemas/PrecisionWrite'
|
||||
description: Precision of timestamps.
|
||||
- $ref: '#/components/parameters/precisionParam'
|
||||
- name: no_sync
|
||||
in: query
|
||||
schema:
|
||||
|
@ -440,16 +435,8 @@ paths:
|
|||
description: Executes an SQL query to retrieve data from the specified database.
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/db'
|
||||
- name: q
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: format
|
||||
in: query
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- $ref: '#/components/parameters/querySqlParam'
|
||||
- $ref: '#/components/parameters/format'
|
||||
- $ref: '#/components/parameters/AcceptQueryHeader'
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
responses:
|
||||
|
@ -1072,15 +1059,104 @@ paths:
|
|||
post:
|
||||
operationId: PostConfigureProcessingEngineTrigger
|
||||
summary: Create processing engine trigger
|
||||
description: Creates a new processing engine trigger.
|
||||
description: |
|
||||
Creates a processing engine trigger with the specified plugin file and trigger specification.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
requestBody:
|
||||
required: true
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
examples:
|
||||
schedule_cron:
|
||||
summary: Schedule trigger using cron
|
||||
description: |
|
||||
In `"cron:CRON_EXPRESSION"`, `CRON_EXPRESSION` uses extended 6-field cron format.
|
||||
The cron expression `0 0 6 * * 1-5` means the trigger will run at 6:00 AM every weekday (Monday to Friday).
|
||||
value:
|
||||
db: DATABASE_NAME
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_cron_trigger
|
||||
trigger_specification: cron:0 0 6 * * 1-5
|
||||
schedule_every:
|
||||
summary: Schedule trigger using interval
|
||||
description: |
|
||||
In `"every:DURATION"`, `DURATION` specifies the interval between trigger executions.
|
||||
The duration `1h` means the trigger will run every hour.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_trigger
|
||||
trigger_specification: every:1h
|
||||
schedule_every_seconds:
|
||||
summary: Schedule trigger using seconds interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 30 seconds.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_30s_trigger
|
||||
trigger_specification: every:30s
|
||||
schedule_every_minutes:
|
||||
summary: Schedule trigger using minutes interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 5 minutes.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_5m_trigger
|
||||
trigger_specification: every:5m
|
||||
all_tables:
|
||||
summary: All tables trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to any table in the database.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: all_tables.py
|
||||
trigger_name: all_tables_trigger
|
||||
trigger_specification: all_tables
|
||||
table_specific:
|
||||
summary: Table-specific trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to a specific table.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: table.py
|
||||
trigger_name: table_trigger
|
||||
trigger_specification: table:sensors
|
||||
api_request:
|
||||
summary: On-demand request trigger example
|
||||
description: |
|
||||
Creates an HTTP endpoint `/api/v3/engine/hello-world` for manual invocation.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: request.py
|
||||
trigger_name: hello_world_trigger
|
||||
trigger_specification: path:hello-world
|
||||
cron_friday_afternoon:
|
||||
summary: Cron trigger for Friday afternoons
|
||||
description: |
|
||||
Example of a cron trigger that runs every Friday at 2:30 PM.
|
||||
value:
|
||||
db: reports
|
||||
plugin_filename: weekly_report.py
|
||||
trigger_name: friday_report_trigger
|
||||
trigger_specification: cron:0 30 14 * * 5
|
||||
cron_monthly:
|
||||
summary: Cron trigger for monthly execution
|
||||
description: |
|
||||
Example of a cron trigger that runs on the first day of every month at midnight.
|
||||
value:
|
||||
db: monthly_data
|
||||
plugin_filename: monthly_cleanup.py
|
||||
trigger_name: monthly_cleanup_trigger
|
||||
trigger_specification: cron:0 0 0 1 * *
|
||||
responses:
|
||||
'201':
|
||||
'200':
|
||||
description: Success. Processing engine trigger created.
|
||||
'400':
|
||||
description: Bad request.
|
||||
|
@ -1157,7 +1233,7 @@ paths:
|
|||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The processing engine trigger has been enabled.
|
||||
description: Success. The processing engine trigger is enabled.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1170,7 +1246,14 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginPackages
|
||||
summary: Install plugin packages
|
||||
description: Installs packages for the plugin environment.
|
||||
description: |
|
||||
Installs the specified Python packages into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the packages are installed.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1179,10 +1262,30 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
packages:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: |
|
||||
A list of Python package names to install.
|
||||
Can include version specifiers (e.g., "scipy==1.9.0").
|
||||
example:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
required:
|
||||
- packages
|
||||
example:
|
||||
packages:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The packages have been installed.
|
||||
description: Success. The packages are installed.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1193,7 +1296,15 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginRequirements
|
||||
summary: Install plugin requirements
|
||||
description: Installs requirements for the plugin environment.
|
||||
description: |
|
||||
Installs requirements from a requirements file (also known as a "pip requirements file") into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the requirements are installed.
|
||||
|
||||
### Related
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
- [Python requirements file format](https://pip.pypa.io/en/stable/reference/requirements-file-format/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1202,7 +1313,17 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
requirements_location:
|
||||
type: string
|
||||
description: |
|
||||
The path to the requirements file containing Python packages to install.
|
||||
Can be a relative path (relative to the plugin directory) or an absolute path.
|
||||
example: requirements.txt
|
||||
required:
|
||||
- requirements_location
|
||||
example:
|
||||
requirements_location: requirements.txt
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The requirements have been installed.
|
||||
|
@ -1248,18 +1369,18 @@ paths:
|
|||
parameters:
|
||||
- name: plugin_path
|
||||
description: |
|
||||
The path configured in the `trigger-spec` for the plugin.
|
||||
The path configured in the request trigger specification "path:<plugin_path>"` for the plugin.
|
||||
|
||||
For example, if you define a trigger with the following:
|
||||
|
||||
```
|
||||
trigger-spec: "request:hello-world"
|
||||
```json
|
||||
trigger-spec: "path:hello-world"
|
||||
```
|
||||
|
||||
then, the HTTP API exposes the following plugin endpoint:
|
||||
|
||||
```
|
||||
<INFLUXDB_HOST>/api/v3/engine/hello-world
|
||||
<INFLUXDB3_HOST>/api/v3/engine/hello-world
|
||||
```
|
||||
in: path
|
||||
required: true
|
||||
|
@ -1269,7 +1390,7 @@ paths:
|
|||
operationId: GetProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1296,7 +1417,7 @@ paths:
|
|||
operationId: PostProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1448,7 +1569,6 @@ components:
|
|||
schema:
|
||||
type: string
|
||||
description: |
|
||||
The name of the database.
|
||||
The name of the database.
|
||||
InfluxDB creates the database if it doesn't already exist, and then
|
||||
writes all points in the batch to the database.
|
||||
|
@ -1804,15 +1924,69 @@ components:
|
|||
type: string
|
||||
plugin_filename:
|
||||
type: string
|
||||
description: |
|
||||
The path and filename of the plugin to execute--for example,
|
||||
`schedule.py` or `endpoints/report.py`.
|
||||
The path can be absolute or relative to the `--plugins-dir` directory configured when starting InfluxDB 3.
|
||||
|
||||
The plugin file must implement the trigger interface associated with the trigger's specification (`trigger_spec`).
|
||||
trigger_name:
|
||||
type: string
|
||||
trigger_specification:
|
||||
type: string
|
||||
description: |
|
||||
Specifies when and how the processing engine trigger should be invoked.
|
||||
|
||||
## Supported trigger specifications:
|
||||
|
||||
### Cron-based scheduling
|
||||
Format: `cron:CRON_EXPRESSION`
|
||||
|
||||
Uses extended (6-field) cron format (second minute hour day_of_month month day_of_week):
|
||||
```
|
||||
┌───────────── second (0-59)
|
||||
│ ┌───────────── minute (0-59)
|
||||
│ │ ┌───────────── hour (0-23)
|
||||
│ │ │ ┌───────────── day of month (1-31)
|
||||
│ │ │ │ ┌───────────── month (1-12)
|
||||
│ │ │ │ │ ┌───────────── day of week (0-6, Sunday=0)
|
||||
│ │ │ │ │ │
|
||||
* * * * * *
|
||||
```
|
||||
Examples:
|
||||
- `cron:0 0 6 * * 1-5` - Every weekday at 6:00 AM
|
||||
- `cron:0 30 14 * * 5` - Every Friday at 2:30 PM
|
||||
- `cron:0 0 0 1 * *` - First day of every month at midnight
|
||||
|
||||
### Interval-based scheduling
|
||||
Format: `every:DURATION`
|
||||
|
||||
Supported durations: `s` (seconds), `m` (minutes), `h` (hours), `d` (days):
|
||||
- `every:30s` - Every 30 seconds
|
||||
- `every:5m` - Every 5 minutes
|
||||
- `every:1h` - Every hour
|
||||
- `every:1d` - Every day
|
||||
|
||||
### Table-based triggers
|
||||
- `all_tables` - Triggers on write events to any table in the database
|
||||
- `table:TABLE_NAME` - Triggers on write events to a specific table
|
||||
|
||||
### On-demand triggers
|
||||
Format: `path:ENDPOINT_NAME`
|
||||
|
||||
Creates an HTTP endpoint `/api/v3/engine/ENDPOINT_NAME` for manual invocation:
|
||||
- `path:hello-world` - Creates endpoint `/api/v3/engine/hello-world`
|
||||
- `path:data-export` - Creates endpoint `/api/v3/engine/data-export`
|
||||
pattern: ^(cron:[0-9 *,/-]+|every:[0-9]+[smhd]|all_tables|table:[a-zA-Z_][a-zA-Z0-9_]*|path:[a-zA-Z0-9_-]+)$
|
||||
example: cron:0 0 6 * * 1-5
|
||||
trigger_arguments:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
description: Optional arguments passed to the plugin.
|
||||
disabled:
|
||||
type: boolean
|
||||
default: false
|
||||
description: Whether the trigger is disabled.
|
||||
required:
|
||||
- db
|
||||
- plugin_filename
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
///////////////// Preferred Client Library programming language ///////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
import { activateTabs, updateBtnURLs } from './tabbed-content.js';
|
||||
import { getPreference, setPreference } from './local-storage.js';
|
||||
import { getPreference, setPreference } from './services/local-storage.js';
|
||||
|
||||
function getVisitedApiLib() {
|
||||
const path = window.location.pathname.match(
|
||||
|
|
|
@ -9,8 +9,8 @@ function setUser(userid, email) {
|
|||
user: {
|
||||
uniqueClientId: userid,
|
||||
email: email,
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Initialize the chat widget
|
||||
|
@ -19,18 +19,20 @@ function initializeChat({onChatLoad, chatAttributes}) {
|
|||
* available configuration options.
|
||||
* All values are strings.
|
||||
*/
|
||||
// If you make changes to data attributes here, you also need to port the changes to the api-docs/template.hbs API reference template.
|
||||
// If you make changes to data attributes here, you also need to
|
||||
// port the changes to the api-docs/template.hbs API reference template.
|
||||
const requiredAttributes = {
|
||||
websiteId: 'a02bca75-1dd3-411e-95c0-79ee1139be4d',
|
||||
projectName: 'InfluxDB',
|
||||
projectColor: '#020a47',
|
||||
projectLogo: '/img/influx-logo-cubo-white.png',
|
||||
}
|
||||
};
|
||||
|
||||
const optionalAttributes = {
|
||||
|
||||
modalDisclaimer: 'This AI can access [documentation for InfluxDB, clients, and related tools](https://docs.influxdata.com). Information you submit is used in accordance with our [Privacy Policy](https://www.influxdata.com/legal/privacy-policy/).',
|
||||
modalExampleQuestions: 'Use Python to write data to InfluxDB 3,How do I query using SQL?,How do I use MQTT with Telegraf?',
|
||||
modalDisclaimer:
|
||||
'This AI can access [documentation for InfluxDB, clients, and related tools](https://docs.influxdata.com). Information you submit is used in accordance with our [Privacy Policy](https://www.influxdata.com/legal/privacy-policy/).',
|
||||
modalExampleQuestions:
|
||||
'Use Python to write data to InfluxDB 3,How do I query using SQL?,How do I use MQTT with Telegraf?',
|
||||
buttonHide: 'true',
|
||||
exampleQuestionButtonWidth: 'auto',
|
||||
modalOpenOnCommandK: 'true',
|
||||
|
@ -52,7 +54,7 @@ function initializeChat({onChatLoad, chatAttributes}) {
|
|||
modalHeaderBorderBottom: 'none',
|
||||
modalTitleColor: '#fff',
|
||||
modalTitleFontSize: '1.25rem',
|
||||
}
|
||||
};
|
||||
|
||||
const scriptUrl = 'https://widget.kapa.ai/kapa-widget.bundle.js';
|
||||
const script = document.createElement('script');
|
||||
|
@ -66,8 +68,12 @@ function initializeChat({onChatLoad, chatAttributes}) {
|
|||
console.error('Error loading AI chat widget script');
|
||||
};
|
||||
|
||||
const dataset = {...requiredAttributes, ...optionalAttributes, ...chatAttributes};
|
||||
Object.keys(dataset).forEach(key => {
|
||||
const dataset = {
|
||||
...requiredAttributes,
|
||||
...optionalAttributes,
|
||||
...chatAttributes,
|
||||
};
|
||||
Object.keys(dataset).forEach((key) => {
|
||||
// Assign dataset attributes from the object
|
||||
script.dataset[key] = dataset[key];
|
||||
});
|
||||
|
@ -91,12 +97,11 @@ function getProductExampleQuestions() {
|
|||
* userid: optional, a unique user ID for the user (not currently used for public docs)
|
||||
*/
|
||||
export default function AskAI({ userid, email, onChatLoad, ...chatParams }) {
|
||||
|
||||
const modalExampleQuestions = getProductExampleQuestions();
|
||||
const chatAttributes = {
|
||||
...(modalExampleQuestions && { modalExampleQuestions }),
|
||||
...chatParams,
|
||||
}
|
||||
};
|
||||
initializeChat({ onChatLoad, chatAttributes });
|
||||
|
||||
if (userid) {
|
||||
|
|
|
@ -2,7 +2,7 @@ import $ from 'jquery';
|
|||
|
||||
function initialize() {
|
||||
var codeBlockSelector = '.article--content pre';
|
||||
var codeBlocks = $(codeBlockSelector);
|
||||
var $codeBlocks = $(codeBlockSelector);
|
||||
|
||||
var appendHTML = `
|
||||
<div class="code-controls">
|
||||
|
@ -15,7 +15,7 @@ function initialize() {
|
|||
`;
|
||||
|
||||
// Wrap all codeblocks with a new 'codeblock' div
|
||||
$(codeBlocks).each(function () {
|
||||
$codeBlocks.each(function () {
|
||||
$(this).wrap("<div class='codeblock'></div>");
|
||||
});
|
||||
|
||||
|
@ -68,7 +68,9 @@ function initialize() {
|
|||
// Trigger copy failure state lifecycle
|
||||
|
||||
$('.copy-code').click(function () {
|
||||
let text = $(this).closest('.code-controls').prevAll('pre:has(code)')[0].innerText;
|
||||
let text = $(this)
|
||||
.closest('.code-controls')
|
||||
.prevAll('pre:has(code)')[0].innerText;
|
||||
|
||||
const copyContent = async () => {
|
||||
try {
|
||||
|
@ -90,7 +92,10 @@ Disable scrolling on the body.
|
|||
Disable user selection on everything but the fullscreen codeblock.
|
||||
*/
|
||||
$('.fullscreen-toggle').click(function () {
|
||||
var code = $(this).closest('.code-controls').prevAll('pre:has(code)').clone();
|
||||
var code = $(this)
|
||||
.closest('.code-controls')
|
||||
.prevAll('pre:has(code)')
|
||||
.clone();
|
||||
|
||||
$('#fullscreen-code-placeholder').replaceWith(code[0]);
|
||||
$('body').css('overflow', 'hidden');
|
||||
|
|
|
@ -0,0 +1,78 @@
|
|||
// Memoize the mermaid module import
|
||||
let mermaidPromise = null;
|
||||
|
||||
export default function Diagram({ component }) {
|
||||
// Import mermaid.js module (memoized)
|
||||
if (!mermaidPromise) {
|
||||
mermaidPromise = import('mermaid');
|
||||
}
|
||||
mermaidPromise
|
||||
.then(({ default: mermaid }) => {
|
||||
// Configure mermaid with InfluxData theming
|
||||
mermaid.initialize({
|
||||
startOnLoad: false, // We'll manually call run()
|
||||
theme: document.body.classList.contains('dark-theme')
|
||||
? 'dark'
|
||||
: 'default',
|
||||
themeVariables: {
|
||||
fontFamily: 'Proxima Nova',
|
||||
fontSize: '16px',
|
||||
lineColor: '#22ADF6',
|
||||
primaryColor: '#22ADF6',
|
||||
primaryTextColor: '#545454',
|
||||
secondaryColor: '#05CE78',
|
||||
tertiaryColor: '#f4f5f5',
|
||||
},
|
||||
securityLevel: 'loose', // Required for interactive diagrams
|
||||
logLevel: 'error',
|
||||
});
|
||||
|
||||
// Process the specific diagram component
|
||||
try {
|
||||
mermaid.run({ nodes: [component] });
|
||||
} catch (error) {
|
||||
console.error('Mermaid diagram rendering error:', error);
|
||||
}
|
||||
|
||||
// Store reference to mermaid for theme switching
|
||||
if (!window.mermaidInstances) {
|
||||
window.mermaidInstances = new Map();
|
||||
}
|
||||
window.mermaidInstances.set(component, mermaid);
|
||||
})
|
||||
.catch((error) => {
|
||||
console.error('Failed to load Mermaid library:', error);
|
||||
});
|
||||
|
||||
// Listen for theme changes to refresh diagrams
|
||||
const observer = new MutationObserver((mutations) => {
|
||||
mutations.forEach((mutation) => {
|
||||
if (
|
||||
mutation.attributeName === 'class' &&
|
||||
document.body.classList.contains('dark-theme') !== window.isDarkTheme
|
||||
) {
|
||||
window.isDarkTheme = document.body.classList.contains('dark-theme');
|
||||
|
||||
// Reload this specific diagram with new theme
|
||||
if (window.mermaidInstances?.has(component)) {
|
||||
const mermaid = window.mermaidInstances.get(component);
|
||||
mermaid.initialize({
|
||||
theme: window.isDarkTheme ? 'dark' : 'default',
|
||||
});
|
||||
mermaid.run({ nodes: [component] });
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Watch for theme changes on body element
|
||||
observer.observe(document.body, { attributes: true });
|
||||
|
||||
// Return cleanup function to be called when component is destroyed
|
||||
return () => {
|
||||
observer.disconnect();
|
||||
if (window.mermaidInstances?.has(component)) {
|
||||
window.mermaidInstances.delete(component);
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,180 @@
|
|||
/**
|
||||
* DocSearch component for InfluxData documentation
|
||||
* Handles asynchronous loading and initialization of Algolia DocSearch
|
||||
*/
|
||||
const debug = false; // Set to true for debugging output
|
||||
|
||||
export default function DocSearch({ component }) {
|
||||
// Store configuration from component data attributes
|
||||
const config = {
|
||||
apiKey: component.getAttribute('data-api-key'),
|
||||
appId: component.getAttribute('data-app-id'),
|
||||
indexName: component.getAttribute('data-index-name'),
|
||||
inputSelector: component.getAttribute('data-input-selector'),
|
||||
searchTag: component.getAttribute('data-search-tag'),
|
||||
includeFlux: component.getAttribute('data-include-flux') === 'true',
|
||||
includeResources:
|
||||
component.getAttribute('data-include-resources') === 'true',
|
||||
debug: component.getAttribute('data-debug') === 'true',
|
||||
};
|
||||
|
||||
// Initialize global object to track DocSearch state
|
||||
window.InfluxDocs = window.InfluxDocs || {};
|
||||
window.InfluxDocs.search = {
|
||||
initialized: false,
|
||||
options: config,
|
||||
};
|
||||
|
||||
// Load DocSearch asynchronously
|
||||
function loadDocSearch() {
|
||||
if (debug) {
|
||||
console.log('Loading DocSearch script...');
|
||||
}
|
||||
const script = document.createElement('script');
|
||||
script.src =
|
||||
'https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.js';
|
||||
script.async = true;
|
||||
script.onload = initializeDocSearch;
|
||||
document.body.appendChild(script);
|
||||
}
|
||||
|
||||
// Initialize DocSearch after script loads
|
||||
function initializeDocSearch() {
|
||||
if (debug) {
|
||||
console.log('Initializing DocSearch...');
|
||||
}
|
||||
const multiVersion = ['influxdb'];
|
||||
|
||||
// Use object-based lookups instead of conditionals for version and product names
|
||||
// These can be replaced with data from productData in the future
|
||||
|
||||
// Version display name mappings
|
||||
const versionDisplayNames = {
|
||||
cloud: 'Cloud (TSM)',
|
||||
core: 'Core',
|
||||
enterprise: 'Enterprise',
|
||||
'cloud-serverless': 'Cloud Serverless',
|
||||
'cloud-dedicated': 'Cloud Dedicated',
|
||||
clustered: 'Clustered',
|
||||
explorer: 'Explorer',
|
||||
};
|
||||
|
||||
// Product display name mappings
|
||||
const productDisplayNames = {
|
||||
influxdb: 'InfluxDB',
|
||||
influxdb3: 'InfluxDB 3',
|
||||
explorer: 'InfluxDB 3 Explorer',
|
||||
enterprise_influxdb: 'InfluxDB Enterprise',
|
||||
flux: 'Flux',
|
||||
telegraf: 'Telegraf',
|
||||
chronograf: 'Chronograf',
|
||||
kapacitor: 'Kapacitor',
|
||||
platform: 'InfluxData Platform',
|
||||
resources: 'Additional Resources',
|
||||
};
|
||||
|
||||
// Initialize DocSearch with configuration
|
||||
window.docsearch({
|
||||
apiKey: config.apiKey,
|
||||
appId: config.appId,
|
||||
indexName: config.indexName,
|
||||
inputSelector: config.inputSelector,
|
||||
debug: config.debug,
|
||||
transformData: function (hits) {
|
||||
// Format version using object lookup instead of if-else chain
|
||||
function fmtVersion(version, productKey) {
|
||||
if (version == null) {
|
||||
return '';
|
||||
} else if (versionDisplayNames[version]) {
|
||||
return versionDisplayNames[version];
|
||||
} else if (multiVersion.includes(productKey)) {
|
||||
return version;
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
hits.map((hit) => {
|
||||
const pathData = new URL(hit.url).pathname
|
||||
.split('/')
|
||||
.filter((n) => n);
|
||||
const product = productDisplayNames[pathData[0]] || pathData[0];
|
||||
const version = fmtVersion(pathData[1], pathData[0]);
|
||||
|
||||
hit.product = product;
|
||||
hit.version = version;
|
||||
hit.hierarchy.lvl0 =
|
||||
hit.hierarchy.lvl0 +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
hit._highlightResult.hierarchy.lvl0.value =
|
||||
hit._highlightResult.hierarchy.lvl0.value +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
});
|
||||
return hits;
|
||||
},
|
||||
algoliaOptions: {
|
||||
hitsPerPage: 10,
|
||||
facetFilters: buildFacetFilters(config),
|
||||
},
|
||||
autocompleteOptions: {
|
||||
templates: {
|
||||
header:
|
||||
'<div class="search-all-content"><a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a>',
|
||||
empty:
|
||||
'<div class="search-no-results"><p>Not finding what you\'re looking for?</p> <a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a></div>',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Mark DocSearch as initialized
|
||||
window.InfluxDocs.search.initialized = true;
|
||||
|
||||
// Dispatch event for other components to know DocSearch is ready
|
||||
window.dispatchEvent(new CustomEvent('docsearch-initialized'));
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper function to build facet filters based on config
|
||||
* - Uses nested arrays for AND conditions
|
||||
* - Includes space after colon in filter expressions
|
||||
*/
|
||||
function buildFacetFilters(config) {
|
||||
if (!config.searchTag) {
|
||||
return ['latest:true'];
|
||||
} else if (config.includeFlux) {
|
||||
// Return a nested array to match original template structure
|
||||
// Note the space after each colon
|
||||
return [
|
||||
[
|
||||
'searchTag: ' + config.searchTag,
|
||||
'flux:true',
|
||||
'resources: ' + config.includeResources,
|
||||
],
|
||||
];
|
||||
} else {
|
||||
// Return a nested array to match original template structure
|
||||
// Note the space after each colon
|
||||
return [
|
||||
[
|
||||
'searchTag: ' + config.searchTag,
|
||||
'resources: ' + config.includeResources,
|
||||
],
|
||||
];
|
||||
}
|
||||
}
|
||||
|
||||
// Load DocSearch when page is idle or after a slight delay
|
||||
if ('requestIdleCallback' in window) {
|
||||
requestIdleCallback(loadDocSearch);
|
||||
} else {
|
||||
setTimeout(loadDocSearch, 500);
|
||||
}
|
||||
|
||||
// Return cleanup function
|
||||
return function cleanup() {
|
||||
// Clean up any event listeners if needed
|
||||
if (debug) {
|
||||
console.log('DocSearch component cleanup');
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,6 @@
|
|||
import SearchInteractions from '../utils/search-interactions.js';
|
||||
|
||||
export default function SidebarSearch({ component }) {
|
||||
const searchInput = component.querySelector('.sidebar--search-field');
|
||||
SearchInteractions({ searchInput });
|
||||
}
|
|
@ -1,7 +1,7 @@
|
|||
import $ from 'jquery';
|
||||
import { Datepicker } from 'vanillajs-datepicker';
|
||||
import { toggleModal } from './modals.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
|
||||
// Placeholder start date used in InfluxDB custom timestamps
|
||||
const defaultStartDate = '2022-01-01';
|
||||
|
@ -111,7 +111,7 @@ function timeToUnixSeconds(time) {
|
|||
|
||||
function updateTimestamps(newStartDate, seedTimes = defaultTimes) {
|
||||
// Update the times array with replacement times
|
||||
const times = seedTimes.map(x => {
|
||||
const times = seedTimes.map((x) => {
|
||||
var newStartTimestamp = x.rfc3339.replace(/^.*T/, newStartDate + 'T');
|
||||
|
||||
return {
|
||||
|
|
|
@ -1,30 +1,54 @@
|
|||
const monthNames = ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"];
|
||||
var date = new Date()
|
||||
var currentTimestamp = date.toISOString().replace(/^(.*)(\.\d+)(Z)/, '$1$3') // 2023-01-01T12:34:56Z
|
||||
var currentTime = date.toISOString().replace(/(^.*T)(.*)(Z)/, '$2') + '084216' // 12:34:56.000084216
|
||||
import $ from 'jquery';
|
||||
|
||||
var date = new Date();
|
||||
var currentTimestamp = date.toISOString().replace(/^(.*)(\.\d+)(Z)/, '$1$3'); // 2023-01-01T12:34:56Z
|
||||
|
||||
// Microsecond offset appended to the current time string for formatting purposes
|
||||
const MICROSECOND_OFFSET = '084216';
|
||||
|
||||
var currentTime =
|
||||
date.toISOString().replace(/(^.*T)(.*)(Z)/, '$2') + MICROSECOND_OFFSET; // 12:34:56.000084216
|
||||
function currentDate(offset = 0, trimTime = false) {
|
||||
outputDate = new Date(date)
|
||||
outputDate.setDate(outputDate.getDate() + offset)
|
||||
let outputDate = new Date(date);
|
||||
outputDate.setDate(outputDate.getDate() + offset);
|
||||
|
||||
if (trimTime) {
|
||||
return outputDate.toISOString().replace(/T.*$/, '') // 2023-01-01
|
||||
return outputDate.toISOString().replace(/T.*$/, ''); // 2023-01-01
|
||||
} else {
|
||||
return outputDate.toISOString().replace(/T.*$/, 'T00:00:00Z') // 2023-01-01T00:00:00Z
|
||||
return outputDate.toISOString().replace(/T.*$/, 'T00:00:00Z'); // 2023-01-01T00:00:00Z
|
||||
}
|
||||
}
|
||||
|
||||
function enterpriseEOLDate() {
|
||||
var inTwoYears = date.setFullYear(date.getFullYear() + 2)
|
||||
earliestEOL = new Date(inTwoYears)
|
||||
return `${monthNames[earliestEOL.getMonth()]} ${earliestEOL.getDate()}, ${earliestEOL.getFullYear()}`
|
||||
const monthNames = [
|
||||
'January',
|
||||
'February',
|
||||
'March',
|
||||
'April',
|
||||
'May',
|
||||
'June',
|
||||
'July',
|
||||
'August',
|
||||
'September',
|
||||
'October',
|
||||
'November',
|
||||
'December',
|
||||
];
|
||||
var inTwoYears = new Date(date);
|
||||
inTwoYears.setFullYear(inTwoYears.getFullYear() + 2);
|
||||
let earliestEOL = new Date(inTwoYears);
|
||||
return `${monthNames[earliestEOL.getMonth()]} ${earliestEOL.getDate()}, ${earliestEOL.getFullYear()}`;
|
||||
}
|
||||
|
||||
$('span.current-timestamp').text(currentTimestamp)
|
||||
$('span.current-time').text(currentTime)
|
||||
$('span.enterprise-eol-date').text(enterpriseEOLDate)
|
||||
function initialize() {
|
||||
$('span.current-timestamp').text(currentTimestamp);
|
||||
$('span.current-time').text(currentTime);
|
||||
$('span.enterprise-eol-date').text(enterpriseEOLDate());
|
||||
$('span.current-date').each(function () {
|
||||
var dayOffset = parseInt($(this).attr("offset"))
|
||||
var trimTime = $(this).attr("trim-time") === "true"
|
||||
$(this).text(currentDate(dayOffset, trimTime))
|
||||
})
|
||||
var dayOffset = parseInt($(this).attr('offset'));
|
||||
var trimTime = $(this).attr('trim-time') === 'true';
|
||||
$(this).text(currentDate(dayOffset, trimTime));
|
||||
});
|
||||
}
|
||||
|
||||
export { initialize };
|
||||
|
|
|
@ -2,37 +2,24 @@
|
|||
This feature is designed to callout new features added to the documentation
|
||||
CSS is required for the callout bubble to determine look and position, but the
|
||||
element must have the `callout` class and a unique id.
|
||||
Callouts are treated as notifications and use the notification cookie API in
|
||||
assets/js/cookies.js.
|
||||
Callouts are treated as notifications and use the LocalStorage notification API.
|
||||
*/
|
||||
|
||||
import $ from 'jquery';
|
||||
import * as LocalStorageAPI from './services/local-storage.js';
|
||||
|
||||
// Get notification ID
|
||||
function getCalloutID(el) {
|
||||
return $(el).attr('id');
|
||||
}
|
||||
|
||||
// Hide a callout and update the cookie with the viewed callout
|
||||
function hideCallout (calloutID) {
|
||||
if (!window.LocalStorageAPI.notificationIsRead(calloutID)) {
|
||||
window.LocalStorageAPI.setNotificationAsRead(calloutID, 'callout');
|
||||
$(`#${calloutID}`).fadeOut(200);
|
||||
}
|
||||
}
|
||||
|
||||
// Show the url feature callouts on page load
|
||||
$(document).ready(function () {
|
||||
$('.feature-callout').each(function () {
|
||||
const calloutID = getCalloutID($(this));
|
||||
export default function FeatureCallout({ component }) {
|
||||
const calloutID = getCalloutID($(component));
|
||||
|
||||
if (!window.LocalStorageAPI.notificationIsRead(calloutID, 'callout')) {
|
||||
if (!LocalStorageAPI.notificationIsRead(calloutID, 'callout')) {
|
||||
$(`#${calloutID}.feature-callout`)
|
||||
.fadeIn(300)
|
||||
.removeClass('start-position');
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Hide the InfluxDB URL selector callout
|
||||
// $('button.url-trigger, #influxdb-url-selector .close').click(function () {
|
||||
// hideCallout('influxdb-url-selector');
|
||||
// });
|
||||
}
|
||||
|
|
|
@ -1,49 +1,148 @@
|
|||
var tablesElement = $("#flux-group-keys-demo #grouped-tables")
|
||||
import $ from 'jquery';
|
||||
|
||||
// Sample data
|
||||
let data = [
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 110.3 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 112.5 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 111.9 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 110.3,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 112.5,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 111.9,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 73.4 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 73.7 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 75.1 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 73.4,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 73.7,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 75.1,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 108.2 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 108.5 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 109.6 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 108.2,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 108.5,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 109.6,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 71.8 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 72.3 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 72.1 }
|
||||
]
|
||||
]
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 71.8,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 72.3,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 72.1,
|
||||
},
|
||||
],
|
||||
];
|
||||
|
||||
// Default group key
|
||||
let groupKey = ["_measurement", "loc", "sensorID", "_field"]
|
||||
let groupKey = ['_measurement', 'loc', 'sensorID', '_field'];
|
||||
|
||||
export default function FluxGroupKeysDemo({ component }) {
|
||||
$('.column-list label').click(function () {
|
||||
toggleCheckbox($(this));
|
||||
groupKey = getChecked(component);
|
||||
groupData();
|
||||
buildGroupExample(component);
|
||||
});
|
||||
|
||||
// Group and render tables on load
|
||||
groupData();
|
||||
}
|
||||
|
||||
// Build a table group (group key and table) using an array of objects
|
||||
function buildTable(inputData) {
|
||||
|
||||
// Build the group key string
|
||||
function wrapString(column, value) {
|
||||
var stringColumns = ["_measurement", "loc", "sensorID", "_field"]
|
||||
var stringColumns = ['_measurement', 'loc', 'sensorID', '_field'];
|
||||
if (stringColumns.includes(column)) {
|
||||
return '"' + value + '"'
|
||||
return '"' + value + '"';
|
||||
} else {
|
||||
return value
|
||||
return value;
|
||||
}
|
||||
}
|
||||
var groupKeyString = "Group key instance = [" + (groupKey.map(column => column + ": " + wrapString(column, (inputData[0])[column])) ).join(", ") + "]";
|
||||
var groupKeyLabel = document.createElement("p");
|
||||
groupKeyLabel.className = "table-group-key"
|
||||
groupKeyLabel.innerHTML = groupKeyString
|
||||
|
||||
var groupKeyString =
|
||||
'Group key instance = [' +
|
||||
groupKey
|
||||
.map((column) => column + ': ' + wrapString(column, inputData[0][column]))
|
||||
.join(', ') +
|
||||
']';
|
||||
var groupKeyLabel = document.createElement('p');
|
||||
groupKeyLabel.className = 'table-group-key';
|
||||
groupKeyLabel.innerHTML = groupKeyString;
|
||||
|
||||
// Extract column headers
|
||||
var columns = [];
|
||||
|
@ -56,54 +155,55 @@ function buildTable(inputData) {
|
|||
}
|
||||
|
||||
// Create the table element
|
||||
var table = document.createElement("table");
|
||||
const table = document.createElement('table');
|
||||
|
||||
// Create the table header
|
||||
for (let i = 0; i < columns.length; i++) {
|
||||
var header = table.createTHead();
|
||||
var th = document.createElement("th");
|
||||
var th = document.createElement('th');
|
||||
th.innerHTML = columns[i];
|
||||
if (groupKey.includes(columns[i])) {
|
||||
th.className = "grouped-by";
|
||||
th.className = 'grouped-by';
|
||||
}
|
||||
header.appendChild(th);
|
||||
}
|
||||
|
||||
// Add inputData to the HTML table
|
||||
for (let i = 0; i < inputData.length; i++) {
|
||||
tr = table.insertRow(-1);
|
||||
let tr = table.insertRow(-1);
|
||||
for (let j = 0; j < columns.length; j++) {
|
||||
var td = tr.insertCell(-1);
|
||||
td.innerHTML = inputData[i][columns[j]];
|
||||
// Highlight the value if column is part of the group key
|
||||
if (groupKey.includes(columns[j])) {
|
||||
td.className = "grouped-by";
|
||||
td.className = 'grouped-by';
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create a table group with group key and table
|
||||
var tableGroup = document.createElement("div");
|
||||
tableGroup.innerHTML += groupKeyLabel.outerHTML + table.outerHTML
|
||||
var tableGroup = document.createElement('div');
|
||||
tableGroup.innerHTML += groupKeyLabel.outerHTML + table.outerHTML;
|
||||
|
||||
return tableGroup
|
||||
return tableGroup;
|
||||
}
|
||||
|
||||
// Clear and rebuild all HTML tables
|
||||
function buildTables(data) {
|
||||
existingTables = tablesElement[0]
|
||||
let tablesElement = $('#flux-group-keys-demo #grouped-tables');
|
||||
let existingTables = tablesElement[0];
|
||||
while (existingTables.firstChild) {
|
||||
existingTables.removeChild(existingTables.firstChild);
|
||||
}
|
||||
for (let i = 0; i < data.length; i++) {
|
||||
var table = buildTable(data[i])
|
||||
var table = buildTable(data[i]);
|
||||
tablesElement.append(table);
|
||||
}
|
||||
}
|
||||
|
||||
// Group data based on the group key and output new tables
|
||||
function groupData() {
|
||||
let groupedData = data.flat()
|
||||
let groupedData = data.flat();
|
||||
|
||||
function groupBy(array, f) {
|
||||
var groups = {};
|
||||
|
@ -114,20 +214,19 @@ function groupData() {
|
|||
});
|
||||
return Object.keys(groups).map(function (group) {
|
||||
return groups[group];
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
groupedData = groupBy(groupedData, function (r) {
|
||||
return groupKey.map(v => r[v]);
|
||||
return groupKey.map((v) => r[v]);
|
||||
});
|
||||
|
||||
buildTables(groupedData);
|
||||
}
|
||||
|
||||
function getChecked(component) {
|
||||
// Get selected column names
|
||||
var checkboxes = $("input[type=checkbox]");
|
||||
|
||||
function getChecked() {
|
||||
var checkboxes = $(component).find('input[type=checkbox]');
|
||||
var checked = [];
|
||||
for (var i = 0; i < checkboxes.length; i++) {
|
||||
var checkbox = checkboxes[i];
|
||||
|
@ -141,17 +240,12 @@ function toggleCheckbox(element) {
|
|||
}
|
||||
|
||||
// Build example group function
|
||||
function buildGroupExample() {
|
||||
var columnCollection = getChecked().map(i => '<span class=\"s2\">"' + i + '"</span>').join(", ")
|
||||
$("pre#group-by-example")[0].innerHTML = "data\n <span class='nx'>|></span> group(columns<span class='nx'>:</span> [" + columnCollection + "])";
|
||||
function buildGroupExample(component) {
|
||||
var columnCollection = getChecked(component)
|
||||
.map((i) => '<span class=\"s2\">"' + i + '"</span>')
|
||||
.join(', ');
|
||||
$('pre#group-by-example')[0].innerHTML =
|
||||
"data\n <span class='nx'>|></span> group(columns<span class='nx'>:</span> [" +
|
||||
columnCollection +
|
||||
'])';
|
||||
}
|
||||
|
||||
$(".column-list label").click(function () {
|
||||
toggleCheckbox($(this))
|
||||
groupKey = getChecked();
|
||||
groupData();
|
||||
buildGroupExample();
|
||||
});
|
||||
|
||||
// Group and render tables on load
|
||||
groupData()
|
||||
|
|
|
@ -1,22 +0,0 @@
|
|||
$('.exp-btn').click(function() {
|
||||
var targetBtnElement = $(this).parent()
|
||||
$('.exp-btn > p', targetBtnElement).fadeOut(100);
|
||||
setTimeout(function() {
|
||||
$('.exp-btn-links', targetBtnElement).fadeIn(200)
|
||||
$('.exp-btn', targetBtnElement).addClass('open');
|
||||
$('.close-btn', targetBtnElement).fadeIn(200);
|
||||
}, 100);
|
||||
})
|
||||
|
||||
$('.close-btn').click(function() {
|
||||
var targetBtnElement = $(this).parent().parent()
|
||||
$('.exp-btn-links', targetBtnElement).fadeOut(100)
|
||||
$('.exp-btn', targetBtnElement).removeClass('open');
|
||||
$(this).fadeOut(100);
|
||||
setTimeout(function() {
|
||||
$('p', targetBtnElement).fadeIn(100);
|
||||
}, 100);
|
||||
})
|
||||
|
||||
/////////////////////////////// EXPANDING BUTTONS //////////////////////////////
|
||||
|
|
@ -1 +0,0 @@
|
|||
export * from './main.js';
|
|
@ -3,7 +3,6 @@
|
|||
///////////////////////// INFLUXDB URL PREFERENCE /////////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
*/
|
||||
import * as pageParams from '@params';
|
||||
import {
|
||||
DEFAULT_STORAGE_URLS,
|
||||
getPreference,
|
||||
|
@ -12,15 +11,18 @@ import {
|
|||
removeInfluxDBUrl,
|
||||
getInfluxDBUrl,
|
||||
getInfluxDBUrls,
|
||||
} from './local-storage.js';
|
||||
} from './services/local-storage.js';
|
||||
import $ from 'jquery';
|
||||
import { context as PRODUCT_CONTEXT, referrerHost } from './page-context.js';
|
||||
import { influxdbUrls } from './services/influxdb-urls.js';
|
||||
import { delay } from './helpers.js';
|
||||
import { toggleModal } from './modals.js';
|
||||
|
||||
let CLOUD_URLS = [];
|
||||
if (pageParams && pageParams.influxdb_urls) {
|
||||
CLOUD_URLS = Object.values(pageParams.influxdb_urls.cloud.providers).flatMap((provider) => provider.regions?.map((region) => region.url));
|
||||
if (influxdbUrls?.cloud) {
|
||||
CLOUD_URLS = Object.values(influxdbUrls.cloud.providers).flatMap((provider) =>
|
||||
provider.regions?.map((region) => region.url)
|
||||
);
|
||||
}
|
||||
export { CLOUD_URLS };
|
||||
|
||||
|
@ -120,7 +122,8 @@ export function InfluxDBUrl() {
|
|||
|
||||
// Retrieve the currently selected URLs from the urls local storage object.
|
||||
function getUrls() {
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } = getInfluxDBUrls();
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } =
|
||||
getInfluxDBUrls();
|
||||
return { oss, cloud, core, enterprise, serverless, dedicated, clustered };
|
||||
}
|
||||
|
||||
|
@ -289,7 +292,8 @@ export function InfluxDBUrl() {
|
|||
}
|
||||
|
||||
// Append the URL selector button to each codeblock containing a placeholder URL
|
||||
function appendUrlSelector(urls={
|
||||
function appendUrlSelector(
|
||||
urls = {
|
||||
cloud: '',
|
||||
oss: '',
|
||||
core: '',
|
||||
|
@ -297,7 +301,8 @@ export function InfluxDBUrl() {
|
|||
serverless: '',
|
||||
dedicated: '',
|
||||
clustered: '',
|
||||
}) {
|
||||
}
|
||||
) {
|
||||
const appendToUrls = Object.values(urls);
|
||||
|
||||
const getBtnText = (context) => {
|
||||
|
@ -336,14 +341,26 @@ export function InfluxDBUrl() {
|
|||
|
||||
// Add the preserve tag to code blocks that shouldn't be updated
|
||||
addPreserve();
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } = DEFAULT_STORAGE_URLS;
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } =
|
||||
DEFAULT_STORAGE_URLS;
|
||||
|
||||
// Append URL selector buttons to code blocks
|
||||
appendUrlSelector({ cloud, oss, core, enterprise, serverless, dedicated, clustered });
|
||||
appendUrlSelector({
|
||||
cloud,
|
||||
oss,
|
||||
core,
|
||||
enterprise,
|
||||
serverless,
|
||||
dedicated,
|
||||
clustered,
|
||||
});
|
||||
|
||||
// Update URLs on load
|
||||
|
||||
updateUrls({ cloud, oss, core, enterprise, serverless, dedicated, clustered }, getUrls());
|
||||
updateUrls(
|
||||
{ cloud, oss, core, enterprise, serverless, dedicated, clustered },
|
||||
getUrls()
|
||||
);
|
||||
|
||||
// Set active radio button on page load
|
||||
setRadioButtons(getUrls());
|
||||
|
|
|
@ -1,41 +1,58 @@
|
|||
// Dynamically update keybindings or hotkeys
|
||||
function getPlatform() {
|
||||
if (/Mac/.test(navigator.platform)) {
|
||||
return "osx"
|
||||
} else if (/Win/.test(navigator.platform)) {
|
||||
return "win"
|
||||
} else if (/Linux/.test(navigator.platform)) {
|
||||
return "linux"
|
||||
import { getPlatform } from './utils/user-agent-platform.js';
|
||||
import $ from 'jquery';
|
||||
|
||||
/**
|
||||
* Adds OS-specific class to component
|
||||
* @param {string} osClass - OS-specific class to add
|
||||
* @param {Object} options - Component options
|
||||
* @param {jQuery} options.$component - jQuery element reference
|
||||
*/
|
||||
function addOSClass(osClass, { $component }) {
|
||||
$component.addClass(osClass);
|
||||
}
|
||||
|
||||
/**
|
||||
* Updates keybinding display based on detected platform
|
||||
* @param {Object} options - Component options
|
||||
* @param {jQuery} options.$component - jQuery element reference
|
||||
* @param {string} options.platform - Detected platform
|
||||
*/
|
||||
function updateKeyBindings({ $component, platform }) {
|
||||
const osx = $component.data('osx');
|
||||
const linux = $component.data('linux');
|
||||
const win = $component.data('win');
|
||||
|
||||
let keybind;
|
||||
|
||||
if (platform === 'other') {
|
||||
if (win !== linux) {
|
||||
keybind =
|
||||
`<code class="osx">${osx}</code> for macOS, ` +
|
||||
`<code>${linux}</code> for Linux, ` +
|
||||
`and <code>${win}</code> for Windows`;
|
||||
} else {
|
||||
return "other"
|
||||
}
|
||||
}
|
||||
|
||||
const platform = getPlatform()
|
||||
|
||||
function addOSClass(osClass) {
|
||||
$('.keybinding').addClass(osClass)
|
||||
}
|
||||
|
||||
function updateKeyBindings() {
|
||||
$('.keybinding').each(function() {
|
||||
var osx = $(this).data("osx")
|
||||
var linux = $(this).data("linux")
|
||||
var win = $(this).data("win")
|
||||
|
||||
if (platform === "other") {
|
||||
if (win != linux) {
|
||||
var keybind = '<code class="osx">' + osx + '</code> for macOS, <code>' + linux + '</code> for Linux, and <code>' + win + '</code> for Windows';
|
||||
} else {
|
||||
var keybind = '<code>' + linux + '</code> for Linux and Windows and <code class="osx">' + osx + '</code> for macOS';
|
||||
keybind =
|
||||
`<code>${linux}</code> for Linux and Windows and ` +
|
||||
`<code class="osx">${osx}</code> for macOS`;
|
||||
}
|
||||
} else {
|
||||
var keybind = '<code>' + $(this).data(platform) + '</code>'
|
||||
keybind = `<code>${$component.data(platform)}</code>`;
|
||||
}
|
||||
|
||||
$(this).html(keybind)
|
||||
})
|
||||
$component.html(keybind);
|
||||
}
|
||||
|
||||
addOSClass(platform)
|
||||
updateKeyBindings()
|
||||
/**
|
||||
* Initialize and render platform-specific keybindings
|
||||
* @param {Object} options - Component options
|
||||
* @param {HTMLElement} options.component - DOM element
|
||||
* @returns {void}
|
||||
*/
|
||||
export default function KeyBinding({ component }) {
|
||||
// Initialize keybindings
|
||||
const platform = getPlatform();
|
||||
const $component = $(component);
|
||||
|
||||
addOSClass(platform, { $component });
|
||||
updateKeyBindings({ $component, platform });
|
||||
}
|
||||
|
|
|
@ -1,11 +1,15 @@
|
|||
import $ from 'jquery';
|
||||
|
||||
// Count tag elements
|
||||
function countTag(tag) {
|
||||
return $(".visible[data-tags*='" + tag + "']").length
|
||||
return $(".visible[data-tags*='" + tag + "']").length;
|
||||
}
|
||||
|
||||
function getFilterCounts() {
|
||||
$('#list-filters label').each(function() {
|
||||
var tagName = $('input', this).attr('name').replace(/[\W/]+/, "-");
|
||||
function getFilterCounts($labels) {
|
||||
$labels.each(function () {
|
||||
var tagName = $('input', this)
|
||||
.attr('name')
|
||||
.replace(/[\W/]+/, '-');
|
||||
var tagCount = countTag(tagName);
|
||||
$(this).attr('data-count', '(' + tagCount + ')');
|
||||
if (tagCount <= 0) {
|
||||
|
@ -13,38 +17,58 @@ function getFilterCounts() {
|
|||
} else {
|
||||
$(this).fadeTo(400, 1.0);
|
||||
}
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
// Get initial filter count on page load
|
||||
getFilterCounts()
|
||||
/** TODO: Include the data source value in the as an additional attribute
|
||||
* in the HTML and pass it into the component, which would let us use selectors
|
||||
* for only the source items and let us have more than one
|
||||
* list filter component per page without conflicts */
|
||||
export default function ListFilters({ component }) {
|
||||
const $component = $(component);
|
||||
const $labels = $component.find('label');
|
||||
const $inputs = $component.find('input');
|
||||
|
||||
$("#list-filters input").click(function() {
|
||||
getFilterCounts($labels);
|
||||
|
||||
$inputs.click(function () {
|
||||
// List of tags to hide
|
||||
var tagArray = $("#list-filters input:checkbox:checked").map(function(){
|
||||
return $(this).attr('name').replace(/[\W]+/, "-");
|
||||
}).get();
|
||||
var tagArray = $component
|
||||
.find('input:checkbox:checked')
|
||||
.map(function () {
|
||||
return $(this).attr('name').replace(/[\W]+/, '-');
|
||||
})
|
||||
.get();
|
||||
|
||||
// List of tags to restore
|
||||
var restoreArray = $("#list-filters input:checkbox:not(:checked)").map(function(){
|
||||
return $(this).attr('name').replace(/[\W]+/, "-");
|
||||
}).get();
|
||||
var restoreArray = $component
|
||||
.find('input:checkbox:not(:checked)')
|
||||
.map(function () {
|
||||
return $(this).attr('name').replace(/[\W]+/, '-');
|
||||
})
|
||||
.get();
|
||||
|
||||
// Actions for filter select
|
||||
if ($(this).is(':checked')) {
|
||||
$.each(tagArray, function (index, value) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])").removeClass('visible').fadeOut()
|
||||
})
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])")
|
||||
.removeClass('visible')
|
||||
.fadeOut();
|
||||
});
|
||||
} else {
|
||||
$.each(restoreArray, function (index, value) {
|
||||
$(".filter-item:not(.visible)[data-tags~='" + value + "']").addClass('visible').fadeIn()
|
||||
})
|
||||
$(".filter-item:not(.visible)[data-tags~='" + value + "']")
|
||||
.addClass('visible')
|
||||
.fadeIn();
|
||||
});
|
||||
$.each(tagArray, function (index, value) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])").removeClass('visible').hide()
|
||||
})
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])")
|
||||
.removeClass('visible')
|
||||
.hide();
|
||||
});
|
||||
}
|
||||
|
||||
// Refresh filter count
|
||||
getFilterCounts()
|
||||
getFilterCounts($labels);
|
||||
});
|
||||
}
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
// assets/js/main.js
|
||||
|
||||
// If you need to pass parameters from the calling Hugo page, you can import them here like so:
|
||||
// import * as pageParams from '@params';
|
||||
// Import dependencies that we still need to load in the global scope
|
||||
import $ from 'jquery';
|
||||
|
||||
/** Import modules that are not components.
|
||||
* TODO: Refactor these into single-purpose component modules.
|
||||
|
@ -9,9 +9,10 @@
|
|||
import * as apiLibs from './api-libs.js';
|
||||
import * as codeControls from './code-controls.js';
|
||||
import * as contentInteractions from './content-interactions.js';
|
||||
import * as datetime from './datetime.js';
|
||||
import { delay } from './helpers.js';
|
||||
import { InfluxDBUrl } from './influxdb-url.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
import * as modals from './modals.js';
|
||||
import * as notifications from './notifications.js';
|
||||
import * as pageContext from './page-context.js';
|
||||
|
@ -29,8 +30,17 @@ import * as v3Wayfinding from './v3-wayfinding.js';
|
|||
import AskAITrigger from './ask-ai-trigger.js';
|
||||
import CodePlaceholder from './code-placeholders.js';
|
||||
import { CustomTimeTrigger } from './custom-timestamps.js';
|
||||
import Diagram from './components/diagram.js';
|
||||
import DocSearch from './components/doc-search.js';
|
||||
import FeatureCallout from './feature-callouts.js';
|
||||
import FluxGroupKeysDemo from './flux-group-keys.js';
|
||||
import FluxInfluxDBVersionsTrigger from './flux-influxdb-versions.js';
|
||||
import KeyBinding from './keybindings.js';
|
||||
import ListFilters from './list-filters.js';
|
||||
import ProductSelector from './version-selector.js';
|
||||
import ReleaseToc from './release-toc.js';
|
||||
import { SearchButton } from './search-button.js';
|
||||
import SidebarSearch from './components/sidebar-search.js';
|
||||
import { SidebarToggle } from './sidebar-toggle.js';
|
||||
import Theme from './theme.js';
|
||||
import ThemeSwitch from './theme-switch.js';
|
||||
|
@ -49,11 +59,20 @@ const componentRegistry = {
|
|||
'ask-ai-trigger': AskAITrigger,
|
||||
'code-placeholder': CodePlaceholder,
|
||||
'custom-time-trigger': CustomTimeTrigger,
|
||||
diagram: Diagram,
|
||||
'doc-search': DocSearch,
|
||||
'feature-callout': FeatureCallout,
|
||||
'flux-group-keys-demo': FluxGroupKeysDemo,
|
||||
'flux-influxdb-versions-trigger': FluxInfluxDBVersionsTrigger,
|
||||
keybinding: KeyBinding,
|
||||
'list-filters': ListFilters,
|
||||
'product-selector': ProductSelector,
|
||||
'release-toc': ReleaseToc,
|
||||
'search-button': SearchButton,
|
||||
'sidebar-search': SidebarSearch,
|
||||
'sidebar-toggle': SidebarToggle,
|
||||
'theme': Theme,
|
||||
'theme-switch': ThemeSwitch
|
||||
theme: Theme,
|
||||
'theme-switch': ThemeSwitch,
|
||||
};
|
||||
|
||||
/**
|
||||
|
@ -72,6 +91,11 @@ function initGlobals() {
|
|||
window.influxdatadocs.toggleModal = modals.toggleModal;
|
||||
window.influxdatadocs.componentRegistry = componentRegistry;
|
||||
|
||||
// Re-export jQuery to global namespace for legacy scripts
|
||||
if (typeof window.jQuery === 'undefined') {
|
||||
window.jQuery = window.$ = $;
|
||||
}
|
||||
|
||||
return window.influxdatadocs;
|
||||
}
|
||||
|
||||
|
@ -103,10 +127,13 @@ function initComponents(globals) {
|
|||
|
||||
globals.instances[componentName].push({
|
||||
element: component,
|
||||
instance
|
||||
instance,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`Error initializing component "${componentName}":`, error);
|
||||
console.error(
|
||||
`Error initializing component "${componentName}":`,
|
||||
error
|
||||
);
|
||||
}
|
||||
} else {
|
||||
console.warn(`Unknown component: "${componentName}"`);
|
||||
|
@ -122,6 +149,7 @@ function initModules() {
|
|||
apiLibs.initialize();
|
||||
codeControls.initialize();
|
||||
contentInteractions.initialize();
|
||||
datetime.initialize();
|
||||
InfluxDBUrl();
|
||||
notifications.initialize();
|
||||
pageFeedback.initialize();
|
||||
|
|
|
@ -1,34 +1,80 @@
|
|||
/** This module retrieves browser context information and site data for the
|
||||
* current page, version, and product.
|
||||
*/
|
||||
import { products, influxdb_urls } from '@params';
|
||||
|
||||
const safeProducts = products || {};
|
||||
const safeUrls = influxdb_urls || {};
|
||||
import { products } from './services/influxdata-products.js';
|
||||
import { influxdbUrls } from './services/influxdb-urls.js';
|
||||
|
||||
function getCurrentProductData() {
|
||||
const path = window.location.pathname;
|
||||
const mappings = [
|
||||
{ pattern: /\/influxdb\/cloud\//, product: safeProducts.cloud, urls: safeUrls.influxdb_cloud },
|
||||
{ pattern: /\/influxdb3\/core/, product: safeProducts.influxdb3_core, urls: safeUrls.core },
|
||||
{ pattern: /\/influxdb3\/enterprise/, product: safeProducts.influxdb3_enterprise, urls: safeUrls.enterprise },
|
||||
{ pattern: /\/influxdb3\/cloud-serverless/, product: safeProducts.influxdb3_cloud_serverless, urls: safeUrls.cloud },
|
||||
{ pattern: /\/influxdb3\/cloud-dedicated/, product: safeProducts.influxdb3_cloud_dedicated, urls: safeUrls.dedicated },
|
||||
{ pattern: /\/influxdb3\/clustered/, product: safeProducts.influxdb3_clustered, urls: safeUrls.clustered },
|
||||
{ pattern: /\/enterprise_v1\//, product: safeProducts.enterprise_influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/influxdb.*v1\//, product: safeProducts.influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/influxdb.*v2\//, product: safeProducts.influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/kapacitor\//, product: safeProducts.kapacitor, urls: safeUrls.oss },
|
||||
{ pattern: /\/telegraf\//, product: safeProducts.telegraf, urls: safeUrls.oss },
|
||||
{ pattern: /\/chronograf\//, product: safeProducts.chronograf, urls: safeUrls.oss },
|
||||
{ pattern: /\/flux\//, product: safeProducts.flux, urls: safeUrls.oss },
|
||||
{
|
||||
pattern: /\/influxdb\/cloud\//,
|
||||
product: products.cloud,
|
||||
urls: influxdbUrls.influxdb_cloud,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/core/,
|
||||
product: products.influxdb3_core,
|
||||
urls: influxdbUrls.core,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/enterprise/,
|
||||
product: products.influxdb3_enterprise,
|
||||
urls: influxdbUrls.enterprise,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/cloud-serverless/,
|
||||
product: products.influxdb3_cloud_serverless,
|
||||
urls: influxdbUrls.cloud,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/cloud-dedicated/,
|
||||
product: products.influxdb3_cloud_dedicated,
|
||||
urls: influxdbUrls.dedicated,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/clustered/,
|
||||
product: products.influxdb3_clustered,
|
||||
urls: influxdbUrls.clustered,
|
||||
},
|
||||
{
|
||||
pattern: /\/enterprise_v1\//,
|
||||
product: products.enterprise_influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb.*v1\//,
|
||||
product: products.influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb.*v2\//,
|
||||
product: products.influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/kapacitor\//,
|
||||
product: products.kapacitor,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/telegraf\//,
|
||||
product: products.telegraf,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/chronograf\//,
|
||||
product: products.chronograf,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{ pattern: /\/flux\//, product: products.flux, urls: influxdbUrls.oss },
|
||||
];
|
||||
|
||||
for (const { pattern, product, urls } of mappings) {
|
||||
if (pattern.test(path)) {
|
||||
return {
|
||||
product: product || 'unknown',
|
||||
urls: urls || {}
|
||||
urls: urls || {},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
@ -36,7 +82,8 @@ function getCurrentProductData() {
|
|||
return { product: 'other', urls: {} };
|
||||
}
|
||||
|
||||
// Return the page context (cloud, serverless, oss/enterprise, dedicated, clustered, other)
|
||||
// Return the page context
|
||||
// (cloud, serverless, oss/enterprise, dedicated, clustered, other)
|
||||
function getContext() {
|
||||
if (/\/influxdb\/cloud\//.test(window.location.pathname)) {
|
||||
return 'cloud';
|
||||
|
@ -78,8 +125,12 @@ const context = getContext(),
|
|||
protocol = location.protocol,
|
||||
referrer = document.referrer === '' ? 'direct' : document.referrer,
|
||||
referrerHost = getReferrerHost(),
|
||||
// TODO: Verify this still does what we want since the addition of InfluxDB 3 naming and the Core and Enterprise versions.
|
||||
version = (/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud') ? pathArr[1].replace(/^v/, '') : "n/a")
|
||||
// TODO: Verify this works since the addition of InfluxDB 3 naming
|
||||
// and the Core and Enterprise versions.
|
||||
version =
|
||||
/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud')
|
||||
? pathArr[1].replace(/^v/, '')
|
||||
: 'n/a';
|
||||
|
||||
export {
|
||||
context,
|
||||
|
|
|
@ -4,34 +4,24 @@
|
|||
* This script is used to generate a table of contents for the
|
||||
* release notes pages.
|
||||
*/
|
||||
|
||||
// Use jQuery filter to get an array of all the *release* h2 elements
|
||||
const releases = $('h2').filter(
|
||||
(_i, el) => !el.id.match(/checkpoint-releases/)
|
||||
export default function ReleaseToc({ component }) {
|
||||
// Get all h2 elements that are not checkpoint-releases
|
||||
const releases = Array.from(document.querySelectorAll('h2')).filter(
|
||||
(el) => !el.id.match(/checkpoint-releases/)
|
||||
);
|
||||
|
||||
// Extract data about each release from the array of releases
|
||||
releaseData = releases.map((_i, el) => ({
|
||||
const releaseData = releases.map((el) => ({
|
||||
name: el.textContent,
|
||||
id: el.id,
|
||||
class: el.getAttribute('class'),
|
||||
date: el.getAttribute('date')
|
||||
date: el.getAttribute('date'),
|
||||
}));
|
||||
|
||||
// Use release data to generate a list item for each release
|
||||
getReleaseItem = (releaseData) => {
|
||||
var li = document.createElement("li");
|
||||
if (releaseData.class !== null) {
|
||||
li.className = releaseData.class;
|
||||
}
|
||||
li.innerHTML = `<a href="#${releaseData.id}">${releaseData.name}</a>`;
|
||||
li.setAttribute('date', releaseData.date);
|
||||
return li;
|
||||
}
|
||||
|
||||
// Use jQuery each to build the release table of contents
|
||||
releaseData.each((_i, release) => {
|
||||
$('#release-toc ul')[0].appendChild(getReleaseItem(release));
|
||||
// Build the release table of contents
|
||||
const releaseTocUl = component.querySelector('#release-toc ul');
|
||||
releaseData.forEach((release) => {
|
||||
releaseTocUl.appendChild(getReleaseItem(release));
|
||||
});
|
||||
|
||||
/*
|
||||
|
@ -39,20 +29,43 @@ releaseData.each((_i, release) => {
|
|||
* number specified in the `show` attribute of `ul.release-list`.
|
||||
* Once all the release items are visible, the "Show More" button is hidden.
|
||||
*/
|
||||
$('#release-toc .show-more').click(function () {
|
||||
const showMoreBtn = component.querySelector('.show-more');
|
||||
if (showMoreBtn) {
|
||||
showMoreBtn.addEventListener('click', function () {
|
||||
const itemHeight = 1.885; // Item height in rem
|
||||
const releaseNum = releaseData.length;
|
||||
const maxHeight = releaseNum * itemHeight;
|
||||
const releaseIncrement = Number($('#release-list')[0].getAttribute('show'));
|
||||
const currentHeight = Number(
|
||||
$('#release-list')[0].style.height.match(/\d+\.?\d+/)[0]
|
||||
);
|
||||
const releaseList = document.getElementById('release-list');
|
||||
const releaseIncrement = Number(releaseList.getAttribute('show'));
|
||||
const currentHeightMatch = releaseList.style.height.match(/\d+\.?\d+/);
|
||||
const currentHeight = currentHeightMatch
|
||||
? Number(currentHeightMatch[0])
|
||||
: 0;
|
||||
const potentialHeight = currentHeight + releaseIncrement * itemHeight;
|
||||
const newHeight = potentialHeight > maxHeight ? maxHeight : potentialHeight;
|
||||
const newHeight =
|
||||
potentialHeight > maxHeight ? maxHeight : potentialHeight;
|
||||
|
||||
$('#release-list')[0].style.height = `${newHeight}rem`;
|
||||
releaseList.style.height = `${newHeight}rem`;
|
||||
|
||||
if (newHeight >= maxHeight) {
|
||||
$('#release-toc .show-more').fadeOut(100);
|
||||
// Simple fade out
|
||||
showMoreBtn.style.transition = 'opacity 0.1s';
|
||||
showMoreBtn.style.opacity = 0;
|
||||
setTimeout(() => {
|
||||
showMoreBtn.style.display = 'none';
|
||||
}, 100);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Use release data to generate a list item for each release
|
||||
function getReleaseItem(releaseData) {
|
||||
const li = document.createElement('li');
|
||||
if (releaseData.class !== null) {
|
||||
li.className = releaseData.class;
|
||||
}
|
||||
li.innerHTML = `<a href="#${releaseData.id}">${releaseData.name}</a>`;
|
||||
li.setAttribute('date', releaseData.date);
|
||||
return li;
|
||||
}
|
||||
|
|
|
@ -1,10 +0,0 @@
|
|||
// Fade content wrapper when focusing on search input
|
||||
$('#algolia-search-input').focus(function() {
|
||||
$('.content-wrapper').fadeTo(300, .35);
|
||||
})
|
||||
|
||||
// Hide search dropdown when leaving search input
|
||||
$('#algolia-search-input').blur(function() {
|
||||
$('.content-wrapper').fadeTo(200, 1);
|
||||
$('.ds-dropdown-menu').hide();
|
||||
})
|
|
@ -0,0 +1,3 @@
|
|||
import { products as productsParam } from '@params';
|
||||
|
||||
export const products = productsParam || {};
|
|
@ -0,0 +1,3 @@
|
|||
import { influxdb_urls as influxdbUrlsParam } from '@params';
|
||||
|
||||
export const influxdbUrls = influxdbUrlsParam || {};
|
|
@ -10,7 +10,8 @@
|
|||
- messages: Messages (data/notifications.yaml) that have been seen (array)
|
||||
- callouts: Feature callouts that have been seen (array)
|
||||
*/
|
||||
import * as pageParams from '@params';
|
||||
|
||||
import { influxdbUrls } from './influxdb-urls.js';
|
||||
|
||||
// Prefix for all InfluxData docs local storage
|
||||
const storagePrefix = 'influxdata_docs_';
|
||||
|
@ -82,14 +83,12 @@ function getPreferences() {
|
|||
//////////// MANAGE INFLUXDATA DOCS URLS IN LOCAL STORAGE //////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
|
||||
const defaultUrls = {};
|
||||
// Guard against pageParams being null/undefined and safely access nested properties
|
||||
if (pageParams && pageParams.influxdb_urls) {
|
||||
Object.entries(pageParams.influxdb_urls).forEach(([product, {providers}]) => {
|
||||
defaultUrls[product] = providers.filter(provider => provider.name === 'Default')[0]?.regions[0]?.url;
|
||||
Object.entries(influxdbUrls).forEach(([product, { providers }]) => {
|
||||
defaultUrls[product] =
|
||||
providers.filter((provider) => provider.name === 'Default')[0]?.regions[0]
|
||||
?.url || 'https://cloud2.influxdata.com';
|
||||
});
|
||||
}
|
||||
|
||||
export const DEFAULT_STORAGE_URLS = {
|
||||
oss: defaultUrls.oss,
|
||||
|
@ -177,7 +176,10 @@ const defaultNotificationsObj = {
|
|||
function getNotifications() {
|
||||
// Initialize notifications data if it doesn't already exist
|
||||
if (localStorage.getItem(notificationStorageKey) === null) {
|
||||
initializeStorageItem('notifications', JSON.stringify(defaultNotificationsObj));
|
||||
initializeStorageItem(
|
||||
'notifications',
|
||||
JSON.stringify(defaultNotificationsObj)
|
||||
);
|
||||
}
|
||||
|
||||
// Retrieve and parse the notifications data as JSON
|
||||
|
@ -221,7 +223,10 @@ function setNotificationAsRead(notificationID, notificationType) {
|
|||
readNotifications.push(notificationID);
|
||||
notificationsObj[notificationType + 's'] = readNotifications;
|
||||
|
||||
localStorage.setItem(notificationStorageKey, JSON.stringify(notificationsObj));
|
||||
localStorage.setItem(
|
||||
notificationStorageKey,
|
||||
JSON.stringify(notificationsObj)
|
||||
);
|
||||
}
|
||||
|
||||
// Export functions as a module and make the file backwards compatible for non-module environments until all remaining dependent scripts are ported to modules
|
|
@ -3,7 +3,7 @@
|
|||
http://www.thesitewizard.com/javascripts/change-style-sheets.shtml
|
||||
*/
|
||||
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
|
||||
// *** TO BE CUSTOMISED ***
|
||||
var sidebar_state_preference_name = 'sidebar_state';
|
||||
|
|
|
@ -1,20 +1,21 @@
|
|||
import Theme from './theme.js';
|
||||
|
||||
export default function ThemeSwitch({ component }) {
|
||||
if ( component == undefined) {
|
||||
if (component === undefined) {
|
||||
component = document;
|
||||
}
|
||||
component.querySelectorAll(`.theme-switch-light`).forEach((button) => {
|
||||
|
||||
component.querySelectorAll('.theme-switch-light').forEach((button) => {
|
||||
button.addEventListener('click', function (event) {
|
||||
event.preventDefault();
|
||||
Theme({ style: 'light-theme' });
|
||||
Theme({ component, style: 'light-theme' });
|
||||
});
|
||||
});
|
||||
|
||||
component.querySelectorAll(`.theme-switch-dark`).forEach((button) => {
|
||||
component.querySelectorAll('.theme-switch-dark').forEach((button) => {
|
||||
button.addEventListener('click', function (event) {
|
||||
event.preventDefault();
|
||||
Theme({ style: 'dark-theme' });
|
||||
Theme({ component, style: 'dark-theme' });
|
||||
});
|
||||
});
|
||||
}
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
import { getPreference, setPreference } from './local-storage.js';
|
||||
import { getPreference, setPreference } from './services/local-storage.js';
|
||||
|
||||
const PROPS = {
|
||||
style_preference_name: 'theme',
|
||||
|
@ -12,13 +12,16 @@ function getPreferredTheme () {
|
|||
|
||||
function switchStyle({ styles_element, css_title }) {
|
||||
// Disable all other theme stylesheets
|
||||
styles_element.querySelectorAll('link[rel*="stylesheet"][title*="theme"]')
|
||||
styles_element
|
||||
.querySelectorAll('link[rel*="stylesheet"][title*="theme"]')
|
||||
.forEach(function (link) {
|
||||
link.disabled = true;
|
||||
});
|
||||
|
||||
// Enable the stylesheet with the specified title
|
||||
const link = styles_element.querySelector(`link[rel*="stylesheet"][title="${css_title}"]`);
|
||||
const link = styles_element.querySelector(
|
||||
`link[rel*="stylesheet"][title="${css_title}"]`
|
||||
);
|
||||
link && (link.disabled = false);
|
||||
|
||||
setPreference(PROPS.style_preference_name, css_title.replace(/-theme/, ''));
|
||||
|
@ -38,5 +41,4 @@ export default function Theme({ component, style }) {
|
|||
if (component.dataset?.themeCallback === 'setVisibility') {
|
||||
setVisibility(component);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -0,0 +1,38 @@
|
|||
/**
|
||||
* Helper functions for debugging without source maps
|
||||
* Example usage:
|
||||
* In your code, you can use these functions like this:
|
||||
* ```javascript
|
||||
* import { debugLog, debugBreak, debugInspect } from './debug-helpers.js';
|
||||
*
|
||||
* const data = debugInspect(someData, 'Data');
|
||||
* debugLog('Processing data', 'myFunction');
|
||||
*
|
||||
* function processData() {
|
||||
* // Add a breakpoint that works with DevTools
|
||||
* debugBreak();
|
||||
*
|
||||
* // Your existing code...
|
||||
* }
|
||||
* ```
|
||||
*
|
||||
* @fileoverview DEVELOPMENT USE ONLY - Functions should not be committed to production
|
||||
*/
|
||||
|
||||
/* eslint-disable no-debugger */
|
||||
/* eslint-disable-next-line */
|
||||
// NOTE: These functions are detected by ESLint rules to prevent committing debug code
|
||||
|
||||
export function debugLog(message, context = '') {
|
||||
const contextStr = context ? `[${context}]` : '';
|
||||
console.log(`DEBUG${contextStr}: ${message}`);
|
||||
}
|
||||
|
||||
export function debugBreak() {
|
||||
debugger;
|
||||
}
|
||||
|
||||
export function debugInspect(value, label = 'Inspect') {
|
||||
console.log(`DEBUG[${label}]:`, value);
|
||||
return value;
|
||||
}
|
|
@ -0,0 +1,107 @@
|
|||
/**
|
||||
* Manages search interactions for DocSearch integration
|
||||
* Uses MutationObserver to watch for dropdown creation
|
||||
*/
|
||||
export default function SearchInteractions({ searchInput }) {
|
||||
const contentWrapper = document.querySelector('.content-wrapper');
|
||||
let observer = null;
|
||||
let dropdownObserver = null;
|
||||
let dropdownMenu = null;
|
||||
const debug = false; // Set to true for debugging logs
|
||||
|
||||
// Fade content wrapper when focusing on search input
|
||||
function handleFocus() {
|
||||
contentWrapper.style.opacity = '0.35';
|
||||
contentWrapper.style.transition = 'opacity 300ms';
|
||||
}
|
||||
|
||||
// Hide search dropdown when leaving search input
|
||||
function handleBlur(event) {
|
||||
// Only process blur if not clicking within dropdown
|
||||
const relatedTarget = event.relatedTarget;
|
||||
if (
|
||||
relatedTarget &&
|
||||
(relatedTarget.closest('.algolia-autocomplete') ||
|
||||
relatedTarget.closest('.ds-dropdown-menu'))
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
contentWrapper.style.opacity = '1';
|
||||
contentWrapper.style.transition = 'opacity 200ms';
|
||||
|
||||
// Hide dropdown if it exists
|
||||
if (dropdownMenu) {
|
||||
dropdownMenu.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Add event listeners
|
||||
searchInput.addEventListener('focus', handleFocus);
|
||||
searchInput.addEventListener('blur', handleBlur);
|
||||
|
||||
// Use MutationObserver to detect when dropdown is added to the DOM
|
||||
observer = new MutationObserver((mutations) => {
|
||||
for (const mutation of mutations) {
|
||||
if (mutation.type === 'childList') {
|
||||
const newDropdown = document.querySelector(
|
||||
'.ds-dropdown-menu:not([data-monitored])'
|
||||
);
|
||||
if (newDropdown) {
|
||||
// Save reference to dropdown
|
||||
dropdownMenu = newDropdown;
|
||||
newDropdown.setAttribute('data-monitored', 'true');
|
||||
|
||||
// Monitor dropdown removal/display changes
|
||||
dropdownObserver = new MutationObserver((dropdownMutations) => {
|
||||
for (const dropdownMutation of dropdownMutations) {
|
||||
if (debug) {
|
||||
if (
|
||||
dropdownMutation.type === 'attributes' &&
|
||||
dropdownMutation.attributeName === 'style'
|
||||
) {
|
||||
console.log(
|
||||
'Dropdown style changed:',
|
||||
dropdownMenu.style.display
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Observe changes to dropdown attributes (like style)
|
||||
dropdownObserver.observe(dropdownMenu, {
|
||||
attributes: true,
|
||||
attributeFilter: ['style'],
|
||||
});
|
||||
|
||||
// Add event listeners to keep dropdown open when interacted with
|
||||
dropdownMenu.addEventListener('mousedown', (e) => {
|
||||
// Prevent blur on searchInput when clicking in dropdown
|
||||
e.preventDefault();
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Start observing the document body for dropdown creation
|
||||
observer.observe(document.body, {
|
||||
childList: true,
|
||||
subtree: true,
|
||||
});
|
||||
|
||||
// Return cleanup function
|
||||
return function cleanup() {
|
||||
searchInput.removeEventListener('focus', handleFocus);
|
||||
searchInput.removeEventListener('blur', handleBlur);
|
||||
|
||||
if (observer) {
|
||||
observer.disconnect();
|
||||
}
|
||||
|
||||
if (dropdownObserver) {
|
||||
dropdownObserver.disconnect();
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
/**
|
||||
* Platform detection utility functions
|
||||
* Provides methods for detecting user's operating system
|
||||
*/
|
||||
|
||||
/**
|
||||
* Detects user's operating system using modern techniques
|
||||
* Falls back to userAgent parsing when newer APIs aren't available
|
||||
* @returns {string} Operating system identifier ("osx", "win", "linux", or "other")
|
||||
*/
|
||||
export function getPlatform() {
|
||||
// Try to use modern User-Agent Client Hints API first (Chrome 89+, Edge 89+)
|
||||
if (navigator.userAgentData && navigator.userAgentData.platform) {
|
||||
const platform = navigator.userAgentData.platform.toLowerCase();
|
||||
|
||||
if (platform.includes('mac')) return 'osx';
|
||||
if (platform.includes('win')) return 'win';
|
||||
if (platform.includes('linux')) return 'linux';
|
||||
}
|
||||
|
||||
// Fall back to userAgent string parsing
|
||||
const userAgent = navigator.userAgent.toLowerCase();
|
||||
|
||||
if (
|
||||
userAgent.includes('mac') ||
|
||||
userAgent.includes('iphone') ||
|
||||
userAgent.includes('ipad')
|
||||
)
|
||||
return 'osx';
|
||||
if (userAgent.includes('win')) return 'win';
|
||||
if (userAgent.includes('linux') || userAgent.includes('android'))
|
||||
return 'linux';
|
||||
|
||||
return 'other';
|
||||
}
|
|
@ -1,6 +1,14 @@
|
|||
import { CLOUD_URLS } from './influxdb-url.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import { context, host, hostname, path, protocol, referrer, referrerHost } from './page-context.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
import {
|
||||
context,
|
||||
host,
|
||||
hostname,
|
||||
path,
|
||||
protocol,
|
||||
referrer,
|
||||
referrerHost,
|
||||
} from './page-context.js';
|
||||
|
||||
/**
|
||||
* Builds a referrer whitelist array that includes the current page host and all
|
||||
|
@ -69,8 +77,6 @@ function setWayfindingInputState() {
|
|||
}
|
||||
|
||||
function submitWayfindingData(engine, action) {
|
||||
|
||||
|
||||
// Build lp using page data and engine data
|
||||
const lp = `ioxwayfinding,host=${hostname},path=${path},referrer=${referrer},engine=${engine} action="${action}"`;
|
||||
|
||||
|
@ -81,10 +87,7 @@ function submitWayfindingData(engine, action) {
|
|||
'https://j32dswat7l.execute-api.us-east-1.amazonaws.com/prod/wayfinding'
|
||||
);
|
||||
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
|
||||
xhr.setRequestHeader(
|
||||
'Access-Control-Allow-Origin',
|
||||
`${protocol}//${host}`
|
||||
);
|
||||
xhr.setRequestHeader('Access-Control-Allow-Origin', `${protocol}//${host}`);
|
||||
xhr.setRequestHeader('Content-Type', 'text/plain; charset=utf-8');
|
||||
xhr.setRequestHeader('Accept', 'application/json');
|
||||
xhr.send(lp);
|
||||
|
|
|
@ -1,19 +1,21 @@
|
|||
export default function ProductSelector({ component }) {
|
||||
// Select the product dropdown and dropdown items
|
||||
const productDropdown = document.querySelector("#product-dropdown");
|
||||
const dropdownItems = document.querySelector("#dropdown-items");
|
||||
const productDropdown = component.querySelector('#product-dropdown');
|
||||
const dropdownItems = component.querySelector('#dropdown-items');
|
||||
|
||||
// Expand the menu on click
|
||||
if (productDropdown) {
|
||||
productDropdown.addEventListener("click", function() {
|
||||
productDropdown.classList.toggle("open");
|
||||
dropdownItems.classList.toggle("open");
|
||||
productDropdown.addEventListener('click', function () {
|
||||
productDropdown.classList.toggle('open');
|
||||
dropdownItems.classList.toggle('open');
|
||||
});
|
||||
}
|
||||
|
||||
// Close the dropdown by clicking anywhere else
|
||||
document.addEventListener("click", function(e) {
|
||||
document.addEventListener('click', function (e) {
|
||||
// Check if the click was outside of the '.product-list' container
|
||||
if (!e.target.closest('.product-list')) {
|
||||
dropdownItems.classList.remove("open");
|
||||
dropdownItems.classList.remove('open');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
|
|
@ -0,0 +1,18 @@
|
|||
/*
|
||||
Datetime Components
|
||||
----------------------------------------------
|
||||
*/
|
||||
|
||||
.current-timestamp,
|
||||
.current-date,
|
||||
.current-time,
|
||||
.enterprise-eol-date {
|
||||
color: $current-timestamp-color;
|
||||
display: inline-block;
|
||||
font-family: $proxima;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.nowrap {
|
||||
white-space: nowrap;
|
||||
}
|
|
@ -105,7 +105,7 @@
|
|||
.product {
|
||||
padding: 0 1rem;
|
||||
display: flex;
|
||||
flex: 1 1 50%;
|
||||
flex: 1 1 33%;
|
||||
flex-direction: column;
|
||||
justify-content: space-between;
|
||||
max-width: 33%;
|
||||
|
@ -118,11 +118,10 @@
|
|||
line-height: 1.5rem;
|
||||
color: rgba($article-text, .7);
|
||||
}
|
||||
}
|
||||
|
||||
&.new {
|
||||
.product-info h3::after {
|
||||
content: "New";
|
||||
h3[state] {
|
||||
&::after {
|
||||
content: attr(state);
|
||||
margin-left: .5rem;
|
||||
font-size: 1rem;
|
||||
padding: .25em .5em .25em .4em;
|
||||
|
@ -132,6 +131,8 @@
|
|||
font-style: italic;
|
||||
vertical-align: middle;
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
ul.product-links {
|
||||
|
@ -227,6 +228,30 @@
|
|||
background: $article-bg;
|
||||
}
|
||||
|
||||
.categories {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
flex-wrap: wrap;
|
||||
// margin: 0 -1rem;
|
||||
width: calc(100% + 2rem);
|
||||
|
||||
.category {
|
||||
&.full-width {
|
||||
width: 100%;
|
||||
}
|
||||
&.two-thirds {
|
||||
width: 66.66%;
|
||||
.product { max-width: 50%; }
|
||||
}
|
||||
&.one-third {
|
||||
width: 33.33%;
|
||||
.product {
|
||||
max-width: 100%;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.category-head{
|
||||
margin: 1rem 0 2rem;
|
||||
&::after {
|
||||
|
@ -234,6 +259,7 @@
|
|||
display: block;
|
||||
border-top: 1px solid $article-hr;
|
||||
margin-top: -1.15rem;
|
||||
width: calc(100% - 2rem);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -441,6 +467,16 @@
|
|||
ul {margin-bottom: 0;}
|
||||
}
|
||||
}
|
||||
.categories .category {
|
||||
&.two-thirds {
|
||||
width: 100%;
|
||||
.product { max-width: 100%; }
|
||||
}
|
||||
&.one-third {
|
||||
width: 100%;
|
||||
.product { max-width: 100%; }
|
||||
}
|
||||
}
|
||||
}
|
||||
#telegraf {
|
||||
flex-direction: column;
|
||||
|
|
|
@ -96,4 +96,5 @@ blockquote {
|
|||
"blocks/tip",
|
||||
"blocks/important",
|
||||
"blocks/warning",
|
||||
"blocks/caution";
|
||||
"blocks/caution",
|
||||
"blocks/beta";
|
||||
|
|
|
@ -16,6 +16,10 @@
|
|||
background: $article-code-bg !important;
|
||||
font-size: .85em;
|
||||
font-weight: $medium;
|
||||
|
||||
p {
|
||||
background: $article-bg !important;
|
||||
}
|
||||
}
|
||||
|
||||
.node {
|
||||
|
|
|
@ -0,0 +1,105 @@
|
|||
.block.beta {
|
||||
@include gradient($grad-burningDusk);
|
||||
padding: 4px;
|
||||
border: none;
|
||||
border-radius: 25px !important;
|
||||
|
||||
.beta-content {
|
||||
background: $article-bg;
|
||||
border-radius: 21px;
|
||||
padding: calc(1.65rem - 4px) calc(2rem - 4px) calc(.1rem + 4px) calc(2rem - 4px);
|
||||
|
||||
h4 {
|
||||
color: $article-heading;
|
||||
}
|
||||
|
||||
p {margin-bottom: 1rem;}
|
||||
|
||||
.expand-wrapper {
|
||||
border: none;
|
||||
margin: .5rem 0 1.5rem;
|
||||
}
|
||||
.expand {
|
||||
border: none;
|
||||
padding: 0;
|
||||
|
||||
.expand-content p {
|
||||
margin-left: 2rem;
|
||||
}
|
||||
|
||||
ul {
|
||||
|
||||
margin-top: -1rem;
|
||||
|
||||
&.feedback-channels {
|
||||
|
||||
padding: 0;
|
||||
margin: -1rem 0 1.5rem 2rem;
|
||||
list-style: none;
|
||||
|
||||
a {
|
||||
color: $article-heading;
|
||||
font-weight: $medium;
|
||||
position: relative;
|
||||
|
||||
&.discord:before {
|
||||
content: url('/svgs/discord.svg');
|
||||
display: inline-block;
|
||||
height: 1.1rem;
|
||||
width: 1.25rem;
|
||||
vertical-align: top;
|
||||
margin: 2px .65rem 0 0;
|
||||
}
|
||||
|
||||
&.community:before {
|
||||
content: "\e900";
|
||||
color: $article-heading;
|
||||
margin: 0 .65rem 0 0;
|
||||
font-size: 1.2rem;
|
||||
font-family: 'icomoon-v2';
|
||||
vertical-align: middle;
|
||||
}
|
||||
|
||||
&.slack:before {
|
||||
content: url('/svgs/slack.svg');
|
||||
display: inline-block;
|
||||
height: 1.1rem;
|
||||
width: 1.1rem;
|
||||
vertical-align: text-top;
|
||||
margin-right: .65rem;
|
||||
}
|
||||
|
||||
&.reddit:before {
|
||||
content: url('/svgs/reddit.svg');
|
||||
display: inline-block;
|
||||
height: 1.1rem;
|
||||
width: 1.2rem;
|
||||
vertical-align: top;
|
||||
margin: 2px .65rem 0 0;
|
||||
}
|
||||
|
||||
&::after {
|
||||
content: "\e90a";
|
||||
font-family: 'icomoon-v4';
|
||||
font-weight: bold;
|
||||
font-size: 1.3rem;
|
||||
display: inline-block;
|
||||
position: absolute;
|
||||
@include gradient($grad-burningDusk);
|
||||
background-clip: text;
|
||||
-webkit-text-fill-color: transparent;
|
||||
right: 0;
|
||||
transform: translateX(.25rem);
|
||||
opacity: 0;
|
||||
transition: transform .2s, opacity .2s;
|
||||
}
|
||||
|
||||
&:hover {
|
||||
&::after {transform: translateX(1.5rem); opacity: 1;}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -23,6 +23,7 @@
|
|||
"layouts/syntax-highlighting",
|
||||
"layouts/algolia-search-overrides",
|
||||
"layouts/landing",
|
||||
"layouts/datetime",
|
||||
"layouts/error-page",
|
||||
"layouts/footer-widgets",
|
||||
"layouts/modals",
|
||||
|
|
|
@ -203,6 +203,12 @@ $article-btn-text-hover: $g20-white;
|
|||
$article-nav-icon-bg: $g5-pepper;
|
||||
$article-nav-acct-bg: $g3-castle;
|
||||
|
||||
// Datetime shortcode colors
|
||||
$current-timestamp-color: $g15-platinum;
|
||||
$current-date-color: $g15-platinum;
|
||||
$current-time-color: $g15-platinum;
|
||||
$enterprise-eol-date-color: $g15-platinum;
|
||||
|
||||
// Error Page Colors
|
||||
$error-page-btn: $b-pool;
|
||||
$error-page-btn-text: $g20-white;
|
||||
|
|
|
@ -203,6 +203,12 @@ $article-btn-text-hover: $g20-white !default;
|
|||
$article-nav-icon-bg: $g6-smoke !default;
|
||||
$article-nav-acct-bg: $g5-pepper !default;
|
||||
|
||||
// Datetime Colors
|
||||
$current-timestamp-color: $article-text !default;
|
||||
$current-date-color: $article-text !default;
|
||||
$current-time-color: $article-text !default;
|
||||
$enterprise-eol-date-color: $article-text !default;
|
||||
|
||||
// Error Page Colors
|
||||
$error-page-btn: $b-pool !default;
|
||||
$error-page-btn-text: $g20-white !default;
|
||||
|
|
|
@ -1,2 +0,0 @@
|
|||
import:
|
||||
- hugo.yml
|
|
@ -1,4 +1,4 @@
|
|||
baseURL: 'https://docs.influxdata.com/'
|
||||
baseURL: https://docs.influxdata.com/
|
||||
languageCode: en-us
|
||||
title: InfluxDB Documentation
|
||||
|
||||
|
@ -49,21 +49,52 @@ privacy:
|
|||
youtube:
|
||||
disable: false
|
||||
privacyEnhanced: true
|
||||
|
||||
outputFormats:
|
||||
json:
|
||||
mediaType: application/json
|
||||
baseName: pages
|
||||
isPlainText: true
|
||||
|
||||
# Asset processing configuration for development
|
||||
build:
|
||||
# Ensure Hugo correctly processes JavaScript modules
|
||||
jsConfig:
|
||||
nodeEnv: "development"
|
||||
# Development asset processing
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
noJSConfigInAssets: false
|
||||
|
||||
# Asset processing configuration
|
||||
assetDir: "assets"
|
||||
|
||||
module:
|
||||
mounts:
|
||||
- source: assets
|
||||
target: assets
|
||||
|
||||
- source: node_modules
|
||||
target: assets/node_modules
|
||||
|
||||
# Environment parameters
|
||||
params:
|
||||
env: development
|
||||
environment: development
|
||||
|
||||
# Configure the server for development
|
||||
server:
|
||||
port: 1313
|
||||
baseURL: 'http://localhost:1313/'
|
||||
watchChanges: true
|
||||
disableLiveReload: false
|
||||
|
||||
# Ignore specific warning logs
|
||||
ignoreLogs:
|
||||
- warning-goldmark-raw-html
|
||||
|
||||
# Disable minification for development
|
||||
minify:
|
||||
disableJS: true
|
||||
disableCSS: true
|
||||
disableHTML: true
|
||||
minifyOutput: false
|
|
@ -0,0 +1,40 @@
|
|||
# Production overrides for CI/CD builds
|
||||
baseURL: 'https://docs.influxdata.com/'
|
||||
|
||||
# Production environment parameters
|
||||
params:
|
||||
env: production
|
||||
environment: production
|
||||
|
||||
# Enable minification for production
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
|
||||
# Production asset processing
|
||||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
|
||||
# Asset processing configuration
|
||||
assetDir: "assets"
|
||||
|
||||
# Mount assets for production
|
||||
module:
|
||||
mounts:
|
||||
- source: assets
|
||||
target: assets
|
||||
- source: node_modules
|
||||
target: assets/node_modules
|
||||
|
||||
# Disable development server settings
|
||||
server: {}
|
||||
|
||||
# Suppress the warning mentioned in the error
|
||||
ignoreLogs:
|
||||
- 'warning-goldmark-raw-html'
|
|
@ -0,0 +1,17 @@
|
|||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
params:
|
||||
env: production
|
||||
environment: production
|
||||
server: {
|
||||
disableLiveReload: true
|
||||
}
|
|
@ -0,0 +1,18 @@
|
|||
baseURL: https://test2.docs.influxdata.com/
|
||||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
params:
|
||||
env: staging
|
||||
environment: staging
|
||||
server:
|
||||
disableLiveReload: true
|
||||
|
|
@ -1,20 +0,0 @@
|
|||
baseURL: 'http://localhost:1315/'
|
||||
|
||||
server:
|
||||
port: 1315
|
||||
|
||||
# Override settings for testing
|
||||
buildFuture: true
|
||||
|
||||
# Configure what content is built in testing env
|
||||
params:
|
||||
environment: testing
|
||||
buildTestContent: true
|
||||
|
||||
# Keep your shared content exclusions
|
||||
ignoreFiles:
|
||||
- "content/shared/.*"
|
||||
|
||||
# Ignore specific warning logs
|
||||
ignoreLogs:
|
||||
- warning-goldmark-raw-html
|
|
@ -22,7 +22,7 @@ We recommend the following design guidelines for most use cases:
|
|||
|
||||
Your queries should guide what data you store in [tags](/enterprise_influxdb/v1/concepts/glossary/#tag) and what you store in [fields](/enterprise_influxdb/v1/concepts/glossary/#field) :
|
||||
|
||||
- Store commonly queried and grouping ([`group()`](/flux/v0.x/stdlib/universe/group) or [`GROUP BY`](/enterprise_influxdb/v1/query_language/explore-data/#group-by-tags)) metadata in tags.
|
||||
- Store commonly queried and grouping ([`group()`](/flux/v0/stdlib/universe/group) or [`GROUP BY`](/enterprise_influxdb/v1/query_language/explore-data/#group-by-tags)) metadata in tags.
|
||||
- Store data in fields if each data point contains a different value.
|
||||
- Store numeric values as fields ([tag values](/enterprise_influxdb/v1/concepts/glossary/#tag-value) only support string values).
|
||||
|
||||
|
|
|
@ -1267,3 +1267,106 @@ This is small tab 2.4 content.
|
|||
{{% /tab-content %}}
|
||||
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## Group key demo
|
||||
|
||||
Used to demonstrate Flux group keys
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs "small" %}}
|
||||
[Input](#)
|
||||
[Output](#)
|
||||
<span class="tab-view-output">Click to view output</span>
|
||||
{{% /tabs %}}
|
||||
{{% tab-content %}}
|
||||
|
||||
The following data is output from the last `filter()` and piped forward into `group()`:
|
||||
|
||||
> [!Note]
|
||||
> `_start` and `_stop` columns have been omitted.
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Kitchen, _field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | hum | 36.2 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | hum | 36.1 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Living Room, _field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | hum | 36 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Kitchen, _field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | temp | 21 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | temp | 23 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | temp | 22.7 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Living Room, _field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | temp | 21.1 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | temp | 21.4 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | temp | 21.8 |
|
||||
|
||||
{{% /tab-content %}}
|
||||
{{% tab-content %}}
|
||||
|
||||
When grouped by `_field`, all rows with the `temp` field will be in one table
|
||||
and all the rows with the `hum` field will be in another.
|
||||
`_measurement` and `room` columns no longer affect how rows are grouped.
|
||||
|
||||
{{% note %}}
|
||||
`_start` and `_stop` columns have been omitted.
|
||||
{{% /note %}}
|
||||
|
||||
{{% flux/group-key "[_field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | hum | 36.2 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | hum | 36.1 |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | hum | 36 |
|
||||
|
||||
{{% flux/group-key "[_field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | temp | 21 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | temp | 23 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | temp | 22.7 |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | temp | 21.1 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | temp | 21.4 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | temp | 21.8 |
|
||||
|
||||
{{% /tab-content %}}
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## datetime/current-timestamp shortcode
|
||||
|
||||
### Default usage
|
||||
|
||||
{{< datetime/current-timestamp >}}
|
||||
|
||||
### Format YYYY-MM-DD HH:mm:ss
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" >}}
|
||||
|
||||
### Format with UTC timezone
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" timezone="UTC" >}}
|
||||
|
||||
### Format with America/New_York timezone
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" timezone="America/New_York" >}}
|
||||
|
|
|
@ -433,7 +433,7 @@ representative of the Flux SPEC.
|
|||
details.
|
||||
- Add tagging support to Flux tests.
|
||||
- Add new function [`experimental.catch()`](/flux/v0/stdlib/experimental/catch/).
|
||||
- Add new function [`testing.shouldError()`](/flux/v0.x/stdlib/testing/shoulderror/).
|
||||
- Add new function [`testing.shouldError()`](/flux/v0/stdlib/testing/shoulderror/).
|
||||
|
||||
### Bug fixes
|
||||
|
||||
|
|
|
@ -12,8 +12,8 @@ menu:
|
|||
parent: Account management
|
||||
name: View data usage
|
||||
related:
|
||||
- /flux/v0.x/stdlib/experimental/usage/from/
|
||||
- /flux/v0.x/stdlib/experimental/usage/limits/
|
||||
- /flux/v0/stdlib/experimental/usage/from/
|
||||
- /flux/v0/stdlib/experimental/usage/limits/
|
||||
alt_links:
|
||||
cloud-serverless: /influxdb3/cloud-serverless/admin/billing/data-usage/
|
||||
---
|
||||
|
|
|
@ -9,8 +9,8 @@ menu:
|
|||
parent: Account management
|
||||
name: Adjustable quotas and limits
|
||||
related:
|
||||
- /flux/v0.x/stdlib/experimental/usage/from/
|
||||
- /flux/v0.x/stdlib/experimental/usage/limits/
|
||||
- /flux/v0/stdlib/experimental/usage/from/
|
||||
- /flux/v0/stdlib/experimental/usage/limits/
|
||||
- /influxdb/cloud/write-data/best-practices/resolve-high-cardinality/
|
||||
alt_links:
|
||||
cloud-serverless: /influxdb3/cloud-serverless/admin/billing/limits/
|
||||
|
@ -97,7 +97,7 @@ Combine delete predicate expressions (if possible) into a single request. Influx
|
|||
|
||||
The {{< product-name >}} UI displays a notification message when service quotas or limits are exceeded. The error messages correspond with the relevant [API error responses](#api-error-responses).
|
||||
|
||||
Errors can also be viewed in the [Usage page](/influxdb/cloud/account-management/data-usage/) under **Limit Events**, e.g. `event_type_limited_query`, `event_type_limited_write`,`event_type_limited_cardinality`, or `event_type_limited_delete_rate`.
|
||||
Errors can also be viewed in the [Usage page](/influxdb/cloud/account-management/data-usage/) under **Limit Events**, for example: `event_type_limited_query`, `event_type_limited_write`,`event_type_limited_cardinality`, or `event_type_limited_delete_rate`.
|
||||
|
||||
## API error responses
|
||||
|
||||
|
|
|
@ -40,7 +40,7 @@ To replicate the state of an organization:
|
|||
### Write data with Flux
|
||||
Perform a query to return all specified data.
|
||||
Write results directly to a bucket in the new organization with the Flux
|
||||
[`to()` function](/flux/v0.x/stdlib/influxdata/influxdb/to/).
|
||||
[`to()` function](/flux/v0/stdlib/influxdata/influxdb/to/).
|
||||
|
||||
{{% note %}}
|
||||
If writes are prevented by rate limiting,
|
||||
|
|
|
@ -25,7 +25,7 @@ types of demo data that let you explore and familiarize yourself with InfluxDB C
|
|||
{{% note %}}
|
||||
#### Free to use and read-only
|
||||
- InfluxDB Cloud demo data buckets are **free to use** and are **_not_ subject to
|
||||
[Free Plan rate limits](influxdb/cloud/account-management/limits/#free-plan-rate-limits) rate limits**.
|
||||
[Free Plan rate limits](/influxdb/cloud/account-management/limits/#free-plan-rate-limits) rate limits**.
|
||||
- Demo data buckets are **read-only**. You cannot write data into demo data buckets.
|
||||
{{% /note %}}
|
||||
|
||||
|
|
|
@ -13,7 +13,7 @@ prepend: |
|
|||
> [Use InfluxQL to query InfluxDB](/influxdb/cloud/query-data/influxql/).
|
||||
> For information about manually converting InfluxQL queries to Flux, see:
|
||||
>
|
||||
> - [Get started with Flux](/flux/v0.x/get-started/)
|
||||
> - [Get started with Flux](/flux/v0/get-started/)
|
||||
> - [Query data with Flux](/influxdb/cloud/query-data/flux/)
|
||||
> - [Migrate continuous queries to Flux tasks](/influxdb/cloud/upgrade/v1-to-cloud/migrate-cqs/)
|
||||
source: /shared/influxdb-v2/reference/cli/influx/transpile/_index.md
|
||||
|
|
|
@ -188,7 +188,7 @@ Now, you can add the following buckets with sample data to your notebooks:
|
|||
|
||||
### Add ability to share notebooks
|
||||
|
||||
Add ability to [share a notebook](/influxdb/cloud/tools/notebooks/manage-notebooks/#share-a-notebook) in the the InfluxDB Cloud notebook UI.
|
||||
Add ability to [share a notebook](/influxdb/cloud/tools/notebooks/manage-notebooks/#share-a-notebook) in the InfluxDB Cloud notebook UI.
|
||||
|
||||
## October 2021
|
||||
|
||||
|
@ -209,7 +209,7 @@ Refresh the look and feel of InfluxDB Cloud UI. The updated icons, fonts, and la
|
|||
|
||||
### Flux update
|
||||
|
||||
Upgrade to [Flux v0.139](/flux/v0.x/release-notes/).
|
||||
Upgrade to [Flux v0.139](/flux/v0/release-notes/).
|
||||
|
||||
### Telegraf configuration UI
|
||||
|
||||
|
@ -347,7 +347,7 @@ Install and customize any [InfluxDB community template](https://github.com/influ
|
|||
|
||||
## Features
|
||||
- **InfluxDB OSS 2.0 alpha-17** –
|
||||
_See the [alpha-17 release notes](/influxdb/v2%2E0/reference/release-notes/influxdb/#v200-alpha17) for details._
|
||||
_See the [alpha-17 release notes](/influxdb/v2/reference/release-notes/influxdb/#v200-alpha17) for details._
|
||||
- Alerts and Notifications to Slack (Free Plan), PagerDuty and HTTP (Usage-based Plan).
|
||||
- Rate limiting on cardinality for Free Plan.
|
||||
- Billing notifications.
|
||||
|
@ -359,7 +359,7 @@ Install and customize any [InfluxDB community template](https://github.com/influ
|
|||
### Features
|
||||
|
||||
- **InfluxDB OSS 2.0 alpha-15** –
|
||||
_See the [alpha-9 release notes](/influxdb/v2%2E0/reference/release-notes/influxdb/#v200-alpha15) for details._
|
||||
_See the [alpha-9 release notes](/influxdb/v2/reference/release-notes/influxdb/#v200-alpha15) for details._
|
||||
- Usage-based Plan.
|
||||
- Adjusted Free Plan rate limits.
|
||||
- Timezone selection in the user interface.
|
||||
|
@ -386,7 +386,7 @@ Install and customize any [InfluxDB community template](https://github.com/influ
|
|||
### Features
|
||||
|
||||
- **InfluxDB OSS 2.0 alpha-9** –
|
||||
_See the [alpha-9 release notes](/influxdb/v2%2E0/reference/release-notes/influxdb/#v200-alpha9) for details._
|
||||
_See the [alpha-9 release notes](/influxdb/v2/reference/release-notes/influxdb/#v200-alpha9) for details._
|
||||
|
||||
### Bug fixes
|
||||
|
||||
|
@ -403,7 +403,7 @@ Install and customize any [InfluxDB community template](https://github.com/influ
|
|||
### Features
|
||||
|
||||
- **InfluxDB OSS 2.0 alpha-7** –
|
||||
_See the [alpha-7 release notes](/influxdb/v2%2E0/reference/release-notes/influxdb/#v200-alpha7) for details._
|
||||
_See the [alpha-7 release notes](/influxdb/v2/reference/release-notes/influxdb/#v200-alpha7) for details._
|
||||
|
||||
### Bug fixes
|
||||
|
||||
|
|
|
@ -22,7 +22,7 @@ We recommend the following design guidelines for most use cases:
|
|||
|
||||
Your queries should guide what data you store in [tags](/influxdb/v1/concepts/glossary/#tag) and what you store in [fields](/influxdb/v1/concepts/glossary/#field) :
|
||||
|
||||
- Store commonly queried and grouping ([`group()`](/flux/v0.x/stdlib/universe/group) or [`GROUP BY`](/influxdb/v1/query_language/explore-data/#group-by-tags)) metadata in tags.
|
||||
- Store commonly queried and grouping ([`group()`](/flux/v0/stdlib/universe/group) or [`GROUP BY`](/influxdb/v1/query_language/explore-data/#group-by-tags)) metadata in tags.
|
||||
- Store data in fields if each data point contains a different value.
|
||||
- Store numeric values as fields ([tag values](/influxdb/v1/concepts/glossary/#tag-value) only support string values).
|
||||
|
||||
|
|
|
@ -83,7 +83,7 @@ customSumProduct = (tables=<-) => tables
|
|||
|
||||
#### Check if a statically defined record contains a key
|
||||
|
||||
When you use the [record literal syntax](/flux/v0.x/data-types/composite/record/#record-syntax)
|
||||
When you use the [record literal syntax](/flux/v0/data-types/composite/record/#record-syntax)
|
||||
to statically define a record, Flux knows the record type and what keys to expect.
|
||||
|
||||
- If the key exists in the static record, `exists` returns `true`.
|
||||
|
|
|
@ -52,7 +52,7 @@ never be removed by the retention enforcement service.
|
|||
You can customize [table (measurement) limits](#table-limit) and
|
||||
[table column limits](#column-limit) when you
|
||||
[create](#create-a-database) or
|
||||
[update a database](#update-a-database) in {{< product-name >}}.
|
||||
[update a database](#update-a-database) in {{% product-name %}}.
|
||||
|
||||
### Table limit
|
||||
|
||||
|
|
|
@ -39,7 +39,7 @@ Each Telegraf configuration must **have at least one input plugin and one output
|
|||
Telegraf input plugins retrieve metrics from different sources.
|
||||
Telegraf output plugins write those metrics to a destination.
|
||||
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{< product-name >}}.
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{% product-name %}}.
|
||||
|
||||
```toml
|
||||
# ...
|
||||
|
|
|
@ -115,7 +115,7 @@ For {{% product-name %}}, set this to an empty string (`""`).
|
|||
The name of the {{% product-name %}} database to write data to.
|
||||
|
||||
> [!Note]
|
||||
> ##### Write to InfluxDB v1.x and {{< product-name >}}
|
||||
> ##### Write to InfluxDB v1.x and {{% product-name %}}
|
||||
>
|
||||
> If a Telegraf agent is already writing to an InfluxDB v1.x database,
|
||||
> enabling the InfluxDB v2 output plugin will write data to both v1.x and your {{< product-name omit="Clustered" >}} cluster.
|
||||
|
|
|
@ -9,8 +9,8 @@ menu:
|
|||
parent: Manage billing
|
||||
name: View data usage
|
||||
related:
|
||||
- /flux/v0.x/stdlib/experimental/usage/from/
|
||||
- /flux/v0.x/stdlib/experimental/usage/limits/
|
||||
- /flux/v0/stdlib/experimental/usage/from/
|
||||
- /flux/v0/stdlib/experimental/usage/limits/
|
||||
alt_links:
|
||||
cloud: /influxdb/cloud/account-management/data-usage/
|
||||
aliases:
|
||||
|
|
|
@ -9,8 +9,8 @@ menu:
|
|||
parent: Manage billing
|
||||
name: Adjustable quotas and limits
|
||||
related:
|
||||
- /flux/v0.x/stdlib/experimental/usage/from/
|
||||
- /flux/v0.x/stdlib/experimental/usage/limits/
|
||||
- /flux/v0/stdlib/experimental/usage/from/
|
||||
- /flux/v0/stdlib/experimental/usage/limits/
|
||||
- /influxdb3/cloud-serverless/write-data/best-practices/
|
||||
alt_links:
|
||||
cloud: /influxdb/cloud/account-management/limits/
|
||||
|
|
|
@ -40,7 +40,7 @@ Each Telegraf configuration must **have at least one input plugin and one output
|
|||
Telegraf input plugins retrieve metrics from different sources.
|
||||
Telegraf output plugins write those metrics to a destination.
|
||||
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{< product-name >}}.
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{% product-name %}}.
|
||||
|
||||
```toml
|
||||
# ...
|
||||
|
|
|
@ -110,7 +110,7 @@ For {{% product-name %}}, set this to an empty string (`""`).
|
|||
The name of the {{% product-name %}} bucket to write data to.
|
||||
|
||||
> [!Note]
|
||||
> ##### Write to InfluxDB v1.x and {{< product-name >}}
|
||||
> ##### Write to InfluxDB v1.x and {{% product-name %}}
|
||||
>
|
||||
> If a Telegraf agent is already writing to an InfluxDB v1.x database,
|
||||
> enabling the InfluxDB v2 output plugin will write data to both v1.x and your {{< product-name >}} bucket.
|
||||
|
|
|
@ -52,7 +52,7 @@ never be removed by the retention enforcement service.
|
|||
You can customize [table (measurement) limits](#table-limit) and
|
||||
[table column limits](#column-limit) when you
|
||||
[create](#create-a-database) or
|
||||
[update a database](#update-a-database) in {{< product-name >}}.
|
||||
[update a database](#update-a-database) in {{% product-name %}}.
|
||||
|
||||
### Table limit
|
||||
|
||||
|
|
|
@ -51,7 +51,7 @@ Use the following command to return the image Kubernetes uses to build your
|
|||
InfluxDB cluster:
|
||||
|
||||
```sh
|
||||
kubectl get appinstances.kubecfg.dev influxdb -o jsonpath='{.spec.package.image}'
|
||||
kubectl get appinstances.kubecfg.dev influxdb -n influxdb -o jsonpath='{.spec.package.image}'
|
||||
```
|
||||
|
||||
The package version number is at the end of the returned string (after `influxdb:`):
|
||||
|
@ -66,8 +66,8 @@ us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:PACKAGE_VERSION
|
|||
|
||||
### Identify the version to upgrade to
|
||||
|
||||
All available InfluxDB Clustered package versions are provided at
|
||||
[oci.influxdata.com](https://oci.influxdata.com).
|
||||
All available InfluxDB Clustered package versions are provided in the
|
||||
[InfluxDB Clustered release notes](/influxdb3/clustered/reference/release-notes/clustered/).
|
||||
Find the package version you want to upgrade to and copy the version number.
|
||||
|
||||
|
||||
|
@ -76,7 +76,7 @@ Find the package version you want to upgrade to and copy the version number.
|
|||
Some InfluxDB Clustered releases are _checkpoint releases_ that introduce a
|
||||
breaking change to an InfluxDB component.
|
||||
Checkpoint releases are only made when absolutely necessary and are clearly
|
||||
identified at [oci.influxdata.com](https://oci.influxdata.com).
|
||||
identified in the [InfluxDB Clustered release notes](/influxdb3/clustered/reference/release-notes/clustered/).
|
||||
|
||||
**When upgrading, always upgrade to each checkpoint release first, before proceeding
|
||||
to newer versions.**
|
||||
|
|
|
@ -21,6 +21,42 @@ weight: 201
|
|||
> Checkpoint releases are only made when absolutely necessary and are clearly
|
||||
> identified below with the <span class="cf-icon Shield pink"></span> icon.
|
||||
|
||||
{{< expand-wrapper >}}
|
||||
{{% expand "Download release artifacts manually" %}}
|
||||
|
||||
To download a bundle of release artifacts for a specific version of
|
||||
InfluxDB Clustered:
|
||||
|
||||
1. [install `crane`](https://github.com/google/go-containerregistry/tree/main/cmd/crane#installation)
|
||||
and [`jq`](https://jqlang.org/download/).
|
||||
2. Ensure your InfluxData pull secret is in the `/tmp/influxdbsecret` directory
|
||||
on your local machine. This secret was provided to you by InfluxData to
|
||||
authorize the use of InfluxDB Clustered images.
|
||||
3. Run the following shell script:
|
||||
|
||||
{{% code-placeholders "RELEASE_VERSION" %}}
|
||||
<!-- pytest.mark.skip -->
|
||||
```bash
|
||||
INFLUXDB_RELEASE="RELEASE_VERSION"
|
||||
IMAGE="us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:$INFLUXDB_RELEASE"
|
||||
DOCKER_CFG="/tmp/influxdbsecret"
|
||||
|
||||
DIGEST=$(DOCKER_CONFIG="$DOCKER_CFG" crane manifest "$IMAGE" | jq -r '.layers[1].digest')
|
||||
|
||||
DOCKER_CONFIG="$DOCKER_CFG" \
|
||||
crane blob "$IMAGE@$DIGEST" | tar -xvzf - -C ./
|
||||
```
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
_Replace {{% code-placeholder-key %}}`RELEASE_VERSION`{{% /code-placeholder-key %}}
|
||||
with the InfluxDB Clustered release version you want to download artifacts for._
|
||||
|
||||
The script creates an `influxdb-3.0-clustered` directory in the current working
|
||||
directory. This new directory contains artifacts associated with the specified release.
|
||||
|
||||
{{% /expand %}}
|
||||
{{< /expand-wrapper >}}
|
||||
|
||||
{{< release-toc >}}
|
||||
|
||||
---
|
||||
|
@ -35,6 +71,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20250508-1719206
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20250508-1719206/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20250508-1719206/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Changes
|
||||
|
||||
#### Deployment
|
||||
|
@ -59,6 +101,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20250212-1570743
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20250212-1570743/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20250212-1570743/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
This release fixes a bug in the 20241217-1494922 release where the default
|
||||
|
@ -88,6 +136,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20241217-1494922
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20241217-1494922/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20241217-1494922/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
This fixes a bug present in release [20241024-1354148](#20241024-1354148), in
|
||||
|
@ -122,6 +176,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20241022-1346953
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20241022-1346953/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20241022-1346953/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Known Bugs
|
||||
|
||||
### `core` service DSN parsing errors
|
||||
|
@ -318,6 +378,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240819-1176644
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240819-1176644/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240819-1176644/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### `admin` section is no longer required
|
||||
|
@ -397,6 +463,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240717-1117630
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240717-1117630/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240717-1117630/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Experimental license enforcement
|
||||
|
@ -508,6 +580,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240605-1035562
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240605-1035562/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240605-1035562/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
Multiple improvements to compaction, pruning, and performance of concurrent queries.
|
||||
|
@ -574,6 +652,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240430-976585
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240430-976585/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240430-976585/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
- Added configuration settings for an optional Prometheus `ServiceMonitor`
|
||||
|
@ -605,6 +689,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240418-955990
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240418-955990/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240418-955990/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Minimum `influxctl` version
|
||||
|
@ -645,6 +735,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240325-920726
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240325-920726/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240325-920726/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Lower defaults for garbage collection
|
||||
|
@ -696,6 +792,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240227-883344
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240227-883344/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240227-883344/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Changes
|
||||
|
||||
#### Deployment
|
||||
|
@ -724,6 +826,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240214-863513
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240214-863513/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240214-863513/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Grafana dashboards by default
|
||||
|
@ -783,6 +891,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20240111-824437
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20240111-824437/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20240111-824437/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Ingress improvements
|
||||
|
@ -845,6 +959,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20231213-791734
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20231213-791734/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20231213-791734/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Labels/annotations
|
||||
|
@ -885,6 +1005,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20231117-750011
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20231117-750011/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20231117-750011/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
> ![Important]
|
||||
|
@ -910,6 +1036,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20231115-746129
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20231115-746129/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20231115-746129/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Ingress templating
|
||||
|
@ -1022,6 +1154,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20231024-711448
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20231024-711448/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20231024-711448/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Additional `AppInstance` parameters
|
||||
|
@ -1083,6 +1221,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20231004-666907
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20231004-666907/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20231004-666907/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Object store custom certificates
|
||||
|
@ -1150,6 +1294,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230922-650371
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230922-650371/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230922-650371/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Configuration simplification
|
||||
|
@ -1202,6 +1352,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230915-630658
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230915-630658/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230915-630658/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Persistent volume fixes
|
||||
|
@ -1228,6 +1384,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230914-628600
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230914-628600/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230914-628600/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Updated Azure AD documentation
|
||||
|
@ -1263,6 +1425,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230912-619813
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230912-619813/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230912-619813/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Custom CA certificates {note="(Optional)"}
|
||||
|
@ -1333,6 +1501,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230911-604209
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230911-604209/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230911-604209/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
This release contains a breaking change to the monitoring subsystem that
|
||||
|
@ -1382,6 +1556,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230908-600131
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230908-600131/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230908-600131/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Highlights
|
||||
|
||||
#### Default storage class
|
||||
|
@ -1409,6 +1589,12 @@ spec:
|
|||
image: us-docker.pkg.dev/influxdb2-artifacts/clustered/influxdb:20230907-597343
|
||||
```
|
||||
|
||||
#### Release artifacts
|
||||
|
||||
- [app-instance-schema.json](/downloads/clustered-release-artifacts/20230907-597343/app-instance-schema.json)
|
||||
- [example-customer.yml](/downloads/clustered-release-artifacts/20230907-597343/example-customer.yml)
|
||||
- [InfluxDB Clustered README EULA July 2024.txt](/downloads/clustered-release-artifacts/InfluxDB%20Clustered%20README%20EULA%20July%202024.txt)
|
||||
|
||||
### Upgrade Notes
|
||||
|
||||
This release requires a new configuration block:
|
||||
|
|
|
@ -39,7 +39,7 @@ Each Telegraf configuration must **have at least one input plugin and one output
|
|||
Telegraf input plugins retrieve metrics from different sources.
|
||||
Telegraf output plugins write those metrics to a destination.
|
||||
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{< product-name >}}.
|
||||
Use the [`outputs.influxdb_v2`](/telegraf/v1/plugins/#output-influxdb_v2) plugin to write metrics collected by Telegraf to {{% product-name %}}.
|
||||
|
||||
```toml
|
||||
# ...
|
||||
|
|
|
@ -112,7 +112,7 @@ For {{% product-name %}}, set this to an empty string (`""`).
|
|||
The name of the {{% product-name %}} database to write data to.
|
||||
|
||||
> [!Note]
|
||||
> ##### Write to InfluxDB v1.x and {{< product-name >}}
|
||||
> ##### Write to InfluxDB v1.x and {{% product-name %}}
|
||||
>
|
||||
> If a Telegraf agent is already writing to an InfluxDB v1.x database,
|
||||
> enabling the InfluxDB v2 output plugin will write data to both v1.x and your {{< product-name omit="Clustered" >}} cluster.
|
||||
|
|
|
@ -9,9 +9,10 @@ menu:
|
|||
influxdb3_core:
|
||||
name: InfluxDB 3 Core
|
||||
weight: 1
|
||||
source: /shared/v3-core-get-started/_index.md
|
||||
source: /shared/influxdb3/_index.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this page is at /shared/v3-core-get-started/_index.md
|
||||
The content of this page is at
|
||||
//SOURCE - content/shared/influxdb3/_index.md
|
||||
-->
|
|
@ -16,4 +16,5 @@ source: /shared/influxdb3-admin/distinct-value-cache/_index.md
|
|||
---
|
||||
|
||||
<!-- The content for this page is located at
|
||||
// SOURCE content/shared/influxdb3-admin/distinct-value-cache/_index.md -->
|
||||
// SOURCE content/shared/influxdb3-admin/distinct-value-cache/_index.md
|
||||
-->
|
||||
|
|
|
@ -17,4 +17,5 @@ source: /shared/influxdb3-admin/last-value-cache/_index.md
|
|||
---
|
||||
|
||||
<!-- The content for this page is located at
|
||||
// SOURCE content/shared/influxdb3-admin/last-value-cache/_index.md -->
|
||||
// SOURCE content/shared/influxdb3-admin/last-value-cache/_index.md
|
||||
-->
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
---
|
||||
title: Manage tokens
|
||||
description: >
|
||||
Manage tokens to authenticate and authorize access to resources and data in an {{< product-name >}} instance.
|
||||
Manage tokens to authenticate and authorize access to server actions, resources, and data in an {{< product-name >}} instance.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Administer InfluxDB
|
||||
|
@ -11,4 +11,4 @@ source: /shared/influxdb3-admin/tokens/_index.md
|
|||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-admin/tokens/_index.md
|
||||
-->>
|
||||
-->
|
|
@ -11,9 +11,9 @@ menu:
|
|||
name: Admin tokens
|
||||
weight: 101
|
||||
influxdb3/core/tags: [tokens]
|
||||
source: /shared/influxdb3-admin/tokens/_index.md
|
||||
source: /shared/influxdb3-admin/tokens/admin/_index.md
|
||||
---
|
||||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-admin/tokens/_index.md
|
||||
// SOURCE content/shared/influxdb3-admin/tokens/admin/_index.md
|
||||
-->
|
|
@ -2,7 +2,7 @@
|
|||
title: Create an admin token
|
||||
description: >
|
||||
Use the [`influxdb3 create token --admin` command](/influxdb3/core/reference/cli/influxdb3/create/token/)
|
||||
or the [HTTP API](/influxdb3/core/api/v3/)
|
||||
or the HTTP API [`/api/v3/configure/token/admin`](/influxdb3/core/api/v3/#operation/PostCreateAdminToken) endpoint
|
||||
to create an [admin token](/influxdb3/core/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
An admin token grants access to all actions on the server.
|
||||
menu:
|
||||
|
|
|
@ -2,10 +2,9 @@
|
|||
title: Regenerate an admin token
|
||||
description: >
|
||||
Use the [`influxdb3 create token --admin` command](/influxdb3/core/reference/cli/influxdb3/create/token/)
|
||||
or the [HTTP API](/influxdb3/core/api/v3/)
|
||||
to regenerate an [admin token](/influxdb3/core/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
An admin token grants access to all actions on the server.
|
||||
Regenerating an admin token deactivates the previous token.
|
||||
or the HTTP API [`/api/v3/configure/token/admin/regenerate`](/influxdb3/core/api/v3/#operation/PostRegenerateAdminToken) endpoint
|
||||
to regenerate an [operator token](/influxdb3/core/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
Regenerating an operator token deactivates the previous token.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: Admin tokens
|
||||
|
@ -14,8 +13,15 @@ list_code_example: |
|
|||
##### CLI
|
||||
```bash
|
||||
influxdb3 create token --admin \
|
||||
--token ADMIN_TOKEN \
|
||||
--regenerate
|
||||
OPERATOR_TOKEN
|
||||
```
|
||||
#### HTTP API
|
||||
```bash
|
||||
curl -X POST "http://{{< influxdb/host >}}/api/v3/configure/token/admin/regenerate" \
|
||||
--header 'Authorization Bearer OPERATOR_TOKEN' \
|
||||
--header 'Accept: application/json'
|
||||
--header 'Content-Type: application/json'
|
||||
```
|
||||
source: /shared/influxdb3-admin/tokens/admin/regenerate.md
|
||||
---
|
||||
|
|
|
@ -13,7 +13,7 @@ related:
|
|||
- /influxdb3/core/admin/query-system-data/
|
||||
- /influxdb3/core/write-data/
|
||||
- /influxdb3/core/query-data/
|
||||
source: /shared/v3-core-get-started/_index.md
|
||||
source: /shared/influxdb3-get-started/_index.md
|
||||
prepend: |
|
||||
> [!Note]
|
||||
> InfluxDB 3 Core is purpose-built for real-time data monitoring and recent data.
|
||||
|
@ -26,5 +26,5 @@ prepend: |
|
|||
|
||||
<!--
|
||||
The content of this page is at
|
||||
// SOURCE content/shared/v3-core-get-started/_index.md
|
||||
// SOURCE content/shared/influxdb3-get-started/_index.md
|
||||
-->
|
||||
|
|
|
@ -12,7 +12,7 @@ alt_links:
|
|||
|
||||
- [System Requirements](#system-requirements)
|
||||
- [Quick install](#quick-install)
|
||||
- [Download {{< product-name >}} binaries](#download-influxdb-3-{{< product-key >}}-binaries)
|
||||
- [Download {{% product-name %}} binaries](#download-influxdb-3-{{< product-key >}}-binaries)
|
||||
- [Docker image](#docker-image)
|
||||
|
||||
## System Requirements
|
||||
|
@ -79,7 +79,7 @@ source ~/.zshrc
|
|||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
## Download {{< product-name >}} binaries
|
||||
## Download {{% product-name %}} binaries
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs %}}
|
||||
|
|
|
@ -103,7 +103,7 @@ influxdb3 -h
|
|||
influxdb3 --help
|
||||
```
|
||||
|
||||
### Run the {{< product-name >}} server with extra verbose logging
|
||||
### Run the {{% product-name %}} server with extra verbose logging
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
|
@ -114,7 +114,7 @@ influxdb3 serve -v \
|
|||
--node-id my-host-01
|
||||
```
|
||||
|
||||
### Run {{< product-name >}} with debug logging using LOG_FILTER
|
||||
### Run {{% product-name %}} with debug logging using LOG_FILTER
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
|
|
|
@ -7,9 +7,9 @@ menu:
|
|||
parent: influxdb3 create
|
||||
name: influxdb3 create token
|
||||
weight: 400
|
||||
source: /shared/influxdb3-cli/create/token.md
|
||||
source: /shared/influxdb3-cli/create/token/_index.md
|
||||
---
|
||||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-cli/create/token.md
|
||||
// SOURCE content/shared/influxdb3-cli/create/token/_index.md
|
||||
-->
|
|
@ -0,0 +1,15 @@
|
|||
---
|
||||
title: influxdb3 create token admin
|
||||
description: >
|
||||
The `influxdb3 create token admin` command creates an operator token or named admin token with full administrative privileges for server actions.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: influxdb3 create token
|
||||
name: influxdb3 create token admin
|
||||
weight: 400
|
||||
source: /shared/influxdb3-cli/create/token/admin.md
|
||||
---
|
||||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-cli/create/token/admin.md
|
||||
-->
|
|
@ -0,0 +1,15 @@
|
|||
---
|
||||
title: influxdb3 test schedule_plugin
|
||||
description: >
|
||||
The `influxdb3 test schedule_plugin` command tests a schedule plugin file without needing to create a trigger.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: influxdb3 test
|
||||
name: influxdb3 test schedule_plugin
|
||||
weight: 401
|
||||
source: /shared/influxdb3-cli/test/schedule_plugin.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is at content/shared/influxdb3-cli/test/schedule_plugin.md
|
||||
-->
|
|
@ -9,9 +9,10 @@ menu:
|
|||
influxdb3_enterprise:
|
||||
name: InfluxDB 3 Enterprise
|
||||
weight: 1
|
||||
source: /shared/v3-enterprise-get-started/_index.md
|
||||
source: /shared/influxdb3/_index.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this page is at /shared/v3-enterprise-get-started/_index.md
|
||||
The content of this page is at
|
||||
//SOURCE - content/shared/influxdb3/_index.md
|
||||
-->
|
||||
|
|
|
@ -153,7 +153,7 @@ existing license if it's still valid.
|
|||
environment variable
|
||||
7. If no license is found, the server won't start
|
||||
|
||||
#### Example: Start the {{< product-name >}} server with your license email:
|
||||
#### Example: Start the {{% product-name %}} server with your license email:
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
|
@ -187,7 +187,7 @@ influxdb3 serve \
|
|||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
#### Example: Start the {{< product-name >}} server with your license file:
|
||||
#### Example: Start the {{% product-name %}} server with your license file:
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
---
|
||||
title: Manage tokens
|
||||
description: >
|
||||
Manage tokens to authenticate and authorize access to resources and data in an {{< product-name >}} instance.
|
||||
Manage tokens to authenticate and authorize access to server actions, resources, and data in an {{< product-name >}} instance.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: Administer InfluxDB
|
||||
|
|
|
@ -2,8 +2,8 @@
|
|||
title: Create an admin token
|
||||
description: >
|
||||
Use the [`influxdb3 create token --admin` command](/influxdb3/enterprise/reference/cli/influxdb3/create/token/)
|
||||
or the [HTTP API](/influxdb3/enterprise/api/v3/)
|
||||
to create an [admin token](/influxdb3/enterprise/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
or the HTTP API [`/api/v3/configure/token/admin`](/influxdb3/enterprise/api/v3/#operation/PostCreateAdminToken)
|
||||
endpoint to create an operator or named [admin token](/influxdb3/enterprise/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
An admin token grants access to all actions on the server.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
|
@ -12,13 +12,15 @@ weight: 201
|
|||
list_code_example: |
|
||||
##### CLI
|
||||
```bash
|
||||
influxdb3 create token --admin
|
||||
influxdb3 create token --admin --name TOKEN_NAME
|
||||
```
|
||||
#### HTTP API
|
||||
```bash
|
||||
curl -X POST "http://{{< influxdb/host >}}/api/v3/configure/token/admin" \
|
||||
--header 'Accept: application/json' \
|
||||
--header 'Content-Type: application/json'
|
||||
--header 'Authorization Bearer ADMIN_TOKEN' \
|
||||
--json '{
|
||||
"name": "TOKEN_NAME"
|
||||
}'
|
||||
```
|
||||
alt_links:
|
||||
cloud-dedicated: /influxdb3/cloud-dedicated/admin/tokens/create-token/
|
||||
|
|
|
@ -1,10 +1,9 @@
|
|||
---
|
||||
title: Regenerate an admin token
|
||||
title: Regenerate an operator admin token
|
||||
description: >
|
||||
Use the [`influxdb3 create token --admin` command](/influxdb3/enterprise/reference/cli/influxdb3/create/token/)
|
||||
or the [HTTP API](/influxdb3/enterprise/api/v3/)
|
||||
to regenerate an [admin token](/influxdb3/enterprise/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
An admin token grants access to all actions on the server.
|
||||
to regenerate an [operator token](/influxdb3/enterprise/admin/tokens/admin/) for your {{< product-name omit="Clustered" >}} instance.
|
||||
Regenerating an admin token deactivates the previous token.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
|
@ -14,9 +13,15 @@ list_code_example: |
|
|||
##### CLI
|
||||
```bash
|
||||
influxdb3 create token --admin \
|
||||
--token ADMIN_TOKEN \
|
||||
--token OPERATOR_TOKEN \
|
||||
--regenerate
|
||||
```
|
||||
|
||||
#### HTTP API
|
||||
```bash
|
||||
curl -X POST "http://{{< influxdb/host >}}/api/v3/configure/token/admin/regenerate" \
|
||||
--header 'Authorization Bearer OPERATOR_TOKEN'
|
||||
```
|
||||
source: /shared/influxdb3-admin/tokens/admin/regenerate.md
|
||||
---
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@ title: Manage resource tokens
|
|||
seotitle: Manage resource tokens in {{< product-name >}}
|
||||
description: >
|
||||
Manage resource tokens in your {{< product-name >}} instance.
|
||||
Resource tokens grant fine-grained permissions on resources, such as databases
|
||||
Resource tokens grant permissions on specific resources, such as databases
|
||||
and system information endpoints in your {{< product-name >}} instance.
|
||||
Database resource tokens allow for actions like writing and querying data.
|
||||
menu:
|
||||
|
@ -15,13 +15,12 @@ influxdb3/enterprise/tags: [tokens]
|
|||
---
|
||||
|
||||
Manage resource tokens in your {{< product-name >}} instance.
|
||||
Resource tokens grant fine-grained permissions on resources, such as databases
|
||||
and system information endpoints in your {{< product-name >}} instance.
|
||||
Resource tokens provide scoped access to specific resources:
|
||||
|
||||
- **Databases**: Database tokens allow for actions like writing and querying data.
|
||||
- **Database tokens**: provide access to specific databases for actions like writing and querying data
|
||||
- **System tokens**: provide access to system-level resources, such as API endpoints for server runtime statistics and health.
|
||||
|
||||
- **System resources**: System information tokens allow read access to server runtime statistics and health.
|
||||
Access controls for system information API endpoints help prevent information leaks and attacks (such as DoS).
|
||||
Resource tokens are user-defined and available only in {{% product-name %}}.
|
||||
|
||||
{{< children depth="1" >}}
|
||||
|
||||
|
|
|
@ -13,10 +13,10 @@ related:
|
|||
- /influxdb3/enterprise/admin/query-system-data/
|
||||
- /influxdb3/enterprise/write-data/
|
||||
- /influxdb3/enterprise/query-data/
|
||||
source: /shared/v3-enterprise-get-started/_index.md
|
||||
source: /shared/influxdb3-get-started/_index.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this page is at
|
||||
// SOURCE content/shared/v3-enterprise-get-started/_index.md
|
||||
// SOURCE content/shared/influxdb3-get-started/_index.md
|
||||
-->
|
||||
|
|
|
@ -12,7 +12,7 @@ alt_links:
|
|||
|
||||
- [System Requirements](#system-requirements)
|
||||
- [Quick install](#quick-install)
|
||||
- [Download {{< product-name >}} binaries](#download-influxdb-3-{{< product-key >}}-binaries)
|
||||
- [Download {{% product-name %}} binaries](#download-influxdb-3-{{< product-key >}}-binaries)
|
||||
- [Docker image](#docker-image)
|
||||
|
||||
## System Requirements
|
||||
|
@ -79,7 +79,7 @@ source ~/.zshrc
|
|||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
## Download {{< product-name >}} binaries
|
||||
## Download {{% product-name %}} binaries
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs %}}
|
||||
|
|
|
@ -108,7 +108,7 @@ influxdb3 -h
|
|||
influxdb3 --help
|
||||
```
|
||||
|
||||
### Run the {{< product-name >}} server with extra verbose logging
|
||||
### Run the {{% product-name %}} server with extra verbose logging
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
|
@ -120,7 +120,7 @@ influxdb3 serve -v \
|
|||
--cluster-id my-cluster-01
|
||||
```
|
||||
|
||||
### Run {{< product-name >}} with debug logging using LOG_FILTER
|
||||
### Run {{% product-name %}} with debug logging using LOG_FILTER
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
|
|
|
@ -1,16 +0,0 @@
|
|||
---
|
||||
title: influxdb3 create token
|
||||
description: >
|
||||
The `influxdb3 create token` command creates a new authentication token.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: influxdb3 create
|
||||
name: influxdb3 create token
|
||||
weight: 400
|
||||
source: /shared/influxdb3-cli/create/token.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is at
|
||||
// SOURCE content/shared/influxdb3-cli/create/token.md
|
||||
-->
|
|
@ -1,19 +1,16 @@
|
|||
---
|
||||
title: influxdb3 create token
|
||||
description: >
|
||||
The `influxdb3 create token` command creates an admin token or a resource (fine-grained
|
||||
permissions) token for authenticating and authorizing actions in an {{% product-name %}} instance.
|
||||
The `influxdb3 create token` command creates an admin token or a scoped resource token for authenticating and authorizing actions in an {{% product-name %}} instance.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: influxdb3
|
||||
name: influxdb3 create token
|
||||
weight: 300
|
||||
aliases:
|
||||
- /influxdb3/enterprise/reference/cli/influxdb3/create/token/admin/
|
||||
source: /shared/influxdb3-cli/create/token.md
|
||||
source: /shared/influxdb3-cli/create/token/_index.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this page is at
|
||||
// SOURCE content/shared/influxdb3-cli/create/token.md
|
||||
// SOURCE content/shared/influxdb3-cli/create/token/_index.md
|
||||
-->
|
|
@ -0,0 +1,15 @@
|
|||
---
|
||||
title: influxdb3 create token admin
|
||||
description: >
|
||||
The `influxdb3 create token admin` command creates an operator token or named admin token with full administrative privileges for server actions.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: influxdb3 create token
|
||||
name: influxdb3 create token admin
|
||||
weight: 400
|
||||
source: /shared/influxdb3-cli/create/token/admin.md
|
||||
---
|
||||
|
||||
<!-- The content for this page is at
|
||||
// SOURCE content/shared/influxdb3-cli/create/token/admin.md
|
||||
-->
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue