Merge branch 'master' into clustered-artifacts
commit
b5189765ab
|
@ -1,4 +1,4 @@
|
|||
version: 2
|
||||
version: 2.1
|
||||
jobs:
|
||||
build:
|
||||
docker:
|
||||
|
@ -41,7 +41,7 @@ jobs:
|
|||
- /home/circleci/bin
|
||||
- run:
|
||||
name: Hugo Build
|
||||
command: npx hugo --logLevel info --minify --destination workspace/public
|
||||
command: yarn hugo --environment production --logLevel info --gc --destination workspace/public
|
||||
- persist_to_workspace:
|
||||
root: workspace
|
||||
paths:
|
||||
|
@ -68,7 +68,6 @@ jobs:
|
|||
when: on_success
|
||||
|
||||
workflows:
|
||||
version: 2
|
||||
build:
|
||||
jobs:
|
||||
- build
|
||||
|
|
|
@ -1679,7 +1679,7 @@ The shortcode takes a regular expression for matching placeholder names.
|
|||
Use the `code-placeholder-key` shortcode to format the placeholder names in
|
||||
text that describes the placeholder--for example:
|
||||
|
||||
```
|
||||
```markdown
|
||||
{{% code-placeholders "DATABASE_NAME|USERNAME|PASSWORD_OR_TOKEN|API_TOKEN|exampleuser@influxdata.com" %}}
|
||||
```sh
|
||||
curl --request POST http://localhost:8086/write?db=DATABASE_NAME \
|
||||
|
@ -1703,3 +1703,83 @@ InfluxDB API documentation when documentation is deployed.
|
|||
Redoc generates HTML documentation using the InfluxDB `swagger.yml`.
|
||||
For more information about generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
## JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
`assets/js`.
|
||||
Only `assets/js/main.js` should be imported in HTML files.
|
||||
|
||||
`assets/js/main.js` registers components and initializes them on page load.
|
||||
|
||||
If you're adding UI functionality that requires JavaScript, follow these steps:
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
should be initialized by your JavaScript code. For example:
|
||||
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Following the component pattern, create a single-purpose JavaScript module
|
||||
(`assets/js/components/my-component.js`)
|
||||
that exports a single function that receives the component element and initializes it.
|
||||
3. In `assets/js/main.js`, import the module and register the component to ensure
|
||||
the component is initialized on page load.
|
||||
|
||||
### Debugging JavaScript
|
||||
|
||||
To debug JavaScript code used in the InfluxData documentation UI, choose one of the following methods:
|
||||
|
||||
- Use source maps and the Chrome DevTools debugger.
|
||||
- Use debug helpers that provide breakpoints and console logging as a workaround or alternative for using source maps and the Chrome DevTools debugger.
|
||||
|
||||
#### Using source maps and Chrome DevTools debugger
|
||||
|
||||
1. In VS Code, select Run > Start Debugging.
|
||||
2. Select the "Debug Docs (source maps)" configuration.
|
||||
3. Click the play button to start the debugger.
|
||||
5. Set breakpoints in the JavaScript source files--files in the
|
||||
`assets/js/ns-hugo-imp:` namespace-- in the
|
||||
VS Code editor or in the Chrome Developer Tools Sources panel:
|
||||
|
||||
- In the VS Code Debugger panel > "Loaded Scripts" section, find the
|
||||
`assets/js/ns-hugo-imp:` namespace.
|
||||
- In the Chrome Developer Tools Sources panel, expand
|
||||
`js/ns-hugo-imp:/<YOUR_WORKSPACE_ROOT>/assets/js/`.
|
||||
|
||||
#### Using debug helpers
|
||||
|
||||
1. In your JavaScript module, import debug helpers from `assets/js/utils/debug-helpers.js`.
|
||||
These helpers provide breakpoints and console logging as a workaround or alternative for
|
||||
using source maps and the Chrome DevTools debugger.
|
||||
2. Insert debug statements by calling the helper functions in your code--for example:
|
||||
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
|
|
@ -21,6 +21,7 @@ node_modules
|
|||
test-results.xml
|
||||
/influxdb3cli-build-scripts/content
|
||||
.vscode/*
|
||||
!.vscode/launch.json
|
||||
.idea
|
||||
**/config.toml
|
||||
package-lock.json
|
||||
|
|
|
@ -3,3 +3,4 @@
|
|||
**/.svn
|
||||
**/.hg
|
||||
**/node_modules
|
||||
assets/jsconfig.json
|
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"name": "Debug JS (debug-helpers)",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"url": "http://localhost:1313",
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"skipFiles": [
|
||||
"<node_internals>/**"
|
||||
],
|
||||
"sourceMaps": false,
|
||||
"trace": true,
|
||||
"smartStep": false
|
||||
},
|
||||
{
|
||||
"name": "Debug JS (source maps)",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"url": "http://localhost:1313",
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"sourceMaps": true,
|
||||
"sourceMapPathOverrides": {
|
||||
"*": "${webRoot}/assets/js/*",
|
||||
"main.js": "${webRoot}/assets/js/main.js",
|
||||
"page-context.js": "${webRoot}/assets/js/page-context.js",
|
||||
"ask-ai-trigger.js": "${webRoot}/assets/js/ask-ai-trigger.js",
|
||||
"ask-ai.js": "${webRoot}/assets/js/ask-ai.js",
|
||||
"utils/*": "${webRoot}/assets/js/utils/*",
|
||||
"services/*": "${webRoot}/assets/js/services/*"
|
||||
},
|
||||
"skipFiles": [
|
||||
"<node_internals>/**",
|
||||
"node_modules/**",
|
||||
"chrome-extension://**"
|
||||
],
|
||||
"trace": true,
|
||||
"smartStep": true,
|
||||
"disableNetworkCache": true,
|
||||
"userDataDir": "${workspaceFolder}/.vscode/chrome-user-data",
|
||||
"runtimeArgs": [
|
||||
"--disable-features=VizDisplayCompositor"
|
||||
]
|
||||
},
|
||||
]
|
||||
}
|
|
@ -1667,7 +1667,7 @@ The shortcode takes a regular expression for matching placeholder names.
|
|||
Use the `code-placeholder-key` shortcode to format the placeholder names in
|
||||
text that describes the placeholder--for example:
|
||||
|
||||
```
|
||||
```markdown
|
||||
{{% code-placeholders "DATABASE_NAME|USERNAME|PASSWORD_OR_TOKEN|API_TOKEN|exampleuser@influxdata.com" %}}
|
||||
```sh
|
||||
curl --request POST http://localhost:8086/write?db=DATABASE_NAME \
|
||||
|
@ -1691,3 +1691,83 @@ InfluxDB API documentation when documentation is deployed.
|
|||
Redoc generates HTML documentation using the InfluxDB `swagger.yml`.
|
||||
For more information about generating InfluxDB API documentation, see the
|
||||
[API Documentation README](https://github.com/influxdata/docs-v2/tree/master/api-docs#readme).
|
||||
|
||||
## JavaScript in the documentation UI
|
||||
|
||||
The InfluxData documentation UI uses JavaScript with ES6+ syntax and
|
||||
`assets/js/main.js` as the entry point to import modules from
|
||||
`assets/js`.
|
||||
Only `assets/js/main.js` should be imported in HTML files.
|
||||
|
||||
`assets/js/main.js` registers components and initializes them on page load.
|
||||
|
||||
If you're adding UI functionality that requires JavaScript, follow these steps:
|
||||
|
||||
1. In your HTML file, add a `data-component` attribute to the element that
|
||||
should be initialized by your JavaScript code. For example:
|
||||
|
||||
```html
|
||||
<div data-component="my-component"></div>
|
||||
```
|
||||
|
||||
2. Following the component pattern, create a single-purpose JavaScript module
|
||||
(`assets/js/components/my-component.js`)
|
||||
that exports a single function that receives the component element and initializes it.
|
||||
3. In `assets/js/main.js`, import the module and register the component to ensure
|
||||
the component is initialized on page load.
|
||||
|
||||
### Debugging JavaScript
|
||||
|
||||
To debug JavaScript code used in the InfluxData documentation UI, choose one of the following methods:
|
||||
|
||||
- Use source maps and the Chrome DevTools debugger.
|
||||
- Use debug helpers that provide breakpoints and console logging as a workaround or alternative for using source maps and the Chrome DevTools debugger.
|
||||
|
||||
#### Using source maps and Chrome DevTools debugger
|
||||
|
||||
1. In VS Code, select Run > Start Debugging.
|
||||
2. Select the "Debug Docs (source maps)" configuration.
|
||||
3. Click the play button to start the debugger.
|
||||
5. Set breakpoints in the JavaScript source files--files in the
|
||||
`assets/js/ns-hugo-imp:` namespace-- in the
|
||||
VS Code editor or in the Chrome Developer Tools Sources panel:
|
||||
|
||||
- In the VS Code Debugger panel > "Loaded Scripts" section, find the
|
||||
`assets/js/ns-hugo-imp:` namespace.
|
||||
- In the Chrome Developer Tools Sources panel, expand
|
||||
`js/ns-hugo-imp:/<YOUR_WORKSPACE_ROOT>/assets/js/`.
|
||||
|
||||
#### Using debug helpers
|
||||
|
||||
1. In your JavaScript module, import debug helpers from `assets/js/utils/debug-helpers.js`.
|
||||
These helpers provide breakpoints and console logging as a workaround or alternative for
|
||||
using source maps and the Chrome DevTools debugger.
|
||||
2. Insert debug statements by calling the helper functions in your code--for example:
|
||||
|
||||
```js
|
||||
import { debugLog, debugBreak, debugInspect } from './utils/debug-helpers.js';
|
||||
|
||||
const data = debugInspect(someData, 'Data');
|
||||
debugLog('Processing data', 'myFunction');
|
||||
|
||||
function processData() {
|
||||
// Add a breakpoint that works with DevTools
|
||||
debugBreak();
|
||||
|
||||
// Your existing code...
|
||||
}
|
||||
```
|
||||
|
||||
3. Start Hugo in development mode--for example:
|
||||
|
||||
```bash
|
||||
yarn hugo server
|
||||
```
|
||||
|
||||
4. In VS Code, go to Run > Start Debugging, and select the "Debug JS (debug-helpers)" configuration.
|
||||
|
||||
Your system uses the configuration in `launch.json` to launch the site in Chrome
|
||||
and attach the debugger to the Developer Tools console.
|
||||
|
||||
Make sure to remove the debug statements before merging your changes.
|
||||
The debug helpers are designed to be used in development and should not be used in production.
|
||||
|
|
|
@ -146,15 +146,15 @@ tags:
|
|||
description: |
|
||||
Manage Processing engine triggers, test plugins, and send requests to trigger On Request plugins.
|
||||
|
||||
InfluxDB 3 Core provides the InfluxDB 3 Processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
InfluxDB 3 Core provides the InfluxDB 3 processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
Use Processing engine plugins and triggers to run code and perform tasks for different database events.
|
||||
|
||||
To get started with the Processing engine, see the [Processing engine and Python plugins](/influxdb3/core/processing-engine/) guide.
|
||||
To get started with the processing engine, see the [Processing engine and Python plugins](/influxdb3/core/processing-engine/) guide.
|
||||
- name: Query data
|
||||
description: Query data using SQL or InfluxQL
|
||||
- name: Quick start
|
||||
description: |
|
||||
1. [Create an admin token](#section/Authentication) for the InfluxDB 3 Core API.
|
||||
1. [Create an admin token](#section/Authentication) to authorize API requests.
|
||||
|
||||
```bash
|
||||
curl -X POST "http://localhost:8181/api/v3/configure/token/admin"
|
||||
|
@ -385,12 +385,7 @@ paths:
|
|||
parameters:
|
||||
- $ref: '#/components/parameters/dbWriteParam'
|
||||
- $ref: '#/components/parameters/accept_partial'
|
||||
- name: precision
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
$ref: '#/components/schemas/PrecisionWrite'
|
||||
description: Precision of timestamps.
|
||||
- $ref: '#/components/parameters/precisionParam'
|
||||
- name: no_sync
|
||||
in: query
|
||||
schema:
|
||||
|
@ -440,16 +435,8 @@ paths:
|
|||
description: Executes an SQL query to retrieve data from the specified database.
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/db'
|
||||
- name: q
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: format
|
||||
in: query
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- $ref: '#/components/parameters/querySqlParam'
|
||||
- $ref: '#/components/parameters/format'
|
||||
- $ref: '#/components/parameters/AcceptQueryHeader'
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
responses:
|
||||
|
@ -1072,15 +1059,104 @@ paths:
|
|||
post:
|
||||
operationId: PostConfigureProcessingEngineTrigger
|
||||
summary: Create processing engine trigger
|
||||
description: Creates a new processing engine trigger.
|
||||
description: |
|
||||
Creates a processing engine trigger with the specified plugin file and trigger specification.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
requestBody:
|
||||
required: true
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
examples:
|
||||
schedule_cron:
|
||||
summary: Schedule trigger using cron
|
||||
description: |
|
||||
In `"cron:CRON_EXPRESSION"`, `CRON_EXPRESSION` uses extended 6-field cron format.
|
||||
The cron expression `0 0 6 * * 1-5` means the trigger will run at 6:00 AM every weekday (Monday to Friday).
|
||||
value:
|
||||
db: DATABASE_NAME
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_cron_trigger
|
||||
trigger_specification: cron:0 0 6 * * 1-5
|
||||
schedule_every:
|
||||
summary: Schedule trigger using interval
|
||||
description: |
|
||||
In `"every:DURATION"`, `DURATION` specifies the interval between trigger executions.
|
||||
The duration `1h` means the trigger will run every hour.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_trigger
|
||||
trigger_specification: every:1h
|
||||
schedule_every_seconds:
|
||||
summary: Schedule trigger using seconds interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 30 seconds.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_30s_trigger
|
||||
trigger_specification: every:30s
|
||||
schedule_every_minutes:
|
||||
summary: Schedule trigger using minutes interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 5 minutes.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_5m_trigger
|
||||
trigger_specification: every:5m
|
||||
all_tables:
|
||||
summary: All tables trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to any table in the database.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: all_tables.py
|
||||
trigger_name: all_tables_trigger
|
||||
trigger_specification: all_tables
|
||||
table_specific:
|
||||
summary: Table-specific trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to a specific table.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: table.py
|
||||
trigger_name: table_trigger
|
||||
trigger_specification: table:sensors
|
||||
api_request:
|
||||
summary: On-demand request trigger example
|
||||
description: |
|
||||
Creates an HTTP endpoint `/api/v3/engine/hello-world` for manual invocation.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: request.py
|
||||
trigger_name: hello_world_trigger
|
||||
trigger_specification: path:hello-world
|
||||
cron_friday_afternoon:
|
||||
summary: Cron trigger for Friday afternoons
|
||||
description: |
|
||||
Example of a cron trigger that runs every Friday at 2:30 PM.
|
||||
value:
|
||||
db: reports
|
||||
plugin_filename: weekly_report.py
|
||||
trigger_name: friday_report_trigger
|
||||
trigger_specification: cron:0 30 14 * * 5
|
||||
cron_monthly:
|
||||
summary: Cron trigger for monthly execution
|
||||
description: |
|
||||
Example of a cron trigger that runs on the first day of every month at midnight.
|
||||
value:
|
||||
db: monthly_data
|
||||
plugin_filename: monthly_cleanup.py
|
||||
trigger_name: monthly_cleanup_trigger
|
||||
trigger_specification: cron:0 0 0 1 * *
|
||||
responses:
|
||||
'201':
|
||||
'200':
|
||||
description: Success. Processing engine trigger created.
|
||||
'400':
|
||||
description: Bad request.
|
||||
|
@ -1157,7 +1233,7 @@ paths:
|
|||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The processing engine trigger has been enabled.
|
||||
description: Success. The processing engine trigger is enabled.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1170,7 +1246,14 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginPackages
|
||||
summary: Install plugin packages
|
||||
description: Installs packages for the plugin environment.
|
||||
description: |
|
||||
Installs the specified Python packages into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the packages are installed.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1179,10 +1262,30 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
packages:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: |
|
||||
A list of Python package names to install.
|
||||
Can include version specifiers (e.g., "scipy==1.9.0").
|
||||
example:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
required:
|
||||
- packages
|
||||
example:
|
||||
packages:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The packages have been installed.
|
||||
description: Success. The packages are installed.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1193,7 +1296,15 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginRequirements
|
||||
summary: Install plugin requirements
|
||||
description: Installs requirements for the plugin environment.
|
||||
description: |
|
||||
Installs requirements from a requirements file (also known as a "pip requirements file") into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the requirements are installed.
|
||||
|
||||
### Related
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/core/plugins/)
|
||||
- [Python requirements file format](https://pip.pypa.io/en/stable/reference/requirements-file-format/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1202,7 +1313,17 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
requirements_location:
|
||||
type: string
|
||||
description: |
|
||||
The path to the requirements file containing Python packages to install.
|
||||
Can be a relative path (relative to the plugin directory) or an absolute path.
|
||||
example: requirements.txt
|
||||
required:
|
||||
- requirements_location
|
||||
example:
|
||||
requirements_location: requirements.txt
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The requirements have been installed.
|
||||
|
@ -1248,18 +1369,18 @@ paths:
|
|||
parameters:
|
||||
- name: plugin_path
|
||||
description: |
|
||||
The path configured in the `trigger-spec` for the plugin.
|
||||
The path configured in the request trigger specification "path:<plugin_path>"` for the plugin.
|
||||
|
||||
For example, if you define a trigger with the following:
|
||||
|
||||
```
|
||||
trigger-spec: "request:hello-world"
|
||||
```json
|
||||
trigger-spec: "path:hello-world"
|
||||
```
|
||||
|
||||
then, the HTTP API exposes the following plugin endpoint:
|
||||
|
||||
```
|
||||
<INFLUXDB_HOST>/api/v3/engine/hello-world
|
||||
<INFLUXDB3_HOST>/api/v3/engine/hello-world
|
||||
```
|
||||
in: path
|
||||
required: true
|
||||
|
@ -1269,7 +1390,7 @@ paths:
|
|||
operationId: GetProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1296,7 +1417,7 @@ paths:
|
|||
operationId: PostProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1335,8 +1456,6 @@ paths:
|
|||
description: |
|
||||
Creates an admin token.
|
||||
An admin token is a special type of token that has full access to all resources in the system.
|
||||
|
||||
This endpoint is only available in InfluxDB 3 Enterprise.
|
||||
responses:
|
||||
'201':
|
||||
description: |
|
||||
|
@ -1357,8 +1476,6 @@ paths:
|
|||
summary: Regenerate admin token
|
||||
description: |
|
||||
Regenerates an admin token and revokes the previous token with the same name.
|
||||
|
||||
This endpoint is only available in InfluxDB 3 Enterprise.
|
||||
parameters: []
|
||||
responses:
|
||||
'201':
|
||||
|
@ -1429,7 +1546,6 @@ components:
|
|||
schema:
|
||||
type: string
|
||||
description: |
|
||||
The name of the database.
|
||||
The name of the database.
|
||||
InfluxDB creates the database if it doesn't already exist, and then
|
||||
writes all points in the batch to the database.
|
||||
|
@ -1747,15 +1863,69 @@ components:
|
|||
type: string
|
||||
plugin_filename:
|
||||
type: string
|
||||
description: |
|
||||
The path and filename of the plugin to execute--for example,
|
||||
`schedule.py` or `endpoints/report.py`.
|
||||
The path can be absolute or relative to the `--plugins-dir` directory configured when starting InfluxDB 3.
|
||||
|
||||
The plugin file must implement the trigger interface associated with the trigger's specification (`trigger_spec`).
|
||||
trigger_name:
|
||||
type: string
|
||||
trigger_specification:
|
||||
type: string
|
||||
description: |
|
||||
Specifies when and how the processing engine trigger should be invoked.
|
||||
|
||||
## Supported trigger specifications:
|
||||
|
||||
### Cron-based scheduling
|
||||
Format: `cron:CRON_EXPRESSION`
|
||||
|
||||
Uses extended (6-field) cron format (second minute hour day_of_month month day_of_week):
|
||||
```
|
||||
┌───────────── second (0-59)
|
||||
│ ┌───────────── minute (0-59)
|
||||
│ │ ┌───────────── hour (0-23)
|
||||
│ │ │ ┌───────────── day of month (1-31)
|
||||
│ │ │ │ ┌───────────── month (1-12)
|
||||
│ │ │ │ │ ┌───────────── day of week (0-6, Sunday=0)
|
||||
│ │ │ │ │ │
|
||||
* * * * * *
|
||||
```
|
||||
Examples:
|
||||
- `cron:0 0 6 * * 1-5` - Every weekday at 6:00 AM
|
||||
- `cron:0 30 14 * * 5` - Every Friday at 2:30 PM
|
||||
- `cron:0 0 0 1 * *` - First day of every month at midnight
|
||||
|
||||
### Interval-based scheduling
|
||||
Format: `every:DURATION`
|
||||
|
||||
Supported durations: `s` (seconds), `m` (minutes), `h` (hours), `d` (days):
|
||||
- `every:30s` - Every 30 seconds
|
||||
- `every:5m` - Every 5 minutes
|
||||
- `every:1h` - Every hour
|
||||
- `every:1d` - Every day
|
||||
|
||||
### Table-based triggers
|
||||
- `all_tables` - Triggers on write events to any table in the database
|
||||
- `table:TABLE_NAME` - Triggers on write events to a specific table
|
||||
|
||||
### On-demand triggers
|
||||
Format: `path:ENDPOINT_NAME`
|
||||
|
||||
Creates an HTTP endpoint `/api/v3/engine/ENDPOINT_NAME` for manual invocation:
|
||||
- `path:hello-world` - Creates endpoint `/api/v3/engine/hello-world`
|
||||
- `path:data-export` - Creates endpoint `/api/v3/engine/data-export`
|
||||
pattern: ^(cron:[0-9 *,/-]+|every:[0-9]+[smhd]|all_tables|table:[a-zA-Z_][a-zA-Z0-9_]*|path:[a-zA-Z0-9_-]+)$
|
||||
example: cron:0 0 6 * * 1-5
|
||||
trigger_arguments:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
description: Optional arguments passed to the plugin.
|
||||
disabled:
|
||||
type: boolean
|
||||
default: false
|
||||
description: Whether the trigger is disabled.
|
||||
required:
|
||||
- db
|
||||
- plugin_filename
|
||||
|
@ -1879,8 +2049,6 @@ components:
|
|||
scheme: bearer
|
||||
bearerFormat: JWT
|
||||
description: |
|
||||
_During Alpha release, an API token is not required._
|
||||
|
||||
A Bearer token for authentication.
|
||||
|
||||
Provide the scheme and the API token in the `Authorization` header--for example:
|
||||
|
|
|
@ -146,15 +146,15 @@ tags:
|
|||
description: |
|
||||
Manage Processing engine triggers, test plugins, and send requests to trigger On Request plugins.
|
||||
|
||||
InfluxDB 3 Enterprise provides the InfluxDB 3 Processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
InfluxDB 3 Enterprise provides the InfluxDB 3 processing engine, an embedded Python VM that can dynamically load and trigger Python plugins in response to events in your database.
|
||||
Use Processing engine plugins and triggers to run code and perform tasks for different database events.
|
||||
|
||||
To get started with the Processing engine, see the [Processing engine and Python plugins](/influxdb3/enterprise/processing-engine/) guide.
|
||||
To get started with the processing engine, see the [Processing engine and Python plugins](/influxdb3/enterprise/processing-engine/) guide.
|
||||
- name: Query data
|
||||
description: Query data using SQL or InfluxQL
|
||||
- name: Quick start
|
||||
description: |
|
||||
1. [Create an admin token](#section/Authentication) for the InfluxDB 3 Enterprise API.
|
||||
1. [Create an admin token](#section/Authentication) to authorize API requests.
|
||||
|
||||
```bash
|
||||
curl -X POST "http://localhost:8181/api/v3/configure/token/admin"
|
||||
|
@ -385,12 +385,7 @@ paths:
|
|||
parameters:
|
||||
- $ref: '#/components/parameters/dbWriteParam'
|
||||
- $ref: '#/components/parameters/accept_partial'
|
||||
- name: precision
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
$ref: '#/components/schemas/PrecisionWrite'
|
||||
description: Precision of timestamps.
|
||||
- $ref: '#/components/parameters/precisionParam'
|
||||
- name: no_sync
|
||||
in: query
|
||||
schema:
|
||||
|
@ -440,16 +435,8 @@ paths:
|
|||
description: Executes an SQL query to retrieve data from the specified database.
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/db'
|
||||
- name: q
|
||||
in: query
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: format
|
||||
in: query
|
||||
required: false
|
||||
schema:
|
||||
type: string
|
||||
- $ref: '#/components/parameters/querySqlParam'
|
||||
- $ref: '#/components/parameters/format'
|
||||
- $ref: '#/components/parameters/AcceptQueryHeader'
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
responses:
|
||||
|
@ -1072,15 +1059,104 @@ paths:
|
|||
post:
|
||||
operationId: PostConfigureProcessingEngineTrigger
|
||||
summary: Create processing engine trigger
|
||||
description: Creates a new processing engine trigger.
|
||||
description: |
|
||||
Creates a processing engine trigger with the specified plugin file and trigger specification.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
requestBody:
|
||||
required: true
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
examples:
|
||||
schedule_cron:
|
||||
summary: Schedule trigger using cron
|
||||
description: |
|
||||
In `"cron:CRON_EXPRESSION"`, `CRON_EXPRESSION` uses extended 6-field cron format.
|
||||
The cron expression `0 0 6 * * 1-5` means the trigger will run at 6:00 AM every weekday (Monday to Friday).
|
||||
value:
|
||||
db: DATABASE_NAME
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_cron_trigger
|
||||
trigger_specification: cron:0 0 6 * * 1-5
|
||||
schedule_every:
|
||||
summary: Schedule trigger using interval
|
||||
description: |
|
||||
In `"every:DURATION"`, `DURATION` specifies the interval between trigger executions.
|
||||
The duration `1h` means the trigger will run every hour.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_trigger
|
||||
trigger_specification: every:1h
|
||||
schedule_every_seconds:
|
||||
summary: Schedule trigger using seconds interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 30 seconds.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_30s_trigger
|
||||
trigger_specification: every:30s
|
||||
schedule_every_minutes:
|
||||
summary: Schedule trigger using minutes interval
|
||||
description: |
|
||||
Example of scheduling a trigger to run every 5 minutes.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: schedule.py
|
||||
trigger_name: schedule_every_5m_trigger
|
||||
trigger_specification: every:5m
|
||||
all_tables:
|
||||
summary: All tables trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to any table in the database.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: all_tables.py
|
||||
trigger_name: all_tables_trigger
|
||||
trigger_specification: all_tables
|
||||
table_specific:
|
||||
summary: Table-specific trigger example
|
||||
description: |
|
||||
Trigger that fires on write events to a specific table.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: table.py
|
||||
trigger_name: table_trigger
|
||||
trigger_specification: table:sensors
|
||||
api_request:
|
||||
summary: On-demand request trigger example
|
||||
description: |
|
||||
Creates an HTTP endpoint `/api/v3/engine/hello-world` for manual invocation.
|
||||
value:
|
||||
db: mydb
|
||||
plugin_filename: request.py
|
||||
trigger_name: hello_world_trigger
|
||||
trigger_specification: path:hello-world
|
||||
cron_friday_afternoon:
|
||||
summary: Cron trigger for Friday afternoons
|
||||
description: |
|
||||
Example of a cron trigger that runs every Friday at 2:30 PM.
|
||||
value:
|
||||
db: reports
|
||||
plugin_filename: weekly_report.py
|
||||
trigger_name: friday_report_trigger
|
||||
trigger_specification: cron:0 30 14 * * 5
|
||||
cron_monthly:
|
||||
summary: Cron trigger for monthly execution
|
||||
description: |
|
||||
Example of a cron trigger that runs on the first day of every month at midnight.
|
||||
value:
|
||||
db: monthly_data
|
||||
plugin_filename: monthly_cleanup.py
|
||||
trigger_name: monthly_cleanup_trigger
|
||||
trigger_specification: cron:0 0 0 1 * *
|
||||
responses:
|
||||
'201':
|
||||
'200':
|
||||
description: Success. Processing engine trigger created.
|
||||
'400':
|
||||
description: Bad request.
|
||||
|
@ -1157,7 +1233,7 @@ paths:
|
|||
$ref: '#/components/schemas/ProcessingEngineTriggerRequest'
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The processing engine trigger has been enabled.
|
||||
description: Success. The processing engine trigger is enabled.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1170,7 +1246,14 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginPackages
|
||||
summary: Install plugin packages
|
||||
description: Installs packages for the plugin environment.
|
||||
description: |
|
||||
Installs the specified Python packages into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the packages are installed.
|
||||
|
||||
### Related guides
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1179,10 +1262,30 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
packages:
|
||||
type: array
|
||||
items:
|
||||
type: string
|
||||
description: |
|
||||
A list of Python package names to install.
|
||||
Can include version specifiers (e.g., "scipy==1.9.0").
|
||||
example:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
required:
|
||||
- packages
|
||||
example:
|
||||
packages:
|
||||
- influxdb3-python
|
||||
- scipy
|
||||
- pandas==1.5.0
|
||||
- requests
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The packages have been installed.
|
||||
description: Success. The packages are installed.
|
||||
'400':
|
||||
description: Bad request.
|
||||
'401':
|
||||
|
@ -1193,7 +1296,15 @@ paths:
|
|||
post:
|
||||
operationId: PostInstallPluginRequirements
|
||||
summary: Install plugin requirements
|
||||
description: Installs requirements for the plugin environment.
|
||||
description: |
|
||||
Installs requirements from a requirements file (also known as a "pip requirements file") into the processing engine plugin environment.
|
||||
|
||||
This endpoint is synchronous and blocks until the requirements are installed.
|
||||
|
||||
### Related
|
||||
|
||||
- [Processing engine and Python plugins](/influxdb3/enterprise/plugins/)
|
||||
- [Python requirements file format](https://pip.pypa.io/en/stable/reference/requirements-file-format/)
|
||||
parameters:
|
||||
- $ref: '#/components/parameters/ContentType'
|
||||
requestBody:
|
||||
|
@ -1202,7 +1313,17 @@ paths:
|
|||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
properties:
|
||||
requirements_location:
|
||||
type: string
|
||||
description: |
|
||||
The path to the requirements file containing Python packages to install.
|
||||
Can be a relative path (relative to the plugin directory) or an absolute path.
|
||||
example: requirements.txt
|
||||
required:
|
||||
- requirements_location
|
||||
example:
|
||||
requirements_location: requirements.txt
|
||||
responses:
|
||||
'200':
|
||||
description: Success. The requirements have been installed.
|
||||
|
@ -1248,18 +1369,18 @@ paths:
|
|||
parameters:
|
||||
- name: plugin_path
|
||||
description: |
|
||||
The path configured in the `trigger-spec` for the plugin.
|
||||
The path configured in the request trigger specification "path:<plugin_path>"` for the plugin.
|
||||
|
||||
For example, if you define a trigger with the following:
|
||||
|
||||
```
|
||||
trigger-spec: "request:hello-world"
|
||||
```json
|
||||
trigger-spec: "path:hello-world"
|
||||
```
|
||||
|
||||
then, the HTTP API exposes the following plugin endpoint:
|
||||
|
||||
```
|
||||
<INFLUXDB_HOST>/api/v3/engine/hello-world
|
||||
<INFLUXDB3_HOST>/api/v3/engine/hello-world
|
||||
```
|
||||
in: path
|
||||
required: true
|
||||
|
@ -1269,7 +1390,7 @@ paths:
|
|||
operationId: GetProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1296,7 +1417,7 @@ paths:
|
|||
operationId: PostProcessingEnginePluginRequest
|
||||
summary: On Request processing engine plugin request
|
||||
description: |
|
||||
Sends a request to invoke an _On Request_ processing engine plugin.
|
||||
Executes the On Request processing engine plugin specified in `<plugin_path>`.
|
||||
The request can include request headers, query string parameters, and a request body, which InfluxDB passes to the plugin.
|
||||
|
||||
An On Request plugin implements the following signature:
|
||||
|
@ -1448,7 +1569,6 @@ components:
|
|||
schema:
|
||||
type: string
|
||||
description: |
|
||||
The name of the database.
|
||||
The name of the database.
|
||||
InfluxDB creates the database if it doesn't already exist, and then
|
||||
writes all points in the batch to the database.
|
||||
|
@ -1804,15 +1924,69 @@ components:
|
|||
type: string
|
||||
plugin_filename:
|
||||
type: string
|
||||
description: |
|
||||
The path and filename of the plugin to execute--for example,
|
||||
`schedule.py` or `endpoints/report.py`.
|
||||
The path can be absolute or relative to the `--plugins-dir` directory configured when starting InfluxDB 3.
|
||||
|
||||
The plugin file must implement the trigger interface associated with the trigger's specification (`trigger_spec`).
|
||||
trigger_name:
|
||||
type: string
|
||||
trigger_specification:
|
||||
type: string
|
||||
description: |
|
||||
Specifies when and how the processing engine trigger should be invoked.
|
||||
|
||||
## Supported trigger specifications:
|
||||
|
||||
### Cron-based scheduling
|
||||
Format: `cron:CRON_EXPRESSION`
|
||||
|
||||
Uses extended (6-field) cron format (second minute hour day_of_month month day_of_week):
|
||||
```
|
||||
┌───────────── second (0-59)
|
||||
│ ┌───────────── minute (0-59)
|
||||
│ │ ┌───────────── hour (0-23)
|
||||
│ │ │ ┌───────────── day of month (1-31)
|
||||
│ │ │ │ ┌───────────── month (1-12)
|
||||
│ │ │ │ │ ┌───────────── day of week (0-6, Sunday=0)
|
||||
│ │ │ │ │ │
|
||||
* * * * * *
|
||||
```
|
||||
Examples:
|
||||
- `cron:0 0 6 * * 1-5` - Every weekday at 6:00 AM
|
||||
- `cron:0 30 14 * * 5` - Every Friday at 2:30 PM
|
||||
- `cron:0 0 0 1 * *` - First day of every month at midnight
|
||||
|
||||
### Interval-based scheduling
|
||||
Format: `every:DURATION`
|
||||
|
||||
Supported durations: `s` (seconds), `m` (minutes), `h` (hours), `d` (days):
|
||||
- `every:30s` - Every 30 seconds
|
||||
- `every:5m` - Every 5 minutes
|
||||
- `every:1h` - Every hour
|
||||
- `every:1d` - Every day
|
||||
|
||||
### Table-based triggers
|
||||
- `all_tables` - Triggers on write events to any table in the database
|
||||
- `table:TABLE_NAME` - Triggers on write events to a specific table
|
||||
|
||||
### On-demand triggers
|
||||
Format: `path:ENDPOINT_NAME`
|
||||
|
||||
Creates an HTTP endpoint `/api/v3/engine/ENDPOINT_NAME` for manual invocation:
|
||||
- `path:hello-world` - Creates endpoint `/api/v3/engine/hello-world`
|
||||
- `path:data-export` - Creates endpoint `/api/v3/engine/data-export`
|
||||
pattern: ^(cron:[0-9 *,/-]+|every:[0-9]+[smhd]|all_tables|table:[a-zA-Z_][a-zA-Z0-9_]*|path:[a-zA-Z0-9_-]+)$
|
||||
example: cron:0 0 6 * * 1-5
|
||||
trigger_arguments:
|
||||
type: object
|
||||
additionalProperties: true
|
||||
description: Optional arguments passed to the plugin.
|
||||
disabled:
|
||||
type: boolean
|
||||
default: false
|
||||
description: Whether the trigger is disabled.
|
||||
required:
|
||||
- db
|
||||
- plugin_filename
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
///////////////// Preferred Client Library programming language ///////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
import { activateTabs, updateBtnURLs } from './tabbed-content.js';
|
||||
import { getPreference, setPreference } from './local-storage.js';
|
||||
import { getPreference, setPreference } from './services/local-storage.js';
|
||||
|
||||
function getVisitedApiLib() {
|
||||
const path = window.location.pathname.match(
|
||||
|
|
|
@ -9,28 +9,30 @@ function setUser(userid, email) {
|
|||
user: {
|
||||
uniqueClientId: userid,
|
||||
email: email,
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Initialize the chat widget
|
||||
function initializeChat({onChatLoad, chatAttributes}) {
|
||||
function initializeChat({ onChatLoad, chatAttributes }) {
|
||||
/* See https://docs.kapa.ai/integrations/website-widget/configuration for
|
||||
* available configuration options.
|
||||
* All values are strings.
|
||||
*/
|
||||
// If you make changes to data attributes here, you also need to port the changes to the api-docs/template.hbs API reference template.
|
||||
// If you make changes to data attributes here, you also need to
|
||||
// port the changes to the api-docs/template.hbs API reference template.
|
||||
const requiredAttributes = {
|
||||
websiteId: 'a02bca75-1dd3-411e-95c0-79ee1139be4d',
|
||||
projectName: 'InfluxDB',
|
||||
projectColor: '#020a47',
|
||||
projectLogo: '/img/influx-logo-cubo-white.png',
|
||||
}
|
||||
};
|
||||
|
||||
const optionalAttributes = {
|
||||
|
||||
modalDisclaimer: 'This AI can access [documentation for InfluxDB, clients, and related tools](https://docs.influxdata.com). Information you submit is used in accordance with our [Privacy Policy](https://www.influxdata.com/legal/privacy-policy/).',
|
||||
modalExampleQuestions: 'Use Python to write data to InfluxDB 3,How do I query using SQL?,How do I use MQTT with Telegraf?',
|
||||
modalDisclaimer:
|
||||
'This AI can access [documentation for InfluxDB, clients, and related tools](https://docs.influxdata.com). Information you submit is used in accordance with our [Privacy Policy](https://www.influxdata.com/legal/privacy-policy/).',
|
||||
modalExampleQuestions:
|
||||
'Use Python to write data to InfluxDB 3,How do I query using SQL?,How do I use MQTT with Telegraf?',
|
||||
buttonHide: 'true',
|
||||
exampleQuestionButtonWidth: 'auto',
|
||||
modalOpenOnCommandK: 'true',
|
||||
|
@ -52,28 +54,32 @@ function initializeChat({onChatLoad, chatAttributes}) {
|
|||
modalHeaderBorderBottom: 'none',
|
||||
modalTitleColor: '#fff',
|
||||
modalTitleFontSize: '1.25rem',
|
||||
}
|
||||
};
|
||||
|
||||
const scriptUrl = 'https://widget.kapa.ai/kapa-widget.bundle.js';
|
||||
const script = document.createElement('script');
|
||||
script.async = true;
|
||||
script.src = scriptUrl;
|
||||
script.onload = function() {
|
||||
script.onload = function () {
|
||||
onChatLoad();
|
||||
window.influxdatadocs.AskAI = AskAI;
|
||||
};
|
||||
script.onerror = function() {
|
||||
script.onerror = function () {
|
||||
console.error('Error loading AI chat widget script');
|
||||
};
|
||||
|
||||
const dataset = {...requiredAttributes, ...optionalAttributes, ...chatAttributes};
|
||||
Object.keys(dataset).forEach(key => {
|
||||
const dataset = {
|
||||
...requiredAttributes,
|
||||
...optionalAttributes,
|
||||
...chatAttributes,
|
||||
};
|
||||
Object.keys(dataset).forEach((key) => {
|
||||
// Assign dataset attributes from the object
|
||||
script.dataset[key] = dataset[key];
|
||||
});
|
||||
|
||||
// Check for an existing script element to remove
|
||||
const oldScript= document.querySelector(`script[src="${scriptUrl}"]`);
|
||||
const oldScript = document.querySelector(`script[src="${scriptUrl}"]`);
|
||||
if (oldScript) {
|
||||
oldScript.remove();
|
||||
}
|
||||
|
@ -89,15 +95,14 @@ function getProductExampleQuestions() {
|
|||
* chatParams: specify custom (for example, page-specific) attribute values for the chat, pass the dataset key-values (collected in ...chatParams). See https://docs.kapa.ai/integrations/website-widget/configuration for available configuration options.
|
||||
* onChatLoad: function to call when the chat widget has loaded
|
||||
* userid: optional, a unique user ID for the user (not currently used for public docs)
|
||||
*/
|
||||
*/
|
||||
export default function AskAI({ userid, email, onChatLoad, ...chatParams }) {
|
||||
|
||||
const modalExampleQuestions = getProductExampleQuestions();
|
||||
const chatAttributes = {
|
||||
...(modalExampleQuestions && { modalExampleQuestions }),
|
||||
...chatParams,
|
||||
}
|
||||
initializeChat({onChatLoad, chatAttributes});
|
||||
};
|
||||
initializeChat({ onChatLoad, chatAttributes });
|
||||
|
||||
if (userid) {
|
||||
setUser(userid, email);
|
||||
|
|
|
@ -2,7 +2,7 @@ import $ from 'jquery';
|
|||
|
||||
function initialize() {
|
||||
var codeBlockSelector = '.article--content pre';
|
||||
var codeBlocks = $(codeBlockSelector);
|
||||
var $codeBlocks = $(codeBlockSelector);
|
||||
|
||||
var appendHTML = `
|
||||
<div class="code-controls">
|
||||
|
@ -15,7 +15,7 @@ function initialize() {
|
|||
`;
|
||||
|
||||
// Wrap all codeblocks with a new 'codeblock' div
|
||||
$(codeBlocks).each(function () {
|
||||
$codeBlocks.each(function () {
|
||||
$(this).wrap("<div class='codeblock'></div>");
|
||||
});
|
||||
|
||||
|
@ -68,7 +68,9 @@ function initialize() {
|
|||
// Trigger copy failure state lifecycle
|
||||
|
||||
$('.copy-code').click(function () {
|
||||
let text = $(this).closest('.code-controls').prevAll('pre:has(code)')[0].innerText;
|
||||
let text = $(this)
|
||||
.closest('.code-controls')
|
||||
.prevAll('pre:has(code)')[0].innerText;
|
||||
|
||||
const copyContent = async () => {
|
||||
try {
|
||||
|
@ -90,7 +92,10 @@ Disable scrolling on the body.
|
|||
Disable user selection on everything but the fullscreen codeblock.
|
||||
*/
|
||||
$('.fullscreen-toggle').click(function () {
|
||||
var code = $(this).closest('.code-controls').prevAll('pre:has(code)').clone();
|
||||
var code = $(this)
|
||||
.closest('.code-controls')
|
||||
.prevAll('pre:has(code)')
|
||||
.clone();
|
||||
|
||||
$('#fullscreen-code-placeholder').replaceWith(code[0]);
|
||||
$('body').css('overflow', 'hidden');
|
||||
|
|
|
@ -0,0 +1,78 @@
|
|||
// Memoize the mermaid module import
|
||||
let mermaidPromise = null;
|
||||
|
||||
export default function Diagram({ component }) {
|
||||
// Import mermaid.js module (memoized)
|
||||
if (!mermaidPromise) {
|
||||
mermaidPromise = import('mermaid');
|
||||
}
|
||||
mermaidPromise
|
||||
.then(({ default: mermaid }) => {
|
||||
// Configure mermaid with InfluxData theming
|
||||
mermaid.initialize({
|
||||
startOnLoad: false, // We'll manually call run()
|
||||
theme: document.body.classList.contains('dark-theme')
|
||||
? 'dark'
|
||||
: 'default',
|
||||
themeVariables: {
|
||||
fontFamily: 'Proxima Nova',
|
||||
fontSize: '16px',
|
||||
lineColor: '#22ADF6',
|
||||
primaryColor: '#22ADF6',
|
||||
primaryTextColor: '#545454',
|
||||
secondaryColor: '#05CE78',
|
||||
tertiaryColor: '#f4f5f5',
|
||||
},
|
||||
securityLevel: 'loose', // Required for interactive diagrams
|
||||
logLevel: 'error',
|
||||
});
|
||||
|
||||
// Process the specific diagram component
|
||||
try {
|
||||
mermaid.run({ nodes: [component] });
|
||||
} catch (error) {
|
||||
console.error('Mermaid diagram rendering error:', error);
|
||||
}
|
||||
|
||||
// Store reference to mermaid for theme switching
|
||||
if (!window.mermaidInstances) {
|
||||
window.mermaidInstances = new Map();
|
||||
}
|
||||
window.mermaidInstances.set(component, mermaid);
|
||||
})
|
||||
.catch((error) => {
|
||||
console.error('Failed to load Mermaid library:', error);
|
||||
});
|
||||
|
||||
// Listen for theme changes to refresh diagrams
|
||||
const observer = new MutationObserver((mutations) => {
|
||||
mutations.forEach((mutation) => {
|
||||
if (
|
||||
mutation.attributeName === 'class' &&
|
||||
document.body.classList.contains('dark-theme') !== window.isDarkTheme
|
||||
) {
|
||||
window.isDarkTheme = document.body.classList.contains('dark-theme');
|
||||
|
||||
// Reload this specific diagram with new theme
|
||||
if (window.mermaidInstances?.has(component)) {
|
||||
const mermaid = window.mermaidInstances.get(component);
|
||||
mermaid.initialize({
|
||||
theme: window.isDarkTheme ? 'dark' : 'default',
|
||||
});
|
||||
mermaid.run({ nodes: [component] });
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Watch for theme changes on body element
|
||||
observer.observe(document.body, { attributes: true });
|
||||
|
||||
// Return cleanup function to be called when component is destroyed
|
||||
return () => {
|
||||
observer.disconnect();
|
||||
if (window.mermaidInstances?.has(component)) {
|
||||
window.mermaidInstances.delete(component);
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,180 @@
|
|||
/**
|
||||
* DocSearch component for InfluxData documentation
|
||||
* Handles asynchronous loading and initialization of Algolia DocSearch
|
||||
*/
|
||||
const debug = false; // Set to true for debugging output
|
||||
|
||||
export default function DocSearch({ component }) {
|
||||
// Store configuration from component data attributes
|
||||
const config = {
|
||||
apiKey: component.getAttribute('data-api-key'),
|
||||
appId: component.getAttribute('data-app-id'),
|
||||
indexName: component.getAttribute('data-index-name'),
|
||||
inputSelector: component.getAttribute('data-input-selector'),
|
||||
searchTag: component.getAttribute('data-search-tag'),
|
||||
includeFlux: component.getAttribute('data-include-flux') === 'true',
|
||||
includeResources:
|
||||
component.getAttribute('data-include-resources') === 'true',
|
||||
debug: component.getAttribute('data-debug') === 'true',
|
||||
};
|
||||
|
||||
// Initialize global object to track DocSearch state
|
||||
window.InfluxDocs = window.InfluxDocs || {};
|
||||
window.InfluxDocs.search = {
|
||||
initialized: false,
|
||||
options: config,
|
||||
};
|
||||
|
||||
// Load DocSearch asynchronously
|
||||
function loadDocSearch() {
|
||||
if (debug) {
|
||||
console.log('Loading DocSearch script...');
|
||||
}
|
||||
const script = document.createElement('script');
|
||||
script.src =
|
||||
'https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.js';
|
||||
script.async = true;
|
||||
script.onload = initializeDocSearch;
|
||||
document.body.appendChild(script);
|
||||
}
|
||||
|
||||
// Initialize DocSearch after script loads
|
||||
function initializeDocSearch() {
|
||||
if (debug) {
|
||||
console.log('Initializing DocSearch...');
|
||||
}
|
||||
const multiVersion = ['influxdb'];
|
||||
|
||||
// Use object-based lookups instead of conditionals for version and product names
|
||||
// These can be replaced with data from productData in the future
|
||||
|
||||
// Version display name mappings
|
||||
const versionDisplayNames = {
|
||||
cloud: 'Cloud (TSM)',
|
||||
core: 'Core',
|
||||
enterprise: 'Enterprise',
|
||||
'cloud-serverless': 'Cloud Serverless',
|
||||
'cloud-dedicated': 'Cloud Dedicated',
|
||||
clustered: 'Clustered',
|
||||
explorer: 'Explorer',
|
||||
};
|
||||
|
||||
// Product display name mappings
|
||||
const productDisplayNames = {
|
||||
influxdb: 'InfluxDB',
|
||||
influxdb3: 'InfluxDB 3',
|
||||
explorer: 'InfluxDB 3 Explorer',
|
||||
enterprise_influxdb: 'InfluxDB Enterprise',
|
||||
flux: 'Flux',
|
||||
telegraf: 'Telegraf',
|
||||
chronograf: 'Chronograf',
|
||||
kapacitor: 'Kapacitor',
|
||||
platform: 'InfluxData Platform',
|
||||
resources: 'Additional Resources',
|
||||
};
|
||||
|
||||
// Initialize DocSearch with configuration
|
||||
window.docsearch({
|
||||
apiKey: config.apiKey,
|
||||
appId: config.appId,
|
||||
indexName: config.indexName,
|
||||
inputSelector: config.inputSelector,
|
||||
debug: config.debug,
|
||||
transformData: function (hits) {
|
||||
// Format version using object lookup instead of if-else chain
|
||||
function fmtVersion(version, productKey) {
|
||||
if (version == null) {
|
||||
return '';
|
||||
} else if (versionDisplayNames[version]) {
|
||||
return versionDisplayNames[version];
|
||||
} else if (multiVersion.includes(productKey)) {
|
||||
return version;
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
hits.map((hit) => {
|
||||
const pathData = new URL(hit.url).pathname
|
||||
.split('/')
|
||||
.filter((n) => n);
|
||||
const product = productDisplayNames[pathData[0]] || pathData[0];
|
||||
const version = fmtVersion(pathData[1], pathData[0]);
|
||||
|
||||
hit.product = product;
|
||||
hit.version = version;
|
||||
hit.hierarchy.lvl0 =
|
||||
hit.hierarchy.lvl0 +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
hit._highlightResult.hierarchy.lvl0.value =
|
||||
hit._highlightResult.hierarchy.lvl0.value +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
});
|
||||
return hits;
|
||||
},
|
||||
algoliaOptions: {
|
||||
hitsPerPage: 10,
|
||||
facetFilters: buildFacetFilters(config),
|
||||
},
|
||||
autocompleteOptions: {
|
||||
templates: {
|
||||
header:
|
||||
'<div class="search-all-content"><a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a>',
|
||||
empty:
|
||||
'<div class="search-no-results"><p>Not finding what you\'re looking for?</p> <a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a></div>',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Mark DocSearch as initialized
|
||||
window.InfluxDocs.search.initialized = true;
|
||||
|
||||
// Dispatch event for other components to know DocSearch is ready
|
||||
window.dispatchEvent(new CustomEvent('docsearch-initialized'));
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper function to build facet filters based on config
|
||||
* - Uses nested arrays for AND conditions
|
||||
* - Includes space after colon in filter expressions
|
||||
*/
|
||||
function buildFacetFilters(config) {
|
||||
if (!config.searchTag) {
|
||||
return ['latest:true'];
|
||||
} else if (config.includeFlux) {
|
||||
// Return a nested array to match original template structure
|
||||
// Note the space after each colon
|
||||
return [
|
||||
[
|
||||
'searchTag: ' + config.searchTag,
|
||||
'flux:true',
|
||||
'resources: ' + config.includeResources,
|
||||
],
|
||||
];
|
||||
} else {
|
||||
// Return a nested array to match original template structure
|
||||
// Note the space after each colon
|
||||
return [
|
||||
[
|
||||
'searchTag: ' + config.searchTag,
|
||||
'resources: ' + config.includeResources,
|
||||
],
|
||||
];
|
||||
}
|
||||
}
|
||||
|
||||
// Load DocSearch when page is idle or after a slight delay
|
||||
if ('requestIdleCallback' in window) {
|
||||
requestIdleCallback(loadDocSearch);
|
||||
} else {
|
||||
setTimeout(loadDocSearch, 500);
|
||||
}
|
||||
|
||||
// Return cleanup function
|
||||
return function cleanup() {
|
||||
// Clean up any event listeners if needed
|
||||
if (debug) {
|
||||
console.log('DocSearch component cleanup');
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,6 @@
|
|||
import SearchInteractions from '../utils/search-interactions.js';
|
||||
|
||||
export default function SidebarSearch({ component }) {
|
||||
const searchInput = component.querySelector('.sidebar--search-field');
|
||||
SearchInteractions({ searchInput });
|
||||
}
|
|
@ -1,7 +1,7 @@
|
|||
import $ from 'jquery';
|
||||
import { Datepicker } from 'vanillajs-datepicker';
|
||||
import { toggleModal } from './modals.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
|
||||
// Placeholder start date used in InfluxDB custom timestamps
|
||||
const defaultStartDate = '2022-01-01';
|
||||
|
@ -53,8 +53,8 @@ function timeToUnixSeconds(time) {
|
|||
return unixSeconds;
|
||||
}
|
||||
|
||||
// Default time values in getting started sample data
|
||||
const defaultTimes = [
|
||||
// Default time values in getting started sample data
|
||||
const defaultTimes = [
|
||||
{
|
||||
rfc3339: `${defaultStartDate}T08:00:00Z`,
|
||||
unix: timeToUnixSeconds(`${defaultStartDate}T08:00:00Z`),
|
||||
|
@ -107,11 +107,11 @@ function timeToUnixSeconds(time) {
|
|||
rfc3339: `${defaultStartDate}T20:00:00Z`,
|
||||
unix: timeToUnixSeconds(`${defaultStartDate}T20:00:00Z`),
|
||||
}, // 1641067200
|
||||
];
|
||||
];
|
||||
|
||||
function updateTimestamps (newStartDate, seedTimes=defaultTimes) {
|
||||
function updateTimestamps(newStartDate, seedTimes = defaultTimes) {
|
||||
// Update the times array with replacement times
|
||||
const times = seedTimes.map(x => {
|
||||
const times = seedTimes.map((x) => {
|
||||
var newStartTimestamp = x.rfc3339.replace(/^.*T/, newStartDate + 'T');
|
||||
|
||||
return {
|
||||
|
@ -178,7 +178,7 @@ function updateTimestamps (newStartDate, seedTimes=defaultTimes) {
|
|||
|
||||
/////////////////////// MODAL INTERACTIONS / DATE PICKER ///////////////////////
|
||||
|
||||
function CustomTimeTrigger({component}) {
|
||||
function CustomTimeTrigger({ component }) {
|
||||
const $component = $(component);
|
||||
$component
|
||||
.find('a[data-action="open"]:first')
|
||||
|
|
|
@ -1,30 +1,54 @@
|
|||
const monthNames = ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"];
|
||||
var date = new Date()
|
||||
var currentTimestamp = date.toISOString().replace(/^(.*)(\.\d+)(Z)/, '$1$3') // 2023-01-01T12:34:56Z
|
||||
var currentTime = date.toISOString().replace(/(^.*T)(.*)(Z)/, '$2') + '084216' // 12:34:56.000084216
|
||||
import $ from 'jquery';
|
||||
|
||||
function currentDate(offset=0, trimTime=false) {
|
||||
outputDate = new Date(date)
|
||||
outputDate.setDate(outputDate.getDate() + offset)
|
||||
var date = new Date();
|
||||
var currentTimestamp = date.toISOString().replace(/^(.*)(\.\d+)(Z)/, '$1$3'); // 2023-01-01T12:34:56Z
|
||||
|
||||
// Microsecond offset appended to the current time string for formatting purposes
|
||||
const MICROSECOND_OFFSET = '084216';
|
||||
|
||||
var currentTime =
|
||||
date.toISOString().replace(/(^.*T)(.*)(Z)/, '$2') + MICROSECOND_OFFSET; // 12:34:56.000084216
|
||||
function currentDate(offset = 0, trimTime = false) {
|
||||
let outputDate = new Date(date);
|
||||
outputDate.setDate(outputDate.getDate() + offset);
|
||||
|
||||
if (trimTime) {
|
||||
return outputDate.toISOString().replace(/T.*$/, '') // 2023-01-01
|
||||
return outputDate.toISOString().replace(/T.*$/, ''); // 2023-01-01
|
||||
} else {
|
||||
return outputDate.toISOString().replace(/T.*$/, 'T00:00:00Z') // 2023-01-01T00:00:00Z
|
||||
return outputDate.toISOString().replace(/T.*$/, 'T00:00:00Z'); // 2023-01-01T00:00:00Z
|
||||
}
|
||||
}
|
||||
|
||||
function enterpriseEOLDate() {
|
||||
var inTwoYears = date.setFullYear(date.getFullYear() + 2)
|
||||
earliestEOL = new Date(inTwoYears)
|
||||
return `${monthNames[earliestEOL.getMonth()]} ${earliestEOL.getDate()}, ${earliestEOL.getFullYear()}`
|
||||
const monthNames = [
|
||||
'January',
|
||||
'February',
|
||||
'March',
|
||||
'April',
|
||||
'May',
|
||||
'June',
|
||||
'July',
|
||||
'August',
|
||||
'September',
|
||||
'October',
|
||||
'November',
|
||||
'December',
|
||||
];
|
||||
var inTwoYears = new Date(date);
|
||||
inTwoYears.setFullYear(inTwoYears.getFullYear() + 2);
|
||||
let earliestEOL = new Date(inTwoYears);
|
||||
return `${monthNames[earliestEOL.getMonth()]} ${earliestEOL.getDate()}, ${earliestEOL.getFullYear()}`;
|
||||
}
|
||||
|
||||
$('span.current-timestamp').text(currentTimestamp)
|
||||
$('span.current-time').text(currentTime)
|
||||
$('span.enterprise-eol-date').text(enterpriseEOLDate)
|
||||
$('span.current-date').each(function() {
|
||||
var dayOffset = parseInt($(this).attr("offset"))
|
||||
var trimTime = $(this).attr("trim-time") === "true"
|
||||
$(this).text(currentDate(dayOffset, trimTime))
|
||||
})
|
||||
function initialize() {
|
||||
$('span.current-timestamp').text(currentTimestamp);
|
||||
$('span.current-time').text(currentTime);
|
||||
$('span.enterprise-eol-date').text(enterpriseEOLDate());
|
||||
$('span.current-date').each(function () {
|
||||
var dayOffset = parseInt($(this).attr('offset'));
|
||||
var trimTime = $(this).attr('trim-time') === 'true';
|
||||
$(this).text(currentDate(dayOffset, trimTime));
|
||||
});
|
||||
}
|
||||
|
||||
export { initialize };
|
||||
|
|
|
@ -2,37 +2,24 @@
|
|||
This feature is designed to callout new features added to the documentation
|
||||
CSS is required for the callout bubble to determine look and position, but the
|
||||
element must have the `callout` class and a unique id.
|
||||
Callouts are treated as notifications and use the notification cookie API in
|
||||
assets/js/cookies.js.
|
||||
Callouts are treated as notifications and use the LocalStorage notification API.
|
||||
*/
|
||||
|
||||
import $ from 'jquery';
|
||||
import * as LocalStorageAPI from './services/local-storage.js';
|
||||
|
||||
// Get notification ID
|
||||
function getCalloutID (el) {
|
||||
function getCalloutID(el) {
|
||||
return $(el).attr('id');
|
||||
}
|
||||
|
||||
// Hide a callout and update the cookie with the viewed callout
|
||||
function hideCallout (calloutID) {
|
||||
if (!window.LocalStorageAPI.notificationIsRead(calloutID)) {
|
||||
window.LocalStorageAPI.setNotificationAsRead(calloutID, 'callout');
|
||||
$(`#${calloutID}`).fadeOut(200);
|
||||
}
|
||||
}
|
||||
|
||||
// Show the url feature callouts on page load
|
||||
$(document).ready(function () {
|
||||
$('.feature-callout').each(function () {
|
||||
const calloutID = getCalloutID($(this));
|
||||
export default function FeatureCallout({ component }) {
|
||||
const calloutID = getCalloutID($(component));
|
||||
|
||||
if (!window.LocalStorageAPI.notificationIsRead(calloutID, 'callout')) {
|
||||
if (!LocalStorageAPI.notificationIsRead(calloutID, 'callout')) {
|
||||
$(`#${calloutID}.feature-callout`)
|
||||
.fadeIn(300)
|
||||
.removeClass('start-position');
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Hide the InfluxDB URL selector callout
|
||||
// $('button.url-trigger, #influxdb-url-selector .close').click(function () {
|
||||
// hideCallout('influxdb-url-selector');
|
||||
// });
|
||||
}
|
||||
|
|
|
@ -1,49 +1,148 @@
|
|||
var tablesElement = $("#flux-group-keys-demo #grouped-tables")
|
||||
import $ from 'jquery';
|
||||
|
||||
// Sample data
|
||||
let data = [
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 110.3 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 112.5 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "temp", _value: 111.9 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 110.3,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 112.5,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'temp',
|
||||
_value: 111.9,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 73.4 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 73.7 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm1", sensorID: "A123", _field: "hum", _value: 75.1 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 73.4,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 73.7,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm1',
|
||||
sensorID: 'A123',
|
||||
_field: 'hum',
|
||||
_value: 75.1,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 108.2 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 108.5 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "temp", _value: 109.6 }
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 108.2,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 108.5,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'temp',
|
||||
_value: 109.6,
|
||||
},
|
||||
],
|
||||
[
|
||||
{ _time: "2021-01-01T00:00:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 71.8 },
|
||||
{ _time: "2021-01-01T00:01:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 72.3 },
|
||||
{ _time: "2021-01-01T00:02:00Z", _measurement: "example", loc: "rm2", sensorID: "B456", _field: "hum", _value: 72.1 }
|
||||
]
|
||||
]
|
||||
{
|
||||
_time: '2021-01-01T00:00:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 71.8,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:01:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 72.3,
|
||||
},
|
||||
{
|
||||
_time: '2021-01-01T00:02:00Z',
|
||||
_measurement: 'example',
|
||||
loc: 'rm2',
|
||||
sensorID: 'B456',
|
||||
_field: 'hum',
|
||||
_value: 72.1,
|
||||
},
|
||||
],
|
||||
];
|
||||
|
||||
// Default group key
|
||||
let groupKey = ["_measurement", "loc", "sensorID", "_field"]
|
||||
let groupKey = ['_measurement', 'loc', 'sensorID', '_field'];
|
||||
|
||||
export default function FluxGroupKeysDemo({ component }) {
|
||||
$('.column-list label').click(function () {
|
||||
toggleCheckbox($(this));
|
||||
groupKey = getChecked(component);
|
||||
groupData();
|
||||
buildGroupExample(component);
|
||||
});
|
||||
|
||||
// Group and render tables on load
|
||||
groupData();
|
||||
}
|
||||
|
||||
// Build a table group (group key and table) using an array of objects
|
||||
function buildTable(inputData) {
|
||||
|
||||
// Build the group key string
|
||||
function wrapString(column, value) {
|
||||
var stringColumns = ["_measurement", "loc", "sensorID", "_field"]
|
||||
var stringColumns = ['_measurement', 'loc', 'sensorID', '_field'];
|
||||
if (stringColumns.includes(column)) {
|
||||
return '"' + value + '"'
|
||||
return '"' + value + '"';
|
||||
} else {
|
||||
return value
|
||||
return value;
|
||||
}
|
||||
}
|
||||
var groupKeyString = "Group key instance = [" + (groupKey.map(column => column + ": " + wrapString(column, (inputData[0])[column])) ).join(", ") + "]";
|
||||
var groupKeyLabel = document.createElement("p");
|
||||
groupKeyLabel.className = "table-group-key"
|
||||
groupKeyLabel.innerHTML = groupKeyString
|
||||
|
||||
var groupKeyString =
|
||||
'Group key instance = [' +
|
||||
groupKey
|
||||
.map((column) => column + ': ' + wrapString(column, inputData[0][column]))
|
||||
.join(', ') +
|
||||
']';
|
||||
var groupKeyLabel = document.createElement('p');
|
||||
groupKeyLabel.className = 'table-group-key';
|
||||
groupKeyLabel.innerHTML = groupKeyString;
|
||||
|
||||
// Extract column headers
|
||||
var columns = [];
|
||||
|
@ -56,54 +155,55 @@ function buildTable(inputData) {
|
|||
}
|
||||
|
||||
// Create the table element
|
||||
var table = document.createElement("table");
|
||||
const table = document.createElement('table');
|
||||
|
||||
// Create the table header
|
||||
for (let i = 0; i < columns.length; i++) {
|
||||
var header = table.createTHead();
|
||||
var th = document.createElement("th");
|
||||
var th = document.createElement('th');
|
||||
th.innerHTML = columns[i];
|
||||
if (groupKey.includes(columns[i])) {
|
||||
th.className = "grouped-by";
|
||||
th.className = 'grouped-by';
|
||||
}
|
||||
header.appendChild(th);
|
||||
}
|
||||
|
||||
// Add inputData to the HTML table
|
||||
for (let i = 0; i < inputData.length; i++) {
|
||||
tr = table.insertRow(-1);
|
||||
let tr = table.insertRow(-1);
|
||||
for (let j = 0; j < columns.length; j++) {
|
||||
var td = tr.insertCell(-1);
|
||||
td.innerHTML = inputData[i][columns[j]];
|
||||
// Highlight the value if column is part of the group key
|
||||
if (groupKey.includes(columns[j])) {
|
||||
td.className = "grouped-by";
|
||||
td.className = 'grouped-by';
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create a table group with group key and table
|
||||
var tableGroup = document.createElement("div");
|
||||
tableGroup.innerHTML += groupKeyLabel.outerHTML + table.outerHTML
|
||||
var tableGroup = document.createElement('div');
|
||||
tableGroup.innerHTML += groupKeyLabel.outerHTML + table.outerHTML;
|
||||
|
||||
return tableGroup
|
||||
return tableGroup;
|
||||
}
|
||||
|
||||
// Clear and rebuild all HTML tables
|
||||
function buildTables(data) {
|
||||
existingTables = tablesElement[0]
|
||||
let tablesElement = $('#flux-group-keys-demo #grouped-tables');
|
||||
let existingTables = tablesElement[0];
|
||||
while (existingTables.firstChild) {
|
||||
existingTables.removeChild(existingTables.firstChild);
|
||||
}
|
||||
for (let i = 0; i < data.length; i++) {
|
||||
var table = buildTable(data[i])
|
||||
var table = buildTable(data[i]);
|
||||
tablesElement.append(table);
|
||||
}
|
||||
}
|
||||
|
||||
// Group data based on the group key and output new tables
|
||||
function groupData() {
|
||||
let groupedData = data.flat()
|
||||
let groupedData = data.flat();
|
||||
|
||||
function groupBy(array, f) {
|
||||
var groups = {};
|
||||
|
@ -114,20 +214,19 @@ function groupData() {
|
|||
});
|
||||
return Object.keys(groups).map(function (group) {
|
||||
return groups[group];
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
groupedData = groupBy(groupedData, function (r) {
|
||||
return groupKey.map(v => r[v]);
|
||||
return groupKey.map((v) => r[v]);
|
||||
});
|
||||
|
||||
buildTables(groupedData);
|
||||
}
|
||||
|
||||
// Get selected column names
|
||||
var checkboxes = $("input[type=checkbox]");
|
||||
|
||||
function getChecked() {
|
||||
function getChecked(component) {
|
||||
// Get selected column names
|
||||
var checkboxes = $(component).find('input[type=checkbox]');
|
||||
var checked = [];
|
||||
for (var i = 0; i < checkboxes.length; i++) {
|
||||
var checkbox = checkboxes[i];
|
||||
|
@ -141,17 +240,12 @@ function toggleCheckbox(element) {
|
|||
}
|
||||
|
||||
// Build example group function
|
||||
function buildGroupExample() {
|
||||
var columnCollection = getChecked().map(i => '<span class=\"s2\">"' + i + '"</span>').join(", ")
|
||||
$("pre#group-by-example")[0].innerHTML = "data\n <span class='nx'>|></span> group(columns<span class='nx'>:</span> [" + columnCollection + "])";
|
||||
function buildGroupExample(component) {
|
||||
var columnCollection = getChecked(component)
|
||||
.map((i) => '<span class=\"s2\">"' + i + '"</span>')
|
||||
.join(', ');
|
||||
$('pre#group-by-example')[0].innerHTML =
|
||||
"data\n <span class='nx'>|></span> group(columns<span class='nx'>:</span> [" +
|
||||
columnCollection +
|
||||
'])';
|
||||
}
|
||||
|
||||
$(".column-list label").click(function () {
|
||||
toggleCheckbox($(this))
|
||||
groupKey = getChecked();
|
||||
groupData();
|
||||
buildGroupExample();
|
||||
});
|
||||
|
||||
// Group and render tables on load
|
||||
groupData()
|
||||
|
|
|
@ -1,22 +0,0 @@
|
|||
$('.exp-btn').click(function() {
|
||||
var targetBtnElement = $(this).parent()
|
||||
$('.exp-btn > p', targetBtnElement).fadeOut(100);
|
||||
setTimeout(function() {
|
||||
$('.exp-btn-links', targetBtnElement).fadeIn(200)
|
||||
$('.exp-btn', targetBtnElement).addClass('open');
|
||||
$('.close-btn', targetBtnElement).fadeIn(200);
|
||||
}, 100);
|
||||
})
|
||||
|
||||
$('.close-btn').click(function() {
|
||||
var targetBtnElement = $(this).parent().parent()
|
||||
$('.exp-btn-links', targetBtnElement).fadeOut(100)
|
||||
$('.exp-btn', targetBtnElement).removeClass('open');
|
||||
$(this).fadeOut(100);
|
||||
setTimeout(function() {
|
||||
$('p', targetBtnElement).fadeIn(100);
|
||||
}, 100);
|
||||
})
|
||||
|
||||
/////////////////////////////// EXPANDING BUTTONS //////////////////////////////
|
||||
|
|
@ -1 +0,0 @@
|
|||
export * from './main.js';
|
|
@ -3,7 +3,6 @@
|
|||
///////////////////////// INFLUXDB URL PREFERENCE /////////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
*/
|
||||
import * as pageParams from '@params';
|
||||
import {
|
||||
DEFAULT_STORAGE_URLS,
|
||||
getPreference,
|
||||
|
@ -12,15 +11,18 @@ import {
|
|||
removeInfluxDBUrl,
|
||||
getInfluxDBUrl,
|
||||
getInfluxDBUrls,
|
||||
} from './local-storage.js';
|
||||
} from './services/local-storage.js';
|
||||
import $ from 'jquery';
|
||||
import { context as PRODUCT_CONTEXT, referrerHost } from './page-context.js';
|
||||
import { influxdbUrls } from './services/influxdb-urls.js';
|
||||
import { delay } from './helpers.js';
|
||||
import { toggleModal } from './modals.js';
|
||||
|
||||
let CLOUD_URLS = [];
|
||||
if (pageParams && pageParams.influxdb_urls) {
|
||||
CLOUD_URLS = Object.values(pageParams.influxdb_urls.cloud.providers).flatMap((provider) => provider.regions?.map((region) => region.url));
|
||||
if (influxdbUrls?.cloud) {
|
||||
CLOUD_URLS = Object.values(influxdbUrls.cloud.providers).flatMap((provider) =>
|
||||
provider.regions?.map((region) => region.url)
|
||||
);
|
||||
}
|
||||
export { CLOUD_URLS };
|
||||
|
||||
|
@ -120,9 +122,10 @@ export function InfluxDBUrl() {
|
|||
|
||||
// Retrieve the currently selected URLs from the urls local storage object.
|
||||
function getUrls() {
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } = getInfluxDBUrls();
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } =
|
||||
getInfluxDBUrls();
|
||||
return { oss, cloud, core, enterprise, serverless, dedicated, clustered };
|
||||
}
|
||||
}
|
||||
|
||||
// Retrieve the previously selected URLs from the from the urls local storage object.
|
||||
// This is used to update URLs whenever you switch between browser tabs.
|
||||
|
@ -289,7 +292,8 @@ export function InfluxDBUrl() {
|
|||
}
|
||||
|
||||
// Append the URL selector button to each codeblock containing a placeholder URL
|
||||
function appendUrlSelector(urls={
|
||||
function appendUrlSelector(
|
||||
urls = {
|
||||
cloud: '',
|
||||
oss: '',
|
||||
core: '',
|
||||
|
@ -297,7 +301,8 @@ export function InfluxDBUrl() {
|
|||
serverless: '',
|
||||
dedicated: '',
|
||||
clustered: '',
|
||||
}) {
|
||||
}
|
||||
) {
|
||||
const appendToUrls = Object.values(urls);
|
||||
|
||||
const getBtnText = (context) => {
|
||||
|
@ -330,20 +335,32 @@ export function InfluxDBUrl() {
|
|||
});
|
||||
}
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////
|
||||
////////////////// Initialize InfluxDB URL interactions ////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////
|
||||
////////////////// Initialize InfluxDB URL interactions ////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
// Add the preserve tag to code blocks that shouldn't be updated
|
||||
addPreserve();
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } = DEFAULT_STORAGE_URLS;
|
||||
const { cloud, oss, core, enterprise, serverless, dedicated, clustered } =
|
||||
DEFAULT_STORAGE_URLS;
|
||||
|
||||
// Append URL selector buttons to code blocks
|
||||
appendUrlSelector({ cloud, oss, core, enterprise, serverless, dedicated, clustered });
|
||||
appendUrlSelector({
|
||||
cloud,
|
||||
oss,
|
||||
core,
|
||||
enterprise,
|
||||
serverless,
|
||||
dedicated,
|
||||
clustered,
|
||||
});
|
||||
|
||||
// Update URLs on load
|
||||
|
||||
updateUrls({ cloud, oss, core, enterprise, serverless, dedicated, clustered }, getUrls());
|
||||
updateUrls(
|
||||
{ cloud, oss, core, enterprise, serverless, dedicated, clustered },
|
||||
getUrls()
|
||||
);
|
||||
|
||||
// Set active radio button on page load
|
||||
setRadioButtons(getUrls());
|
||||
|
|
|
@ -1,41 +1,58 @@
|
|||
// Dynamically update keybindings or hotkeys
|
||||
function getPlatform() {
|
||||
if (/Mac/.test(navigator.platform)) {
|
||||
return "osx"
|
||||
} else if (/Win/.test(navigator.platform)) {
|
||||
return "win"
|
||||
} else if (/Linux/.test(navigator.platform)) {
|
||||
return "linux"
|
||||
} else {
|
||||
return "other"
|
||||
}
|
||||
import { getPlatform } from './utils/user-agent-platform.js';
|
||||
import $ from 'jquery';
|
||||
|
||||
/**
|
||||
* Adds OS-specific class to component
|
||||
* @param {string} osClass - OS-specific class to add
|
||||
* @param {Object} options - Component options
|
||||
* @param {jQuery} options.$component - jQuery element reference
|
||||
*/
|
||||
function addOSClass(osClass, { $component }) {
|
||||
$component.addClass(osClass);
|
||||
}
|
||||
|
||||
const platform = getPlatform()
|
||||
/**
|
||||
* Updates keybinding display based on detected platform
|
||||
* @param {Object} options - Component options
|
||||
* @param {jQuery} options.$component - jQuery element reference
|
||||
* @param {string} options.platform - Detected platform
|
||||
*/
|
||||
function updateKeyBindings({ $component, platform }) {
|
||||
const osx = $component.data('osx');
|
||||
const linux = $component.data('linux');
|
||||
const win = $component.data('win');
|
||||
|
||||
function addOSClass(osClass) {
|
||||
$('.keybinding').addClass(osClass)
|
||||
}
|
||||
let keybind;
|
||||
|
||||
function updateKeyBindings() {
|
||||
$('.keybinding').each(function() {
|
||||
var osx = $(this).data("osx")
|
||||
var linux = $(this).data("linux")
|
||||
var win = $(this).data("win")
|
||||
|
||||
if (platform === "other") {
|
||||
if (win != linux) {
|
||||
var keybind = '<code class="osx">' + osx + '</code> for macOS, <code>' + linux + '</code> for Linux, and <code>' + win + '</code> for Windows';
|
||||
if (platform === 'other') {
|
||||
if (win !== linux) {
|
||||
keybind =
|
||||
`<code class="osx">${osx}</code> for macOS, ` +
|
||||
`<code>${linux}</code> for Linux, ` +
|
||||
`and <code>${win}</code> for Windows`;
|
||||
} else {
|
||||
var keybind = '<code>' + linux + '</code> for Linux and Windows and <code class="osx">' + osx + '</code> for macOS';
|
||||
keybind =
|
||||
`<code>${linux}</code> for Linux and Windows and ` +
|
||||
`<code class="osx">${osx}</code> for macOS`;
|
||||
}
|
||||
} else {
|
||||
var keybind = '<code>' + $(this).data(platform) + '</code>'
|
||||
keybind = `<code>${$component.data(platform)}</code>`;
|
||||
}
|
||||
|
||||
$(this).html(keybind)
|
||||
})
|
||||
$component.html(keybind);
|
||||
}
|
||||
|
||||
addOSClass(platform)
|
||||
updateKeyBindings()
|
||||
/**
|
||||
* Initialize and render platform-specific keybindings
|
||||
* @param {Object} options - Component options
|
||||
* @param {HTMLElement} options.component - DOM element
|
||||
* @returns {void}
|
||||
*/
|
||||
export default function KeyBinding({ component }) {
|
||||
// Initialize keybindings
|
||||
const platform = getPlatform();
|
||||
const $component = $(component);
|
||||
|
||||
addOSClass(platform, { $component });
|
||||
updateKeyBindings({ $component, platform });
|
||||
}
|
||||
|
|
|
@ -1,11 +1,15 @@
|
|||
import $ from 'jquery';
|
||||
|
||||
// Count tag elements
|
||||
function countTag(tag) {
|
||||
return $(".visible[data-tags*='" + tag + "']").length
|
||||
return $(".visible[data-tags*='" + tag + "']").length;
|
||||
}
|
||||
|
||||
function getFilterCounts() {
|
||||
$('#list-filters label').each(function() {
|
||||
var tagName = $('input', this).attr('name').replace(/[\W/]+/, "-");
|
||||
function getFilterCounts($labels) {
|
||||
$labels.each(function () {
|
||||
var tagName = $('input', this)
|
||||
.attr('name')
|
||||
.replace(/[\W/]+/, '-');
|
||||
var tagCount = countTag(tagName);
|
||||
$(this).attr('data-count', '(' + tagCount + ')');
|
||||
if (tagCount <= 0) {
|
||||
|
@ -13,38 +17,58 @@ function getFilterCounts() {
|
|||
} else {
|
||||
$(this).fadeTo(400, 1.0);
|
||||
}
|
||||
})
|
||||
});
|
||||
}
|
||||
|
||||
// Get initial filter count on page load
|
||||
getFilterCounts()
|
||||
/** TODO: Include the data source value in the as an additional attribute
|
||||
* in the HTML and pass it into the component, which would let us use selectors
|
||||
* for only the source items and let us have more than one
|
||||
* list filter component per page without conflicts */
|
||||
export default function ListFilters({ component }) {
|
||||
const $component = $(component);
|
||||
const $labels = $component.find('label');
|
||||
const $inputs = $component.find('input');
|
||||
|
||||
$("#list-filters input").click(function() {
|
||||
getFilterCounts($labels);
|
||||
|
||||
$inputs.click(function () {
|
||||
// List of tags to hide
|
||||
var tagArray = $("#list-filters input:checkbox:checked").map(function(){
|
||||
return $(this).attr('name').replace(/[\W]+/, "-");
|
||||
}).get();
|
||||
var tagArray = $component
|
||||
.find('input:checkbox:checked')
|
||||
.map(function () {
|
||||
return $(this).attr('name').replace(/[\W]+/, '-');
|
||||
})
|
||||
.get();
|
||||
|
||||
// List of tags to restore
|
||||
var restoreArray = $("#list-filters input:checkbox:not(:checked)").map(function(){
|
||||
return $(this).attr('name').replace(/[\W]+/, "-");
|
||||
}).get();
|
||||
var restoreArray = $component
|
||||
.find('input:checkbox:not(:checked)')
|
||||
.map(function () {
|
||||
return $(this).attr('name').replace(/[\W]+/, '-');
|
||||
})
|
||||
.get();
|
||||
|
||||
// Actions for filter select
|
||||
if ( $(this).is(':checked') ) {
|
||||
$.each( tagArray, function( index, value ) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])").removeClass('visible').fadeOut()
|
||||
})
|
||||
if ($(this).is(':checked')) {
|
||||
$.each(tagArray, function (index, value) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])")
|
||||
.removeClass('visible')
|
||||
.fadeOut();
|
||||
});
|
||||
} else {
|
||||
$.each( restoreArray, function( index, value ) {
|
||||
$(".filter-item:not(.visible)[data-tags~='" + value + "']").addClass('visible').fadeIn()
|
||||
})
|
||||
$.each( tagArray, function( index, value ) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])").removeClass('visible').hide()
|
||||
})
|
||||
$.each(restoreArray, function (index, value) {
|
||||
$(".filter-item:not(.visible)[data-tags~='" + value + "']")
|
||||
.addClass('visible')
|
||||
.fadeIn();
|
||||
});
|
||||
$.each(tagArray, function (index, value) {
|
||||
$(".filter-item.visible:not([data-tags~='" + value + "'])")
|
||||
.removeClass('visible')
|
||||
.hide();
|
||||
});
|
||||
}
|
||||
|
||||
// Refresh filter count
|
||||
getFilterCounts()
|
||||
});
|
||||
getFilterCounts($labels);
|
||||
});
|
||||
}
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
// assets/js/main.js
|
||||
|
||||
// If you need to pass parameters from the calling Hugo page, you can import them here like so:
|
||||
// import * as pageParams from '@params';
|
||||
// Import dependencies that we still need to load in the global scope
|
||||
import $ from 'jquery';
|
||||
|
||||
/** Import modules that are not components.
|
||||
* TODO: Refactor these into single-purpose component modules.
|
||||
|
@ -9,9 +9,10 @@
|
|||
import * as apiLibs from './api-libs.js';
|
||||
import * as codeControls from './code-controls.js';
|
||||
import * as contentInteractions from './content-interactions.js';
|
||||
import * as datetime from './datetime.js';
|
||||
import { delay } from './helpers.js';
|
||||
import { InfluxDBUrl } from './influxdb-url.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
import * as modals from './modals.js';
|
||||
import * as notifications from './notifications.js';
|
||||
import * as pageContext from './page-context.js';
|
||||
|
@ -29,8 +30,17 @@ import * as v3Wayfinding from './v3-wayfinding.js';
|
|||
import AskAITrigger from './ask-ai-trigger.js';
|
||||
import CodePlaceholder from './code-placeholders.js';
|
||||
import { CustomTimeTrigger } from './custom-timestamps.js';
|
||||
import Diagram from './components/diagram.js';
|
||||
import DocSearch from './components/doc-search.js';
|
||||
import FeatureCallout from './feature-callouts.js';
|
||||
import FluxGroupKeysDemo from './flux-group-keys.js';
|
||||
import FluxInfluxDBVersionsTrigger from './flux-influxdb-versions.js';
|
||||
import KeyBinding from './keybindings.js';
|
||||
import ListFilters from './list-filters.js';
|
||||
import ProductSelector from './version-selector.js';
|
||||
import ReleaseToc from './release-toc.js';
|
||||
import { SearchButton } from './search-button.js';
|
||||
import SidebarSearch from './components/sidebar-search.js';
|
||||
import { SidebarToggle } from './sidebar-toggle.js';
|
||||
import Theme from './theme.js';
|
||||
import ThemeSwitch from './theme-switch.js';
|
||||
|
@ -49,11 +59,20 @@ const componentRegistry = {
|
|||
'ask-ai-trigger': AskAITrigger,
|
||||
'code-placeholder': CodePlaceholder,
|
||||
'custom-time-trigger': CustomTimeTrigger,
|
||||
diagram: Diagram,
|
||||
'doc-search': DocSearch,
|
||||
'feature-callout': FeatureCallout,
|
||||
'flux-group-keys-demo': FluxGroupKeysDemo,
|
||||
'flux-influxdb-versions-trigger': FluxInfluxDBVersionsTrigger,
|
||||
keybinding: KeyBinding,
|
||||
'list-filters': ListFilters,
|
||||
'product-selector': ProductSelector,
|
||||
'release-toc': ReleaseToc,
|
||||
'search-button': SearchButton,
|
||||
'sidebar-search': SidebarSearch,
|
||||
'sidebar-toggle': SidebarToggle,
|
||||
'theme': Theme,
|
||||
'theme-switch': ThemeSwitch
|
||||
theme: Theme,
|
||||
'theme-switch': ThemeSwitch,
|
||||
};
|
||||
|
||||
/**
|
||||
|
@ -72,6 +91,11 @@ function initGlobals() {
|
|||
window.influxdatadocs.toggleModal = modals.toggleModal;
|
||||
window.influxdatadocs.componentRegistry = componentRegistry;
|
||||
|
||||
// Re-export jQuery to global namespace for legacy scripts
|
||||
if (typeof window.jQuery === 'undefined') {
|
||||
window.jQuery = window.$ = $;
|
||||
}
|
||||
|
||||
return window.influxdatadocs;
|
||||
}
|
||||
|
||||
|
@ -103,10 +127,13 @@ function initComponents(globals) {
|
|||
|
||||
globals.instances[componentName].push({
|
||||
element: component,
|
||||
instance
|
||||
instance,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`Error initializing component "${componentName}":`, error);
|
||||
console.error(
|
||||
`Error initializing component "${componentName}":`,
|
||||
error
|
||||
);
|
||||
}
|
||||
} else {
|
||||
console.warn(`Unknown component: "${componentName}"`);
|
||||
|
@ -122,6 +149,7 @@ function initModules() {
|
|||
apiLibs.initialize();
|
||||
codeControls.initialize();
|
||||
contentInteractions.initialize();
|
||||
datetime.initialize();
|
||||
InfluxDBUrl();
|
||||
notifications.initialize();
|
||||
pageFeedback.initialize();
|
||||
|
|
|
@ -1,34 +1,80 @@
|
|||
/** This module retrieves browser context information and site data for the
|
||||
* current page, version, and product.
|
||||
*/
|
||||
import { products, influxdb_urls } from '@params';
|
||||
|
||||
const safeProducts = products || {};
|
||||
const safeUrls = influxdb_urls || {};
|
||||
import { products } from './services/influxdata-products.js';
|
||||
import { influxdbUrls } from './services/influxdb-urls.js';
|
||||
|
||||
function getCurrentProductData() {
|
||||
const path = window.location.pathname;
|
||||
const mappings = [
|
||||
{ pattern: /\/influxdb\/cloud\//, product: safeProducts.cloud, urls: safeUrls.influxdb_cloud },
|
||||
{ pattern: /\/influxdb3\/core/, product: safeProducts.influxdb3_core, urls: safeUrls.core },
|
||||
{ pattern: /\/influxdb3\/enterprise/, product: safeProducts.influxdb3_enterprise, urls: safeUrls.enterprise },
|
||||
{ pattern: /\/influxdb3\/cloud-serverless/, product: safeProducts.influxdb3_cloud_serverless, urls: safeUrls.cloud },
|
||||
{ pattern: /\/influxdb3\/cloud-dedicated/, product: safeProducts.influxdb3_cloud_dedicated, urls: safeUrls.dedicated },
|
||||
{ pattern: /\/influxdb3\/clustered/, product: safeProducts.influxdb3_clustered, urls: safeUrls.clustered },
|
||||
{ pattern: /\/enterprise_v1\//, product: safeProducts.enterprise_influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/influxdb.*v1\//, product: safeProducts.influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/influxdb.*v2\//, product: safeProducts.influxdb, urls: safeUrls.oss },
|
||||
{ pattern: /\/kapacitor\//, product: safeProducts.kapacitor, urls: safeUrls.oss },
|
||||
{ pattern: /\/telegraf\//, product: safeProducts.telegraf, urls: safeUrls.oss },
|
||||
{ pattern: /\/chronograf\//, product: safeProducts.chronograf, urls: safeUrls.oss },
|
||||
{ pattern: /\/flux\//, product: safeProducts.flux, urls: safeUrls.oss },
|
||||
{
|
||||
pattern: /\/influxdb\/cloud\//,
|
||||
product: products.cloud,
|
||||
urls: influxdbUrls.influxdb_cloud,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/core/,
|
||||
product: products.influxdb3_core,
|
||||
urls: influxdbUrls.core,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/enterprise/,
|
||||
product: products.influxdb3_enterprise,
|
||||
urls: influxdbUrls.enterprise,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/cloud-serverless/,
|
||||
product: products.influxdb3_cloud_serverless,
|
||||
urls: influxdbUrls.cloud,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/cloud-dedicated/,
|
||||
product: products.influxdb3_cloud_dedicated,
|
||||
urls: influxdbUrls.dedicated,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb3\/clustered/,
|
||||
product: products.influxdb3_clustered,
|
||||
urls: influxdbUrls.clustered,
|
||||
},
|
||||
{
|
||||
pattern: /\/enterprise_v1\//,
|
||||
product: products.enterprise_influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb.*v1\//,
|
||||
product: products.influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/influxdb.*v2\//,
|
||||
product: products.influxdb,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/kapacitor\//,
|
||||
product: products.kapacitor,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/telegraf\//,
|
||||
product: products.telegraf,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{
|
||||
pattern: /\/chronograf\//,
|
||||
product: products.chronograf,
|
||||
urls: influxdbUrls.oss,
|
||||
},
|
||||
{ pattern: /\/flux\//, product: products.flux, urls: influxdbUrls.oss },
|
||||
];
|
||||
|
||||
for (const { pattern, product, urls } of mappings) {
|
||||
if (pattern.test(path)) {
|
||||
return {
|
||||
product: product || 'unknown',
|
||||
urls: urls || {}
|
||||
urls: urls || {},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
@ -36,7 +82,8 @@ function getCurrentProductData() {
|
|||
return { product: 'other', urls: {} };
|
||||
}
|
||||
|
||||
// Return the page context (cloud, serverless, oss/enterprise, dedicated, clustered, other)
|
||||
// Return the page context
|
||||
// (cloud, serverless, oss/enterprise, dedicated, clustered, other)
|
||||
function getContext() {
|
||||
if (/\/influxdb\/cloud\//.test(window.location.pathname)) {
|
||||
return 'cloud';
|
||||
|
@ -78,8 +125,12 @@ const context = getContext(),
|
|||
protocol = location.protocol,
|
||||
referrer = document.referrer === '' ? 'direct' : document.referrer,
|
||||
referrerHost = getReferrerHost(),
|
||||
// TODO: Verify this still does what we want since the addition of InfluxDB 3 naming and the Core and Enterprise versions.
|
||||
version = (/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud') ? pathArr[1].replace(/^v/, '') : "n/a")
|
||||
// TODO: Verify this works since the addition of InfluxDB 3 naming
|
||||
// and the Core and Enterprise versions.
|
||||
version =
|
||||
/^v\d/.test(pathArr[1]) || pathArr[1]?.includes('cloud')
|
||||
? pathArr[1].replace(/^v/, '')
|
||||
: 'n/a';
|
||||
|
||||
export {
|
||||
context,
|
||||
|
|
|
@ -3,45 +3,34 @@
|
|||
/*
|
||||
* This script is used to generate a table of contents for the
|
||||
* release notes pages.
|
||||
*/
|
||||
*/
|
||||
export default function ReleaseToc({ component }) {
|
||||
// Get all h2 elements that are not checkpoint-releases
|
||||
const releases = Array.from(document.querySelectorAll('h2')).filter(
|
||||
(el) => !el.id.match(/checkpoint-releases/)
|
||||
);
|
||||
|
||||
// Get all h2 elements that are not checkpoint-releases
|
||||
const releases = Array.from(document.querySelectorAll('h2')).filter(
|
||||
el => !el.id.match(/checkpoint-releases/)
|
||||
);
|
||||
|
||||
// Extract data about each release from the array of releases
|
||||
const releaseData = releases.map(el => ({
|
||||
// Extract data about each release from the array of releases
|
||||
const releaseData = releases.map((el) => ({
|
||||
name: el.textContent,
|
||||
id: el.id,
|
||||
class: el.getAttribute('class'),
|
||||
date: el.getAttribute('date')
|
||||
}));
|
||||
date: el.getAttribute('date'),
|
||||
}));
|
||||
|
||||
// Use release data to generate a list item for each release
|
||||
function getReleaseItem(releaseData) {
|
||||
const li = document.createElement("li");
|
||||
if (releaseData.class !== null) {
|
||||
li.className = releaseData.class;
|
||||
}
|
||||
li.innerHTML = `<a href="#${releaseData.id}">${releaseData.name}</a>`;
|
||||
li.setAttribute('date', releaseData.date);
|
||||
return li;
|
||||
}
|
||||
|
||||
// Build the release table of contents
|
||||
const releaseTocUl = document.querySelector('#release-toc ul');
|
||||
releaseData.forEach(release => {
|
||||
// Build the release table of contents
|
||||
const releaseTocUl = component.querySelector('#release-toc ul');
|
||||
releaseData.forEach((release) => {
|
||||
releaseTocUl.appendChild(getReleaseItem(release));
|
||||
});
|
||||
});
|
||||
|
||||
/*
|
||||
/*
|
||||
* This script is used to expand the release notes table of contents by the
|
||||
* number specified in the `show` attribute of `ul.release-list`.
|
||||
* Once all the release items are visible, the "Show More" button is hidden.
|
||||
*/
|
||||
const showMoreBtn = document.querySelector('#release-toc .show-more');
|
||||
if (showMoreBtn) {
|
||||
*/
|
||||
const showMoreBtn = component.querySelector('.show-more');
|
||||
if (showMoreBtn) {
|
||||
showMoreBtn.addEventListener('click', function () {
|
||||
const itemHeight = 1.885; // Item height in rem
|
||||
const releaseNum = releaseData.length;
|
||||
|
@ -53,7 +42,8 @@ if (showMoreBtn) {
|
|||
? Number(currentHeightMatch[0])
|
||||
: 0;
|
||||
const potentialHeight = currentHeight + releaseIncrement * itemHeight;
|
||||
const newHeight = potentialHeight > maxHeight ? maxHeight : potentialHeight;
|
||||
const newHeight =
|
||||
potentialHeight > maxHeight ? maxHeight : potentialHeight;
|
||||
|
||||
releaseList.style.height = `${newHeight}rem`;
|
||||
|
||||
|
@ -66,4 +56,16 @@ if (showMoreBtn) {
|
|||
}, 100);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Use release data to generate a list item for each release
|
||||
function getReleaseItem(releaseData) {
|
||||
const li = document.createElement('li');
|
||||
if (releaseData.class !== null) {
|
||||
li.className = releaseData.class;
|
||||
}
|
||||
li.innerHTML = `<a href="#${releaseData.id}">${releaseData.name}</a>`;
|
||||
li.setAttribute('date', releaseData.date);
|
||||
return li;
|
||||
}
|
||||
|
|
|
@ -1,10 +0,0 @@
|
|||
// Fade content wrapper when focusing on search input
|
||||
$('#algolia-search-input').focus(function() {
|
||||
$('.content-wrapper').fadeTo(300, .35);
|
||||
})
|
||||
|
||||
// Hide search dropdown when leaving search input
|
||||
$('#algolia-search-input').blur(function() {
|
||||
$('.content-wrapper').fadeTo(200, 1);
|
||||
$('.ds-dropdown-menu').hide();
|
||||
})
|
|
@ -0,0 +1,3 @@
|
|||
import { products as productsParam } from '@params';
|
||||
|
||||
export const products = productsParam || {};
|
|
@ -0,0 +1,3 @@
|
|||
import { influxdb_urls as influxdbUrlsParam } from '@params';
|
||||
|
||||
export const influxdbUrls = influxdbUrlsParam || {};
|
|
@ -10,7 +10,8 @@
|
|||
- messages: Messages (data/notifications.yaml) that have been seen (array)
|
||||
- callouts: Feature callouts that have been seen (array)
|
||||
*/
|
||||
import * as pageParams from '@params';
|
||||
|
||||
import { influxdbUrls } from './influxdb-urls.js';
|
||||
|
||||
// Prefix for all InfluxData docs local storage
|
||||
const storagePrefix = 'influxdata_docs_';
|
||||
|
@ -82,14 +83,12 @@ function getPreferences() {
|
|||
//////////// MANAGE INFLUXDATA DOCS URLS IN LOCAL STORAGE //////////////////////
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
|
||||
const defaultUrls = {};
|
||||
// Guard against pageParams being null/undefined and safely access nested properties
|
||||
if (pageParams && pageParams.influxdb_urls) {
|
||||
Object.entries(pageParams.influxdb_urls).forEach(([product, {providers}]) => {
|
||||
defaultUrls[product] = providers.filter(provider => provider.name === 'Default')[0]?.regions[0]?.url;
|
||||
});
|
||||
}
|
||||
Object.entries(influxdbUrls).forEach(([product, { providers }]) => {
|
||||
defaultUrls[product] =
|
||||
providers.filter((provider) => provider.name === 'Default')[0]?.regions[0]
|
||||
?.url || 'https://cloud2.influxdata.com';
|
||||
});
|
||||
|
||||
export const DEFAULT_STORAGE_URLS = {
|
||||
oss: defaultUrls.oss,
|
||||
|
@ -177,7 +176,10 @@ const defaultNotificationsObj = {
|
|||
function getNotifications() {
|
||||
// Initialize notifications data if it doesn't already exist
|
||||
if (localStorage.getItem(notificationStorageKey) === null) {
|
||||
initializeStorageItem('notifications', JSON.stringify(defaultNotificationsObj));
|
||||
initializeStorageItem(
|
||||
'notifications',
|
||||
JSON.stringify(defaultNotificationsObj)
|
||||
);
|
||||
}
|
||||
|
||||
// Retrieve and parse the notifications data as JSON
|
||||
|
@ -221,7 +223,10 @@ function setNotificationAsRead(notificationID, notificationType) {
|
|||
readNotifications.push(notificationID);
|
||||
notificationsObj[notificationType + 's'] = readNotifications;
|
||||
|
||||
localStorage.setItem(notificationStorageKey, JSON.stringify(notificationsObj));
|
||||
localStorage.setItem(
|
||||
notificationStorageKey,
|
||||
JSON.stringify(notificationsObj)
|
||||
);
|
||||
}
|
||||
|
||||
// Export functions as a module and make the file backwards compatible for non-module environments until all remaining dependent scripts are ported to modules
|
|
@ -3,7 +3,7 @@
|
|||
http://www.thesitewizard.com/javascripts/change-style-sheets.shtml
|
||||
*/
|
||||
|
||||
import * as localStorage from './local-storage.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
|
||||
// *** TO BE CUSTOMISED ***
|
||||
var sidebar_state_preference_name = 'sidebar_state';
|
||||
|
|
|
@ -1,20 +1,21 @@
|
|||
import Theme from './theme.js';
|
||||
|
||||
export default function ThemeSwitch({ component }) {
|
||||
if ( component == undefined) {
|
||||
if (component === undefined) {
|
||||
component = document;
|
||||
}
|
||||
component.querySelectorAll(`.theme-switch-light`).forEach((button) => {
|
||||
button.addEventListener('click', function(event) {
|
||||
|
||||
component.querySelectorAll('.theme-switch-light').forEach((button) => {
|
||||
button.addEventListener('click', function (event) {
|
||||
event.preventDefault();
|
||||
Theme({ style: 'light-theme' });
|
||||
Theme({ component, style: 'light-theme' });
|
||||
});
|
||||
});
|
||||
|
||||
component.querySelectorAll(`.theme-switch-dark`).forEach((button) => {
|
||||
button.addEventListener('click', function(event) {
|
||||
component.querySelectorAll('.theme-switch-dark').forEach((button) => {
|
||||
button.addEventListener('click', function (event) {
|
||||
event.preventDefault();
|
||||
Theme({ style: 'dark-theme' });
|
||||
Theme({ component, style: 'dark-theme' });
|
||||
});
|
||||
});
|
||||
}
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
import { getPreference, setPreference } from './local-storage.js';
|
||||
import { getPreference, setPreference } from './services/local-storage.js';
|
||||
|
||||
const PROPS = {
|
||||
style_preference_name: 'theme',
|
||||
|
@ -6,19 +6,22 @@ const PROPS = {
|
|||
style_domain: 'docs.influxdata.com',
|
||||
};
|
||||
|
||||
function getPreferredTheme () {
|
||||
function getPreferredTheme() {
|
||||
return `${getPreference(PROPS.style_preference_name)}-theme`;
|
||||
}
|
||||
|
||||
function switchStyle({ styles_element, css_title }) {
|
||||
// Disable all other theme stylesheets
|
||||
styles_element.querySelectorAll('link[rel*="stylesheet"][title*="theme"]')
|
||||
styles_element
|
||||
.querySelectorAll('link[rel*="stylesheet"][title*="theme"]')
|
||||
.forEach(function (link) {
|
||||
link.disabled = true;
|
||||
});
|
||||
|
||||
// Enable the stylesheet with the specified title
|
||||
const link = styles_element.querySelector(`link[rel*="stylesheet"][title="${css_title}"]`);
|
||||
const link = styles_element.querySelector(
|
||||
`link[rel*="stylesheet"][title="${css_title}"]`
|
||||
);
|
||||
link && (link.disabled = false);
|
||||
|
||||
setPreference(PROPS.style_preference_name, css_title.replace(/-theme/, ''));
|
||||
|
@ -38,5 +41,4 @@ export default function Theme({ component, style }) {
|
|||
if (component.dataset?.themeCallback === 'setVisibility') {
|
||||
setVisibility(component);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -0,0 +1,38 @@
|
|||
/**
|
||||
* Helper functions for debugging without source maps
|
||||
* Example usage:
|
||||
* In your code, you can use these functions like this:
|
||||
* ```javascript
|
||||
* import { debugLog, debugBreak, debugInspect } from './debug-helpers.js';
|
||||
*
|
||||
* const data = debugInspect(someData, 'Data');
|
||||
* debugLog('Processing data', 'myFunction');
|
||||
*
|
||||
* function processData() {
|
||||
* // Add a breakpoint that works with DevTools
|
||||
* debugBreak();
|
||||
*
|
||||
* // Your existing code...
|
||||
* }
|
||||
* ```
|
||||
*
|
||||
* @fileoverview DEVELOPMENT USE ONLY - Functions should not be committed to production
|
||||
*/
|
||||
|
||||
/* eslint-disable no-debugger */
|
||||
/* eslint-disable-next-line */
|
||||
// NOTE: These functions are detected by ESLint rules to prevent committing debug code
|
||||
|
||||
export function debugLog(message, context = '') {
|
||||
const contextStr = context ? `[${context}]` : '';
|
||||
console.log(`DEBUG${contextStr}: ${message}`);
|
||||
}
|
||||
|
||||
export function debugBreak() {
|
||||
debugger;
|
||||
}
|
||||
|
||||
export function debugInspect(value, label = 'Inspect') {
|
||||
console.log(`DEBUG[${label}]:`, value);
|
||||
return value;
|
||||
}
|
|
@ -0,0 +1,107 @@
|
|||
/**
|
||||
* Manages search interactions for DocSearch integration
|
||||
* Uses MutationObserver to watch for dropdown creation
|
||||
*/
|
||||
export default function SearchInteractions({ searchInput }) {
|
||||
const contentWrapper = document.querySelector('.content-wrapper');
|
||||
let observer = null;
|
||||
let dropdownObserver = null;
|
||||
let dropdownMenu = null;
|
||||
const debug = false; // Set to true for debugging logs
|
||||
|
||||
// Fade content wrapper when focusing on search input
|
||||
function handleFocus() {
|
||||
contentWrapper.style.opacity = '0.35';
|
||||
contentWrapper.style.transition = 'opacity 300ms';
|
||||
}
|
||||
|
||||
// Hide search dropdown when leaving search input
|
||||
function handleBlur(event) {
|
||||
// Only process blur if not clicking within dropdown
|
||||
const relatedTarget = event.relatedTarget;
|
||||
if (
|
||||
relatedTarget &&
|
||||
(relatedTarget.closest('.algolia-autocomplete') ||
|
||||
relatedTarget.closest('.ds-dropdown-menu'))
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
contentWrapper.style.opacity = '1';
|
||||
contentWrapper.style.transition = 'opacity 200ms';
|
||||
|
||||
// Hide dropdown if it exists
|
||||
if (dropdownMenu) {
|
||||
dropdownMenu.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Add event listeners
|
||||
searchInput.addEventListener('focus', handleFocus);
|
||||
searchInput.addEventListener('blur', handleBlur);
|
||||
|
||||
// Use MutationObserver to detect when dropdown is added to the DOM
|
||||
observer = new MutationObserver((mutations) => {
|
||||
for (const mutation of mutations) {
|
||||
if (mutation.type === 'childList') {
|
||||
const newDropdown = document.querySelector(
|
||||
'.ds-dropdown-menu:not([data-monitored])'
|
||||
);
|
||||
if (newDropdown) {
|
||||
// Save reference to dropdown
|
||||
dropdownMenu = newDropdown;
|
||||
newDropdown.setAttribute('data-monitored', 'true');
|
||||
|
||||
// Monitor dropdown removal/display changes
|
||||
dropdownObserver = new MutationObserver((dropdownMutations) => {
|
||||
for (const dropdownMutation of dropdownMutations) {
|
||||
if (debug) {
|
||||
if (
|
||||
dropdownMutation.type === 'attributes' &&
|
||||
dropdownMutation.attributeName === 'style'
|
||||
) {
|
||||
console.log(
|
||||
'Dropdown style changed:',
|
||||
dropdownMenu.style.display
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Observe changes to dropdown attributes (like style)
|
||||
dropdownObserver.observe(dropdownMenu, {
|
||||
attributes: true,
|
||||
attributeFilter: ['style'],
|
||||
});
|
||||
|
||||
// Add event listeners to keep dropdown open when interacted with
|
||||
dropdownMenu.addEventListener('mousedown', (e) => {
|
||||
// Prevent blur on searchInput when clicking in dropdown
|
||||
e.preventDefault();
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Start observing the document body for dropdown creation
|
||||
observer.observe(document.body, {
|
||||
childList: true,
|
||||
subtree: true,
|
||||
});
|
||||
|
||||
// Return cleanup function
|
||||
return function cleanup() {
|
||||
searchInput.removeEventListener('focus', handleFocus);
|
||||
searchInput.removeEventListener('blur', handleBlur);
|
||||
|
||||
if (observer) {
|
||||
observer.disconnect();
|
||||
}
|
||||
|
||||
if (dropdownObserver) {
|
||||
dropdownObserver.disconnect();
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
/**
|
||||
* Platform detection utility functions
|
||||
* Provides methods for detecting user's operating system
|
||||
*/
|
||||
|
||||
/**
|
||||
* Detects user's operating system using modern techniques
|
||||
* Falls back to userAgent parsing when newer APIs aren't available
|
||||
* @returns {string} Operating system identifier ("osx", "win", "linux", or "other")
|
||||
*/
|
||||
export function getPlatform() {
|
||||
// Try to use modern User-Agent Client Hints API first (Chrome 89+, Edge 89+)
|
||||
if (navigator.userAgentData && navigator.userAgentData.platform) {
|
||||
const platform = navigator.userAgentData.platform.toLowerCase();
|
||||
|
||||
if (platform.includes('mac')) return 'osx';
|
||||
if (platform.includes('win')) return 'win';
|
||||
if (platform.includes('linux')) return 'linux';
|
||||
}
|
||||
|
||||
// Fall back to userAgent string parsing
|
||||
const userAgent = navigator.userAgent.toLowerCase();
|
||||
|
||||
if (
|
||||
userAgent.includes('mac') ||
|
||||
userAgent.includes('iphone') ||
|
||||
userAgent.includes('ipad')
|
||||
)
|
||||
return 'osx';
|
||||
if (userAgent.includes('win')) return 'win';
|
||||
if (userAgent.includes('linux') || userAgent.includes('android'))
|
||||
return 'linux';
|
||||
|
||||
return 'other';
|
||||
}
|
|
@ -1,6 +1,14 @@
|
|||
import { CLOUD_URLS } from './influxdb-url.js';
|
||||
import * as localStorage from './local-storage.js';
|
||||
import { context, host, hostname, path, protocol, referrer, referrerHost } from './page-context.js';
|
||||
import * as localStorage from './services/local-storage.js';
|
||||
import {
|
||||
context,
|
||||
host,
|
||||
hostname,
|
||||
path,
|
||||
protocol,
|
||||
referrer,
|
||||
referrerHost,
|
||||
} from './page-context.js';
|
||||
|
||||
/**
|
||||
* Builds a referrer whitelist array that includes the current page host and all
|
||||
|
@ -69,8 +77,6 @@ function setWayfindingInputState() {
|
|||
}
|
||||
|
||||
function submitWayfindingData(engine, action) {
|
||||
|
||||
|
||||
// Build lp using page data and engine data
|
||||
const lp = `ioxwayfinding,host=${hostname},path=${path},referrer=${referrer},engine=${engine} action="${action}"`;
|
||||
|
||||
|
@ -81,10 +87,7 @@ function submitWayfindingData(engine, action) {
|
|||
'https://j32dswat7l.execute-api.us-east-1.amazonaws.com/prod/wayfinding'
|
||||
);
|
||||
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
|
||||
xhr.setRequestHeader(
|
||||
'Access-Control-Allow-Origin',
|
||||
`${protocol}//${host}`
|
||||
);
|
||||
xhr.setRequestHeader('Access-Control-Allow-Origin', `${protocol}//${host}`);
|
||||
xhr.setRequestHeader('Content-Type', 'text/plain; charset=utf-8');
|
||||
xhr.setRequestHeader('Accept', 'application/json');
|
||||
xhr.send(lp);
|
||||
|
|
|
@ -1,19 +1,21 @@
|
|||
// Select the product dropdown and dropdown items
|
||||
const productDropdown = document.querySelector("#product-dropdown");
|
||||
const dropdownItems = document.querySelector("#dropdown-items");
|
||||
export default function ProductSelector({ component }) {
|
||||
// Select the product dropdown and dropdown items
|
||||
const productDropdown = component.querySelector('#product-dropdown');
|
||||
const dropdownItems = component.querySelector('#dropdown-items');
|
||||
|
||||
// Expand the menu on click
|
||||
if (productDropdown) {
|
||||
productDropdown.addEventListener("click", function() {
|
||||
productDropdown.classList.toggle("open");
|
||||
dropdownItems.classList.toggle("open");
|
||||
// Expand the menu on click
|
||||
if (productDropdown) {
|
||||
productDropdown.addEventListener('click', function () {
|
||||
productDropdown.classList.toggle('open');
|
||||
dropdownItems.classList.toggle('open');
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Close the dropdown by clicking anywhere else
|
||||
document.addEventListener("click", function(e) {
|
||||
// Close the dropdown by clicking anywhere else
|
||||
document.addEventListener('click', function (e) {
|
||||
// Check if the click was outside of the '.product-list' container
|
||||
if (!e.target.closest('.product-list')) {
|
||||
dropdownItems.classList.remove("open");
|
||||
dropdownItems.classList.remove('open');
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
|
|
@ -3,8 +3,7 @@
|
|||
"baseUrl": ".",
|
||||
"paths": {
|
||||
"*": [
|
||||
"*",
|
||||
"../node_modules/*"
|
||||
"*"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,18 @@
|
|||
/*
|
||||
Datetime Components
|
||||
----------------------------------------------
|
||||
*/
|
||||
|
||||
.current-timestamp,
|
||||
.current-date,
|
||||
.current-time,
|
||||
.enterprise-eol-date {
|
||||
color: $current-timestamp-color;
|
||||
display: inline-block;
|
||||
font-family: $proxima;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.nowrap {
|
||||
white-space: nowrap;
|
||||
}
|
|
@ -16,6 +16,10 @@
|
|||
background: $article-code-bg !important;
|
||||
font-size: .85em;
|
||||
font-weight: $medium;
|
||||
|
||||
p {
|
||||
background: $article-bg !important;
|
||||
}
|
||||
}
|
||||
|
||||
.node {
|
||||
|
|
|
@ -23,6 +23,7 @@
|
|||
"layouts/syntax-highlighting",
|
||||
"layouts/algolia-search-overrides",
|
||||
"layouts/landing",
|
||||
"layouts/datetime",
|
||||
"layouts/error-page",
|
||||
"layouts/footer-widgets",
|
||||
"layouts/modals",
|
||||
|
|
|
@ -203,6 +203,12 @@ $article-btn-text-hover: $g20-white;
|
|||
$article-nav-icon-bg: $g5-pepper;
|
||||
$article-nav-acct-bg: $g3-castle;
|
||||
|
||||
// Datetime shortcode colors
|
||||
$current-timestamp-color: $g15-platinum;
|
||||
$current-date-color: $g15-platinum;
|
||||
$current-time-color: $g15-platinum;
|
||||
$enterprise-eol-date-color: $g15-platinum;
|
||||
|
||||
// Error Page Colors
|
||||
$error-page-btn: $b-pool;
|
||||
$error-page-btn-text: $g20-white;
|
||||
|
|
|
@ -203,6 +203,12 @@ $article-btn-text-hover: $g20-white !default;
|
|||
$article-nav-icon-bg: $g6-smoke !default;
|
||||
$article-nav-acct-bg: $g5-pepper !default;
|
||||
|
||||
// Datetime Colors
|
||||
$current-timestamp-color: $article-text !default;
|
||||
$current-date-color: $article-text !default;
|
||||
$current-time-color: $article-text !default;
|
||||
$enterprise-eol-date-color: $article-text !default;
|
||||
|
||||
// Error Page Colors
|
||||
$error-page-btn: $b-pool !default;
|
||||
$error-page-btn-text: $g20-white !default;
|
||||
|
|
|
@ -1,2 +0,0 @@
|
|||
import:
|
||||
- hugo.yml
|
|
@ -1,4 +1,4 @@
|
|||
baseURL: 'https://docs.influxdata.com/'
|
||||
baseURL: https://docs.influxdata.com/
|
||||
languageCode: en-us
|
||||
title: InfluxDB Documentation
|
||||
|
||||
|
@ -49,21 +49,52 @@ privacy:
|
|||
youtube:
|
||||
disable: false
|
||||
privacyEnhanced: true
|
||||
|
||||
outputFormats:
|
||||
json:
|
||||
mediaType: application/json
|
||||
baseName: pages
|
||||
isPlainText: true
|
||||
|
||||
# Asset processing configuration for development
|
||||
build:
|
||||
# Ensure Hugo correctly processes JavaScript modules
|
||||
jsConfig:
|
||||
nodeEnv: "development"
|
||||
# Development asset processing
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
noJSConfigInAssets: false
|
||||
|
||||
# Asset processing configuration
|
||||
assetDir: "assets"
|
||||
|
||||
module:
|
||||
mounts:
|
||||
- source: assets
|
||||
target: assets
|
||||
|
||||
- source: node_modules
|
||||
target: assets/node_modules
|
||||
|
||||
# Environment parameters
|
||||
params:
|
||||
env: development
|
||||
environment: development
|
||||
|
||||
# Configure the server for development
|
||||
server:
|
||||
port: 1313
|
||||
baseURL: 'http://localhost:1313/'
|
||||
watchChanges: true
|
||||
disableLiveReload: false
|
||||
|
||||
# Ignore specific warning logs
|
||||
ignoreLogs:
|
||||
- warning-goldmark-raw-html
|
||||
|
||||
# Disable minification for development
|
||||
minify:
|
||||
disableJS: true
|
||||
disableCSS: true
|
||||
disableHTML: true
|
||||
minifyOutput: false
|
|
@ -0,0 +1,40 @@
|
|||
# Production overrides for CI/CD builds
|
||||
baseURL: 'https://docs.influxdata.com/'
|
||||
|
||||
# Production environment parameters
|
||||
params:
|
||||
env: production
|
||||
environment: production
|
||||
|
||||
# Enable minification for production
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
|
||||
# Production asset processing
|
||||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
|
||||
# Asset processing configuration
|
||||
assetDir: "assets"
|
||||
|
||||
# Mount assets for production
|
||||
module:
|
||||
mounts:
|
||||
- source: assets
|
||||
target: assets
|
||||
- source: node_modules
|
||||
target: assets/node_modules
|
||||
|
||||
# Disable development server settings
|
||||
server: {}
|
||||
|
||||
# Suppress the warning mentioned in the error
|
||||
ignoreLogs:
|
||||
- 'warning-goldmark-raw-html'
|
|
@ -0,0 +1,17 @@
|
|||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
params:
|
||||
env: production
|
||||
environment: production
|
||||
server: {
|
||||
disableLiveReload: true
|
||||
}
|
|
@ -0,0 +1,18 @@
|
|||
baseURL: https://test2.docs.influxdata.com/
|
||||
build:
|
||||
writeStats: false
|
||||
useResourceCacheWhen: "fallback"
|
||||
buildOptions:
|
||||
sourcemap: false
|
||||
target: "es2015"
|
||||
minify:
|
||||
disableJS: false
|
||||
disableCSS: false
|
||||
disableHTML: false
|
||||
minifyOutput: true
|
||||
params:
|
||||
env: staging
|
||||
environment: staging
|
||||
server:
|
||||
disableLiveReload: true
|
||||
|
|
@ -1,20 +0,0 @@
|
|||
baseURL: 'http://localhost:1315/'
|
||||
|
||||
server:
|
||||
port: 1315
|
||||
|
||||
# Override settings for testing
|
||||
buildFuture: true
|
||||
|
||||
# Configure what content is built in testing env
|
||||
params:
|
||||
environment: testing
|
||||
buildTestContent: true
|
||||
|
||||
# Keep your shared content exclusions
|
||||
ignoreFiles:
|
||||
- "content/shared/.*"
|
||||
|
||||
# Ignore specific warning logs
|
||||
ignoreLogs:
|
||||
- warning-goldmark-raw-html
|
|
@ -1267,3 +1267,106 @@ This is small tab 2.4 content.
|
|||
{{% /tab-content %}}
|
||||
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## Group key demo
|
||||
|
||||
Used to demonstrate Flux group keys
|
||||
|
||||
{{< tabs-wrapper >}}
|
||||
{{% tabs "small" %}}
|
||||
[Input](#)
|
||||
[Output](#)
|
||||
<span class="tab-view-output">Click to view output</span>
|
||||
{{% /tabs %}}
|
||||
{{% tab-content %}}
|
||||
|
||||
The following data is output from the last `filter()` and piped forward into `group()`:
|
||||
|
||||
> [!Note]
|
||||
> `_start` and `_stop` columns have been omitted.
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Kitchen, _field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | hum | 36.2 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | hum | 36.1 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Living Room, _field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | hum | 36 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Kitchen, _field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | temp | 21 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | temp | 23 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | temp | 22.7 |
|
||||
|
||||
{{% flux/group-key "[_measurement=home, room=Living Room, _field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | temp | 21.1 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | temp | 21.4 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | temp | 21.8 |
|
||||
|
||||
{{% /tab-content %}}
|
||||
{{% tab-content %}}
|
||||
|
||||
When grouped by `_field`, all rows with the `temp` field will be in one table
|
||||
and all the rows with the `hum` field will be in another.
|
||||
`_measurement` and `room` columns no longer affect how rows are grouped.
|
||||
|
||||
{{% note %}}
|
||||
`_start` and `_stop` columns have been omitted.
|
||||
{{% /note %}}
|
||||
|
||||
{{% flux/group-key "[_field=hum]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | hum | 36.2 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | hum | 36.1 |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | hum | 35.9 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | hum | 36 |
|
||||
|
||||
{{% flux/group-key "[_field=temp]" true %}}
|
||||
|
||||
| _time | _measurement | room | _field | _value |
|
||||
| :------------------- | :----------- | :---------- | :----- | :----- |
|
||||
| 2022-01-01T08:00:00Z | home | Kitchen | temp | 21 |
|
||||
| 2022-01-01T09:00:00Z | home | Kitchen | temp | 23 |
|
||||
| 2022-01-01T10:00:00Z | home | Kitchen | temp | 22.7 |
|
||||
| 2022-01-01T08:00:00Z | home | Living Room | temp | 21.1 |
|
||||
| 2022-01-01T09:00:00Z | home | Living Room | temp | 21.4 |
|
||||
| 2022-01-01T10:00:00Z | home | Living Room | temp | 21.8 |
|
||||
|
||||
{{% /tab-content %}}
|
||||
{{< /tabs-wrapper >}}
|
||||
|
||||
## datetime/current-timestamp shortcode
|
||||
|
||||
### Default usage
|
||||
|
||||
{{< datetime/current-timestamp >}}
|
||||
|
||||
### Format YYYY-MM-DD HH:mm:ss
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" >}}
|
||||
|
||||
### Format with UTC timezone
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" timezone="UTC" >}}
|
||||
|
||||
### Format with America/New_York timezone
|
||||
|
||||
{{< datetime/current-timestamp format="YYYY-MM-DD HH:mm:ss" timezone="America/New_York" >}}
|
||||
|
|
|
@ -51,7 +51,7 @@ Use the following command to return the image Kubernetes uses to build your
|
|||
InfluxDB cluster:
|
||||
|
||||
```sh
|
||||
kubectl get appinstances.kubecfg.dev influxdb -o jsonpath='{.spec.package.image}'
|
||||
kubectl get appinstances.kubecfg.dev influxdb -n influxdb -o jsonpath='{.spec.package.image}'
|
||||
```
|
||||
|
||||
The package version number is at the end of the returned string (after `influxdb:`):
|
||||
|
|
|
@ -0,0 +1,15 @@
|
|||
---
|
||||
title: influxdb3 test schedule_plugin
|
||||
description: >
|
||||
The `influxdb3 test schedule_plugin` command tests a schedule plugin file without needing to create a trigger.
|
||||
menu:
|
||||
influxdb3_core:
|
||||
parent: influxdb3 test
|
||||
name: influxdb3 test schedule_plugin
|
||||
weight: 401
|
||||
source: /shared/influxdb3-cli/test/schedule_plugin.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is at content/shared/influxdb3-cli/test/schedule_plugin.md
|
||||
-->
|
|
@ -0,0 +1,15 @@
|
|||
---
|
||||
title: influxdb3 test schedule_plugin
|
||||
description: >
|
||||
The `influxdb3 test schedule_plugin` command tests a schedule plugin file without needing to create a trigger.
|
||||
menu:
|
||||
influxdb3_enterprise:
|
||||
parent: influxdb3 test
|
||||
name: influxdb3 test schedule_plugin
|
||||
weight: 401
|
||||
source: /shared/influxdb3-cli/test/schedule_plugin.md
|
||||
---
|
||||
|
||||
<!--
|
||||
The content of this file is at content/shared/influxdb3-cli/test/schedule_plugin.md
|
||||
-->
|
|
@ -407,15 +407,15 @@ if __name__ == '__main__':
|
|||
agent.handler = h
|
||||
|
||||
# Anything printed to STDERR from a UDF process gets captured into the Kapacitor logs.
|
||||
print >> sys.stderr, "Starting agent for TTestHandler"
|
||||
print("Starting agent for TTestHandler", file=sys.stderr)
|
||||
agent.start()
|
||||
agent.wait()
|
||||
print >> sys.stderr, "Agent finished"
|
||||
print("Agent finished", file=sys.stderr)
|
||||
|
||||
```
|
||||
|
||||
That was a lot, but now we are ready to configure Kapacitor to run our
|
||||
code. Create a scratch dir for working through the rest of this
|
||||
code. Make sure that `scipy` is installed (`$ pip3 install scipy`). Create a scratch dir for working through the rest of this
|
||||
guide:
|
||||
|
||||
```bash
|
||||
|
@ -434,7 +434,7 @@ Add this snippet to your Kapacitor configuration file (typically located at `/et
|
|||
[udf.functions]
|
||||
[udf.functions.tTest]
|
||||
# Run python
|
||||
prog = "/usr/bin/python2"
|
||||
prog = "/usr/bin/python3"
|
||||
# Pass args to python
|
||||
# -u for unbuffered STDIN and STDOUT
|
||||
# and the path to the script
|
||||
|
@ -468,8 +468,8 @@ correctly:
|
|||
service kapacitor restart
|
||||
```
|
||||
|
||||
Check the logs (`/var/log/kapacitor/`) to make sure you see a
|
||||
*Listening for signals* line and that no errors occurred. If you
|
||||
Check the logs (`/var/log/kapacitor/` or `journalctl -f -n 256 -u kapacitor.service`) to make sure you see a
|
||||
_Listening for signals_ line and that no errors occurred. If you
|
||||
don't see the line, it's because the UDF process is hung and not
|
||||
responding. It should be killed after a timeout, so give it a moment
|
||||
to stop properly. Once stopped, you can fix any errors and try again.
|
||||
|
@ -544,6 +544,20 @@ the Kapacitor task:
|
|||
kapacitor define print_temps -tick print_temps.tick
|
||||
```
|
||||
|
||||
Ensure that the task is enabled:
|
||||
|
||||
```bash
|
||||
kapacitor enable print_temps
|
||||
```
|
||||
|
||||
And then list the tasks:
|
||||
|
||||
```bash
|
||||
kapacitor list tasks
|
||||
ID Type Status Executing Databases and Retention Policies
|
||||
print_temps stream enabled true ["printer"."autogen"]
|
||||
```
|
||||
|
||||
### Generating test data
|
||||
|
||||
To simulate our printer for testing, we will write a simple Python
|
||||
|
@ -557,7 +571,7 @@ to use real data for testing our TICKscript and UDF, but this is
|
|||
faster (and much cheaper than a 3D printer).
|
||||
|
||||
```python
|
||||
#!/usr/bin/python2
|
||||
#!/usr/bin/env python
|
||||
|
||||
from numpy import random
|
||||
from datetime import timedelta, datetime
|
||||
|
@ -672,7 +686,11 @@ fake data so that we can easily iterate on the task:
|
|||
```sh
|
||||
# Start the recording in the background
|
||||
kapacitor record stream -task print_temps -duration 24h -no-wait
|
||||
# Grab the ID from the output and store it in a var
|
||||
# List recordings to find the ID
|
||||
kapacitor list recordings
|
||||
ID Type Status Size Date
|
||||
7bd3ced5-5e95-4a67-a0e1-f00860b1af47 stream running 0 B 04 May 16 11:34 MDT
|
||||
# Copy the ID and store it in a variable
|
||||
rid=7bd3ced5-5e95-4a67-a0e1-f00860b1af47
|
||||
# Run our python script to generate data
|
||||
chmod +x ./printer_data.py
|
||||
|
|
|
@ -0,0 +1,84 @@
|
|||
|
||||
The `influxdb3 test schedule_plugin` command tests a schedule plugin. Use this command to verify plugin behavior without creating a trigger.
|
||||
|
||||
## Usage
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
```bash
|
||||
influxdb3 test schedule_plugin [OPTIONS] --database <DATABASE_NAME> <FILENAME>
|
||||
```
|
||||
|
||||
## Arguments
|
||||
|
||||
- **FILENAME**: Path to the plugin file. Use the absolute path or the path relative to the current working directory, such as `<plugin-dir>/<plugin-file-name>.py`.
|
||||
|
||||
## Options
|
||||
|
||||
| Option | Flag | Description |
|
||||
| :----- | :-------------------- | :-------------------------------------------------------------------------------------------- |
|
||||
| `-H` | `--host` | URL of the running {{< product-name >}} server <br>(default: `http://127.0.0.1:8181`) |
|
||||
| `-d` | `--database` | _({{< req >}})_ Name of the database you want to test the plugin against |
|
||||
| | `--token` | _({{< req >}})_ Authentication token |
|
||||
| | `--input-arguments` | JSON map of key/value pairs to pass as plugin input arguments (for example, `'{"key":"val"}'`)|
|
||||
| | `--schedule` | Cron schedule to simulate when testing the plugin <br>(default: `* * * * *`) |
|
||||
| | `--cache-name` | Optional cache name to associate with the test |
|
||||
| | `--tls-ca` | Path to a custom TLS certificate authority for self-signed certs |
|
||||
| `-h` | `--help` | Show basic help information |
|
||||
| | `--help-all` | Show all available help options |
|
||||
|
||||
|
||||
### Option environment variables
|
||||
|
||||
You can use the following environment variables to set command options:
|
||||
|
||||
| Environment Variable | Corresponding Option |
|
||||
| :------------------------ | :------------------- |
|
||||
| `INFLUXDB3_HOST_URL` | `--host` |
|
||||
| `INFLUXDB3_DATABASE_NAME` | `--database` |
|
||||
| `INFLUXDB3_AUTH_TOKEN` | `--token` |
|
||||
| `INFLUXDB3_TLS_CA` | `--tls-ca` |
|
||||
|
||||
## Examples
|
||||
|
||||
In the examples below, replace the following:
|
||||
|
||||
- {{% code-placeholder-key %}}`DATABASE_NAME`{{% /code-placeholder-key %}}: Your target database
|
||||
- {{% code-placeholder-key %}}`AUTH_TOKEN`{{% /code-placeholder-key %}}: Your authentication token
|
||||
- {{% code-placeholder-key %}}`PLUGIN_DIR`{{% /code-placeholder-key %}}:
|
||||
the path to the plugin directory you provided when starting the server
|
||||
- {{% code-placeholder-key %}}`FILENAME`{{% /code-placeholder-key %}}:
|
||||
Plugin file name
|
||||
|
||||
{{% code-placeholders "(DATABASE|PLUGIN_DIR|FILENAME|AUTH_TOKEN)" %}}
|
||||
|
||||
### Test a schedule plugin
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
```bash
|
||||
influxdb3 test schedule_plugin \
|
||||
--database DATABASE_NAME \
|
||||
--token AUTH_TOKEN \
|
||||
PLUGIN_DIR/FILENAME.py
|
||||
```
|
||||
|
||||
### Test with input arguments and a custom cron schedule
|
||||
|
||||
You can pass input arguments to your plugin as key-value pairs and specify a custom cron schedule (using Quartz cron syntax with six fields):
|
||||
|
||||
<!--pytest.mark.skip-->
|
||||
|
||||
```bash
|
||||
influxdb3 test schedule_plugin \
|
||||
--host http://localhost:8182 \
|
||||
--database DATABASE_NAME \
|
||||
--token AUTH_TOKEN \
|
||||
--input-arguments threshold=10,unit=seconds \
|
||||
--schedule "0 0 * * * ?" \
|
||||
PLUGIN_DIR/FILENAME.py
|
||||
```
|
||||
- Pass plugin parameters using `--input-arguments` as comma-separated key=value pairs.
|
||||
- Use `--schedule` to set the plugin’s execution time with a Quartz cron expression. For example, "0 0 * * * ?" runs the plugin at the start of every hour.
|
||||
|
||||
{{% /code-placeholders %}}
|
|
@ -169,10 +169,10 @@ Before you begin, make sure:
|
|||
Choose a plugin type based on your automation goals:
|
||||
|
||||
| Plugin Type | Best For | Trigger Type |
|
||||
|-------------|----------|-------------|
|
||||
| ---------------- | ------------------------------------------- | ------------------------ |
|
||||
| **Data write** | Processing data as it arrives | `table:` or `all_tables` |
|
||||
| **Scheduled** | Running code at specific times | `every:` or `cron:` |
|
||||
| **HTTP request** | Creating API endpoints | `path:` |
|
||||
| **Scheduled** | Running code at specific intervals or times | `every:` or `cron:` |
|
||||
| **HTTP request** | Running code on demand via API endpoints | `path:` |
|
||||
|
||||
#### Create your plugin file
|
||||
|
||||
|
@ -336,8 +336,9 @@ influxdb3 create trigger \
|
|||
regular_check
|
||||
|
||||
# Run on a cron schedule (8am daily)
|
||||
# Supports extended cron format with seconds
|
||||
influxdb3 create trigger \
|
||||
--trigger-spec "cron:0 8 * * *" \
|
||||
--trigger-spec "cron:0 0 8 * * *" \
|
||||
--plugin-filename "daily_report.py" \
|
||||
--database my_database \
|
||||
daily_report
|
||||
|
@ -522,27 +523,90 @@ influxdb3 create trigger \
|
|||
|
||||
### Install Python dependencies
|
||||
|
||||
If your plugin needs additional Python packages, use the `influxdb3 install` command:
|
||||
Use the `influxdb3 install package` command to add third-party libraries (like `pandas`, `requests`, or `influxdb3-python`) to your plugin environment.
|
||||
This installs packages into the Processing Engine’s embedded Python environment to ensure compatibility with your InfluxDB instance.
|
||||
|
||||
{{% code-placeholders "CONTAINER_NAME|PACKAGE_NAME" %}}
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
|
||||
{{% code-tabs %}}
|
||||
[CLI](#)
|
||||
[Docker](#)
|
||||
{{% /code-tabs %}}
|
||||
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```bash
|
||||
# Install a package directly
|
||||
# Use the CLI to install a Python package
|
||||
influxdb3 install package pandas
|
||||
|
||||
```
|
||||
|
||||
{{% /code-tab-content %}}
|
||||
|
||||
{{% code-tab-content %}}
|
||||
|
||||
```bash
|
||||
# With Docker
|
||||
# Use the CLI to install a Python package in a Docker container
|
||||
docker exec -it CONTAINER_NAME influxdb3 install package pandas
|
||||
```
|
||||
|
||||
This creates a Python virtual environment in your plugins directory with the specified packages installed.
|
||||
{{% /code-tab-content %}}
|
||||
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
These examples install the specified Python package (for example, pandas) into the Processing Engine’s embedded virtual environment.
|
||||
|
||||
- Use the CLI command when running InfluxDB directly on your system.
|
||||
- Use the Docker variant if you're running InfluxDB in a containerized environment.
|
||||
|
||||
> [!Important]
|
||||
> #### Use bundled Python for plugins
|
||||
> When you start the server with the `--plugin-dir` option, InfluxDB 3 creates a Python virtual environment (`<PLUGIN_DIR>/venv`) for your plugins.
|
||||
> If you need to create a custom virtual environment, use the Python interpreter bundled with InfluxDB 3. Don't use the system Python.
|
||||
> Creating a virtual environment with the system Python (for example, using `python -m venv`) can lead to runtime errors and plugin failures.
|
||||
>
|
||||
>For more information, see the [processing engine README](https://github.com/influxdata/influxdb/blob/main/README_processing_engine.md#official-builds).
|
||||
|
||||
{{% /code-placeholders %}}
|
||||
|
||||
InfluxDB creates a Python virtual environment in your plugins directory with the specified packages installed.
|
||||
|
||||
{{% show-in "enterprise" %}}
|
||||
|
||||
### Connect Grafana to your InfluxDB instance
|
||||
## Distributed cluster considerations
|
||||
|
||||
When configuring Grafana to connect to an InfluxDB 3 Enterprise instance:
|
||||
When you deploy {{% product-name %}} in a multi-node environment, configure each node based on its role and the plugins it runs.
|
||||
|
||||
- **URL**: Use a querier URL or any node that serves queries
|
||||
### Match plugin types to the correct node
|
||||
|
||||
Each plugin must run on a node that supports its trigger type:
|
||||
|
||||
| Plugin type | Trigger spec | Runs on |
|
||||
|--------------------|--------------------------|-----------------------------|
|
||||
| Data write | `table:` or `all_tables` | Ingester nodes |
|
||||
| Scheduled | `every:` or `cron:` | Any node with scheduler |
|
||||
| HTTP request | `path:` | Nodes that serve API traffic|
|
||||
|
||||
For example:
|
||||
- Run write-ahead log (WAL) plugins on ingester nodes.
|
||||
- Run scheduled plugins on any node configured to execute them.
|
||||
- Run HTTP-triggered plugins on querier nodes or any node that handles HTTP endpoints.
|
||||
|
||||
Place all plugin files in the `--plugin-dir` directory configured for each node.
|
||||
|
||||
> [!Note]
|
||||
> Triggers fail if the plugin file isn’t available on the node where it runs.
|
||||
|
||||
### Route third-party clients to querier nodes
|
||||
|
||||
External tools—such as Grafana, custom dashboards, or REST clients—must connect to querier nodes in your InfluxDB Enterprise deployment.
|
||||
|
||||
#### Examples
|
||||
|
||||
- **Grafana**: When adding InfluxDB 3 as a Grafana data source, use a querier node URL, such as:
|
||||
`https://querier.example.com:8086`
|
||||
- **REST clients**: Applications using `POST /api/v3/query/sql` or similar endpoints must target a querier node.
|
||||
|
||||
Example URL format: `https://querier.your-influxdb.com:8086`
|
||||
{{% /show-in %}}
|
||||
|
|
|
@ -2,8 +2,10 @@ import { spawn } from 'child_process';
|
|||
import fs from 'fs';
|
||||
import http from 'http';
|
||||
import net from 'net';
|
||||
import process from 'process';
|
||||
|
||||
// Hugo server constants
|
||||
export const HUGO_ENVIRONMENT = 'testing';
|
||||
export const HUGO_PORT = 1315;
|
||||
export const HUGO_LOG_FILE = '/tmp/hugo_server.log';
|
||||
|
||||
|
@ -28,7 +30,8 @@ export async function isPortInUse(port) {
|
|||
/**
|
||||
* Start the Hugo server with the specified options
|
||||
* @param {Object} options - Configuration options for Hugo
|
||||
* @param {string} options.configFile - Path to Hugo config file (e.g., 'config/testing/config.yml')
|
||||
* @param {string} options.configFile - Path to Hugo config file
|
||||
* @param {string} options.environment - Environment to run Hugo in
|
||||
* @param {number} options.port - Port number for Hugo server
|
||||
* @param {boolean} options.buildDrafts - Whether to build draft content
|
||||
* @param {boolean} options.noHTTPCache - Whether to disable HTTP caching
|
||||
|
@ -36,9 +39,10 @@ export async function isPortInUse(port) {
|
|||
* @returns {Promise<Object>} Child process object
|
||||
*/
|
||||
export async function startHugoServer({
|
||||
configFile = 'config/testing/config.yml',
|
||||
configFile = 'config/_default/hugo.yml',
|
||||
port = HUGO_PORT,
|
||||
buildDrafts = true,
|
||||
environment = 'testing',
|
||||
buildDrafts = false,
|
||||
noHTTPCache = true,
|
||||
logFile = HUGO_LOG_FILE,
|
||||
} = {}) {
|
||||
|
@ -48,6 +52,8 @@ export async function startHugoServer({
|
|||
const hugoArgs = [
|
||||
'hugo',
|
||||
'server',
|
||||
'--environment',
|
||||
environment,
|
||||
'--config',
|
||||
configFile,
|
||||
'--port',
|
||||
|
@ -64,16 +70,16 @@ export async function startHugoServer({
|
|||
|
||||
return new Promise((resolve, reject) => {
|
||||
try {
|
||||
// Use npx to find and execute Hugo, which will work regardless of installation method
|
||||
console.log(`Running Hugo with npx: npx ${hugoArgs.join(' ')}`);
|
||||
const hugoProc = spawn('npx', hugoArgs, {
|
||||
// Use yarn to find and execute Hugo, which will work regardless of installation method
|
||||
console.log(`Running Hugo with yarn: yarn ${hugoArgs.join(' ')}`);
|
||||
const hugoProc = spawn('yarn', hugoArgs, {
|
||||
stdio: ['ignore', 'pipe', 'pipe'],
|
||||
shell: true,
|
||||
});
|
||||
|
||||
// Check if the process started successfully
|
||||
if (!hugoProc || !hugoProc.pid) {
|
||||
return reject(new Error('Failed to start Hugo server via npx'));
|
||||
return reject(new Error('Failed to start Hugo server via yarn'));
|
||||
}
|
||||
|
||||
// Set up logging
|
||||
|
|
|
@ -38,9 +38,10 @@ import fs from 'fs';
|
|||
import path from 'path';
|
||||
import cypress from 'cypress';
|
||||
import net from 'net';
|
||||
import matter from 'gray-matter';
|
||||
import { Buffer } from 'buffer';
|
||||
import { displayBrokenLinksReport, initializeReport } from './link-reporter.js';
|
||||
import {
|
||||
HUGO_ENVIRONMENT,
|
||||
HUGO_PORT,
|
||||
HUGO_LOG_FILE,
|
||||
startHugoServer,
|
||||
|
@ -90,28 +91,6 @@ async function isPortInUse(port) {
|
|||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract source information from frontmatter
|
||||
* @param {string} filePath - Path to the markdown file
|
||||
* @returns {string|null} Source information if present
|
||||
*/
|
||||
function getSourceFromFrontmatter(filePath) {
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
const fileContent = fs.readFileSync(filePath, 'utf8');
|
||||
const { data } = matter(fileContent);
|
||||
return data.source || null;
|
||||
} catch (err) {
|
||||
console.warn(
|
||||
`Warning: Could not extract frontmatter from ${filePath}: ${err.message}`
|
||||
);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensures a directory exists, creating it if necessary
|
||||
* Also creates an empty file to ensure the directory is not empty
|
||||
|
@ -296,7 +275,7 @@ async function main() {
|
|||
});
|
||||
|
||||
console.log('Hugo is available on the system');
|
||||
} catch (checkErr) {
|
||||
} catch {
|
||||
console.log(
|
||||
'Hugo not found on PATH, will use project-local Hugo via yarn'
|
||||
);
|
||||
|
@ -304,9 +283,8 @@ async function main() {
|
|||
|
||||
// Use the startHugoServer function from hugo-server.js
|
||||
hugoProc = await startHugoServer({
|
||||
configFile: 'config/testing/config.yml',
|
||||
environment: HUGO_ENVIRONMENT,
|
||||
port: HUGO_PORT,
|
||||
buildDrafts: true,
|
||||
noHTTPCache: true,
|
||||
logFile: HUGO_LOG_FILE,
|
||||
});
|
||||
|
@ -412,7 +390,7 @@ async function main() {
|
|||
`ℹ️ Note: ${testFailureCount} test(s) failed but no broken links were detected in the report.`
|
||||
);
|
||||
console.warn(
|
||||
` This usually indicates test errors unrelated to link validation.`
|
||||
' This usually indicates test errors unrelated to link validation.'
|
||||
);
|
||||
|
||||
// We should not consider special case domains (those with expected errors) as failures
|
||||
|
|
|
@ -66,6 +66,23 @@ cloud:
|
|||
- name: East US (Virginia)
|
||||
location: Virginia, USA
|
||||
url: https://eastus-1.azure.cloud2.influxdata.com
|
||||
|
||||
serverless:
|
||||
product: InfluxDB Cloud
|
||||
providers:
|
||||
- name: Amazon Web Services
|
||||
short_name: AWS
|
||||
iox: true
|
||||
regions:
|
||||
- name: US East (Virginia)
|
||||
location: Virginia, USA
|
||||
url: https://us-east-1-1.aws.cloud2.influxdata.com
|
||||
iox: true
|
||||
- name: EU Frankfurt
|
||||
location: Frankfurt, Germany
|
||||
url: https://eu-central-1-1.aws.cloud2.influxdata.com
|
||||
iox: true
|
||||
|
||||
cloud_dedicated:
|
||||
providers:
|
||||
- name: Default
|
||||
|
|
|
@ -106,6 +106,33 @@ export default [
|
|||
files: ['assets/js/**/*.js'],
|
||||
rules: {
|
||||
// Rules specific to JavaScript in Hugo assets
|
||||
// Prevent imports from debug-helpers.js
|
||||
'no-restricted-imports': [
|
||||
'error',
|
||||
{
|
||||
paths: [
|
||||
{
|
||||
name: './utils/debug-helpers.js',
|
||||
message:
|
||||
'Remove debugging functions before committing. Debug helpers should not be used in production code.',
|
||||
},
|
||||
{
|
||||
name: '/utils/debug-helpers.js',
|
||||
message:
|
||||
'Remove debugging functions before committing. Debug helpers should not be used in production code.',
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
// Prevent use of debug functions in production code
|
||||
'no-restricted-syntax': [
|
||||
'error',
|
||||
{
|
||||
selector: 'CallExpression[callee.name=/^debug(Log|Break|Inspect)$/]',
|
||||
message:
|
||||
'Remove debugging functions before committing. Debug helpers should not be used in production code.',
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
{
|
||||
|
|
|
@ -1,60 +0,0 @@
|
|||
baseURL: https://test2.docs.influxdata.com/
|
||||
languageCode: en-us
|
||||
title: InfluxDB Documentation
|
||||
|
||||
# Git history information for lastMod-type functionality
|
||||
enableGitInfo: true
|
||||
|
||||
# Syntax Highlighting
|
||||
pygmentsCodefences: true
|
||||
pygmentsUseClasses: true
|
||||
|
||||
# Preserve case in article tags
|
||||
preserveTaxonomyNames: true
|
||||
|
||||
# Generate a robots.txt
|
||||
enableRobotsTXT: true
|
||||
|
||||
# Custom staging params
|
||||
params:
|
||||
environment: staging
|
||||
|
||||
# Markdown rendering options
|
||||
blackfriday:
|
||||
hrefTargetBlank: true
|
||||
smartDashes: false
|
||||
|
||||
taxonomies:
|
||||
influxdb/v2/tag: influxdb/v2/tags
|
||||
influxdb/cloud/tag: influxdb/cloud/tags
|
||||
influxdb3/cloud-serverless/tag: influxdb/cloud-serverless/tags
|
||||
influxdb/cloud-dedicated/tag: influxdb/cloud-dedicated/tags
|
||||
influxdb/clustered/tag: influxdb/clustered/tags
|
||||
influxdb3/core/tag: influxdb3/core/tags
|
||||
influxdb3/enterprise/tag: influxdb3/enterprise/tags
|
||||
flux/v0/tag: flux/v0/tags
|
||||
|
||||
markup:
|
||||
goldmark:
|
||||
renderer:
|
||||
unsafe: true
|
||||
extensions:
|
||||
linkify: false
|
||||
parser:
|
||||
attribute:
|
||||
block: true
|
||||
|
||||
privacy:
|
||||
googleAnalytics:
|
||||
anonymizeIP: false
|
||||
disable: false
|
||||
respectDoNotTrack: true
|
||||
useSessionStorage: false
|
||||
youtube:
|
||||
disable: false
|
||||
privacyEnhanced: true
|
||||
outputFormats:
|
||||
json:
|
||||
mediaType: application/json
|
||||
baseName: pages
|
||||
isPlainText: true
|
File diff suppressed because it is too large
Load Diff
|
@ -19,7 +19,7 @@
|
|||
<div class="home-content">
|
||||
|
||||
<div class="section search">
|
||||
<div class="sidebar--search">
|
||||
<div class="sidebar--search" data-component="sidebar-search">
|
||||
<input class="sidebar--search-field"
|
||||
id="algolia-search-input"
|
||||
type="text"
|
||||
|
|
|
@ -13,7 +13,7 @@
|
|||
{{ $urlCalloutText := $scratch.Get "urlCalloutText" }}
|
||||
|
||||
<!-- {{ if or $isOSS $isCloud $isHome }}
|
||||
<div class="feature-callout start-position" id="callout-url-selector">
|
||||
<div class="feature-callout start-position" id="callout-url-selector" data-component="feature-callout">
|
||||
<p>{{ $urlCalloutText }} <a href="#" class="close"><span class="icon-remove"></span></a></p>
|
||||
</div>
|
||||
{{ end }} -->
|
||||
|
|
|
@ -1,18 +1,3 @@
|
|||
{{ $jquery := resources.Get "js/jquery-3.5.0.min.js" }}
|
||||
{{ $versionSelector := resources.Get "js/version-selector.js" }}
|
||||
{{ $searchInteractions := resources.Get "js/search-interactions.js" }}
|
||||
{{ $listFilters := resources.Get "js/list-filters.js" }}
|
||||
{{ $featureCallouts := resources.Get "js/feature-callouts.js" }}
|
||||
{{ $keybindings := resources.Get "js/keybindings.js" }}
|
||||
{{ $fluxGroupKeys := resources.Get "js/flux-group-keys.js" }}
|
||||
{{ $dateTime := resources.Get "js/datetime.js" }}
|
||||
{{ $homepageInteractions := resources.Get "js/home-interactions.js" }}
|
||||
{{ $releaseTOC := resources.Get "/js/release-toc.js" }}
|
||||
{{ $footerjs := slice $jquery $versionSelector $searchInteractions $listFilters $featureCallouts $keybindings $homepageInteractions | resources.Concat "js/footer.bundle.js" | resources.Fingerprint }}
|
||||
{{ $fluxGroupKeyjs := $fluxGroupKeys | resources.Fingerprint }}
|
||||
{{ $dateTimejs := $dateTime | resources.Fingerprint }}
|
||||
{{ $releaseTOCjs := $releaseTOC | resources.Fingerprint }}
|
||||
|
||||
<!-- Load cloudUrls array -->
|
||||
<script type="text/javascript">
|
||||
cloudUrls = [
|
||||
|
@ -21,37 +6,3 @@
|
|||
{{ end -}}
|
||||
]
|
||||
</script>
|
||||
|
||||
{{ if .Page.HasShortcode "diagram" }}
|
||||
<!-- Load mermaid.js for diagrams -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/mermaid/dist/mermaid.min.js"></script>
|
||||
<script>
|
||||
mermaid.initialize({
|
||||
startOnLoad: true,
|
||||
|
||||
themeVariables: {
|
||||
fontFamily: "Proxima Nova",
|
||||
fontSize: '18px',
|
||||
}
|
||||
})
|
||||
</script>
|
||||
{{ end }}
|
||||
|
||||
<!-- Load group key demo JS if when the group key demo shortcode is present -->
|
||||
{{ if .Page.HasShortcode "flux/group-key-demo" }}
|
||||
<script type="text/javascript" src="{{ $fluxGroupKeyjs.RelPermalink }}"></script>
|
||||
{{ end }}
|
||||
|
||||
<!-- Load datetime js if when datetime shortcodes are present -->
|
||||
{{ if or (.Page.HasShortcode "datetime/current-time") (.Page.HasShortcode "datetime/current-timestamp")
|
||||
(.Page.HasShortcode "datetime/current-date") (.Page.HasShortcode "datetime/enterprise-eol-date") }}
|
||||
<script type="text/javascript" src="{{ $dateTimejs.RelPermalink }}"></script>
|
||||
{{ end }}
|
||||
|
||||
<!-- Load code release-toc js when release-toc shortcode is present -->
|
||||
{{ if .Page.HasShortcode "release-toc" }}
|
||||
<script type="text/javascript" src="{{ $releaseTOCjs.RelPermalink }}"></script>
|
||||
{{ end }}
|
||||
|
||||
<!-- Load footer.js -->
|
||||
<script type="text/javascript" src="{{ $footerjs.RelPermalink }}"></script>
|
|
@ -7,84 +7,15 @@
|
|||
{{ $includeFlux := and (in $fluxSupported $product) (in $influxdbFluxSupport $version) }}
|
||||
{{ $includeResources := not (in (slice "cloud-serverless" "cloud-dedicated" "clustered" "core" "enterprise" "explorer") $version) }}
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.js"></script>
|
||||
<script>
|
||||
var multiVersion = ['influxdb']
|
||||
docsearch({
|
||||
apiKey: '501434b53a46a92a7931aecc7c9672e2',
|
||||
appId: 'WHM9UWMP6M',
|
||||
indexName: 'influxdata',
|
||||
inputSelector: '#algolia-search-input',
|
||||
// Set debug to true if you want to inspect the dropdown
|
||||
debug: true,
|
||||
transformData: function (hits) {
|
||||
function fmtVersion (version, productKey) {
|
||||
if (version == null) {
|
||||
return '';
|
||||
} else if (version === 'cloud') {
|
||||
return 'Cloud (TSM)';
|
||||
} else if (version === 'core') {
|
||||
return 'Core';
|
||||
} else if (version === 'enterprise') {
|
||||
return 'Enterprise';
|
||||
} else if (version === 'explorer') {
|
||||
return 'Explorer';
|
||||
} else if (version === 'cloud-serverless') {
|
||||
return 'Cloud Serverless';
|
||||
} else if (version === 'cloud-dedicated') {
|
||||
return 'Cloud Dedicated';
|
||||
} else if (version === 'clustered') {
|
||||
return 'Clustered';
|
||||
} else if (multiVersion.includes(productKey)) {
|
||||
return version;
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
};
|
||||
productNames = {
|
||||
influxdb: 'InfluxDB',
|
||||
influxdb3: 'InfluxDB 3',
|
||||
enterprise_influxdb: 'InfluxDB Enterprise',
|
||||
flux: 'Flux',
|
||||
telegraf: 'Telegraf',
|
||||
chronograf: 'Chronograf',
|
||||
kapacitor: 'Kapacitor',
|
||||
platform: 'InfluxData Platform',
|
||||
resources: 'Additional Resources',
|
||||
};
|
||||
hits.map(hit => {
|
||||
pathData = new URL(hit.url).pathname.split('/').filter(n => n);
|
||||
product = productNames[pathData[0]];
|
||||
version = fmtVersion(pathData[1], pathData[0]);
|
||||
|
||||
hit.product = product;
|
||||
hit.version = version;
|
||||
hit.hierarchy.lvl0 =
|
||||
hit.hierarchy.lvl0 +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
hit._highlightResult.hierarchy.lvl0.value =
|
||||
hit._highlightResult.hierarchy.lvl0.value +
|
||||
` <span class=\"search-product-version\">${product} ${version}</span>`;
|
||||
});
|
||||
return hits;
|
||||
},
|
||||
algoliaOptions: {
|
||||
hitsPerPage: 10,
|
||||
'facetFilters': [
|
||||
{{ if or (eq $product "platform") (eq $product "resources") (le (len $productPathData) 1) }}
|
||||
'latest:true'
|
||||
{{ else if $includeFlux }}
|
||||
['searchTag: {{ $product }}-{{ $version }}', 'flux:true', 'resources:{{ $includeResources }}']
|
||||
{{ else }}
|
||||
['searchTag: {{ $product }}-{{ $version }}', 'resources:{{ $includeResources }}']
|
||||
{{ end }}
|
||||
]
|
||||
},
|
||||
autocompleteOptions: {
|
||||
templates: {
|
||||
header: '<div class="search-all-content"><a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a>',
|
||||
empty: '<div class="search-no-results"><p>Not finding what you\'re looking for?</p> <a href="https:\/\/support.influxdata.com" target="_blank">Search all InfluxData content <span class="icon-arrow-up-right"></span></a></div>'
|
||||
}
|
||||
}
|
||||
});
|
||||
</script>
|
||||
<!-- DocSearch Component Container -->
|
||||
<div
|
||||
data-component="doc-search"
|
||||
data-api-key="501434b53a46a92a7931aecc7c9672e2"
|
||||
data-app-id="WHM9UWMP6M"
|
||||
data-index-name="influxdata"
|
||||
data-input-selector="#algolia-search-input"
|
||||
data-search-tag="{{ $product }}-{{ $version }}"
|
||||
data-include-flux="{{ $includeFlux }}"
|
||||
data-include-resources="{{ $includeResources }}"
|
||||
data-debug="{{ if hugo.IsProduction }}false{{ else }}true{{ end }}"
|
||||
></div>
|
||||
|
|
|
@ -5,20 +5,49 @@
|
|||
<!-- Get site data -->
|
||||
<!-- Load cloudUrls array -->
|
||||
{{ $cloudUrls := slice }}
|
||||
{{- range.Site.Data.influxdb_urls.cloud.providers }}
|
||||
{{- range.regions }}
|
||||
{{ $cloudUrls = $cloudUrls | append "{{ safeHTML .url }}" }}
|
||||
{{- range .Site.Data.influxdb_urls.cloud.providers }}
|
||||
{{- range .regions }}
|
||||
{{ $cloudUrls = $cloudUrls | append (safeHTML .url) }}
|
||||
{{ end -}}
|
||||
{{ end -}}
|
||||
{{ $products := .Site.Data.products }}
|
||||
{{ $influxdb_urls := .Site.Data.influxdb_urls }}
|
||||
|
||||
<!-- Build main.js -->
|
||||
{{ with resources.Get "js/index.js" }}
|
||||
{{ $isDevelopment := false }}
|
||||
{{ $isTesting := false }}
|
||||
|
||||
{{ with hugo }}
|
||||
{{ if .IsDevelopment }}
|
||||
{{ $isDevelopment = .IsDevelopment }}
|
||||
{{ end }}
|
||||
{{ end }}
|
||||
|
||||
{{ if eq .Site.Params.env "testing" }}
|
||||
{{ $isTesting = true }}
|
||||
{{ end }}
|
||||
{{ if eq .Site.Params.environment "testing" }}
|
||||
{{ $isTesting = true }}
|
||||
{{ end }}
|
||||
|
||||
{{ $isDevelopmentOrTesting := or $isDevelopment $isTesting }}
|
||||
|
||||
{{ with resources.Get "js/main.js" }}
|
||||
{{ $opts := dict
|
||||
"minify" hugo.IsProduction
|
||||
"sourceMap" (cond hugo.IsProduction "" "external")
|
||||
"sourceMap" (cond $isDevelopmentOrTesting "inline" "")
|
||||
"format" (cond $isDevelopmentOrTesting "esm" "iife")
|
||||
"bundle" true
|
||||
"targetPath" "js/main.js"
|
||||
"params" (dict "product" $product "currentVersion" $currentVersion "isServer" hugo.IsServer "products" $products "influxdb_urls" $influxdb_urls "cloudUrls" $cloudUrls)
|
||||
"params" (dict
|
||||
"product" $product
|
||||
"currentVersion" $currentVersion
|
||||
"isServer" hugo.IsServer
|
||||
"products" $products
|
||||
"influxdb_urls" $influxdb_urls
|
||||
"cloudUrls" $cloudUrls
|
||||
"isDevelopment" $isDevelopmentOrTesting
|
||||
)
|
||||
}}
|
||||
{{ with . | js.Build $opts }}
|
||||
{{ if hugo.IsProduction }}
|
||||
|
@ -26,7 +55,7 @@
|
|||
<script src="{{ .RelPermalink }}" integrity="{{ .Data.Integrity }}" crossorigin="anonymous"></script>
|
||||
{{ end }}
|
||||
{{ else }}
|
||||
<script src="{{ .RelPermalink }}"></script>
|
||||
<script type="module" src="{{ .RelPermalink }}"></script>
|
||||
{{ end }}
|
||||
{{ end }}
|
||||
{{ end }}
|
|
@ -49,7 +49,7 @@
|
|||
<aside class="sidebar">
|
||||
{{ partial "sidebar/sidebar-toggle.html" (dict "state" "Close") }}
|
||||
<div class="search-and-nav-toggle">
|
||||
<div class="sidebar--search">
|
||||
<div class="sidebar--search" data-component="sidebar-search">
|
||||
<input class="sidebar--search-field"
|
||||
id="algolia-search-input"
|
||||
type="text"
|
||||
|
|
|
@ -43,7 +43,7 @@ Identify products by their product path. Dictionary schema:
|
|||
|
||||
{{ $templateDefaults := dict "context" . "productInfo" $productInfo "altLinks" $altLinks "pageRoot" $pageRoot "useRootProductLink" $useRootProductLink }}
|
||||
|
||||
<div class="product-list">
|
||||
<div class="product-list" data-component="product-selector">
|
||||
<div id="product-dropdown">
|
||||
<p class="selected">{{ index (index $productInfo $pageRoot) 0 | default "Select product" }}</p>
|
||||
</div>
|
||||
|
|
|
@ -1,3 +1,3 @@
|
|||
<div class="mermaid">
|
||||
<div class="mermaid" data-component="diagram">
|
||||
{{.Inner}}
|
||||
</div>
|
|
@ -1,4 +1,4 @@
|
|||
<div id="flux-group-keys-demo">
|
||||
<div id="flux-group-keys-demo" data-component="flux-group-keys-demo">
|
||||
<div id="group-by-columns">
|
||||
<ul class="column-list">
|
||||
<li>
|
||||
|
|
|
@ -5,4 +5,4 @@
|
|||
{{- $mac := .Get "mac" | default $default -}}
|
||||
{{- $win := .Get "win" | default $default -}}
|
||||
{{- $linux := .Get "linux" | default $default -}}
|
||||
<span class="keybinding" data-osx="{{ $mac }}" data-win="{{ $win }}" data-linux="{{ $linux }}"><code>{{ $default }}</code></span>
|
||||
<span class="keybinding" data-osx="{{ $mac }}" data-win="{{ $win }}" data-linux="{{ $linux }}" data-component="keybinding"><code>{{ $default }}</code></span>
|
|
@ -1,6 +1,6 @@
|
|||
{{ $source := .Get 0 | default "telegraf"}}
|
||||
|
||||
<div id="list-filters">
|
||||
<div id="list-filters" data-component="list-filters">
|
||||
|
||||
{{ range ( index .Site.Data.list_filters $source) }}
|
||||
{{ $numValues := len .values }}
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
{{- $currentVersion := index $productPathData 1 -}}
|
||||
{{- $show := .Get "show" | default 12 -}}
|
||||
|
||||
<div id="release-toc" class="{{ $currentVersion }}">
|
||||
<div id="release-toc" class="{{ $currentVersion }}" data-component="release-toc">
|
||||
<ul id="release-list" style="height: calc({{ $show }} * 1.885rem);" show="{{ $show }}">
|
||||
<!-- PLACEHOLDER FOR JS-GENERATED LIST ITEMS -->
|
||||
</ul>
|
||||
|
|
|
@ -5,6 +5,10 @@
|
|||
pre-commit:
|
||||
parallel: true
|
||||
commands:
|
||||
eslint-debug-check:
|
||||
glob: "assets/js/*.js"
|
||||
run: yarn eslint {staged_files}
|
||||
fail_text: "Debug helpers found! Remove debug imports and calls before committing."
|
||||
build-copilot-instructions:
|
||||
glob: "CONTRIBUTING.md"
|
||||
run: yarn build:copilot-instructions
|
||||
|
@ -58,6 +62,10 @@ pre-commit:
|
|||
{ echo "⚠️ Prettier found formatting issues. Automatic formatting applied."
|
||||
git add {staged_files}
|
||||
}
|
||||
lint-js:
|
||||
glob: "assets/js/*.{js,ts}"
|
||||
run: yarn eslint {staged_files}
|
||||
fail_text: "JavaScript linting failed. Fix errors before committing."
|
||||
pre-push:
|
||||
commands:
|
||||
packages-audit:
|
||||
|
|
|
@ -66,8 +66,6 @@
|
|||
"test:links:api-docs": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" /influxdb3/core/api/,/influxdb3/enterprise/api/,/influxdb3/cloud-dedicated/api/,/influxdb3/cloud-dedicated/api/v1/,/influxdb/cloud-dedicated/api/v1/,/influxdb/cloud-dedicated/api/management/,/influxdb3/cloud-dedicated/api/management/",
|
||||
"test:shortcode-examples": "node cypress/support/run-e2e-specs.js --spec \"cypress/e2e/content/article-links.cy.js\" content/example.md"
|
||||
},
|
||||
"main": "assets/js/main.js",
|
||||
"module": "assets/js/main.js",
|
||||
"type": "module",
|
||||
"browserslist": [
|
||||
"last 2 versions",
|
||||
|
|
|
@ -0,0 +1,24 @@
|
|||
def process_request(influxdb3_local, query_parameters, request_headers, request_body, args=None):
|
||||
"""
|
||||
Process an HTTP request to a custom endpoint in the InfluxDB 3 processing engine.
|
||||
Args:
|
||||
influxdb3_local: Local InfluxDB API client
|
||||
query_parameters: Query parameters from the HTTP request
|
||||
request_headers: Headers from the HTTP request
|
||||
request_body: Body of the HTTP request
|
||||
args: Optional arguments passed from the trigger configuration
|
||||
"""
|
||||
influxdb3_local.info("Processing HTTP request to custom endpoint")
|
||||
# Handle HTTP requests to a custom endpoint
|
||||
|
||||
# Log the request parameters
|
||||
influxdb3_local.info(f"Received request with parameters: {query_parameters}")
|
||||
|
||||
# Process the request body
|
||||
if request_body:
|
||||
import json
|
||||
data = json.loads(request_body)
|
||||
influxdb3_local.info(f"Request data: {data}")
|
||||
|
||||
# Return a response (automatically converted to JSON)
|
||||
return {"status": "success", "message": "Request processed"}
|
|
@ -0,0 +1,12 @@
|
|||
def process_scheduled_call(influxdb3_local, call_time, args=None):
|
||||
"""
|
||||
Process a scheduled call from the InfluxDB 3 processing engine.
|
||||
|
||||
Args:
|
||||
influxdb3_local: Local InfluxDB API client
|
||||
call_time: Time when the trigger was called
|
||||
args: Optional arguments passed from the trigger configuration
|
||||
"""
|
||||
influxdb3_local.info(f"Processing scheduled call at {call_time}")
|
||||
if args:
|
||||
influxdb3_local.info(f"With arguments: {args}")
|
|
@ -0,0 +1,18 @@
|
|||
def process_writes(influxdb3_local, table_batches, args=None):
|
||||
"""
|
||||
Process writes to the InfluxDB 3 processing engine, handling
|
||||
data persisted to the object store.
|
||||
"""
|
||||
# Process data as it's written to the database
|
||||
for table_batch in table_batches:
|
||||
table_name = table_batch["table_name"]
|
||||
rows = table_batch["rows"]
|
||||
|
||||
# Log information about the write
|
||||
influxdb3_local.info(f"Processing {len(rows)} rows from {table_name}")
|
||||
|
||||
# Write derived data back to the database
|
||||
line = LineBuilder("processed_data")
|
||||
line.tag("source_table", table_name)
|
||||
line.int64_field("row_count", len(rows))
|
||||
influxdb3_local.write(line)
|
Loading…
Reference in New Issue