Merge branch 'master' into sql-window-functions

pull/5854/head
Scott Anderson 2025-02-19 13:44:53 -07:00
commit c6f0fdd5d9
6 changed files with 1621 additions and 624 deletions

View File

@ -129,8 +129,8 @@ spec:
containers:
iox:
env:
INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: true
INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: true
INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'true'
INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'true'
INFLUXDB_IOX_KEEP_HOURLY_CATALOG_BACKUP_FILE_LISTS: '30d'
INFLUXDB_IOX_KEEP_DAILY_CATALOG_BACKUP_FILE_LISTS: '90d'
INFLUXDB_IOX_GC_OBJECTSTORE_CUTOFF: '14d'
@ -140,20 +140,20 @@ spec:
#### INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES
Enable hourly Catalog snapshotting. The default is `false`. Set to `true`:
Enable hourly Catalog snapshotting. The default is `'false'`. Set to `'true'`:
```yaml
INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: true
INFLUXDB_IOX_CREATE_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'true'
```
#### INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES
Enable a snapshot check when deleting files to ensure the Garbage Collector does
not remove Parquet files from the object store that are associated with existing
snapshots. The default is `false`. Set to `true`:
snapshots. The default is `'false'`. Set to `'true'`:
```yaml
INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: true
INFLUXDB_IOX_DELETE_USING_CATALOG_BACKUP_DATA_SNAPSHOT_FILES: 'true'
```
> [!Caution]

View File

@ -40,12 +40,13 @@ to
#### Write the home sensor data to InfluxDB
Use the `influxdb3` CLI, InfluxDB v2 API, or InfluxDB v1 API to write the
Use the `influxdb3` CLI, InfluxDB v3 API, InfluxDB v2 API, or InfluxDB v1 API to write the
home sensor sample data to {{< product-name >}}.
{{< code-tabs-wrapper >}}
{{% code-tabs %}}
[influxdb3](#)
[v3 API](#)
[v2 API](#)
[v1 API](#)
{{% /code-tabs %}}
@ -90,6 +91,43 @@ home,room=Kitchen temp=22.7,hum=36.5,co=26i 1641067200'
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% influxdb/custom-timestamps %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl -v "http://localhost:8181/api/v3/write_lp?db=sensors&precision=auto&accept_partial=true" \
--data-raw "home,room=Living\ Room temp=21.1,hum=35.9,co=0i 1735545600
home,room=Kitchen temp=21.0,hum=35.9,co=0i 1735545600
home,room=Living\ Room temp=21.4,hum=35.9,co=0i 1735549200
home,room=Kitchen temp=23.0,hum=36.2,co=0i 1735549200
home,room=Living\ Room temp=21.8,hum=36.0,co=0i 1735552800
home,room=Kitchen temp=22.7,hum=36.1,co=0i 1735552800
home,room=Living\ Room temp=22.2,hum=36.0,co=0i 1735556400
home,room=Kitchen temp=22.4,hum=36.0,co=0i 1735556400
home,room=Living\ Room temp=22.2,hum=35.9,co=0i 1735560000
home,room=Kitchen temp=22.5,hum=36.0,co=0i 1735560000
home,room=Living\ Room temp=22.4,hum=36.0,co=0i 1735563600
home,room=Kitchen temp=22.8,hum=36.5,co=1i 1735563600
home,room=Living\ Room temp=22.3,hum=36.1,co=0i 1735567200
home,room=Kitchen temp=22.8,hum=36.3,co=1i 1735567200
home,room=Living\ Room temp=22.3,hum=36.1,co=1i 1735570800
home,room=Kitchen temp=22.7,hum=36.2,co=3i 1735570800
home,room=Living\ Room temp=22.4,hum=36.0,co=4i 1735574400
home,room=Kitchen temp=22.4,hum=36.0,co=7i 1735574400
home,room=Living\ Room temp=22.6,hum=35.9,co=5i 1735578000
home,room=Kitchen temp=22.7,hum=36.0,co=9i 1735578000
home,room=Living\ Room temp=22.8,hum=36.2,co=9i 1735581600
home,room=Kitchen temp=23.3,hum=36.9,co=18i 1735581600
home,room=Living\ Room temp=22.5,hum=36.3,co=14i 1735585200
home,room=Kitchen temp=23.1,hum=36.6,co=22i 1735585200
home,room=Living\ Room temp=22.2,hum=36.4,co=17i 1735588800
home,room=Kitchen temp=22.7,hum=36.5,co=26i 1735588800"
```
{{% /code-placeholders %}}
{{% /influxdb/custom-timestamps %}}
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% influxdb/custom-timestamps %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
@ -227,12 +265,13 @@ to
#### Write the home sensor actions data to InfluxDB
Use the `influxdb3` CLI, InfluxDB v2 API, or InfluxDB v1 API to write the
Use the `influxdb3` CLI, InfluxDB v3 API, InfluxDB v2 API, or InfluxDB v1 API to write the
home sensor actions sample data to {{< product-name >}}.
{{< code-tabs-wrapper >}}
{{% code-tabs %}}
[influxdb3](#)
[v3 API](#)
[v2 API](#)
[v1 API](#)
{{% /code-tabs %}}
@ -259,6 +298,25 @@ home_actions,room=Living\ Room,action=alert,level=warn description="Carbon monox
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% influxdb/custom-timestamps %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl -v "http://localhost:8181/api/v3/write_lp?db=sensors&precision=auto&accept_partial=true" \
--data-raw "home_actions,room=Kitchen,action=cool,level=ok description=\"Temperature at or above 23°C (23°C). Cooling to 22°C.\" 1739437200
home_actions,room=Kitchen,action=cool,level=ok description=\"Temperature at or above 23°C (23.3°C). Cooling to 22°C.\" 1739469600
home_actions,room=Kitchen,action=cool,level=ok description=\"Temperature at or above 23°C (23.1°C). Cooling to 22°C.\" 1739473200
home_actions,room=Kitchen,action=alert,level=warn description=\"Carbon monoxide level above normal: 18 ppm.\" 1739469600
home_actions,room=Kitchen,action=alert,level=warn description=\"Carbon monoxide level above normal: 22 ppm.\" 1739473200
home_actions,room=Kitchen,action=alert,level=warn description=\"Carbon monoxide level above normal: 26 ppm.\" 1739476800
home_actions,room=Living Room,action=alert,level=warn description=\"Carbon monoxide level above normal: 14 ppm.\" 1739473200
home_actions,room=Living Room,action=alert,level=warn description=\"Carbon monoxide level above normal: 17 ppm.\" 1739476800"
```
{{% /code-placeholders %}}
{{% /influxdb/custom-timestamps %}}
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% influxdb/custom-timestamps %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
@ -354,12 +412,13 @@ series use cases that involve seasonality.
#### Write the NOAA Bay Area weather data to InfluxDB
Use the `influxdb3` CLI, InfluxDB v2 API, or InfluxDB v1 API to write the
Use the `influxdb3` CLI, InfluxDB v3 API, InfluxDB v2 API, or InfluxDB v1 API to write the
NOAA Bay Area weather sample data to {{< product-name >}}.
{{< code-tabs-wrapper >}}
{{% code-tabs %}}
[influxdb3](#)
[v3 API](#)
[v2 API](#)
[v1 API](#)
{{% /code-tabs %}}
@ -374,6 +433,16 @@ influxdb3 write \
```
{{% /code-placeholders %}}
{{% /code-tabs %}}
{{% code-tab-content %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl -v "http://localhost:8181/api/v3/write_lp?db=sensors&precision=auto&accept_partial=false" \
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bay-area-weather.lp)"
```
{{% /code-placeholders %}}
{{% /code-tab-content %}}
{{% code-tab-content %}}
@ -454,12 +523,13 @@ The Bitcoin price sample dataset provides Bitcoin prices from
#### Write the Bitcoin price sample data to InfluxDB
Use the `influxdb3` CLI, InfluxDB v2 API, or InfluxDB v1 API to write the
Use the `influxdb3` CLI, InfluxDB v3 API, InfluxDB v2 API, or InfluxDB v1 API to write the
Bitcoin price sample data to {{< product-name >}}.
{{< code-tabs-wrapper >}}
{{% code-tabs %}}
[influxdb3](#)
[v3 API](#)
[v2 API](#)
[v1 API](#)
{{% /code-tabs %}}
@ -474,6 +544,16 @@ influxdb3 write \
```
{{% /code-placeholders %}}
{{% /code-tabs %}}
{{% code-tab-content %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl -v "http://localhost:8181/api/v3/write_lp?db=sensors&precision=auto&accept_partial=false" \
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
```
{{% /code-placeholders %}}
{{% /code-tab-content %}}
{{% code-tab-content %}}
@ -528,12 +608,13 @@ transformation functions.
#### Write the random number sample data to InfluxDB
Use the `influxdb3` CLI, InfluxDB v2 API, or InfluxDB v1 API to write the
Use the `influxdb3` CLI, InfluxDB v3 API, InfluxDB v2 API, or InfluxDB v1 API to write the
random number sample data to {{< product-name >}}.
{{< code-tabs-wrapper >}}
{{% code-tabs %}}
[influxdb3](#)
[v3 API](#)
[v2 API](#)
[v1 API](#)
{{% /code-tabs %}}
@ -551,6 +632,16 @@ influxdb3 write \
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl -v "http://localhost:8181/api/v3/write_lp?db=sensors&precision=auto&accept_partial=false" \
--data-binary "$(curl --request GET https://docs.influxdata.com/downloads/bitcoin.lp)"
```
{{% /code-placeholders %}}
{{% /code-tab-content %}}
{{% code-tab-content %}}
{{% code-placeholders "AUTH_TOKEN|DATABASE_NAME" %}}
```sh
curl --request POST \

View File

@ -570,11 +570,6 @@ influxdb3 create distinct_cache -h
### Python plugins and the Processing engine
> [!Important]
> #### Processing engine only works with Docker
>
> The Processing engine is currently supported only in Docker x86 environments. Non-Docker support is coming soon. The engine, API, and developer experience are actively evolving and may change. Join our [Discord](https://discord.gg/9zaNCW2PRT) for updates and feedback.
The InfluxDB 3 Processing engine is an embedded Python VM for running code inside the database to process and transform data.
To use the Processing engine, you create [plugins](#plugin) and [triggers](#trigger).
@ -609,11 +604,6 @@ InfluxDB 3 provides the following types of triggers:
### Test, create, and trigger plugin code
> [!Important]
> #### Processing engine only works with Docker
>
> The Processing engine is currently supported only in Docker x86 environments. Non-Docker support is coming soon. The engine, API, and developer experience are actively evolving and may change. Join our [Discord](https://discord.gg/9zaNCW2PRT) for updates and feedback.
##### Example: Python plugin for WAL flush
```python
@ -699,10 +689,9 @@ Test your InfluxDB 3 plugin safely without affecting written data. During a plug
To test a plugin, do the following:
1. Create a _plugin directory_--for example, `/path/to/.influxdb/plugins`
2. Make the plugin directory available to the Docker container (for example, using a bind mount)
3. Run the Docker command to [start the server](#start-influxdb) and include the `--plugin-dir` option with your plugin directory path.
4. Save the [preceding example code](#example-python-plugin) to a plugin file inside of the plugin directory. If you haven't yet written data to the table in the example, comment out the lines where it queries.
5. To run the test, enter the following command with the following options:
2. [Start the InfluxDB server](#start-influxdb) and include the `--plugin-dir` option with your plugin directory path.
3. Save the [preceding example code](#example-python-plugin) to a plugin file inside of the plugin directory. If you haven't yet written data to the table in the example, comment out the lines where it queries.
4. To run the test, enter the following command with the following options:
- `--lp` or `--file`: The line protocol to test
- Optional: `--input-arguments`: A comma-delimited list of `<KEY>=<VALUE>` arguments for your plugin code

View File

@ -1,7 +1,3 @@
> [!Important]
> #### Processing engine only works with Docker
>
> The Processing engine is currently supported only in Docker x86 environments. Non-Docker support is coming soon. The engine, API, and developer experience are actively evolving and may change. Join our [Discord](https://discord.gg/9zaNCW2PRT) for updates and feedback.
Use the {{% product-name %}} Processing engine to run code and perform tasks
for different database events.
@ -35,6 +31,7 @@ The Processing engine provides four types of plugins and triggers--each type cor
- **On Request**: Bound to the HTTP API `/api/v3/engine/<CUSTOM_PATH>` endpoint and triggered by a GET or POST request to the endpoint.
## Activate the Processing engine
To enable the Processing engine, start the {{% product-name %}} server with the `--plugin-dir` option and a path to your plugins directory (it doesn't need to exist yet)--for example:
```bash
@ -351,6 +348,7 @@ def process_scheduled_call(influxdb3_local, time, args=None):
```
### Schedule Trigger Configuration
Schedule plugins are set with a `trigger-spec` of `schedule:<cron_expression>` or `every:<duration>`. The `args` parameter can be used to pass configuration to the plugin. For example, if we wanted to use the system-metrics example from the Github repo and have it collect every 10 seconds we could use the following trigger definition:
```shell
@ -361,6 +359,7 @@ influxdb3 create trigger \
```
## On Request Plugin
On Request plugins are triggered by a request to a specific endpoint under `/api/v3/engine`. The plugin will receive the local API, query parameters `Dict[str, str]`, request headers `Dict[str, str]`, request body (as bytes), and any arguments passed in the trigger definition. Here's an example of a simple On Request plugin:
```python
@ -387,6 +386,7 @@ def process_request(influxdb3_local, query_parameters, request_headers, request_
```
### On Request Trigger Configuration
On Request plugins are set with a `trigger-spec` of `request:<endpoint>`. The `args` parameter can be used to pass configuration to the plugin. For example, if we wanted the above plugin to run on the endpoint `/api/v3/engine/my_plugin`, we would use `request:my_plugin` as the `trigger-spec`.
Trigger specs must be unique across all configured plugins, regardless of which database they are tied to, given the path is the same. Here's an example to create a request trigger tied to the "hello-world' path using a plugin in the plugin-dir:

View File

@ -560,11 +560,6 @@ influxdb3 create distinct_cache -h
### Python plugins and the Processing engine
> [!Important]
> #### Processing engine only works with Docker
>
> The Processing engine is currently supported only in Docker x86 environments. Non-Docker support is coming soon. The engine, API, and developer experience are actively evolving and may change. Join our [Discord](https://discord.gg/9zaNCW2PRT) for updates and feedback.
The InfluxDB 3 Processing engine is an embedded Python VM for running code inside the database to process and transform data.
To use the Processing engine, you create [plugins](#plugin) and [triggers](#trigger).
@ -599,11 +594,6 @@ InfluxDB 3 provides the following types of triggers:
### Test, create, and trigger plugin code
> [!Important]
> #### Processing engine only works with Docker
>
> The Processing engine is currently supported only in Docker x86 environments. Non-Docker support is coming soon. The engine, API, and developer experience are actively evolving and may change. Join our [Discord](https://discord.gg/9zaNCW2PRT) for updates and feedback.
##### Example: Python plugin for WAL flush
```python
@ -689,10 +679,9 @@ Test your InfluxDB 3 plugin safely without affecting written data. During a plug
To test a plugin, do the following:
1. Create a _plugin directory_--for example, `/path/to/.influxdb/plugins`
2. Make the plugin directory available to the Docker container (for example, using a bind mount)
3. Run the Docker command to [start the server](#start-influxdb) and include the `--plugin-dir` option with your plugin directory path.
4. Save the [preceding example code](#example-python-plugin) to a plugin file inside of the plugin directory. If you haven't yet written data to the table in the example, comment out the lines where it queries.
5. To run the test, enter the following command with the following options:
2. [Start the InfluxDB server](#start-influxdb) and include the `--plugin-dir` option with your plugin directory path.
3. Save the [preceding example code](#example-python-plugin) to a plugin file inside of the plugin directory. If you haven't yet written data to the table in the example, comment out the lines where it queries.
4. To run the test, enter the following command with the following options:
- `--lp` or `--file`: The line protocol to test
- Optional: `--input-arguments`: A comma-delimited list of `<KEY>=<VALUE>` arguments for your plugin code

2090
yarn.lock

File diff suppressed because it is too large Load Diff