feat(v3): WIP: Revise API specs for Core and Enterprise

- Adds basic support for core and enterprise in getswagger.sh
- Adds custom OpenAPI info for Core and Enterprise
- Validates as OpenAPI 3.0 (using Spectral)
    - operationId
    - tags
- Revises use of Legacy, v2
- TODO: need to check and validate in UI, adjust tags if nec.
- Add and remove components
- Update parameters
- Add examples
- Add tests for Core
pull/5870/head
Jason Stirnaman 2025-02-04 18:04:47 -06:00
parent f11461a419
commit 4314589c07
26 changed files with 3150 additions and 179 deletions

View File

@ -0,0 +1,11 @@
extends: substitution
message: Did you mean '%s' instead of '%s'
level: warning
ignorecase: false
# swap maps tokens in form of bad: good
# NOTE: The left-hand (bad) side can match the right-hand (good) side;
# Vale ignores alerts that match the intended form.
swap:
'cloud-serverless|cloud-dedicated|clustered': core
'Cloud Serverless|Cloud Dedicated|Clustered': Core
'API token': database token

View File

@ -0,0 +1,10 @@
extends: substitution
message: Did you mean '%s' instead of '%s'
level: warning
ignorecase: false
# swap maps tokens in form of bad: good
# NOTE: The left-hand (bad) side can match the right-hand (good) side;
# Vale ignores alerts that match the intended form.
swap:
'(?i)bucket': database
'(?i)measurement': table

40
.frontmatter-schema.json Normal file
View File

@ -0,0 +1,40 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"title": {
"type": "string",
"description": "Title of the page"
},
"description": {
"type": "string",
"description": "Page description that supports multi-line text"
},
"menu": {
"type": "object",
"properties": {
"influxdb3_core": {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "Menu item name"
}
},
"required": ["name"]
}
}
},
"weight": {
"type": "integer",
"description": "Order weight for menu items",
"minimum": 0
},
"source": {
"type": "string",
"description": "Path to source content file",
"pattern": "^/shared/.+\\.md$"
}
},
"required": ["title", "description", "menu", "weight"]
}

2
.gitignore vendored
View File

@ -8,7 +8,7 @@ node_modules
*.log
/resources
.hugo_build.lock
/content/influxdb/*/api/**/*.html
/content/influxdb*/*/api/**/*.html
/api-docs/redoc-static.html*
.vscode/*
.idea

15
.vscode/settings.json vendored
View File

@ -1,4 +1,17 @@
{
"vale.valeCLI.config": " \"${workspaceFolder}/.vale.ini\"",
"commentAnchors.tags.anchors":
{ "SOURCE": {
"scope": "file",
"behavior": "link",
"iconColor": "#FF0000",
"highlightColor": "#FF0000",
"style": "bold"
}},
"commentAnchors.workspace.matchFiles": "**/*.{md,ini,json,yaml,yml}",
"commentAnchors.workspace.enabled": true,
"yaml.schemas": {
"./.frontmatter-schema.json": "${workspaceFolder}/content/**/*.md"
},
"vale.valeCLI.config": "${workspaceFolder}/.vale.ini",
"vale.valeCLI.minAlertLevel": "warning",
}

View File

@ -46,9 +46,10 @@ To install dependencies listed in package.json:
4. Install the Yarn package manager and run `yarn` to install project dependencies.
`package.json` contains dependencies for linting and running Git hooks.
docs-v2 uses [Lefthook](https://github.com/evilmartians/lefthook) to configure and manage pre-commit hooks for linting and testing Markdown content.
Other dependencies used in the project:
- **[husky](https://github.com/typicode/husky)**: manages Git hooks, including the pre-commit hook for linting and testing
- **[lint-staged](https://github.com/lint-staged/lint-staged)**: passes staged files to commands
- **[prettier](https://prettier.io/docs/en/)**: formats code, including Markdown, according to style rules for consistency
### Install Docker
@ -72,6 +73,17 @@ docker build -t influxdata:docs-pytest -f Dockerfile.pytest .
To run the documentation locally, follow the instructions provided in the README.
### Install Visual Studio Code extensions
If you use Microsoft Visual Studio (VS) Code, you can install extensions
to help you navigate, check, and edit files.
docs-v2 contains a `./.vscode/settings.json` that configures the following extensions:
- Comment Anchors: recognizes tags (for example, `//SOURCE`) and makes links and filepaths clickable in comments.
- Vale: shows linter errors and suggestions in the editor.
- YAML Schemas: validates frontmatter attributes.
### Make your changes
Make your suggested changes being sure to follow the [style and formatting guidelines](#style--formatting) outline below.
@ -80,15 +92,15 @@ Make your suggested changes being sure to follow the [style and formatting guide
### Automatic pre-commit checks
docs-v2 uses Husky to manage Git hook scripts.
When you try to commit your changes (for example, `git commit`), Git runs
scripts configured in `.husky/pre-commit`, including linting and tests for your **staged** files.
docs-v2 uses Lefthook to manage Git hooks, such as pre-commit hooks that lint Markdown and test code blocks.
When you try to commit changes (`git commit`), Git runs
the commands configured in `lefthook.yml` which pass your **staged** files to Vale, Prettier, and Pytest (in a Docker container).
### Skip pre-commit hooks
**We strongly recommend running linting and tests**, but you can skip them
(and avoid installing dependencies)
by including the `HUSKY=0` environment variable or the `--no-verify` flag with
by including the `LEFTHOOK=0` environment variable or the `--no-verify` flag with
your commit--for example:
```sh
@ -96,11 +108,9 @@ git commit -m "<COMMIT_MESSAGE>" --no-verify
```
```sh
HUSKY=0 git commit
LEFTHOOK=0 git commit
```
For more options, see the [Husky documentation](https://typicode.github.io/husky/how-to.html#skipping-git-hooks).
### Set up test scripts and credentials
To set up your docs-v2 instance to run tests locally, do the following:

View File

@ -34,6 +34,11 @@ RUN apt-get update && apt-get upgrade -y && apt-get install -y \
telegraf \
wget
# Install InfluxDB 3 Core
RUN curl -O https://www.influxdata.com/d/install_influxdb3.sh \
&& chmod +x install_influxdb3.sh \
&& bash -c yes | ./install_influxdb3.sh
RUN ln -s /usr/bin/python3 /usr/bin/python
# Create a virtual environment for Python to avoid conflicts with the system Python and having to use the --break-system-packages flag when installing packages with pip.

View File

@ -62,7 +62,7 @@ function showHelp {
subcommand=$1
case "$subcommand" in
cloud-dedicated-v2|cloud-dedicated-management|cloud-serverless-v2|clustered-v2|cloud-v2|v2|v1-compat|all)
cloud-dedicated-v2|cloud-dedicated-management|cloud-serverless-v2|clustered-v2|cloud-v2|v2|v1-compat|core-v3|enterprise-v3|all)
product=$1
shift
@ -176,6 +176,17 @@ function updateCloudDedicatedV2 {
postProcess $outFile 'influxdb3/cloud-dedicated/.config.yml' v2@2
}
function updateCloudServerlessV2 {
outFile="influxdb3/cloud-serverless/v2/ref.yml"
if [[ -z "$baseUrl" ]];
then
echo "Using existing $outFile"
else
curl $UPDATE_OPTIONS ${baseUrl}/contracts/ref/cloud.yml -o $outFile
fi
postProcess $outFile 'influxdb3/cloud-serverless/.config.yml' v2@2
}
function updateClusteredV2 {
outFile="influxdb3/clustered/v2/ref.yml"
if [[ -z "$baseUrl" ]];
@ -187,15 +198,28 @@ function updateClusteredV2 {
postProcess $outFile 'influxdb3/clustered/.config.yml' v2@2
}
function updateCloudServerlessV2 {
outFile="influxdb3/cloud-serverless/v2/ref.yml"
function updateCoreV3 {
outFile="influxdb3/core/v3/ref.yml"
if [[ -z "$baseUrl" ]];
then
echo "Using existing $outFile"
else
curl $UPDATE_OPTIONS ${baseUrl}/contracts/ref/cloud.yml -o $outFile
local url="${baseUrl}/TO_BE_DECIDED"
curl $UPDATE_OPTIONS $url -o $outFile
fi
postProcess $outFile 'influxdb3/cloud-serverless/.config.yml' v2@2
postProcess $outFile 'influxdb3/core/.config.yml' v3@3
}
function updateEnterpriseV3 {
outFile="influxdb3/enterprise/v3/ref.yml"
if [[ -z "$baseUrl" ]];
then
echo "Using existing $outFile"
else
local url="${baseUrl}/TO_BE_DECIDED"
curl $UPDATE_OPTIONS $url -o $outFile
fi
postProcess $outFile 'influxdb3/enterprise/.config.yml' v3@3
}
function updateOSSV2 {
@ -220,7 +244,7 @@ function updateV1Compat {
postProcess $outFile 'influxdb/cloud/.config.yml' 'v1-compatibility'
outFile="influxdb/v2/v1-compatibility/swaggerV1Compat.yml"
cp cloud/v1-compatibility/swaggerV1Compat.yml $outFile
cp influxdb/cloud/v1-compatibility/swaggerV1Compat.yml $outFile
postProcess $outFile 'influxdb/v2/.config.yml' 'v1-compatibility'
outFile="influxdb3/cloud-dedicated/v1-compatibility/swaggerV1Compat.yml"
@ -257,6 +281,12 @@ then
elif [ "$product" = "clustered-v2" ];
then
updateClusteredV2
elif [ "$product" = "core-v3" ];
then
updateCoreV3
elif [ "$product" = "enterprise-v3" ];
then
updateEnterpriseV3
elif [ "$product" = "v2" ];
then
updateOSSV2
@ -270,9 +300,11 @@ then
updateCloudDedicatedManagement
updateCloudServerlessV2
updateClusteredV2
updateCoreV3
updateEnterpriseV3
updateOSSV2
updateV1Compat
else
echo "Provide a product argument: cloud-v2, cloud-serverless-v2, cloud-dedicated-v2, clustered-v2, v2, v1-compat, or all."
echo "Provide a product argument: cloud-v2, cloud-serverless-v2, cloud-dedicated-v2, cloud-dedicated-management, clustered-v2, core-v3, enterprise-v3, v2, v1-compat, or all."
showHelp
fi

View File

@ -0,0 +1,23 @@
title: InfluxDB 3 Core API Service
x-influxdata-short-title: InfluxDB 3 API
x-influxdata-version-matrix:
v1: Legacy compatibility layer
v2: Backward compatibility with InfluxDB 2.x
v3: Current native API
summary: The InfluxDB HTTP API for InfluxDB 3 Core provides a programmatic interface for writing data stored in an InfluxDB 3 Core database.
description: |
Write and query data, and perform administrative tasks, such as managing databases and processing engine plugins.
The InfluxDB HTTP API for InfluxDB 3 Core includes endpoints for compatibility with InfluxDB 2.x and InfluxDB 1.x APIs.
<!-- TODO: verify where to host the spec that users can download.
This documentation is generated from the
[InfluxDB OpenAPI specification](https://raw.githubusercontent.com/influxdata/openapi/master/contracts/ref/cloud.yml).
-->
license:
name: MIT
url: 'https://opensource.org/licenses/MIT'
contact:
name: InfluxData
url: https://www.influxdata.com
email: support@influxdata.com

View File

@ -0,0 +1,8 @@
- url: https://{baseurl}
description: InfluxDB 3 Core API URL
variables:
baseurl:
enum:
- 'localhost:8181'
default: 'localhost:8181'
description: InfluxDB 3 Core URL

View File

@ -0,0 +1,13 @@
- name: Using the InfluxDB HTTP API
tags:
- Quick start
- Authentication
- Headers
- Pagination
- Response codes
- System information endpoints
- name: All endpoints
tags:
- Ping
- Query
- Write

View File

@ -3,34 +3,44 @@ info:
title: InfluxDB 3 Core HTTP API
description: HTTP API service for managing, writing to, and querying from InfluxDB 3 Core.
version: 1.0.2
license:
name: MIT
url: 'https://opensource.org/licenses/MIT'
contact:
name: InfluxData
url: https://www.influxdata.com
email: support@influxdata.com
servers:
- url: http://localhost:8080
description: Local development server
tags:
- name: Databases
description: Create, read, update, and delete database and cache resources
- name: Tables
description: Manage table schemas and data
- name: Data Operations
description: Write, query, and process data
- name: Legacy APIs
description: Backwards compatibility APIs for v1.x and v2.x clients
paths:
/write:
post:
summary: Write Line Protocol (Legacy)
operationId: PostLegacyV1Write
summary: Write line protocol (v1-compatible)
description: >
Accepts line protocol data and writes it to the database.
Write options are provided via query parameters.
Writes line protocol to the specified database.
This is a legacy endpoint compatible with InfluxDB 2.x client libraries, the Telegraf `outputs.influxdb` output plugin, and third-party tools.
Use this endpoint to send data in [line protocol](https://docs.influxdata.com/influxdb3/core/reference/syntax/line-protocol/) format to InfluxDB.
Use query parameters to specify options for writing data.
parameters:
- name: db
in: query
required: true
schema:
type: string
description: Database name.
- name: accept_partial
in: query
required: false
schema:
type: boolean
default: true
description: Whether to accept partial writes.
- $ref: '#/components/parameters/dbWriteParam'
- name: precision
in: query
required: true
schema:
type: string
enum: [ns, us, ms, s]
default: ns
$ref: '#/components/schemas/LegacyWritePrecision'
description: Precision of timestamps.
- name: no_sync
in: query
@ -46,12 +56,52 @@ paths:
text/plain:
schema:
type: string
example: "measurement,tag=value field=1 1234567890"
format: byte
examples:
plain-utf8-multiline:
value: |
measurement,tag=id001 field=1.0 1234567890
measurement,tag=id001 field=1.1 1234567900
responses:
"204":
description: Write successful.
description: Success ("No Content"). All data in the batch is written and queryable.
"400":
description: Bad request.
description: |
Bad request. Some (a _partial write_) or all of the data from the batch was rejected and not written.
If a partial write occurred, then some points from the batch are written and queryable.
The response body:
- indicates if a partial write occurred or all data was rejected.
- contains details about the [rejected points](/influxdb3/core/write-data/troubleshoot/#troubleshoot-rejected-points), up to 100 points.
content:
application/json:
examples:
rejectedAllPoints:
summary: Rejected all points in the batch
value: |
{
"error": "write of line protocol failed",
"data": [
{
"original_line": "dquote> home,room=Kitchen temp=hi",
"line_number": 2,
"error_message": "No fields were provided"
}
]
}
partialWriteErrorWithRejectedPoints:
summary: Partial write rejected some points in the batch
value: |
{
"error": "partial write of line protocol occurred",
"data": [
{
"original_line": "dquote> home,room=Kitchen temp=hi",
"line_number": 2,
"error_message": "No fields were provided"
}
]
}
"401":
$ref: '#/components/responses/Unauthorized'
"403":
@ -60,19 +110,77 @@ paths:
description: Request entity too large.
security:
- BearerAuth: []
tags:
- Legacy v1-compatible
- Write data
/api/v2/write:
post:
summary: Write Line Protocol (v2)
operationId: PostLegacyV2Write
summary: Write line protocol (v2-compatible)
description: >
Accepts line protocol data and writes it to the database.
Write options are provided via query parameters.
Writes line protocol to the specified database.
This is a legacy endpoint compatible with InfluxDB 2.x client libraries the Telegraf `outputs.influxdb_v2` output plugin, and third-party tools.
Use this endpoint to send data in [line protocol](/influxdb3/core/reference/syntax/line-protocol/) format to InfluxDB.
Use query parameters to specify options for writing data.
parameters:
- description: |
The compression applied to the line protocol in the request payload.
To send a gzip payload, pass `Content-Encoding: gzip` header.
in: header
name: Content-Encoding
schema:
default: identity
description: |
Content coding.
Use `gzip` for compressed data or `identity` for unmodified, uncompressed data.
enum:
- gzip
- identity
type: string
- description: |
The format of the data in the request body.
To send a line protocol payload, pass `Content-Type: text/plain; charset=utf-8`.
in: header
name: Content-Type
schema:
default: text/plain; charset=utf-8
description: |
`text/plain` is the content type for line protocol. `UTF-8` is the default character set.
enum:
- text/plain
- text/plain; charset=utf-8
type: string
- description: |
The size of the entity-body, in bytes, sent to InfluxDB.
in: header
name: Content-Length
schema:
description: The length in decimal number of octets.
type: integer
- description: |
The content type that the client can understand.
Writes only return a response body if they fail (partially or completely)--for example,
due to a syntax problem or type mismatch.
in: header
name: Accept
schema:
default: application/json
description: Error content type.
enum:
- application/json
type: string
- name: db
in: query
required: true
schema:
type: string
description: Database name.
description: |
A database name.
InfluxDB creates the database if it doesn't already exist, and then
writes all points in the batch to the database.
- name: accept_partial
in: query
required: false
@ -84,10 +192,8 @@ paths:
in: query
required: true
schema:
type: string
enum: [ns, us, ms, s]
default: ns
description: Precision of timestamps.
$ref: '#/components/schemas/LegacyWritePrecision'
description: The precision for unix timestamps in the line protocol batch.
- name: no_sync
in: query
required: false
@ -116,19 +222,20 @@ paths:
description: Request entity too large.
security:
- BearerAuth: []
tags:
- Legacy v2-compatible
- Write data
/api/v3/write_lp:
post:
summary: Write Line Protocol (v3)
operationId: PostWriteLP
summary: Write line protocol
description: >
Accepts line protocol data and writes it to the database.
Write options are provided via query parameters.
Writes line protocol to the specified database.
Use this endpoint to send data in [line protocol](/influxdb3/core/reference/syntax/line-protocol/) format to InfluxDB.
Use query parameters to specify options for writing data.
parameters:
- name: db
in: query
required: true
schema:
type: string
description: Database name.
- $ref: '#/components/parameters/dbWriteParam'
- name: accept_partial
in: query
required: false
@ -140,9 +247,7 @@ paths:
in: query
required: true
schema:
type: string
enum: [ns, us, ms, s]
default: ns
$ref: '#/components/schemas/WritePrecision'
description: Precision of timestamps.
- name: no_sync
in: query
@ -174,16 +279,15 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- Write data
/api/v3/query_sql:
get:
summary: Execute SQL Query
description: Executes an SQL query against the database.
operationId: GetExecuteQuerySQL
summary: Execute SQL query
description: Executes an SQL query to retrieve data from the specified database.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: q
in: query
required: true
@ -197,7 +301,7 @@ paths:
- $ref: '#/components/parameters/Accept'
responses:
"200":
description: Successful query execution.
description: Success. The response body contains query results.
content:
application/json:
schema:
@ -231,9 +335,12 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- Query data
post:
summary: Execute SQL Query
description: Executes an SQL query against the database.
operationId: PostExecuteQuerySQL
summary: Execute SQL query
description: Executes an SQL query to retrieve data from the specified database.
parameters:
- $ref: '#/components/parameters/ContentType'
requestBody:
@ -272,16 +379,15 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- Query data
/api/v3/query_influxql:
get:
summary: Execute InfluxQL Query (v3)
description: Executes an InfluxQL query against the database.
operationId: GetExecuteInfluxQLQuery
summary: Execute InfluxQL query
description: Executes an InfluxQL query to retrieve data from the specified database.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: q
in: query
required: true
@ -323,9 +429,12 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- InfluxQL query
post:
summary: Execute InfluxQL Query (v3)
description: Executes an InfluxQL query against the database.
operationId: PostExecuteQueryInfluxQL
summary: Execute InfluxQL query
description: Executes an InfluxQL query to retrieve data from the specified database.
parameters:
- $ref: '#/components/parameters/ContentType'
requestBody:
@ -364,16 +473,20 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- InfluxQL query
/query:
get:
summary: Execute InfluxQL Query
description: Executes an InfluxQL query against the database.
operationId: GetLegacyV1ExecuteQuery
summary: Execute InfluxQL query (v1-compatible)
description: |
Executes an InfluxQL query to retrieve data from the specified database.
This endpoint is compatible with InfluxDB 1.x client libraries and third-party integrations such as Grafana.
Use query parameters to specify the database and the InfluxQL query.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: q
in: query
required: true
@ -415,9 +528,13 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- Query data
- Legacy v1-compatible
post:
summary: Execute InfluxQL Query
description: Executes an InfluxQL query against the database.
operationId: PostExecuteLegacyV1Query
summary: Execute InfluxQL query (v1-compatible)
description: Executes an InfluxQL query to retrieve data from the specified database.
parameters:
- $ref: '#/components/parameters/ContentType'
requestBody:
@ -428,7 +545,7 @@ paths:
$ref: '#/components/schemas/QueryRequest'
responses:
"200":
description: Successful query execution.
description: Success. The response body contains query results.
content:
application/json:
schema:
@ -456,8 +573,12 @@ paths:
description: Unprocessable entity.
security:
- BearerAuth: []
tags:
- Query data
- Legacy v1-compatible
/health:
get:
operationId: GetHealth
summary: Health Check
description: Returns the status of the service.
responses:
@ -465,8 +586,11 @@ paths:
description: Service is running.
"500":
description: Service is unavailable.
tags:
- Server
/api/v1/health:
get:
operationId: GetHealthV1
summary: Health Check (v1)
description: Returns the status of the service.
responses:
@ -474,8 +598,14 @@ paths:
description: Service is running.
"500":
description: Service is unavailable.
tags:
- Server
- Legacy v1-compatible
/ping:
get:
operationId: GetPing
tags:
- Server
summary: Ping the Server
description: Returns basic server information.
responses:
@ -483,13 +613,17 @@ paths:
description: Server is reachable.
/metrics:
get:
operationId: GetMetrics
summary: Metrics
description: Retrieves Prometheus-compatible metrics.
responses:
"200":
description: Metrics returned.
tags:
- Server
/api/v3/configure/database:
get:
operationId: GetConfigureDatabase
summary: List Databases
description: Retrieves a list of databases.
responses:
@ -507,7 +641,11 @@ paths:
description: Database not found.
security:
- BearerAuth: []
tags:
- List
- Database
post:
operationId: PostConfigureDatabase
summary: Create a Database
description: Creates a new database in the system.
requestBody:
@ -527,15 +665,17 @@ paths:
description: Database already exists.
security:
- BearerAuth: []
tags:
- Create
- Database
delete:
operationId: DeleteConfigureDatabase
summary: Delete a Database
description: Soft deletes a database.
description: |
Soft deletes a database.
The database is scheduled for deletion and unavailable for querying.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
responses:
"200":
description: Database deleted.
@ -545,8 +685,12 @@ paths:
description: Database not found.
security:
- BearerAuth: []
tags:
- Delete
- Database
/api/v3/configure/table:
post:
operationId: PostConfigureTable
summary: Create a Table
description: Creates a new table within a database.
requestBody:
@ -566,15 +710,17 @@ paths:
description: Database not found.
security:
- BearerAuth: []
tags:
- Create
- Table
delete:
operationId: DeleteConfigureTable
summary: Delete a Table
description: Soft deletes a table.
description: |
Soft deletes a table.
The table is scheduled for deletion and unavailable for querying.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: table
in: query
required: true
@ -589,8 +735,12 @@ paths:
description: Table not found.
security:
- BearerAuth: []
tags:
- Delete
- Table
/api/v3/configure/distinct_cache:
post:
operationId: PostConfigureDistinctCache
summary: Create Distinct Cache
description: Creates a distinct cache for a table.
requestBody:
@ -612,15 +762,15 @@ paths:
description: Cache already exists.
security:
- BearerAuth: []
tags:
- Create
- Cache
delete:
operationId: DeleteConfigureDistinctCache
summary: Delete Distinct Cache
description: Deletes a distinct cache.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: table
in: query
required: true
@ -642,8 +792,12 @@ paths:
description: Cache not found.
security:
- BearerAuth: []
tags:
- Delete
- Cache
/api/v3/configure/last_cache:
post:
operationId: PostConfigureLastCache
summary: Create Last Cache
description: Creates a last cache for a table.
requestBody:
@ -665,15 +819,15 @@ paths:
description: Cache already exists.
security:
- BearerAuth: []
tags:
- Create
- Cache
delete:
operationId: DeleteConfigureLastCache
summary: Delete Last Cache
description: Deletes a last cache.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: table
in: query
required: true
@ -695,8 +849,12 @@ paths:
description: Cache not found.
security:
- BearerAuth: []
tags:
- Delete
- Cache
/api/v3/configure/processing_engine_trigger:
post:
operationId: PostConfigureProcessingEngineTrigger
summary: Create Processing Engine Trigger
description: Creates a new processing engine trigger.
requestBody:
@ -716,15 +874,15 @@ paths:
description: Trigger not found.
security:
- BearerAuth: []
tags:
- Create
- Processing engine
delete:
operationId: DeleteConfigureProcessingEngineTrigger
summary: Delete Processing Engine Trigger
description: Deletes a processing engine trigger.
parameters:
- name: db
in: query
required: true
schema:
type: string
- $ref: '#/components/parameters/db'
- name: trigger_name
in: query
required: true
@ -747,8 +905,12 @@ paths:
description: Trigger not found.
security:
- BearerAuth: []
tags:
- Delete
- Processing engine
/api/v3/configure/processing_engine_trigger/disable:
post:
operationId: PostDisableProcessingEngineTrigger
summary: Disable Processing Engine Trigger
description: Disables a processing engine trigger.
parameters:
@ -770,8 +932,11 @@ paths:
description: Trigger not found.
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/configure/processing_engine_trigger/enable:
post:
operationId: PostEnableProcessingEngineTrigger
summary: Enable Processing Engine Trigger
description: Enables a processing engine trigger.
parameters:
@ -793,8 +958,11 @@ paths:
description: Trigger not found.
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/configure/plugin_environment/install_packages:
post:
operationId: PostInstallPluginPackages
summary: Install Plugin Packages
description: Installs packages for the plugin environment.
parameters:
@ -815,8 +983,11 @@ paths:
$ref: '#/components/responses/Unauthorized'
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/configure/plugin_environment/install_requirements:
post:
operationId: PostInstallPluginRequirements
summary: Install Plugin Requirements
description: Installs requirements for the plugin environment.
parameters:
@ -837,10 +1008,13 @@ paths:
$ref: '#/components/responses/Unauthorized'
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/plugin_test/wal:
post:
operationId: PostTestWALPlugin
summary: Test WAL Plugin
description: Executes a plugin test for write-ahead logging.
description: Executes a test of a write-ahead logging (WAL) plugin.
responses:
"200":
description: Plugin test executed.
@ -852,10 +1026,13 @@ paths:
description: Plugin not enabled.
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/plugin_test/schedule:
post:
operationId: PostTestSchedulingPlugin
summary: Test Scheduling Plugin
description: Executes a plugin test for scheduling.
description: Executes a test of a scheduling plugin.
responses:
"200":
description: Plugin test executed.
@ -867,6 +1044,8 @@ paths:
description: Plugin not enabled.
security:
- BearerAuth: []
tags:
- Processing engine
/api/v3/engine/{plugin_path}:
parameters:
- name: plugin_path
@ -875,6 +1054,7 @@ paths:
schema:
type: string
get:
operationId: GetProcessingEnginePluginRequest
summary: Custom Processing Engine Request (GET)
description: Sends a custom request to a processing engine plugin.
parameters:
@ -892,7 +1072,10 @@ paths:
description: Processing failure.
security:
- BearerAuth: []
tags:
- Processing engine
post:
operationId: PostProcessingEnginePluginRequest
summary: Custom Processing Engine Request (POST)
description: Sends a custom request to a processing engine plugin.
parameters:
@ -917,56 +1100,178 @@ paths:
description: Processing failure.
security:
- BearerAuth: []
tags:
- Processing engine
components:
parameters:
ContentEncoding:
name: Content-Encoding
in: header
description: |
The compression applied to the line protocol in the request payload.
To send a gzip payload, pass `Content-Encoding: gzip` header.
schema:
type: string
enum: [gzip, identity]
$ref: '#/components/schemas/ContentEncoding'
required: false
description: Optional encoding of the request body.
Accept:
name: Accept
in: header
schema:
type: string
enum: [application/json, application/vnd.apache.parquet, application/jsonl]
$ref: '#/components/schemas/Accept'
required: false
description: Acceptable response content type.
description: The content type that the client can understand.
ContentLength:
name: Content-Length
in: header
description: |
The size of the entity-body, in bytes, sent to InfluxDB.
schema:
$ref: '#/components/schemas/ContentLength'
ContentType:
name: Content-Type
in: header
description: |
The format of the data in the request body.
To send a line protocol payload, pass `Content-Type: text/plain; charset=utf-8`.
schema:
$ref: '#/components/schemas/ContentType'
required: false
dbWriteParam:
name: db
in: query
required: true
schema:
type: string
enum: [text/plain, application/json, text/csv]
description: |
A database name.
InfluxDB creates the database if it doesn't already exist, and then
writes all points in the batch to the database.
accept_partial:
name: accept_partial
in: query
required: false
description: Request content type.
schema:
type: boolean
default: false
description: Whether to accept partial writes.
precision:
name: precision
in: query
required: true
schema:
$ref: '#/components/schemas/WritePrecision'
description: The precision for unix timestamps in the line protocol batch.
precisionLegacyParam:
name: precision
in: query
required: true
schema:
$ref: '#/components/schemas/LegacyWritePrecision'
description: The precision for unix timestamps in the line protocol batch.
no_sync:
name: no_sync
in: query
required: false
schema:
type: boolean
default: false
description: Do not sync writes.
db:
name: db
in: query
required: true
schema:
type: string
description: |
The name of the database.
q:
name: q
in: query
required: true
schema:
type: string
description: |
The query to execute.
requestBodies:
lineProtocolRequestBody:
required: true
content:
text/plain:
schema:
type: string
examples:
line:
summary: Example line protocol
value: "measurement,tag=value field=1 1234567890"
multiline:
summary: Example line protocol with UTF-8 characters
value: |
measurement,tag=value field=1 1234567890
measurement,tag=value field=2 1234567900
measurement,tag=value field=3 1234568000
schemas:
WriteParams:
type: object
properties:
db:
type: string
accept_partial:
type: boolean
default: true
precision:
type: string
enum: [ns, us, ms, s]
default: ns
no_sync:
type: boolean
default: false
required:
- db
- precision
example:
db: "mydb"
accept_partial: true
precision: "ns"
no_sync: false
ContentEncoding:
type: string
enum: [gzip, identity]
description: |
Content coding.
Use `gzip` for compressed data or `identity` for unmodified, uncompressed data.
default: identity
ContentType:
type: string
enum: [text/plain, text/plain; charset=utf-8]
description: |
`text/plain` is the content type for line protocol. `UTF-8` is the default character set.
default: text/plain; charset=utf-8
ContentLength:
type: integer
description: The length in decimal number of octets.
Accept:
type: string
enum: [application/json]
description: Error content type.
default: application/json
db:
type: string
description: |
A database name.
InfluxDB creates the database if it doesn't already exist, and then
writes all points in the batch to the database.
accept_partial:
type: boolean
default: false
description: Whether to accept partial writes.
precision:
$ref: '#/components/schemas/LegacyWritePrecision'
no_sync:
type: boolean
default: false
description: Do not sync writes.
LegacyWritePrecision:
enum:
- ms
- s
- us
- ns
type: string
description: |
The precision for unix timestamps in the line protocol batch.
Use `ms` for milliseconds, `s` for seconds, `us` for microseconds, or `ns` for nanoseconds.
WritePrecision:
enum:
- auto
- millisecond
- second
- microsecond
- nanosecond
type: string
QueryRequest:
type: object
properties:
@ -1136,13 +1441,38 @@ components:
data:
type: object
nullable: true
PingResponse:
type: object
LineProtocolError:
properties:
version:
code:
description: Code is the machine-readable error code.
enum:
- internal error
- not found
- conflict
- invalid
- empty value
- unavailable
readOnly: true
type: string
revision:
err:
description: Stack of errors that occurred during processing of the request. Useful for debugging.
readOnly: true
type: string
line:
description: First line in the request body that contains malformed data.
format: int32
readOnly: true
type: integer
message:
description: Human-readable message.
readOnly: true
type: string
op:
description: Describes the logical code operation when the error occurred. Useful for debugging.
readOnly: true
type: string
required:
- code
responses:
Unauthorized:
description: Unauthorized access.
@ -1165,4 +1495,13 @@ components:
securitySchemes:
BearerAuth:
type: http
scheme: bearer
scheme: Bearer
bearerFormat: JWT
description: |
A Bearer token for authentication.
Provide the scheme and the API token in the `Authorization` header--for example:
```bash
curl http://localhost:8181/api/v3/query_influxql \
--header "Authorization: Bearer API_TOKEN"
```

View File

@ -0,0 +1,19 @@
title: InfluxDB 3 Enterprise API Service
x-influxdata-short-title: InfluxDB 3 API
summary: The InfluxDB HTTP API for InfluxDB 3 Enterprise provides a programmatic interface for writing data stored in an InfluxDB 3 Enterprise database.
description: |
Write and query data, and perform administrative tasks, such as managing databases and processing engine plugins.
The InfluxDB HTTP API for InfluxDB 3 Enterprise includes endpoints for compatibility with InfluxDB 2.x and InfluxDB 1.x APIs.
<!-- TODO: verify where to host the spec that users can download.
This documentation is generated from the
[InfluxDB OpenAPI specification](https://raw.githubusercontent.com/influxdata/openapi/master/contracts/ref/cloud.yml).
-->
license:
name: MIT
url: 'https://opensource.org/licenses/MIT'
contact:
name: InfluxData
url: https://www.influxdata.com
email: support@influxdata.com

View File

@ -0,0 +1,8 @@
- url: https://{baseurl}
description: InfluxDB 3 Enterprise API URL
variables:
baseurl:
enum:
- 'localhost:8181'
default: 'localhost:8181'
description: InfluxDB 3 Enterprise URL

View File

@ -0,0 +1,13 @@
- name: Using the InfluxDB HTTP API
tags:
- Quick start
- Authentication
- Headers
- Pagination
- Response codes
- System information endpoints
- name: All endpoints
tags:
- Ping
- Query
- Write

File diff suppressed because it is too large Load Diff

View File

@ -16,6 +16,10 @@ function SetInfo(data) {
}
if(data.hasOwnProperty('summary')) {
info.summary = data.summary;
} else {
// Remove summary if not provided.
// info.summary isn't a valid OpenAPI 3.0 property, but it's used by Redocly.
info['summary'] = undefined;
}
if(data.hasOwnProperty('description')) {
info.description = data.description;
@ -23,6 +27,9 @@ function SetInfo(data) {
if(data.hasOwnProperty('license')) {
info.license = data.license;
}
if(data.hasOwnProperty('contact')) {
info.contact = data.contact;
}
}
}
}

View File

@ -6,5 +6,8 @@
"license": "MIT",
"dependencies": {
"js-yaml": "^4.1.0"
},
"devDependencies": {
"spectral": "^0.0.0"
}
}

View File

@ -2,14 +2,90 @@
# yarn lockfile v1
ansi-regex@^2.0.0:
version "2.1.1"
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-2.1.1.tgz#c3b33ab5ee360d86e0e628f0468ae7ef27d654df"
integrity sha512-TIGnTpdo+E3+pCyAluZvtED5p5wCqLdezCyhPZzKPcxvFplEt4i+W7OONCKgeZFT3+y5NZZfOOS/Bdcanm1MYA==
ansi-styles@^2.2.1:
version "2.2.1"
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-2.2.1.tgz#b432dd3358b634cf75e1e4664368240533c1ddbe"
integrity sha512-kmCevFghRiWM7HB5zTPULl4r9bVFSWjz62MhqizDGUrq2NWuNMQyuv4tHHoKJHs69M/MF64lEcHdYIocrdWQYA==
argparse@^2.0.1:
version "2.0.1"
resolved "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz"
integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==
chalk@^1.0.0:
version "1.1.3"
resolved "https://registry.yarnpkg.com/chalk/-/chalk-1.1.3.tgz#a8115c55e4a702fe4d150abd3872822a7e09fc98"
integrity sha512-U3lRVLMSlsCfjqYPbLyVv11M9CPW4I728d6TCKMAOJueEeB9/8o+eSsMnxPJD+Q+K909sdESg7C+tIkoH6on1A==
dependencies:
ansi-styles "^2.2.1"
escape-string-regexp "^1.0.2"
has-ansi "^2.0.0"
strip-ansi "^3.0.0"
supports-color "^2.0.0"
commander@^2.8.1:
version "2.20.3"
resolved "https://registry.yarnpkg.com/commander/-/commander-2.20.3.tgz#fd485e84c03eb4881c20722ba48035e8531aeb33"
integrity sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==
escape-string-regexp@^1.0.2:
version "1.0.5"
resolved "https://registry.yarnpkg.com/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz#1b61c0562190a8dff6ae3bb2cf0200ca130b86d4"
integrity sha512-vbRorB5FUQWvla16U8R/qgaFIya2qGzwDrNmCZuYKrbdSUMG6I1ZCGQRefkRVhuOkIGVne7BQ35DSfo1qvJqFg==
extend@^2.0.1:
version "2.0.2"
resolved "https://registry.yarnpkg.com/extend/-/extend-2.0.2.tgz#1b74985400171b85554894459c978de6ef453ab7"
integrity sha512-AgFD4VU+lVLP6vjnlNfF7OeInLTyeyckCNPEsuxz1vi786UuK/nk6ynPuhn/h+Ju9++TQyr5EpLRI14fc1QtTQ==
has-ansi@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/has-ansi/-/has-ansi-2.0.0.tgz#34f5049ce1ecdf2b0649af3ef24e45ed35416d91"
integrity sha512-C8vBJ8DwUCx19vhm7urhTuUsr4/IyP6l4VzNQDv+ryHQObW3TTTp9yB68WpYgRe2bbaGuZ/se74IqFeVnMnLZg==
dependencies:
ansi-regex "^2.0.0"
js-yaml@^4.1.0:
version "4.1.0"
resolved "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.0.tgz"
integrity sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==
dependencies:
argparse "^2.0.1"
moment@^2.10.3:
version "2.30.1"
resolved "https://registry.yarnpkg.com/moment/-/moment-2.30.1.tgz#f8c91c07b7a786e30c59926df530b4eac96974ae"
integrity sha512-uEmtNhbDOrWPFS+hdjFCBfy9f2YoyzRpwcl+DqpC6taX21FzsTLQVbMV/W7PzNSX6x/bhC1zA3c2UQ5NzH6how==
spectral@^0.0.0:
version "0.0.0"
resolved "https://registry.yarnpkg.com/spectral/-/spectral-0.0.0.tgz#a244b28c0726a7907374ad39c58024f934b9e8a1"
integrity sha512-tJamrVCLdpHt3geQn9ypWLlcS7K02+TZV5hj1bnPjGcjQs5N0dtxzJVitcmHbR9tZQgjwj2hAO1f8v1fzzwF1Q==
dependencies:
chalk "^1.0.0"
commander "^2.8.1"
extend "^2.0.1"
moment "^2.10.3"
string-etc "^0.2.0"
string-etc@^0.2.0:
version "0.2.0"
resolved "https://registry.yarnpkg.com/string-etc/-/string-etc-0.2.0.tgz#a0f84a2d8816082266384a3c7229acbb8064eda5"
integrity sha512-J9RfI2DvBDlnISBhfOBOAXPFxE4cpEgNC6zJTjULmagQaMuu2sYrE44H8h5Paxf3Bm9Wcer92DJv9n77OAHIRg==
strip-ansi@^3.0.0:
version "3.0.1"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-3.0.1.tgz#6a385fb8853d952d5ff05d0e8aaf94278dc63dcf"
integrity sha512-VhumSSbBqDTP8p2ZLKj40UjBCV4+v8bUSEpUb4KjRgWk9pbqGF4REFj6KEagidb2f/M6AzC0EmFyDNGaw9OCzg==
dependencies:
ansi-regex "^2.0.0"
supports-color@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-2.0.0.tgz#535d045ce6b6363fa40117084629995e9df324c7"
integrity sha512-KKNVtd6pCYgPIKU4cp2733HWYCpplQhddZLBUryaAHou723x+FRzQ5Df824Fj+IyyuiQTRoub4SnIFfIcrp70g==

View File

@ -245,6 +245,68 @@ services:
source: test-content
target: /app/content
working_dir: /app
influxdb3-core-pytest:
container_name: influxdb3-core-pytest
image: influxdata/docs-pytest
build:
context: .
dockerfile: Dockerfile.pytest
entrypoint:
- /bin/bash
- /src/test/scripts/run-tests.sh
- pytest
command:
# In the command, pass file paths to test.
# The container preprocesses the files for testing and runs the tests.
- content/influxdb3/core/**/*.md
- content/shared/**/*.md
environment:
- CONTENT_PATH=content/influxdb3/core
profiles:
- test
- influxdb3
stdin_open: true
tty: true
volumes:
# Site configuration files.
- type: bind
source: .
target: /src
read_only: true
# Files shared between host and container and writeable by both.
- type: bind
source: ./test/shared
target: /shared
- type: bind
source: ./content/influxdb3/core/.env.test
target: /app/.env.test
read_only: true
# In your code samples, use `/app/data/<FILE.lp>` or `data/<FILE.lp>` to access sample data files from the `static/downloads` directory.
- type: bind
source: ./static/downloads
target: /app/data
read_only: true
# In your code samples, use `/app/iot-starter` to store example modules or project files.
- type: volume
source: influxdb3-core-tmp
target: /app/iot-starter
# Target directory for the content under test.
# Files are copied from /src/content/<productpath> to /app/content/<productpath> before running tests.
- type: volume
source: test-content
target: /app/content
working_dir: /app
influxdb3-core:
container_name: influxdb3-core
image: quay.io/influxdb/influxdb3-core:latest
ports:
- 8181:8181
command:
- serve
- --node-id=sensors_node0
- --log-filter=debug
- --object-store=file
- --data-dir=/var/lib/influxdb3
telegraf-pytest:
container_name: telegraf-pytest
image: influxdata/docs-pytest

View File

@ -0,0 +1,22 @@
# Custom Vale configuration for InfluxDB 3.x Core documentation.
# Custom styles are defined in .ci/vale/styles/InfluxDB3-Core:
# // SOURCE .ci/vale/styles/InfluxDB3-Core/Branding.yml
# // SOURCE .ci/vale/styles/InfluxDB3-Core/v3Schema.yml
StylesPath = "../../../.ci/vale/styles"
Vocab = InfluxDataDocs
MinAlertLevel = warning
Packages = Google, write-good, Hugo
[*.md]
BasedOnStyles = Vale, InfluxDataDocs, InfluxDB3-Core, Google, write-good
Google.Acronyms = NO
Google.DateFormat = NO
Google.Ellipses = NO
Google.Headings = NO
Google.WordList = NO
Vale.Spelling = NO

File diff suppressed because one or more lines are too long

View File

@ -42,7 +42,7 @@ Use the InfluxDB 3 quick install script to install {{< product-name >}} on
1. Use the following command to download and install the appropriate
{{< product-name >}} package on your local machine:
<!--pytest.mark.skip-->
```bash
curl -O https://www.influxdata.com/d/install_influxdb3.sh \
&& sh install_influxdb3.sh
@ -72,6 +72,7 @@ source ~/.bashrc
```
{{% /code-tab-content %}}
{{% code-tab-content %}}
<!--pytest.mark.skip-->
```bash
source ~/.zshrc
```
@ -134,6 +135,9 @@ Use the `influxdb3-core` Docker image to deploy {{< product-name >}} in a
Docker container.
The image is available for x86_64 (AMD64) and ARM64 architectures.
### Using Docker CLI
<!--pytest.mark.skip-->
```bash
docker pull quay.io/influxdb/influxdb3-core:latest
```
@ -156,4 +160,47 @@ docker pull \
quay.io/influxdb/influxdb3-core:latest
```
### Using Docker Compose
1. Open `compose.yaml` for editing and add a `services` entry for {{% product-name %}}--for example:
```yaml
# compose.yaml
services
influxdb3-core:
container_name: influxdb3-core
image: quay.io/influxdb/influxdb3-{{% product-key %}}:latest
ports:
- 9999:9999
command:
- serve
- --node-id=node0
- --log-filter=debug
- --object-store=file
- --data-dir=/var/lib/influxdb3
```
2. Use the Docker Compose CLI to start the server.
Optional: to make sure you have the latest version of the image before you
start the server, run `docker compose pull`.
<!--pytest.mark.skip-->
```bash
docker compose pull && docker compose run influxdb3-core
```
> [!Note]
> #### Stopping an InfluxDB 3 container
>
> To stop a running InfluxDB 3 container, find and terminate the process--for example:
>
> <!--pytest.mark.skip-->
> ```bash
> ps -ef | grep influxdb3
> kill -9 <PROCESS_ID>
> ```
>
> Currently, a bug prevents using `Ctrl-c` in the terminal to stop an InfluxDB 3 container.
{{< page-nav next="/influxdb3/core/get-started/" nextText="Get started with InfluxDB 3 Core" >}}

View File

@ -68,6 +68,8 @@ This guide covers InfluxDB 3 Core (the open source release), including the follo
<!--------------- BEGIN LINUX AND MACOS -------------->
To get started quickly, download and run the install script--for example, using [curl](https://curl.se/download.html):
<!--pytest.mark.skip-->
```bash
curl -O https://www.influxdata.com/d/install_influxdb3.sh \
&& sh install_influxdb3.sh
@ -106,6 +108,8 @@ is available for x86_64 (AMD64) and ARM64 architectures.
Pull the image:
<!--pytest.mark.skip-->
```bash
docker pull quay.io/influxdb/influxdb3-core:latest
```
@ -126,6 +130,8 @@ influxdb3 --version
If your system doesn't locate `influxdb3`, then `source` the configuration file (for example, .bashrc, .zshrc) for your shell--for example:
<!--pytest.mark.skip-->
```zsh
source ~/.zshrc
```
@ -145,13 +151,13 @@ and provide the following:
The following examples show how to start InfluxDB 3 with different object store configurations:
```bash
# MEMORY
# Memory object store
# Stores data in RAM; doesn't persist data
influxdb3 serve --node-id=local01 --object-store=memory
```
```bash
# FILESYSTEM
# Filesystem object store
# Provide the filesystem directory
influxdb3 serve \
--node-id=local01 \
@ -164,10 +170,13 @@ To run the [Docker image](/influxdb3/core/install/#docker-image) and persist dat
- `-v /path/on/host:/path/in/container`: Mounts a directory from your filesystem to the container
- `--object-store file --data-dir /path/in/container`: Uses the mount for server storage
<!--pytest.mark.skip-->
```bash
# FILESYSTEM USING DOCKER
# Filesystem object store with Docker
# Create a mount
# Provide the mount path
docker run -it \
-v /path/on/host:/path/in/container \
quay.io/influxdb/influxdb3-core:latest serve \
@ -177,15 +186,29 @@ docker run -it \
```
```bash
# S3 (defaults to us-east-1 for region)
# S3 object store (default is the us-east-1 region)
# Specify the Object store type and associated options
influxdb3 serve --node-id=local01 --object-store=s3 --bucket=[BUCKET] --aws-access-key=[AWS ACCESS KEY] --aws-secret-access-key=[AWS SECRET ACCESS KEY]
```bash
influxdb3 serve \
--node-id=local01 \
--object-store=s3 \
--bucket=BUCKET \
--aws-access-key=AWS_ACCESS_KEY \
--aws-secret-access-key=AWS_SECRET_ACCESS_KEY
```
```bash
# Minio/Open Source Object Store (Uses the AWS S3 API, with additional parameters)
# Specify the Object store type and associated options
influxdb3 serve --node-id=local01 --object-store=s3 --bucket=[BUCKET] --aws-access-key=[AWS ACCESS KEY] --aws-secret-access-key=[AWS SECRET ACCESS KEY] --aws-endpoint=[ENDPOINT] --aws-allow-http
# Minio or other open source object store
# (using the AWS S3 API with additional parameters)
# Specify the object store type and associated options
```bash
influxdb3 serve --node-id=local01 --object-store=s3 --bucket=BUCKET \
--aws-access-key=AWS_ACCESS_KEY \
--aws-secret-access-key=AWS_SECRET_ACCESS_KEY \
--aws-endpoint=ENDPOINT \
--aws-allow-http
```
_For more information about server options, run `influxdb3 serve --help`._
@ -197,15 +220,17 @@ _For more information about server options, run `influxdb3 serve --help`._
> Use the `docker kill` command to stop the container:
>
> 1. Enter the following command to find the container ID:
> <!--pytest.mark.skip-->
> ```bash
> docker ps -a
> ```
> 2. Enter the command to stop the container:
> <!--pytest.mark.skip-->
> ```bash
> docker kill <CONTAINER_ID>
> ```
### Data Model
### Data model
The database server contains logical databases, which have tables, which have columns. Compared to previous versions of InfluxDB you can think of a database as a `bucket` in v2 or as a `db/retention_policy` in v1. A `table` is equivalent to a `measurement`, which has columns that can be of type `tag` (a string dictionary), `int64`, `float64`, `uint64`, `bool`, or `string` and finally every table has a `time` column that is a nanosecond precision timestamp.
@ -214,7 +239,7 @@ This is the sort order used for all Parquet files that get created. When you cre
Tags should hold unique identifying information like `sensor_id`, or `building_id` or `trace_id`. All other data should be kept in fields. You will be able to add fast last N value and distinct value lookups later for any column, whether it is a field or a tag.
### Write Data
### Write data
InfluxDB is a schema-on-write database. You can start writing data and InfluxDB creates the logical database, tables, and their schemas on the fly.
After a schema is created, InfluxDB validates future write requests against it before accepting the data.
@ -222,23 +247,43 @@ Subsequent requests can add new fields on-the-fly, but can't add new tags.
{{% product-name %}} is optimized for recent data, but accepts writes from any time period. It persists that data in Parquet files for access by third-party systems for longer term historical analysis and queries. If you require longer historical queries with a compactor that optimizes data organization, consider using [InfluxDB 3 Enterprise](/influxdb3/enterprise/get-started/).
The database provides three write API endpoints that respond to HTTP `POST` requests:
The database has three write API endpoints that respond to HTTP `POST` requests:
#### /api/v3/write_lp endpoint
* `/write?db=mydb&precision=ns`
* `/api/v2/write?bucket=mydb&precision=ns`
* `/api/v3/write_lp?db=mydb&precision=nanosecond&accept_partial=true`
{{% product-name %}} adds the `/api/v3/write_lp` endpoint.
{{% product-name %}} provides the `/write` and `/api/v2/write` endpoints for backward compatibility with clients that can write data to previous versions of InfluxDB.
However, these APIs differ from the APIs in the previous versions in the following ways:
{{<api-endpoint endpoint="/api/v3/write_lp?db=mydb&precision=nanosecond&accept_partial=true" method="post" >}}
This endpoint accepts the same line protocol syntax as previous versions,
and supports the `?accept_partial=<BOOLEAN>` parameter, which
lets you accept or reject partial writes (default is `true`).
#### /api/v2/write InfluxDB v2 compatibility endpoint
Provides backwards compatibility with clients that can write data to InfluxDB OSS v2.x and Cloud 2 (TSM).
{{<api-endpoint endpoint="/api/v2/write?bucket=mydb&precision=ns" method="post" >}}
#### /write InfluxDB v1 compatibility endpoint
Provides backwards compatibility for clients that can write data to InfluxDB v1.x
{{<api-endpoint endpoint="/write?db=mydb&precision=ns" method="post" >}}
Keep in mind that these compatibility APIs differ from the v1 and v2 APIs in previous versions in the following ways:
- Tags in a table (measurement) are _immutable_
- A tag and a field can't have the same name within a table.
{{% product-name %}} adds the `/api/v3/write_lp` endpoint, which accepts the same line protocol syntax as previous versions, and supports an `?accept_partial=<BOOLEAN>` parameter, which
lets you accept or reject partial writes (default is `true`).
#### Write line protocol
The following code block is an example of [line protocol](/influxdb3/core/reference/syntax/line-protocol/), which shows the table name followed by tags, which are an ordered, comma-separated list of key/value pairs where the values are strings, followed by a comma-separated list of key/value pairs that are the fields, and ending with an optional timestamp. The timestamp by default is a nanosecond epoch, but you can specify a different precision through the `precision` query parameter.
The following code block is an example of time series data in [line protocol](/influxdb3/core/reference/syntax/line-protocol/) syntax:
- `cpu`: the table name.
- `host`, `region`, `applications`: the tags. A tag set is an ordered, comma-separated list of key/value pairs where the values are strings.
- `val`, `usage_percent`, `status`: the fields. A field set is a comma-separated list of key/value pairs.
- timestamp: If you don't specify a timestamp, InfluxData uses the time when data is written.
The default precision is a nanosecond epoch.
To specify a different precision, pass the `precision` query parameter.
```
cpu,host=Alpha,region=us-west,application=webserver val=1i,usage_percent=20.5,status="OK"
@ -249,16 +294,20 @@ cpu,host=Bravo,region=us-central,application=database val=5i,usage_percent=80.5,
cpu,host=Alpha,region=us-west,application=webserver val=6i,usage_percent=25.3,status="Warn"
```
##### Example: write data using the influxdb3 CLI
If you save the preceding line protocol to a file (for example, `server_data`), then you can use the `influxdb3` CLI to write the data--for example:
```bash
influxdb3 write --database=mydb --file=server_data
```
##### Example: write data using the /api/v3 HTTP API
The following examples show how to write data using `curl` and the `/api/3/write_lp` HTTP endpoint.
To show the difference between accepting and rejecting partial writes, line `2` in the example contains a `string` value for a `float` field (`temp=hi`).
##### Partial write of line protocol occurred
###### Partial write of line protocol occurred
With `accept_partial=true`:
@ -284,7 +333,7 @@ With `accept_partial=true`:
Line `1` is written and queryable.
The response is an HTTP error (`400`) status, and the response body contains the error message `partial write of line protocol occurred` with details about the problem line.
##### Parsing failed for write_lp endpoint
###### Parsing failed for write_lp endpoint
With `accept_partial=false`:
@ -315,7 +364,7 @@ The response is the following:
InfluxDB rejects all points in the batch.
The response is an HTTP error (`400`) status, and the response body contains `parsing failed for write_lp endpoint` and details about the problem line.
##### Data durability
#### Data durability
When you write data to InfluxDB, InfluxDB ingests the data and writes it to WAL files, created once per second, and to an in-memory queryable buffer.
Later, InfluxDB snapshots the WAL and persists the data into object storage as Parquet files.
@ -327,7 +376,7 @@ For more information, see [diskless architecture](#diskless-architecture).
> Because InfluxDB sends a write response after the WAL file has been flushed to the configured object store (default is every second), individual write requests might not complete quickly, but you can make many concurrent requests to achieve higher total throughput.
> Future enhancements will include an API parameter that lets requests return without waiting for the WAL flush.
#### Create a database or table
### Create a database or table
To create a database without writing data, use the `create` subcommand--for example:
@ -341,7 +390,7 @@ To learn more about a subcommand, use the `-h, --help` flag:
influxdb3 create -h
```
### Query the database
### Query a database
InfluxDB 3 now supports native SQL for querying, in addition to InfluxQL, an
SQL-like language customized for time series queries.

View File

@ -27,6 +27,8 @@
"jquery": "^3.7.1",
"js-cookie": "^3.0.5",
"js-yaml": "^4.1.0",
"lefthook": "^1.10.10",
"markdown-link": "^0.1.1",
"mermaid": "^11.4.1",
"vanillajs-datepicker": "^1.3.4"
},

View File

@ -2657,6 +2657,79 @@ lazy-ass@^1.6.0:
resolved "https://registry.yarnpkg.com/lazy-ass/-/lazy-ass-1.6.0.tgz#7999655e8646c17f089fdd187d150d3324d54513"
integrity sha512-cc8oEVoctTvsFZ/Oje/kGnHbpWHYBe8IAJe4C0QNc3t8uM/0Y8+erSz/7Y1ALuXTEZTMvxXwO6YbX1ey3ujiZw==
lazystream@^1.0.0:
version "1.0.1"
resolved "https://registry.npmjs.org/lazystream/-/lazystream-1.0.1.tgz"
integrity sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==
dependencies:
readable-stream "^2.0.5"
lefthook-darwin-arm64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-darwin-arm64/-/lefthook-darwin-arm64-1.10.10.tgz#48a3eb7935cb171a36a037e61a68c0dc800efc5f"
integrity sha512-hEypKdwWpmNSl4Q8eJxgmlGb2ybJj1+W5/v13Mxc+ApEmjbpNiJzPcdjC9zyaMEpPK4EybiHy8g5ZC0dLOwkpA==
lefthook-darwin-x64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-darwin-x64/-/lefthook-darwin-x64-1.10.10.tgz#729f5ddd296f876da703496e5071a735e6cf3625"
integrity sha512-9xNbeE78i4Amz+uOheg9dcy7X/6X12h98SUMrYWk7fONvjW/Bp9h6nPGIGxI5krHp9iRB8rhmo33ljVDVtTlyg==
lefthook-freebsd-arm64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-freebsd-arm64/-/lefthook-freebsd-arm64-1.10.10.tgz#24abbba49d5d6007381883bb122089f4d33f0e48"
integrity sha512-GT9wYxPxkvO1rtIAmctayT9xQIVII5xUIG3Pv6gZo+r6yEyle0EFTLFDbmVje7p7rQNCsvJ8XzCNdnyDrva90g==
lefthook-freebsd-x64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-freebsd-x64/-/lefthook-freebsd-x64-1.10.10.tgz#e509ab6efe42741a0b8f57e87155f855c701366e"
integrity sha512-2BB/HRhEb9wGpk5K38iNkHtMPnn+TjXDtFG6C/AmUPLXLNhGnNiYp+v2uhUE8quWzxJx7QzfnU7Ga+/gzJcIcw==
lefthook-linux-arm64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-linux-arm64/-/lefthook-linux-arm64-1.10.10.tgz#1c5c9339f86fa427d311026af92e9f84a89191a1"
integrity sha512-GJ7GALKJ1NcMnNZG9uY+zJR3yS8q7/MgcHFWSJhBl+w4KTiiD/RAdSl5ALwEK2+UX36Eo+7iQA7AXzaRdAii4w==
lefthook-linux-x64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-linux-x64/-/lefthook-linux-x64-1.10.10.tgz#629f7f91618073ca8d712ff95e4f0e54d839af3c"
integrity sha512-dWUvPM9YTIJ3+X9dB+8iOnzoVHbnNmpscmUqEOKSeizgBrvuuIYKZJGDyjEtw65Qnmn1SJ7ouSaKK93p5c7SkQ==
lefthook-openbsd-arm64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-openbsd-arm64/-/lefthook-openbsd-arm64-1.10.10.tgz#29859c0357f00ba828f73ada56fe709bd108bb15"
integrity sha512-KnwDyxOvbvGSBTbEF/OxkynZRPLowd3mIXUKHtkg3ABcQ4UREalX+Sh0nWU2dNjQbINx7Eh6B42TxNC7h+qXEg==
lefthook-openbsd-x64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-openbsd-x64/-/lefthook-openbsd-x64-1.10.10.tgz#2ff9cd0ed72f9d5deabdcc84fba4d8e1b6e14c50"
integrity sha512-49nnG886CI3WkrzVJ71D1M2KWpUYN1BP9LMKNzN11cmZ0j6dUK4hj3nbW+NcrKXxgYzzyLU3FFwrc51OVy2eKA==
lefthook-windows-arm64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-windows-arm64/-/lefthook-windows-arm64-1.10.10.tgz#6ba521f289909cd1467b4f408f8ef8a1e87d278f"
integrity sha512-9ni0Tsnk+O5oL7EBfKj9C5ZctD1mrTyHCtiu1zQJBbREReJtPjIM9DwWzecfbuVfrIlpbviVQvx5mjZ44bqlWw==
lefthook-windows-x64@1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook-windows-x64/-/lefthook-windows-x64-1.10.10.tgz#aac9caca3152df8f288713929c2ec31ee0a10b54"
integrity sha512-gkKWYrlay4iecFfY1Ris5VcRYa0BaNJKMk0qE/wZmIpMgu4GvNg+f9BEwTMflkQIanABduT9lrECaL1lX5ClKw==
lefthook@^1.10.10:
version "1.10.10"
resolved "https://registry.yarnpkg.com/lefthook/-/lefthook-1.10.10.tgz#29d0b221429f55d699785ddeeb6fa3c8f9951e6f"
integrity sha512-YW0fTONgOXsephvXq2gIFbegCW19MHCyKYX7JDWmzVF1ZiVMnDBYUL/SP3i0RtFvlCmqENl4SgKwYYQGUMnvig==
optionalDependencies:
lefthook-darwin-arm64 "1.10.10"
lefthook-darwin-x64 "1.10.10"
lefthook-freebsd-arm64 "1.10.10"
lefthook-freebsd-x64 "1.10.10"
lefthook-linux-arm64 "1.10.10"
lefthook-linux-x64 "1.10.10"
lefthook-openbsd-arm64 "1.10.10"
lefthook-openbsd-x64 "1.10.10"
lefthook-windows-arm64 "1.10.10"
lefthook-windows-x64 "1.10.10"
levn@^0.4.1:
version "0.4.1"
resolved "https://registry.yarnpkg.com/levn/-/levn-0.4.1.tgz#ae4562c007473b932a6200d403268dd2fffc6ade"
@ -2798,6 +2871,11 @@ make-dir@^1.0.0:
dependencies:
pify "^3.0.0"
markdown-link@^0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/markdown-link/-/markdown-link-0.1.1.tgz#32c5c65199a6457316322d1e4229d13407c8c7cf"
integrity sha512-TurLymbyLyo+kAUUAV9ggR9EPcDjP/ctlv9QAFiqUH7c+t6FlsbivPo9OKTU8xdOx9oNd2drW/Fi5RRElQbUqA==
marked@^13.0.2:
version "13.0.3"
resolved "https://registry.yarnpkg.com/marked/-/marked-13.0.3.tgz#5c5b4a5d0198060c7c9bc6ef9420a7fed30f822d"