Merge pull request #320 from influxdata/query/sql-guide
Query SQL guide with sample datapull/351/head
commit
28c41bdcd4
|
@ -56,6 +56,7 @@ pre {
|
|||
padding: 0;
|
||||
font-size: .95rem;
|
||||
line-height: 1.5rem;
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@ menu:
|
|||
v2_0:
|
||||
name: Create histograms
|
||||
parent: How-to guides
|
||||
weight: 207
|
||||
weight: 208
|
||||
---
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,217 @@
|
|||
---
|
||||
title: Query SQL data sources
|
||||
seotitle: Query SQL data sources with InfluxDB
|
||||
description: >
|
||||
The Flux `sql` package provides functions for working with SQL data sources.
|
||||
Use `sql.from()` to query SQL databases like PostgreSQL and MySQL
|
||||
v2.0/tags: [query, flux, sql]
|
||||
menu:
|
||||
v2_0:
|
||||
parent: How-to guides
|
||||
weight: 207
|
||||
---
|
||||
|
||||
The [Flux](/v2.0/reference/flux) `sql` package provides functions for working with SQL data sources.
|
||||
[`sql.from()`](/v2.0/reference/flux/functions/sql/from/) lets you query SQL data sources
|
||||
like [PostgreSQL](https://www.postgresql.org/) and [MySQL](https://www.mysql.com/)
|
||||
and use the results with InfluxDB dashboards, tasks, and other operations.
|
||||
|
||||
- [Query a SQL data source](#query-a-sql-data-source)
|
||||
- [Join SQL data with data in InfluxDB](#join-sql-data-with-data-in-influxdb)
|
||||
- [Use SQL results to populate dashboard variables](#use-sql-results-to-populate-dashboard-variables)
|
||||
- [Sample sensor data](#sample-sensor-data)
|
||||
|
||||
## Query a SQL data source
|
||||
To query a SQL data source:
|
||||
|
||||
1. Import the `sql` package in your Flux query
|
||||
2. Use the `sql.from()` function to specify the driver, data source name (DSN),
|
||||
and query used to query data from your SQL data source:
|
||||
|
||||
{{< code-tabs-wrapper >}}
|
||||
{{% code-tabs %}}
|
||||
[PostgreSQL](#)
|
||||
[MySQL](#)
|
||||
{{% /code-tabs %}}
|
||||
|
||||
{{% code-tab-content %}}
|
||||
```js
|
||||
import "sql"
|
||||
|
||||
sql.from(
|
||||
driverName: "postgres",
|
||||
dataSourceName: "postgresql://user:password@localhost",
|
||||
query: "SELECT * FROM example_table"
|
||||
)
|
||||
```
|
||||
{{% /code-tab-content %}}
|
||||
|
||||
{{% code-tab-content %}}
|
||||
```js
|
||||
import "sql"
|
||||
|
||||
sql.from(
|
||||
driverName: "mysql",
|
||||
dataSourceName: "user:password@tcp(localhost:3306)/db",
|
||||
query: "SELECT * FROM example_table"
|
||||
)
|
||||
```
|
||||
{{% /code-tab-content %}}
|
||||
{{< /code-tabs-wrapper >}}
|
||||
|
||||
_See the [`sql.from()` documentation](/v2.0/reference/flux/functions/sql/from/) for
|
||||
information about required function parameters._
|
||||
|
||||
## Join SQL data with data in InfluxDB
|
||||
One of the primary benefits of querying SQL data sources from InfluxDB
|
||||
is the ability to enrich query results with data stored outside of InfluxDB.
|
||||
|
||||
Using the [air sensor sample data](#sample-sensor-data) below, the following query
|
||||
joins air sensor metrics stored in InfluxDB with sensor information stored in PostgreSQL.
|
||||
The joined data lets you query and filter results based on sensor information
|
||||
that isn't stored in InfluxDB.
|
||||
|
||||
```js
|
||||
// Import the "sql" package
|
||||
import "sql"
|
||||
|
||||
// Query data from PostgreSQL
|
||||
sensorInfo = sql.from(
|
||||
driverName: "postgres",
|
||||
dataSourceName: "postgresql://localhost?sslmode=disable",
|
||||
query: "SELECT * FROM sensors"
|
||||
)
|
||||
|
||||
// Query data from InfluxDB
|
||||
sensorMetrics = from(bucket: "example-bucket")
|
||||
|> range(start: -1h)
|
||||
|> filter(fn: (r) => r._measurement == "airSensors")
|
||||
|
||||
// Join InfluxDB query results with PostgreSQL query results
|
||||
join(tables: {metric: sensorMetrics, info: sensorInfo}, on: ["sensor_id"])
|
||||
```
|
||||
|
||||
## Use SQL results to populate dashboard variables
|
||||
Use `sql.from()` to [create dashboard variables](/v2.0/visualize-data/variables/create-variable/)
|
||||
from SQL query results.
|
||||
The following example uses the [air sensor sample data](#sample-data) below to
|
||||
create a variable that lets you select the location of a sensor.
|
||||
|
||||
```js
|
||||
import "sql"
|
||||
|
||||
sql.from(
|
||||
driverName: "postgres",
|
||||
dataSourceName: "postgresql://localhost?sslmode=disable",
|
||||
query: "SELECT * FROM sensors"
|
||||
)
|
||||
|> rename(columns: {location: "_value"})
|
||||
|> keep(columns: ["_value"])
|
||||
```
|
||||
|
||||
Use the variable to manipulate queries in your dashboards.
|
||||
|
||||
{{< img-hd src="/img/2-0-sql-dashboard-variable.png" alt="Dashboard variable from SQL query results" />}}
|
||||
|
||||
---
|
||||
|
||||
## Sample sensor data
|
||||
The [sample data generator](#download-and-run-the-sample-data-generator) and
|
||||
[sample sensor information](#import-the-sample-sensor-information) simulate a
|
||||
group of sensors that measure temperature, humidity, and carbon monoxide
|
||||
in rooms throughout a building.
|
||||
Each collected data point is stored in InfluxDB with a `sensor_id` tag that identifies
|
||||
the specific sensor it came from.
|
||||
Sample sensor information is stored in PostgreSQL.
|
||||
|
||||
**Sample data includes:**
|
||||
|
||||
- Simulated data collected from each sensor and stored in the `airSensors` measurement in **InfluxDB**:
|
||||
- temperature
|
||||
- humidity
|
||||
- co
|
||||
|
||||
- Information about each sensor stored in the `sensors` table in **PostgreSQL**:
|
||||
- sensor_id
|
||||
- location
|
||||
- model_number
|
||||
- last_inspected
|
||||
|
||||
### Import and generate sample sensor data
|
||||
|
||||
#### Download and run the sample data generator
|
||||
`air-sensor-data.rb` is a script that generates air sensor data and stores the data in InfluxDB.
|
||||
To use `air-sensor-data.rb`:
|
||||
|
||||
1. [Create a bucket](/v2.0/organizations/buckets/create-bucket/) to store the data.
|
||||
2. Download the sample data generator. _This tool requires [Ruby](https://www.ruby-lang.org/en/)._
|
||||
|
||||
<a class="btn download" href="/downloads/air-sensor-data.rb" download>Download Air Sensor Generator</a>
|
||||
|
||||
3. Give `air-sensor-data.rb` executable permissions:
|
||||
|
||||
```
|
||||
chmod +x air-sensor-data.rb
|
||||
```
|
||||
|
||||
4. Start the generator. Specify your organization, bucket, and authorization token.
|
||||
_For information about retrieving your token, see [View tokens](/v2.0/security/tokens/view-tokens/)._
|
||||
|
||||
```
|
||||
./air-sensor-data.rb -o your-org -b your-bucket -t YOURAUTHTOKEN
|
||||
```
|
||||
|
||||
The generator begins to write data to InfluxDB and will continue until stopped.
|
||||
Use `ctrl-c` to stop the generator.
|
||||
|
||||
_**Note:** Use the `--help` flag to view other configuration options._
|
||||
|
||||
|
||||
5. [Query your target bucket](v2.0/query-data/execute-queries/) to ensure the
|
||||
generated data is writing successfully.
|
||||
The generator doesn't catch errors from write requests, so it will continue running
|
||||
even if data is not writing to InfluxDB successfully.
|
||||
|
||||
```
|
||||
from(bucket: "example-bucket")
|
||||
|> range(start: -1m)
|
||||
|> filter(fn: (r) => r._measurement == "airSensors")
|
||||
```
|
||||
|
||||
#### Import the sample sensor information
|
||||
1. [Download and install PostgreSQL](https://www.postgresql.org/download/).
|
||||
2. Download the sample sensor information CSV.
|
||||
|
||||
<a class="btn download" href="/downloads/sample-sensor-info.csv" download>Download Sample Data</a>
|
||||
|
||||
3. Use a PostgreSQL client (`psql` or a GUI) to create the `sensors` table:
|
||||
|
||||
```
|
||||
CREATE TABLE sensors (
|
||||
sensor_id character varying(50),
|
||||
location character varying(50),
|
||||
model_number character varying(50),
|
||||
last_inspected date
|
||||
);
|
||||
```
|
||||
|
||||
4. Import the downloaded CSV sample data.
|
||||
_Update the `FROM` file path to the path of the downloaded CSV sample data._
|
||||
|
||||
```
|
||||
COPY sensors(sensor_id,location,model_number,last_inspected)
|
||||
FROM '/path/to/sample-sensor-info.csv' DELIMITER ',' CSV HEADER;
|
||||
```
|
||||
|
||||
5. Query the table to ensure the data was imported correctly:
|
||||
|
||||
```
|
||||
SELECT * FROM sensors;
|
||||
```
|
||||
|
||||
#### Import the sample data dashboard
|
||||
Download and import the Air Sensors dashboard to visualize the generated data:
|
||||
|
||||
<a class="btn download" href="/downloads/air_sensors_dashboard.json" download>Download Air Sensors dashboard</a>
|
||||
|
||||
_For information about importing a dashboard, see [Create a dashboard](/v2.0/visualize-data/dashboards/create-dashboard/#create-a-new-dashboard)._
|
|
@ -60,11 +60,11 @@ _**Data type:** String_
|
|||
```js
|
||||
import "sql"
|
||||
|
||||
sql.from(
|
||||
driverName: "mysql",
|
||||
dataSourceName: "user:password@tcp(localhost:3306)/db",
|
||||
query:"SELECT * FROM ExampleTable"
|
||||
)
|
||||
sql.from(
|
||||
driverName: "mysql",
|
||||
dataSourceName: "user:password@tcp(localhost:3306)/db",
|
||||
query:"SELECT * FROM ExampleTable"
|
||||
)
|
||||
```
|
||||
|
||||
### Query a Postgres database
|
||||
|
|
|
@ -0,0 +1,131 @@
|
|||
#! /usr/bin/ruby
|
||||
require "optparse"
|
||||
require "net/http"
|
||||
require"openssl"
|
||||
require "uri"
|
||||
|
||||
# CLI Options
|
||||
options = {
|
||||
protocol: "http",
|
||||
host: "localhost",
|
||||
port: "9999",
|
||||
interval: 5
|
||||
}
|
||||
|
||||
OptionParser.new do |opt|
|
||||
opt.banner = "Usage: air-sensor-data [OPTIONS]"
|
||||
|
||||
opt.on("-o","--org ORG","The organization to write data to. REQUIRED.") do |org|
|
||||
options[:org] = org
|
||||
end
|
||||
|
||||
opt.on("-b","--bucket BUCKET","The bucket to write data to. REQUIRED.") do |bucket|
|
||||
options[:bucket] = bucket
|
||||
end
|
||||
|
||||
opt.on("-t","--token TOKEN","Your InfluxDB authentication token. REQUIRED.") do |token|
|
||||
options[:token] = token
|
||||
end
|
||||
|
||||
opt.on("-h","--host host","Your InfluxDB host. Defaults to 'localhost'") do |host|
|
||||
options[:host] = host
|
||||
end
|
||||
|
||||
opt.on("-p","--port port","Your InfluxDB port. Defaults to '9999'") do |port|
|
||||
options[:port] = port
|
||||
end
|
||||
|
||||
opt.on("-i","--interval interval",Integer,"The interval (in seconds) at which to write data. Defaults to '5'.") do |interval|
|
||||
options[:interval] = interval
|
||||
end
|
||||
|
||||
opt.on("-s","--tls", "Sends data over HTTPS.") do |tls|
|
||||
options[:protocol] = "https"
|
||||
end
|
||||
|
||||
opt.on("--help","Displays this help information.") do
|
||||
puts opt
|
||||
exit
|
||||
end
|
||||
end.parse!
|
||||
|
||||
unless options[:org] && options[:bucket] && options[:token]
|
||||
$stderr.puts "\nError: you must specify an organization, bucket, and token.\nUse the '--help' flag for more info.\n\n"
|
||||
exit 1
|
||||
end
|
||||
|
||||
# Global Variables
|
||||
$protocol = options[:protocol]
|
||||
$host = options[:host]
|
||||
$port = options[:port]
|
||||
$org = options[:org]
|
||||
$bucket = options[:bucket]
|
||||
$token = options[:token]
|
||||
$interval = options[:interval]
|
||||
|
||||
# Seed Data
|
||||
seeds = [
|
||||
{id: 100, t: 71.2, h: 35.1, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 101, t: 71.8, h: 34.9, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 102, t: 72.0, h: 34.9, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 103, t: 71.3, h: 35.2, c: 0.4, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 200, t: 73.6, h: 35.8, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.05},
|
||||
{id: 201, t: 74.0, h: 35.2, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 202, t: 75.3, h: 35.7, c: 0.5, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
{id: 203, t: 74.8, h: 35.9, c: 0.4, t_inc: -0.05..0.05, h_inc: -0.05..0.05, c_inc: -0.02..0.02},
|
||||
]
|
||||
|
||||
def increment_data(data={})
|
||||
data[:t] += rand(data[:t_inc])
|
||||
data[:h] += rand(data[:h_inc])
|
||||
data[:c] += rand(data[:c_inc])
|
||||
|
||||
# Avoid negative humidity and co
|
||||
if data[:h] < 0
|
||||
data[:h] = 0
|
||||
end
|
||||
if data[:c] < 0
|
||||
data[:c] = 0
|
||||
end
|
||||
|
||||
return data
|
||||
end
|
||||
|
||||
def line_protocol_batch(point_data=[])
|
||||
batch = []
|
||||
point_data.each do |v|
|
||||
batch << "airSensors,sensor_id=TLM0#{v[:id]} temperature=#{v[:t]},humidity=#{v[:h]},co=#{v[:c]}"
|
||||
end
|
||||
return batch.join("\n")
|
||||
end
|
||||
|
||||
def send_data(batch)
|
||||
uri = URI.parse("#{$protocol}://#{$host}:#{$port}/api/v2/write?org=#{URI::encode($org)}&bucket=#{URI::encode($bucket)}")
|
||||
request = Net::HTTP::Post.new(uri)
|
||||
request["Authorization"] = "Token #{$token}"
|
||||
request.body = "#{batch}"
|
||||
|
||||
req_options = {
|
||||
use_ssl: uri.scheme == "https",
|
||||
ssl_version: :SSLv23
|
||||
}
|
||||
|
||||
response = Net::HTTP.start(uri.hostname, uri.port, req_options) do |http|
|
||||
http.request(request)
|
||||
end
|
||||
end
|
||||
|
||||
def send_batches(dataset=[], interval=$interval)
|
||||
dataset.map! { |seed| increment_data(seed) }
|
||||
send_data(line_protocol_batch(dataset))
|
||||
sleep interval
|
||||
send_batches(dataset,interval)
|
||||
end
|
||||
|
||||
begin
|
||||
puts "Sending data to #{$protocol}://#{$host}:#{$port}..."
|
||||
puts " (ctrl-c to kill the data stream)"
|
||||
send_batches(seeds)
|
||||
rescue Interrupt
|
||||
puts "\nStopping data stream..."
|
||||
end
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,9 @@
|
|||
sensor_id,location,model_number,last_inspected
|
||||
TLM0100,Main Lobby,TLM89092A,1/11/2019
|
||||
TLM0101,Room 101,TLM89092A,1/11/2019
|
||||
TLM0102,Room 102,TLM89092B,1/11/2019
|
||||
TLM0103,Mechanical Room,TLM90012Z,1/14/2019
|
||||
TLM0200,Conference Room,TLM89092B,9/24/2018
|
||||
TLM0201,Room 201,TLM89092B,9/24/2018
|
||||
TLM0202,Room 202,TLM89092A,9/24/2018
|
||||
TLM0203,Room 203,TLM89092A,9/24/2018
|
|
Binary file not shown.
After Width: | Height: | Size: 50 KiB |
Loading…
Reference in New Issue