refactor: merge CI integration tests (#3049)

* refactor: merge CI integration tests

* chore: update docs
pull/24376/head
Raphael Taylor-Davies 2021-11-10 16:48:16 +00:00 committed by GitHub
parent a439404e82
commit f650962221
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 37 additions and 135 deletions

View File

@ -149,7 +149,16 @@ jobs:
# "1" means line tables only, which is useful for panic tracebacks.
RUSTFLAGS: "-C debuginfo=1"
RUST_BACKTRACE: "1"
# Run integration tests
TEST_INTEGRATION: 1
INFLUXDB_IOX_INTEGRATION_LOCAL: 1
KAFKA_CONNECT: "localhost:9092"
steps:
- run:
name: Run Kafka
# Sudo needed because data directory owned by root but container runs as unprivileged user
command: sudo rpk redpanda start
background: true
- checkout
- rust_components
- cache_restore
@ -222,42 +231,6 @@ jobs:
trap "cat perf/logs/test.log" ERR
perf/perf.py --debug --no-volumes --object-store memory battery-0
test_kafka_integration:
machine: true
resource_class: xlarge
environment:
# Disable incremental compilation to avoid overhead. We are not preserving these files anyway.
CARGO_INCREMENTAL: "0"
# Disable full debug symbol generation to speed up CI build
# "1" means line tables only, which is useful for panic tracebacks.
RUSTFLAGS: "-C debuginfo=1"
RUST_BACKTRACE: "1"
steps:
- checkout
- run:
name: Run integration tests with Docker Compose
command: docker-compose -f docker/ci-kafka-docker-compose.yml up --build --force-recreate --exit-code-from rust
# Integration tests for the influxdb2_client crate against InfluxDB 2.0 OSS.
test_influxdb2_client:
docker:
- image: quay.io/influxdb/rust:ci
environment:
# Disable incremental compilation to avoid overhead. We are not preserving these files anyway.
CARGO_INCREMENTAL: "0"
# Disable full debug symbol generation to speed up CI build
# "1" means line tables only, which is useful for panic tracebacks.
RUSTFLAGS: "-C debuginfo=1"
RUST_BACKTRACE: "1"
steps:
- checkout
- rust_components
- cache_restore
- run:
name: Cargo test
command: TEST_INTEGRATION=1 INFLUXDB_IOX_INTEGRATION_LOCAL=1 cargo test -p influxdb2_client
- cache_save
# Build a dev binary.
#
# Compiles a binary with the default ("dev") cargo profile from the iox source
@ -434,8 +407,6 @@ workflows:
- test
- test_heappy
- test_perf
- test_kafka_integration
- test_influxdb2_client
- build
- doc
- perf_image:
@ -450,8 +421,6 @@ workflows:
- test
- test_heappy
- test_perf
- test_kafka_integration
- test_influxdb2_client
- build
- doc

View File

@ -1,24 +0,0 @@
###
# Dockerfile for integration tests that connect to Kafka
# It expects to be run with `docker-compose -f ci-kafka-docker-compose.yml`
##
# Reuse most of the configuration for the rest of the CI builds
FROM quay.io/influxdb/rust:ci
# Create a new directory that will contain the code checkout
ADD . /home/rust/iox
# Make the rust user the owner
RUN sudo chown -R rust:rust /home/rust/iox
# Work in this directory
WORKDIR /home/rust/iox
ENV CARGO_INCREMENTAL=0
ENV RUSTFLAGS="-C debuginfo=1"
ENV TEST_INTEGRATION=1
ENV KAFKA_CONNECT=kafka:9092
# Run the integration tests that connect to Kafka that will be running in another container
CMD ["sh", "-c", "cargo test -p write_buffer kafka -- --nocapture"]

View File

@ -1 +0,0 @@
target/

View File

@ -1,37 +0,0 @@
version: "2"
services:
zookeeper:
image: docker.io/bitnami/zookeeper:3
ports:
- "2181:2181"
volumes:
- "zookeeper_data:/bitnami"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
image: docker.io/bitnami/kafka:2
ports:
- "9093:9093"
volumes:
- "kafka_data:/bitnami"
environment:
- KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
- ALLOW_PLAINTEXT_LISTENER=yes
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CLIENT:PLAINTEXT,EXTERNAL:PLAINTEXT
- KAFKA_CFG_LISTENERS=CLIENT://:9092,EXTERNAL://:9093
- KAFKA_CFG_ADVERTISED_LISTENERS=CLIENT://kafka:9092,EXTERNAL://localhost:9093
- KAFKA_INTER_BROKER_LISTENER_NAME=CLIENT
- KAFKA_CFG_LOG_RETENTION_CHECK_INTERVAL_MS=100
depends_on:
- zookeeper
rust:
build:
context: ..
dockerfile: docker/Dockerfile.ci.integration
depends_on:
- kafka
volumes:
zookeeper_data: {}
kafka_data: {}

View File

@ -111,48 +111,43 @@ If you do not want to use Docker locally, but you do have `influxd` for InfluxDB
## Kafka Write Buffer
If you want to run integration tests with a Kafka instance serving as a write buffer, you will need
to set `TEST_INTEGRATION=1`.
By default, the integration tests for the Kafka-based write buffer are not run.
You will also need to set `KAFKA_CONNECT` to the host and port where the tests can connect to a
running Kafka instance.
In order to run them you must set two environment variables:
There is a Docker Compose file for running Kafka and Zookeeper using Docker in
`docker/ci-kafka-docker-compose.yml` that CI also uses to run the integration tests.
* `TEST_INTEGRATION=1`
* `KAFKA_CONNECT` to a host and port where the tests can connect to a running Kafka broker
You have two options for running `cargo test`: on your local (host) machine (likely what you
normally do with tests), or within another Docker container (what CI does).
### Running Kafka Locally
### Running `cargo test` on the host machine
[Redpanda](https://vectorized.io/redpanda/) is a Kafka-compatible broker that can be used to run the tests, and is used
by the CI to test IOx.
If you want to compile the tests and run `cargo test` on your local machine, you can start Kafka
using the Docker Compose file with:
Either follow the instructions on the website to install redpanda directly onto your system, or alternatively
it can be run in a docker container with:
```
$ docker-compose -f docker/ci-kafka-docker-compose.yml up kafka
docker run -d --pull=always --name=redpanda-1 --rm \
-p 9092:9092 \
-p 9644:9644 \
docker.vectorized.io/vectorized/redpanda:latest \
redpanda start \
--overprovisioned \
--smp 1 \
--memory 1G \
--reserve-memory 0M \
--node-id 0 \
--check=false
```
You can then run the tests with `KAFKA_CONNECT=localhost:9093`. To run just the Kafka integration
tests, the full command would then be:
It is then just a case of setting the environment variables and running the tests as normal
```
TEST_INTEGRATION=1 KAFKA_CONNECT=localhost:9093 cargo test
```
Or to just run the Kafka tests
```
TEST_INTEGRATION=1 KAFKA_CONNECT=localhost:9093 cargo test -p write_buffer kafka --nocapture
```
### Running `cargo test` in a Docker container
Alternatively, you can do what CI does by compiling the tests and running `cargo test` in a Docker
container as well. First, make sure you have the latest `rust:ci` image by running:
```
docker image pull quay.io/influxdb/rust:ci
```
Then run this Docker Compose command that uses `docker/Dockerfile.ci.integration`:
```
docker-compose -f docker/ci-kafka-docker-compose.yml up --build --force-recreate --exit-code-from rust
```
Because the `rust` service depends on the `kafka` service in the Docker Compose file, you don't
need to start the `kafka` service separately.
```