ce46bbaada
This adds the scaffolding for the ingester server to consume data from Kafka. This ingests data in an in memory structure while creating records in the catalog for any partitions that don't yet exist. I've removed catalog_update.rs in ingester for now. That was mostly a placeholder and will be going in a combination of handler.rs and data.rs on my next PR which will have some primitive lifecycle wired up. There's one ugly bit here where the DML write is cloned because it's getting borrowed to output spans and metrics. I'll need to follow up with a refactor to make it so that the DML write's tables can be consumed without it gumming up the metrics stuff. Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
migrations | ||
src | ||
Cargo.toml | ||
README.md | ||
build.rs |
README.md
IOx Catalog
This crate contains the code for the IOx Catalog. This includes the definitions of namespaces, their tables, the columns of those tables and their types, what Parquet files are in object storage and delete tombstones. There's also some configuration information that the overal distributed system uses for operation.
To run this crate's tests you'll need Postgres installed and running locally. You'll also need to set the
DATABASE_URL
environment variable so that sqlx will be able to connect to your local DB. For example with
user and password filled in:
DATABASE_URL=postgres://<postgres user>:<postgres password>@localhost/iox_shared
You'll then need to create the database. You can do this via the sqlx command line.
cargo install sqlx-cli
sqlx database create
This will set up the database based on the files in ./migrations
in this crate. SQLx also creates a table
to keep track of which migrations have been run.
Tests
To run the Postgres integration tests, ensure the above setup is complete first.
- Set
DATABASE_URL=<dsn>
env (see above) - Set
TEST_INTEGRATION=1
- Run
cargo test
CAUTION: existing data in the database is dropped when tests are run