Elementary provides anomaly tests for detection of data quality issues. Elementary data tests are configured and executed like native tests in your dbt project.

Elementary tests can be used in addition to dbt tests, packages tests (such as dbt-expectations), and custom tests. All of these test results will be presented in the Elementary UI and alerts.

The Elementary dbt package offers two test types:

  • Pipeline tests: Monitor the health of data pipelines, ensuring timely and smooth data ingestion, transformation, and loading.
  • Data quality tests: Validate data accuracy, completeness, and correctness, detect anomalies and schema changes, and ensure the data meets predefined business rules.

Together, these tests ensure reliable pipelines and trusted data.

In addition to the previously mentioned tests, the Elementary Cloud Platform offers automated pipeline tests. While traditional tests query the dbt tables directly, automated pipeline tests analyze query history metadata. This method is both faster and more cost-efficient, as it eliminates the need to query large datasets, focusing solely on the metadata layer.

Elementary automatically creates monitors for every model and source in your dbt project once you set up your environment, no configuration is required. Learn more about automated tests.

Anomaly detection tests

Tests to detect anomalies in data quality metrics such as volume, freshness, null rates, and anomalies in specific dimensions.

Volume anomalies

Monitors table row count over time to detect drops or spikes in volume.

Freshness anomalies

Monitors the latest timestamp of a table to detect data delays.

Event freshness anomalies

Monitors the gap between the latest event timestamp and its loading time, to detect event freshness issues.

Dimension anomalies

Monitors the row count per dimension over time, and alerts on unexpected changes in the distribution. It is best to configure it on low-cardinality fields.

All columns anomalies

Activates the column anomalies test on all the columns of the table. It’s possible to exclude specific columns.

Schema tests

Schema changes

Fails on changes in schema: deleted or added columns, or change of data type of a column.

Schema changes from baseline

Fails if the table schema is different in columns names or column types than a configured baseline (can be generated with a macro).

JSON schema

Monitors a JSON type column and fails if there are JSON events that don’t match a configured JSON schema (can be generated with a macro).

Exposure schema

Monitors changes in your models’ columns that break schema for downstream exposures, such as BI dashboards.

Other tests

Python tests

Write your own custom tests using Python scripts.