Skip to content

Releases: snowplow/dbt-snowplow-attribution

snowplow-attribution v0.4.0

15 Oct 10:33
Compare
Choose a tag to compare

Summary

This release extends support to Apache Spark with the Iceberg file format and updates integration tests accordingly.

Features

  • Add support for Apache Spark with Iceberg file format

Under the hood

  • Fix return_limit_from_model macro

Upgrading

Update the snowplow-attribution version in your packages.yml file.

snowplow-attribution v0.3.0

26 Jul 11:05
Compare
Choose a tag to compare

Summary

This release introduces a new user stitching logic when the snowplow__conversion_stitching variable is not enabled to avoid the need of having to enable both view and conversion stitching in the unified package to get the most accurate user journeys and conversions. The package will rely on one more source from the Unified Digital dbt package, the snowplow_unified_user_mapping table to accomplish this.

Features

Change stitching logic

Upgrading

Bump the snowplow-attribution version in your packages.yml file.

🚨 Breaking Changes 🚨

Due to the new user mapping table join in the paths_to_conversions() macro it may be that your snowplow__conversion_clause variable would need to be changed by adding the ev table alias to the fields it references (for user_identifier or user_id).

The package by default will now rely on the snowplow_unified_user_mapping table. Although most users would use the Unified package as a source already (therefore this should not be a breaking change), for those users where it is not available, the paths_to_conversion() macro will have to be overwritten in the dbt project where the package is referenced. Similarly, the optional paths_to_non_conversion model is also changed, it would need to be disabled and overwritten in that case, too.

snowplow-attribution v0.2.2

19 Jun 09:57
Compare
Choose a tag to compare

Summary

This release fixes a bug in the source_checks() macro that could fail if the snowplow__conversions_source or snowplow__conversion_path_source variables are overwritten. This fix should allow users to provide a schema.table or warehouse.schema.table style string reference for these sources without having to specify a dbt source in their project (similar to the default), if necessary.

Fixes

  • Fix source checks macro

Upgrading

Bump the snowplow-unified version in your packages.yml file.

snowplow-attribution v0.2.1

11 Jun 15:09
Compare
Choose a tag to compare

Summary

This release removes null user_identifier data when the conversions_source and path_source are joined. Previously catching nulls on this field was reliant on tests only in the Unified package which was removed as it is easier to enforce this directly in the model itself.

Features

  • Exclude null user identifiers from sources

Under the hood

  • Add Redshift to tests

Upgrading

Bump the snowplow-unified version in your packages.yml file.

snowplow-attribution v0.2.0

26 Mar 10:21
Compare
Choose a tag to compare

Summary

This release adds new cv_id and cv_type columns to align with the latest snowplow_unified package's conversion table structure which this package may take as a base (recommended). It also makes a few internal tweaks, including adding support for schema grants.

🚨 Breaking Changes 🚨

  • The surrogate_key as well as the update column has changed, it is best to do a full-refresh as part of the upgrade.

Features

  • Add support for schema grants
  • Add cv_id and cv_type

Under the hood

  • Enforce full refresh flag for refreshing manifest tables
  • Safeguard spend source from duplication
  • Disable reporting models

Upgrading

To upgrade bump the snowplow-attribution version in your packages.yml file and run a full refresh

Although there are no direct dependencies between the packages, the snowplow-unified package needs to be upgraded as well to v.0.4.0 for the new package structure to work.

snowplow-attribution v0.1.0

31 Jan 12:54
Compare
Choose a tag to compare

Summary

This is the first release of the Snowplow Attribution package, which contains incremental tables to prepare data for marketing attribution analysis as well as report tables which help you understand and visualize which channels or campaigns attributed the most to your revenue. It supports various attibution models out of the box, all in SQL.

Features

  • incremental dbt Package that produces tables for marketing attribution analysis
  • Support for Snowflake / BigQuery / Databricks / Redshift

Installation

To install the package, add the following to the packages.yml in your project:

Github

packages:
  - git: "https://github.com/snowplow/dbt-snowplow-attribution.git"
    revision: 0.1.0

dbt hub

Please note that it may be a few days before the package is available on dbt hub after the initial release.

packages:
  - package: snowplow/snowplow_attribution
    version: [">=0.1.0", "<0.2.0"]