Releases: dbt-labs/metricflow
MetricFlow 0.201.0
This release adds duckdb support and provides a partial fix for user-reported issues with partition pruning in BigQuery.
MetricFlow 0.201.0 - August 17, 2023
Features
- Add dbt-duckdb as a supported adapter and remove legacy DuckDB sql client (#583)
Fixes
- Remove barrier to partition pruning certain time partition filter predicates (#712)
Under the Hood
Dependencies
Contributors
v0.200.0
This release represents a complete overhaul of MetricFlow targeted at a first class integration with dbt! Complete changelog below:
MetricFlow 0.200.0 - August 02, 2023
Breaking Changes
- License Change - Version 0 to 0.140.0 was covered by the Affero GPL license. Version 0.150.0 and greater is covered by the BSL license. (#465)
- Removing time_format from DimensionTypeParams (#494)
- Use Templates For Defining Metric Filters (#505)
- Rename Metric.constraint to Metric.filter (#511)
- Deprecate and refactor CLI commands
- Removes async query and query cancel methods from SqlClient protocols (#577)
- Remove time spine introspection and table creation, which may break cumulative metric queries (#592)
- Remove SqlEngineAttributes construct from SqlClient interface in favor of dialect rendering and engine type properties (#577)
- CLI needs to be ran in a dbt project root directory
- Remove expr & ratio metrics and bundle with derived metrics. (#504)
explain_get_dimension_values
&get_dimension_values
take a list of metrics parameters- Remove MetricFlow config file - all future configuration must originate with the dbt project (#624)
- Update to dbt-semantic-interfaces==0.1.0.dev8. (#634)
- Changed the --group-bys option in mf query to be --group-by
- Add Support for
primary_entity
in Semantic Models (#694)
Features
- Added new entity calls to CLI/MetricFlowEngine
- Script to Generate Snapshots for Supported SQL Engines. (#609)
- Add dbt adapter support for postgres and enable it for tests (#578)
- Use dbt adapter to run queries and warehouse validations from MetricFlow CLI (#624)
- Enable Snowflake queries in dbt <-> MetricFlow CLI integration (#579)
- Refactor
mf tutorial
to work alongside a dbt project. - New package
dbt-metricflow
which bundles dbt-core and metricflow and dbt-adapters - Add Support for Python 3.10 / 3.11 (#659)
- Include metric_time in List Dimensions Output Where Appropriate (#673)
- Enable support for Redshift queries in dbt-metricflow integration (#582)
- Enable Databricks support for the dbt-metricflow integration (#580)
- Enable support for BigQuery for dbt metricflow integration users (#581)
Fixes
- Removes MySQL from SqlEngine and SqlDialect options since it is not supported. (#0)
- Derived metrics were not respecting the constraint defined in the original input metric's definition.
- Fixes type error in BigQuerySqlExpressionRenderer (#536)
- Fix broken type signature for log_call decorator
- Apply transformations to dbt-generated serialized model to fix issue with query generation (#624)
- Improve error message rendering in MetricFlow CLI (#646)
- Added --version and fix manifest transformer rules for dbt-core-=1.6.0b8 (#650)
- Include granularity suffix on time dimension name rendering for all time dimension granularities
- Clean up list dimensions outputs
Under the Hood
- Adding Changie (#457)
- Ensure use of ValidationIssue instead of ValidationIssueType. ValidationIssueType was from a time before ValidationIssue classes had proper inheritance, and it's continued use was become problematic for typing.
- Removing
model
from theModelValidator.validate
return type. The model isn't altered, and thus doesn't need to be returned. - Moving AggregationType enum into dbt-semantic-interfaces
- Moving errors relevant to dbt_semantic_interfaces to dbt_semantic_interfaces
- Migrating to RapidFuzz (#470)
- Matching dbt-core issue templates (#457)
- Removing the transform CLA (#450)
- Pinning dbt-core to 1.4 (#475)
- Removing YamlLint (#472)
- Add ObjectToReference class in preparation of removing the .reference calls (##463,, ##464)
- Moving all *Reference objects to dbt-semantic-interfaces.
- Add pytest flag to use a persistent source schema for faster repeat testing. (#482)
- Renamed instances of and related to Identifiers to Entities. (#dbt-semantic-interfaces#9)
- Improves typechecking coverage by updgrading to MyPy 0.942 and removing blanket ignore all imports setting (#536)
- Push mypy to run using local environment packages in pre-commit. Developers should always use a clean virtual environment to ensure consistency with CI. (#530, #536)
- Update mypy to 1.3.0 (#546)
- Migrate from Poetry -> Hatch for Project / Package Management (#549)
- Enable the ability to return only dimensions requested in the query, specifically used for dimension values queries.
- Raising a UnsupportedEngineFeatureError instead of a generic RuntimeError when a data platform doesn't support a feature
- Remove SqlIsolationLevel constructs and other vestigial remnants of defunct SqlClient features (#577)
- Raise a more specific exception when a Metric isn't found during linking.
- Update test environment configuration to allow for more streamlined dependencies
- Remove DDL and other unused methods from SqlClient protocol
Dependencies
- Switches MetricFlow SemanticManifest dependencies from the local dbt semantic interfaces package to the initial dev release (#540)
- Clean up unused dependencies, relax tabulate version pin (#545)
- Update dbt dependencies to support development on the new integration (#571)
- Move SQLAlchemy and SQL engine dependencies out of the production package (#672)
- Update dependencies and attribution file in preparation for 0.200.0 release (#703)
Contributors
v0.140.0
Highlights
We've added a number of new features, including:
- Derived metrics
- Support for joining against versioned dimensions (Slowly Changing Dimensions, or SCD)
- Percentile measures
- dbt Cloud support
Breaking Changes
- Result layout is changing from one row per metric/null dimension valued pair to one row per null dimension value regardless of number of metrics in the query. This only affects queries for multiple metrics where the requested dimensions contain null values. See the description on the relevant PR for more detailed information and an example illustrating how the output will change.
- Updates to the required SqlClient protocol could cause typechecking failures for users injecting a custom SqlClient implementation into the MetricFlowClient
- Version minimum changes in SQLAlchemy and snowflake-sqlalchemy could cause dependency conflicts when installed in python environments with libraries requiring an older version of either of these dependencies.
New Feature Details
Derived Metrics (@WilliamDee)
MetricFlow now enables the user to reference metrics in the definition of a metric - an expression of metrics. This feature will further simplify and DRY out code by removing the need to create pre-aggregated subqueries in the data source definition or duplicated measure definitions. For example:
metric:
name: net_sales_per_user
owners:
- [email protected]
type: derived
type_params:
expr: gross_sales - cogs / active_users
metrics:
# these are all metrics (can be a derived metric, meaning building a derived metric with derived metrics)
- name: gross_sales
- name: cogs
- name: users
constraint: is_active # Optional additional constraint
alias: active_users # Optional alias to use in the expr
Versioned dimension (SCD Type II) join support (@tlento)
MetricFlow now supports versioned dimension (Slowly Changing Dimension (SCD) Type II) joins!
Given an SCD Type II table with an entity key and dimension values with an appropriate start and end timestamp column, you can now fetch the slowly changing dimension from an SCD Type II table through extra configurations in your data source.
For specific details and examples, please see the documentation on slowly changing dimensions.
Percentile measures (@kyleli626)
MetricFlow now supports percentile calculations in measures! Simply specify percentile for the agg type in your data sources and input the desired percentile within agg_params as seen in the documentation for configuring measures. This feature also provides a median
aggregation type as a convenience around the appropriate percentile configuration. For example:
measures:
- name: p99_transaction_value
description: The 99th percentile transaction value
expr: transaction_amount_usd
agg: percentile
agg_params:
percentile: .99
# False will calculate the discrete percentile and True will calculate the continuous percentile
use_discrete_percentile: False
create_metric: True
- name: median_transaction_value
description: The median transaction value
expr: transaction_amount_usd
agg: median
create_metric: True
Note that MetricFlow allows for choosing between continuous or discrete percentiles via the use_discrete_percentile parameter.
dbt Cloud support (@QMalcolm)
MetricFlow is now available to use with dbt Cloud. Instead of requiring additional MetricFlow config yamls to enable dbt in your MetricFlow model (as in the previous dbt metrics release), the MetricFlow CLI can work off of a semantic model built from dbt Cloud. To use, simply follow these two steps:
- Install our dbt cloud package (e.g., by running
pip install "metricflow[dbt-cloud]"
) - Add the following to
.metricflow/config.yaml
:
dbt_cloud_job_id: <job_id>
# The following service token MUST have dbt Metadata API access for the project containing the specified job
dbt_cloud_service_token: <dbt_service_token>
Other changes
Added
- Support for querying metrics without grouping by dimensions (@WilliamDee)
- A
cancel_request
API in the SQL client for canceling running queries, with the necessary support for SQL isolation levels and asynchronous query submission (@plypaul) - Support for passing in query tags for Snowflake queries (@plypaul)
- DataFlowPlan optimization to reduce source table scans (@plypaul)
- Internal API to enable developers to fetch joinable data source targets from an input data source (@courtneyholcomb)
Updated
- Improved readability of validation error messages (@QMalcolm)
- Made Postgres engine tests merge-blocking in CI to reduce cycle time on detecting engine-specific errors (@tlento)
- Updated poetry and python versions in CI to align with our build process and verify all supported Python versions (@tlento)
- Eliminated data source level primary time dimension requirement in cases where all measures have an aggregation time dimension set (@QMalcolm)
- Extended support for typed values for bind parameters (@courtneyholcolm)
- Removed the optional Python levenshtein package from build dependencies in order to streamline package version requirements (@plypaul)
- Consolidated join validation logic to eliminate code duplication and speed development (@plypaul)
- Factored join building logic out of DataflowToSqlQueryPlanBuilder to streamline development (@tlento)
- Improved visibility on underlying errors thrown by sql client requests (@courtneyholcomb)
- Updated SQLAlchemy and snowflake-sqlalchemy minimum version requirements to resolve a version incompatibility introduced with SQLAlchemy 1.4.42 (@tlento)
- Added CI coverage for Databricks SQL Warehouse execution environments (@tlento)
Fixed
- Resolved error encountered in Databricks whenever table rename methods were invoked (@courtneyholcomb)
- Fixed bug with warehouse measure validation where an error would be inappropriately thrown when users with measure-specific agg_time_dimension configurations attempted to run the full validation suite (@WilliamDee)
- Issue with parsing
explain
output for Databricks SQL warehouse configurations (@courtneyholcomb) - Floating point comparison errors in CI tests (@tlento)
- Issue with incomplete time range constraint validation that could result in invalid queries(@plypaul)
- Resolved GitHub upgrade warnings on use of deprecated APIs and node.js build versions (@tlento)
- Resolved python-levenshtein optimization warning on CLI startup (@jzhu13)
- Resolved SQLAlchemy warning about the impending deprecation of the
engine.table_names
method (@Jstein77) - Improved error message for queries with time range constraints which were too narrow for the chosen time granularity (@kyleli626)
- Eliminate SQL rendering error in BigQuery which would intermittently produce invalid GROUP BY specifications (@tlento)
v0.130.1
Highlights
We've improved safeguards for proper model development and added support for profile and targets overrides for dbt queries!
Added
- Support for overriding dbt
profile
andtargets
attributes when querying dbt models (@QMalcolm) - Validation to block use of
DISTINCT
keyword inCOUNT
aggregation expressions, as this can lead to incorrect results if optimized queries relying on partial aggregation attempt to do something likeSUM(counts)
to retrieve a less granular total value. (@tlento)
Updated
- Made minor improvements to safeguards for internal development (@tlento)
v0.130.0
Highlights
Introducing query support for dbt metrics!
With this release you can now use MetricFlow to run queries against your dbt metrics config! If you wish to use the MetricFlow toolchain to query your dbt metrics you can now do this with a simple configuration change. To use, reinstall Metricflow with the appropriate dbt package (see below for supported installations) and make sure the following is in your .metricflow/config.yaml
:
model_path: /path/to/dbt/project/root
dbt_repo: true
From there you can use all of Metricflow's tools to query that model!
Our supported installations can be added as follows:
- BigQuery:
pip install metricflow[dbt-bigquery]
- Postgres:
pip install metricflow[dbt-postgres]
- Redshift:
pip install metricflow[dbt-redshift]
- Snowflake:
pip install metricflow[dbt-snowflake]
Packaging changes
- Use of the new dbt integration requires installation of extended package dependencies. These should not be pulled in by default.
- Developers experiencing module not found errors with dbt models will need to run the expanded installation via
poetry install -E dbt-<data_warehouse>
where <data_warehouse> is one of the supported extras noted above.
Added
Updated
- Internal refactor to use the more appropriate MetricReferences as lookup keys in place of MetricSpec classes. (@WilliamDee)
v0.120.0
Highlights
We now support Databricks! If you use Databricks for your metrics data warehousing needs, give it a go! You'll need your server hostname
, http path
, and access token
(see the databricks documentation on where to find these). You'll need to use the keys dwh_host
, dwh_http_path
, and dwh_access_token
for these properties in your data warehouse configuration.
We have also fixed some bugs and improved support for semi-additive measures, and added the ability to use measure-specific constraints inside of more complex metric definitions!
Breaking Changes
- Minor API change on SqlClient protocol could break existing API users writing custom SqlClient classes. MyPy should detect this issue. See the relevant PR for details.
Added
- Support for Databricks! Now you can use Metricflow with your Databricks-backed warehouse! (@courtneyholcomb)
- The ability to define constraints on input measures for ratio and expr metric types. This is a temporary quality of life improvement until full-featured derived metrics (i.e., metrics based on other metrics) are available. (@tlento)
- Support for different time granularities, improved group by expression handling, and corrected constraint handling on semi-additive measures (@WilliamDee)
- Support for
count
as a measure aggregation type. Note this is implemented as an alias aroundsum
, so use of theDISTINCT
keyword in expressions is not supported, and will be blocked via validation in a separate update. Users wishing for aCOUNT(DISTINCT thing)
equivalent should continue to use thecount_distinct
aggregation type. (@WilliamDee)
Fixed
- Resolved incorrect constraint handling with semi-additive measures (@WilliamDee)
- Eliminated Dataclass deserialization errors on default parameters (@tlento)
Updated
- Optimized multi-hop join candidate selection (@plypaul)
- Improved error handling and error messages on config validation (@QMalcolm, @tlento)
- Streamlined our project README (@nhandel)
- CLI now accepts metric-only queries in limited circumstances. See the assertion checks in the PR for details. We will announce this as a new feature when it is complete, but for the time being users may try it out to see if it meets their needs in its current form. (@WilliamDee)
v0.111.1
Highlights
We now have a basic extension for authoring configs available for Visual Studio Code, with simple support for auto-completion and inline schema validation, courtesy of @jack-transform !
This release also features several bugfixes and improvements, most notably a dramatic speedup (as much as 10x) in plan building for complex queries.
Changes:
Added
- Metric model template for Shopify metrics - if you use Shopify, this template can get you up and running for tracking your Shopify metrics! (@hesham-nawaz)
Fixed
- Fixed warehouse validation failures caused by interaction with measure-specific aggregation times (@QMalcolm)
- Resolved intermittent semantic validation failures due to unpicklable pydantic objects (@tlento)
- Fixed edge case with semi-additive measures failing on certain join dimension connections (@WilliamDee)
- Restructured semantic container depedencies to eliminate certain classes of circular import errors (@tlento)
Updated
- Replaced Pydantic objects with serializable Dataclasses to improve plan building speed. This can reduce time to generate dataflow plans by 10-20x for complex queries. (@plypaul)
- Refactored validation logic to make it easier to reason about (@QMalcolm)
- Modified inference detection rules to make it more likely to classify low cardinality string columns as categorical dimensions (@tlento)
v0.111.0
Metricflow 0.111.0 was released today!
The big news is we have added data source inference as a Beta feature. It allows new users to quickly get started with Metricflow by bootstrapping their data source configurations from a warehouse connection.
Please note that this is still Beta feature. As such, it should not be expected to be free of bugs, and its CLI/Python interfaces might change without prior notice.
For full details see our release notes.
[0.111.0] - 2022-08-10
Added
- Early beta for data source inference - Snowflake users can now run a command to bootstrap data source definitions based on warehouse metadata for tables in a given schema. Check out
mf infer --help
for more details. Feedback welcome! (@serramatutu) - Support for semi-additive measures - semi-additive measures are measures which can be aggregated across some dimensions, but not others. The most common use case for this is an end-of-period measure, such as a statement balance or end of day inventory hold. As of this release we support semi-additive restrictions bound by any time dimension. See the description on PR #183 for more details on usage. (@WilliamDee)
- Expanded data warehouse validations - data warehoue validation runs will now check to ensure all measures, identifiers, and dimensions defined in data sources are queryable. (@QMalcolm)
Fixed
- Resolved serialization error on custom Pydantic objects in the model parser (@tlento)
- Cleaned up error message handling and missing test depencies on model validation (@tlento)
- Eliminated a class of circular import errors by forking reference classes from the specs module (@WilliamDee)
- Fixed error state when specifying a where constraint with metric_time (@plypaul)
Updated
- Updated the tutorial to use metric_time instead of arbitrary time dimensions (@plypaul)
- Increased strictness of typechecking enforcement (@tlento)
- Removed unnecessary "all time" WHERE clause rendering for cases where time constraints were not provided (@WilliamDee)
- Updated Click version from
^8.1.3
to>=7.1.2
to temporarily resolve dependency issue for downstream Flask1.1.4
usage (@jack-transform) - Updated Jinja2 version from
2.11.0
to>=2.11.0
to allow downstream Flask1.1.4
users to update to2.x.x
(@jpreillymb, @tlento)
v0.110.1
Patch fixes and minor improvements on v0.110.0:
- Fixing a bug where columns would occasionally render in the wrong order (@WilliamDee)
- Package metadata cleanup to fix broken links on pypi (@tlento)
- Rendering BETWEEN expressions to streamline SQL time range comparisons (@WilliamDee)
- Consolidating error detection in validation to surface all errors in a single pass (@QMalcolm)
Get the latest here: https://pypi.org/project/metricflow/0.110.1/
v0.110.0
Metricflow 0.110.0 was released today!
The big news is we have added support for setting custom time dimensions on a measure-by-measure basis. This change also means we no longer require the primary time dimensions to have the same names across all data sources.
To run a query for a time series metric computation, simply request the metric_time
dimension, and Metricflow will use the aggregation time dimension associated with each measure in your query and line everything up on your time series chart for you. You don't even need to know the names of the time dimensions! Please note metric_time
is now a reserved name.
Speaking of reserved names, we have also improved our validation against SQL reserved keywords, which should provide more rapid feedback in the metric config development workflow.
For full details see our release notes.
[0.110.0] - 2022-07-21
Breaking Changes
- Updated query inputs for time series dimensions to use
metric_time
instead of dimension names, since it is now possible for measures to have different time dimensions for time series aggregation. This also removes the restriction that all data sources have the same primary time dimension name. However, users issuing queries might experience exceptions if they are not usingmetric_time
as their requested primary time dimension. (@plypaul) - Added enforcement for new reserved keyword
metric_time
(@tlento) - Reordered column output to
time dimension, identifiers, dimensions, metrics
, which could break automation relying on order-dependent parsing of CLI output. We encourage affected users to switch to using the API, and to access the resulting data frame with order-independent (i.e., by name) access to column values. (@WilliamDee) - Removed support for SQLite - expected impact is minimal as the repo has been cut to DuckDB for in memory testing (@plypaul)